ctags does not work for long file names - ctags

I used
ctags -e -R foo
to generate TAGS. But TAGS includes file names, as tags, like
foo/a/bar/d/vvv.cpp
The real file name should be
foo/abc/bar/ddd/vvv.cppp
So it cannot find the correct files.
Is it because my file name is too long?

This is because I did
ctags -e -R ./foo/
It does not have the issue if I do
ctags -e -R foo/

Related

Rename files in bash based on content inside

I have a directory which has 70000 xml files in it. Each file has a tag which looks something like this, for the sake of simplicity:
<ns2:apple>, <ns2:orange>, <ns2:grapes>, <ns2:melon>. Each file has only one fruit tag, i.e. there cannot be both apple and orange in the same file.
I would like rename every file (add "1_" before the beginning of each filename) which has one of: <ns2:apple>, <ns2:orange>, <ns2:melon> inside of it.
I can find such files with egrep:
egrep -r '<ns2:apple>|<ns2:orange>|<ns2:melon>'
So how would it look as a bash script, which I can then user as a cron job?
P.S. Sorry I don't have any bash script draft, I have very little experience with it and the time is of the essence right now.
This may be done with this script:
#!/bin/sh
find /path/to/directory/with/xml -type f | while read f; do
grep -q -E '<ns2:apple>|<ns2:orange>|<ns2:melon>' "$f" && mv "$f" "1_${f}"
done
But it will rescan the directory each time it runs and append 1_ to each file containing one of your tags. This means a lot of excess IO and files with certain tags will be getting 1_ prefix each run, resulting in names like 1_1_1_1_file.xml.
Probably you should think more on design, e.g. move processed files to two directories based on whether file has certain tags or not:
#!/bin/sh
# create output dirs
mkdir -p /path/to/directory/with/xml/with_tags/ /path/to/directory/with/xml/without_tags/
find /path/to/directory/with/xml -maxdepth 1 -mindepth 1 -type f | while read f; do
if grep -q -E '<ns2:apple>|<ns2:orange>|<ns2:melon>'; then
mv "$f" /path/to/directory/with/xml/with_tags/
else
mv "$f" /path/to/directory/with/xml/without_tags/
fi
done
Run this command as a dry run, then remove --dry_run to actually rename the files:
grep -Pl '(<ns2:apple>|<ns2:orange>|<ns2:melon>)' *.xml | xargs rename --dry-run 's/^/1_/'
The command-line utility rename comes in many flavors. Most of them should work for this task. I used the rename version 1.601 by Aristotle Pagaltzis. To install rename, simply download its Perl script and place into $PATH. Or install rename using conda, like so:
conda install rename
Here, grep uses the following options:
-P : Use Perl regexes.
-l : Suppress normal output; instead print the name of each input file from which output would normally have been printed.
SEE ALSO:
grep manual

Rpmbuild copying folders specified by a mapping file

Currently, working with rhel. Need to create a script that can use a mapping file (source and target need to be dynamic). What file type should be used for the mapping file (.csv, .txt, .json)?
End directory in a slash e.g. src/
-r can be used against a file e.g. cp -r src/file.fil tgt/
script will be kicked off by the spec file and files will be copied during build
source and target need to be dynamic
Script example
cp -r src/file.fil tgt/
Mapping file
<src>\t<tgt>\n e.g. src/file.fil \t tgt/ \n OR 'src/file.fil'\t'tgt/'\n
xargs can be used for that
xargs --arg-file your-file.txt cp
file should contain lines like
file1 tgt/
file2 tgt/
or a for loop reading from a file with just source file names provided that your target is constant
for f in $(cat yourfile.txt); do
cp "$f" tgt/
done
or even
while read src tgt; do
cp "$src" "$tgt"
done < yourfile.txt

Linux bash script to copy files by list

I'm new in bash and I need a help please. I have a file call list.txt containing pattern like
1210
1415
1817
What I want to do is to write a bash script which will copy all file in my current directory which name contain that pattern towards a new directory called toto.
Example of my file in the current directory :
1210_ammm.txt
1415_xdffmslk.txt
1817_lsmqlkksk.txt
201247_kksjdjdjd.txt
The goal is to copy 1210_ammm.txt, 1415_xdffmslk.txt, 1817_lsmqlkksk.txt to toto.
Transferred from an 'answer'.
My list.txt and toto directory are in my current directory. That is what I try
#!/bin/bash
while read p; do # read my list file
for i in `find -name $p -type f` # find all file match the pattern
do
cp $i toto # copy all files find into toto
done
done < partB.txt
I don't have an error but it doesn't do the job.
Here is what you need to implement :
read tokens from an input file
for each token
search the files whose name contain said token
for each file found
copy it to toto
To read tokens from the input file, you can use a read command in a while loop (and the Bash FAQ generally, and Bash FAQ 24 specifically.
To search files whose name contain a string, you can use a for loop and globbing. For example, for file in ./*test*; do echo $file; done will print the name of the files in the current directory which contain test.
To copy a file, use cp.
You can check this ideone sample for a working implementation.
Use below script:
cp "$(ls | grep -f list.txt)" toto
ls | grep -f list.txt will grep for the pattern found in list.txt in the ls output.
cp copies the matched files to toto directory.
NOTE: If list.txt and toto are not in current directory, provide absolute paths in the script.
I needed this too, I tried #Zaziln's answer, but it gave me errors. I just found a better answer. I think others will be interested too.
mapfile -t files < test1.txt
cp -- "${files[#]}" Folder/
I found it on this post --> https://unix.stackexchange.com/questions/106219/copy-files-from-a-list-to-a-folder#106231

Command line zip everything within a directory, but do not include any directory as the root

I can't find the answer to this for the life of me. Because I am packaging a zip in a specific way for a build process, I don't want to include a folder at all in the resulting zip at the root. For example, if I have this file path:
MyFolder/
A.png
B.txt
C.mp3
And I use either the command:
zip -r -X "MyFolder.zip" MyFolder/*
or
cd MyFolder; zip -r -X "../MyFolder.zip" *
I end up with a zip file that has the root element of MyFolder. What I want is for when I unzip it is to dump all of it right into the directory, like this:
A.png
B.txt
C.mp3
In other words, I don't want MyFolder or any other folder as the root. I read through the whole manual and have tried numerous options and a lot of Google searching, and zip seems to just really want to have a folder at the root.
Thanks!
It was Archive Utility's fault (a Mac OS X unzipper app). When I used the unzip command from the command line, it works great.
(cd MyFolder && zip -r -X "../MyFolder.zip" .)
Stumbled across this answer but didnt want to have to change in out of directories. I found the -j option useful which adds all files to the root of the zip. Note that its is all files so subdirectory structure will not be preserved.
So with this folder structure:
MyFolder
- MyFile1
- MySubFolder
- MyFile2
And this command:
zip -rj MyFolder.zip MyFolder
You get this:
MyFolder.zip
- MyFile1
- MyFile2
I found the easier way to make an encrypted zip file with the terminal app on mac (mac os) just from the files of your folder.
The command for the terminal
zip -j -e wishedname.zip yourfolder/*
That's it. Enjoy!
*
For more information to zip command in the terminal app
man zip
What -j and -e do?
-j
--junk-paths
Store just the name of a saved file (junk the path), and do not store directory names. By default, zip will store the full path (relative to the current directory).
-e
--encrypt
Encrypt the contents of the zip archive using a password which is entered on the terminal in response to a prompt (this will not be echoed; if standard error is not a tty, zip will exit with an error). The password prompt is repeated to save the user from typing errors.
Install zip
sudo apt install zip
use zip
zip -r foo.zip .
You can use the flags -0 (none) to -9 (best) to change compressionrate
Excluding files can be done via the -x flag. From the man-page:
-x files
--exclude files
Explicitly exclude the specified files, as in:
zip -r foo foo -x \*.o
which will include the contents of foo in foo.zip while excluding all the files that end in .o. The backslash avoids the shell filename substitution, so that the name matching
is performed by zip at all directory levels.
Also possible:
zip -r foo foo -x#exclude.lst
which will include the contents of foo in foo.zip while excluding all the files that match the patterns in the file exclude.lst.
The long option forms of the above are
zip -r foo foo --exclude \*.o
and
zip -r foo foo --exclude #exclude.lst
Multiple patterns can be specified, as in:
zip -r foo foo -x \*.o \*.c
If there is no space between -x and the pattern, just one value is assumed (no list):
zip -r foo foo -x\*.o
See -i for more on include and exclude.

How to extract only one kind of file from the archive?

Given a .zip or .rar archive containing 10 files, each with different extensions.
Given I only want the .jpg file in it.
How to extract the *.jpg in it without having to extract the 9 other files ?
Try this :
unzip test.zip '*.jpg'
The argument after the filename is the file to be extracted. See man unzip Arguments section:
[file(s)]
An optional list of archive members to be processed, separated
by spaces. (VMS versions compiled with VMSCLI defined must
delimit files with commas instead. See -v in OPTIONS below.)
Regular expressions (wildcards) may be used to match multiple
members; see above. Again, **be sure to quote expressions that
would otherwise be expanded** or modified by the operating system.
tar -xf test.tar --wildcards --no-anchored '*.jpg'
You can use:
while read -r f; do
tar -xf "$zipFile" "$f"
done < <(tar tf "$zipFile" | grep '\.jpg')

Resources