how to escape parenthesis in ls commands - bash

I do not succed to filter with ls files with parenthesis (on bash)
$ ls -1
a_échéancier(1).pdf
a_échéancier(2).pdf
a_échéancier(3).pdf
a_échéancier(4).pdf
a_échéancier(5).pdf
a_échéancier(6).pdf
a_échéancier.pdf
$
A try here:
$ ls "*).pdf"
ls: cannot access '*).pdf': No such file or directory
$
$ ls '*\).pdf'
ls: cannot access '*\).pdf': No such file or directory
$

You are escaping too many characters; the only character that needs to be escaped is ):
ls *\).pdf
but everything else except the * can be escaped:
ls *").pdf"
The shell itself is what expands the glob before ls even runs; ls just gets an explicit list of filenames. Quoting the * makes ls try to list the single file named *).pdf, not every file in the current directory that matches the pattern.

ls -1 | grep "*).pdf" should do the job. ls with a filename after it just tries to match it exactly.

Related

Piping the contents of a file to ls

I have a file called "input.txt." that contains one line:
/bin
I would like to make the contents of the file be the input of the command ls
I tried doing
cat input.txt | ls
but it doesn't output the list of files in the /bin directory
I also tried
ls < input.txt
to no avail.
You are looking for the xargs (transpose arguments) command.
xargs ls < input.txt
You say you want /bin to be the "input" to ls, but that's not correct; ls doesn't do anything with its input. Instead, you want /bin to be passed as a command-line argument to ls, as if you had typed ls /bin.
Input and arguments are completely different things; feeding text to a command as input is not the same as supplying that text as an argument. The difference can be blurred by the fact that many commands, such as cat, will operate on either their input or their arguments (or both) – but even there, we find an important distinction: what they actually operate on is the content of files whose names are passed as arguments.
The xargs command was specifically designed to transform between those two things: it interprets its input as a whitespace-separated list of command-line arguments to pass to some other command. That other command is supplied to xargs as its command-line argument(s), in this case ls.
Thanks to the input redirection provided by the shell via <, the arguments xargs supplies to ls here come from the input.txt file.
There are other ways of accomplishing the same thing; for instance, as long as input.txt does not have so many files in it that they won't fit in a single command line, you can just do this:
ls $(< input.txt)
Both the above command and the xargs version will treat any spaces in the input.txt file as separating filenames, so if you have filenames containing space characters, you'll have to do more work to interpret the file properly. Also, note that if any of the filenames contain wildcard/"glob" characters like ? or * or [...], the $(<...) version will expand them as wildcard patterns, while xargs will not.
ls takes the filenames from its command line, not its standard input, which | ls and ls < file would use.
If you have only one file listed in input.txt and the filename doesn't contain trailing newlines, it's enough to use (note quotes):
ls "$(cat input.txt)"
Or in almost all but plain POSIX shell:
ls "$(< input.txt)"
If there are many filenames in the file, you'd want to use xargs, but to deal with whitespace in the names, use -d "\n" (with GNU xargs) to take each line as a filename.
xargs -d "\n" ls < input.txt
Or, if you need to handle filenames with newlines, you can separate them using NUL bytes in the input, and use
xargs -0 ls < input.txt
(This also works even if there's only one filename.)
Try xargs
cat file | xargs ls
Ohhh man, I have to put this to get 30 characters long ;)

delete n largest files in a directory in ubuntu terminal

I want to delete n (say 2 in our case) largest files in a directory.
files=$(ls -S | head -2)
rm $files
This doesn't work because the file names have space and all sorts of special characters in them. I got similar results with this ls -xS | head -2 | xargs rm. I guess one should escape all the special characters in the file name but there are various types of special characters. Although it's doable, I didn't expect it to be this complicated.
I used -Q option to quote the file names, but I still get the same error.
Downloads > files=$(ls -SQ | head -1)
Downloads > echo $files
"[ www.UsaBit.com ] - Little Children 2006 720p BRRip x264-PLAYNOW.mp4"
Downloads > rm $files
rm: cannot remove ‘"[’: No such file or directory
rm: cannot remove ‘www.UsaBit.com’: No such file or directory
rm: cannot remove ‘]’: No such file or directory
rm: cannot remove ‘-’: No such file or directory
rm: cannot remove ‘Little’: No such file or directory
rm: cannot remove ‘Children’: No such file or directory
rm: cannot remove ‘2006’: No such file or directory
rm: cannot remove ‘720p’: No such file or directory
rm: cannot remove ‘BRRip’: No such file or directory
rm: cannot remove ‘x264-PLAYNOW.mp4"’: No such file or directory
choroba's answer works well, and even though use of eval happens to be safe in this case, it's better to form a habit of avoiding it if there are alternatives.
The same goes for parsing the output of ls.
The general recommendations are:
Avoid use of eval on input you don't control, because it can result in execution of arbitrary commands.
Do not parse ls output; if possible, use pathname expansion (globbing).
That said, sometimes ls offers so much convenience that it's hard not to use it, as is the case here: ls -S conveniently sorts by file size (in descending order); hand-crafting the same logic would be nontrivial.
The price you pay for parsing ls output is that filenames with embedded newlines (\n) won't be handled correctly (as is true of choroba's answer as well). That said, such filenames are rarely a real-world concern.
While xargs applies word-splitting to its input lines by default - which is why handling of filenames with embedded whitespace fails - it can be made to recognize each input line as a distinct, as-is argument (note that ls, when not outputting to a terminal, outputs each filename on its own line by default):
GNU xargs (as used on most Linux distros):
ls -S | head -2 | xargs -d $'\n' rm # $'\n' requires bash, ksh, or zsh
-d $'\n tells xargs to treat each input line as a whole as a separate argument when passing arguments to rm.
BSD/macOS xargs (also works with GNU xargs):
This xargs implementation doesn't support the -d option, but it supports -0 to split the input into arguments by NULs (0x0 bytes). Therefore, an intermediate tr command is needed to translate \n to NULs:
ls -S | head -2 | tr '\n' '\0' | xargs -0 rm
If your ls supports the -Q option, it will quote all the names in double quotes (and backslash double quotes).
You can't use such an output directly as the argument of rm, as word-splitting won't respect the quotes. You can use eval to force a new word splitting:
eval rm $(ls -Q | head -2)
Use this with care! eval is dangerous, it can run turn data into running code that you can't control. My tests show ls -Q turns newline into \n which isn't interpreted as a newline in double quotes!

Need to concatenate a string to each line of ls command output in unix

I am a beginer in Shell script. Below is my requirement in UNIX Korn Shell.
Example:
When we list files using ls command redirect to a file the file names will be stored as below.
$ ls FILE*>FLIST.TXT
$ cat FLIST.TXT
FILE1
FILE2
FILE3
But I need output as below with a prefixed constant string STR,:
$ cat FLIST.TXT
STR,FILE1
STR,FILE2
STR,FILE3
Please let me what should be the ls command to acheive this output.
You can't use ls alone to append data before each file. ls exists to list files.
You will need to use other tools along side ls.
You can append to the front of each line using the sed command:
cat FLIST.TXT | sed 's/^/STR,/'
This will send the changes to stdout.
If you'd like to change the actual file, run sed in place:
sed -i -e 's/^/STR,/' FLIST.TXT
To do the append before writing to the file, pipe ls into sed:
ls FILE* | sed 's/^/STR,/' > FLIST.TXT
The following should work:
ls FILE* | xargs -i echo "STR,{}" > FLIST.TXT
It takes every one of the file names filtered by ls and adds the "STR," prefix to it prior to the appending

Input redirection to grep

I have a directory with contents like this -
vishal.yadav#droid36:~/Shell$ ls
lazy_dog.txt ls-error.txt ls-output.txt ShellCommands.txt TheTimeMachineHGWells.txt words.txt words.txt.bak
First Command
If I try using ls | grep *.txt I get the following output -
ShellCommands.txt: $ cat > lazy_dog.txt
ShellCommands.txt: $ cat lazy_dog.txt
ShellCommands.txt: $ cat < lazy_dog.txt
ShellCommands.txt:input from the keyboard to the file lazy_dog.txt. We see that the result is the
Second Command
And if I use ls | grep .*.txt I get this as output -
lazy_dog.txt
ls-error.txt
ls-output.txt
ShellCommands.txt
TheTimeMachineHGWells.txt
words.txt
words.txt.bak
Isn't .*.txt and *.txt one and the same?
In the First Command, is the output of ls the regex for grep or is it the list of files?
Similarly, for the Second Command, is the output of ls the regex or list of files?
In the first command (ls | grep *.txt), the output from ls is completely ignored by grep because it sees:
grep lazy_dog.txt ls-error.txt ls-output.txt ShellCommands.txt TheTimeMachineHGWells.txt
It has one pattern lazy_dog.txt and four files, so it reads each file in turn to find the pattern, and prefixes the matching output lines with the name of the file that held the pattern. If there was only one file name, it would not list the file name before the matched lines.
It appears that the only file of the four that grep searches (ls-error.txt, ls-output.txt, ShellCommands.txt, TheTimeMachineHGWells.txt) that contains the text lazy_dog.txt is ShellCommands.txt, so that's what you see in the output. Note that a line containing lazy_dogstxt would also match the regex (but not the shell glob).
In the second command (ls | grep .*.txt), there are no files that match .*.txt, so that argument is passed to grep unexpanded, so it has only a pattern, so it reads its standard input, which is the output from ls this time. All the file names match the regex .*.txt (even though none of them match the shell glob .*.txt), so they're all listed. Note that it would also pick up many other lines, even one containing just "etxt", because the . is a grep metacharacter (and the .*.txt regex matches any string of zero or more characters followed by one arbitrary character and then txt.
do ls -al:
you will find that the current directory is listed as a . and previous directory is listed as ...
So when you say ls | grep .*.txt, the . is taken as path matching from current directory that contains .txt afterwards.
grep see the pattern .*.txt as regex, not glob.
So you can use ls *.txt or ls | grep .*txt

Listing files in date order with spaces in filenames

I am starting with a file containing a list of hundreds of files (full paths) in a random order. I would like to list the details of the ten latest files in that list. This is my naive attempt:
$ ls -las -t `cat list-of-files.txt` | head -10
That works, so long as none of the files have spaces in, but fails if they do as those files are split up at the spaces and treated as separate files. File "hello world" gives me:
ls: hello: No such file or directory
ls: world: No such file or directory
I have tried quoting the files in the original list-of-files file, but the here-document still splits the files up at the spaces in the filenames, treating the quotes as part of the filenames:
$ ls -las -t `awk '{print "\"" $0 "\""}' list-of-files.txt` | head -10
ls: "hello: No such file or directory
ls: world": No such file or directory
The only way I can think of doing this, is to ls each file individually (using xargs perhaps) and create an intermediate file with the file listings and the date in a sortable order as the first field in each line, then sort that intermediate file. However, that feels a bit cumbersome and inefficient (hundreds of ls commands rather than one or two). But that may be the only way to do it?
Is there any way to pass "ls" a list of files to process, where those files could contain spaces - it seems like it should be simple, but I'm stumped.
Instead of "one or more blank characters", you can force bash to use another field separator:
OIFS=$IFS
IFS=$'\n'
ls -las -t $(cat list-of-files.txt) | head -10
IFS=$OIFS
However, I don't think this code would be more efficient than doing a loop; in addition, that won't work if the number of files in list-of-files.txt exceeds the max number of arguments.
Try this:
xargs -a list-of-files.txt ls -last | head -n 10
I'm not sure whether this will work, but did you try escaping spaces with \? Using sed or something. sed "s/ /\\\\ /g" list-of-files.txt, for example.
This worked for me:
xargs -d\\n ls -last < list-of-files.txt | head -10

Resources