list all files in multiple directories - bash

I would like to list all files in a certain directory, and list them with their full path.
I have a series of directories like this:
user.newskims.131017222704/
user.newskims.131017222741/
user.newskims.131017222822/
user.newskims.131017222949/
If I do
ls user.newskims.131017222*
The output has lines like this which I want to eliminate:
user.newskims.131017222822:
It also doesn't give the full path. Is there a way to make it list all of the files inside, and only those files and no additional rows, and with the full path?

You can list file with full path of a given directory using printf:
printf "$PWD%s\n" user.newskims.131017222/*

Related

how do i combine txt file from a list of file emplacement

i have a problem, i used "everything" to extract every txt file from a specific directory so that i can merge them. But on emeditor i don't find a way to merge file from a list of localisation.
Here what the everything file look like:
E:\Main directory\subdirectory 1\file.txt
E:\Main directory\subdirectory 2\file.txt
E:\Main directory\subdirectory 3\file.txt
E:\Main directory\subdirectory 4\file.txt
The list goes over 40k location. is there a way to use a program to read all the location in the text file and combine them ?
Also, the subdirectory has other txt file that i don't want to so i can't just merge all txt file from the main. Another thing is that there are variation of the "file.txt" like "Files.txt" for example.

Extracting contents of many zipped folders into a single directory

Kind of easy question, but I can't find the answer. I want to extract the contents of multiple zipped folders into a single directory. I am using the bash console, which is the only tool available on the particular website I am using.
For example, I have two folders: a.zip (which contains a1.txt and a2.txt) and b.zip (which contains b1.txt and b2.txt). I want to get extract all four text files into a single directory.
I have tried
unzip \*.zip -d \newdirectory
But it creates two directories (a and b) with two text files in each.
I also tried concatenating the two zipped folders into one big folder and extracting it, but it still creates two directories, even when I specify a new directory.
I can't figure what I am doing wrong. Any help?
Thanks in advance!
Use the -j parameter to ignore any directory structure.
unzip -j -d /path/to/your/directory '*.zip*'

copy files from different locations and with different names into one target directory

i really looke for solutions for this , but all solutions just assume that either the file names are similar (like file1,file2, file 3...) so a regular expression can be used, or a FIND can look for the files with a certain name. But in my case i have different file names:
assume i have
file1 in E:\dir1\, input99 in C:\dir2 , result3 in F:\dir3
so these three files reside in different locations, different drive names, but still local
is it possible to use cp with absolute pathnames ?
like:
cp /dir1/file1, /dir2/input99, /dir3/result3 /targetdirectory
cp accepts a list of source files (specified either as absolute paths or relative paths) and copies all of them to the target directory (the last parameter). So:
cp E:\dir1\file1 C:\dir2\input99 F:\dir3\result3 C:\targetdirectory

Terminals - Creating Multiple Identical Folders within Subdirectories and Moving Files

I have a bunch of files I'm trying to organize quickly, and I had two questions about how to do that. I really appreciate any help! I tried searching but couldn't find anything on these specific commands for OSX.
First, I have about 100 folders in a directory - I'd like to place an folder in each one of those folders.
For example, I have
Cars/Mercedes/<br>
Cars/BMW/<br>
Cars/Audi/<br>
Cars/Jeep/<br>
Cars/Tesla/
Is there a way I can create a folder inside each of those named "Pricing" in one command, i.e. ->
Cars/Mercedes/Pricing <br>
Cars/BMW/Pricing<br>
Cars/Audi/Pricing<br>
Cars/Jeep/Pricing<br>
Cars/Tesla/Pricing
My second question is a little tougher to explain. In each of these folders, I'd like move certain files into these newly created folders (above) in the subdirectory.
Each file has a slightly different filename but contains the same string of letters - for example, in each of the above folders, I might have
Cars/Mercedes/payment123.html
Cars/BMW/payment432.html
Cars/Audi/payment999.html
Cars/Jeep/payment283.html
Is there a way to search each subdirectory for a file containing the string "payment" and move that file into a subfolder in that subdirecotry - i.e. into the hypothetical "Pricing" folders we just created above with one command for all the subdirectories in Cars?
Thanks so much~! help with either of these would be invaluable.
I will assume you are using bash, since it is the default shell in OS X. One way to do this uses a for loop over each directory to create the subdirectory and move the file. Wildcards are used to find all of the directories and the file.
for DIR in Cars/*/ ; do
mkdir "${DIR}Pricing"
mv "${DIR}payment*.html" "${DIR}Pricing/"
done
The first line finds every directory in Cars, and then runs the loop once for each, replacing ${DIR} with the current directory. The second line creates the subdirectory using the substitution. Note the double quotes, which are necessary only if the path could contain spaces. The third line moves any file in the directory whose name starts with "payment" and ends with ".html" to the subdirectory. If you have multiple files which match this, they will all be moved. The fourth line simply marks the end of the loop.
If you are typing this directly into the command line, you can combine it into a single line:
for DIR in Cars/*/ ; do mkdir "${DIR}Pricing"; mv "${DIR}payment*.html" "${DIR}Pricing/"; done

Finding and Removing Unused Files Through Command Line

My websites file structure has gotten very messy over the years from uploading random files to test different things out. I have a list of all my files such as this:
file1.html
another.html
otherstuff.php
cool.jpg
whatsthisdo.js
hmmmm.js
Is there any way I can input my list of files via command line and search the contents of all the other files on my website and output a list of the files that aren't mentioned anywhere on my other files?
For example, if cool.jpg and hmmmm.js weren't mentioned in any of my other files then it could output them in a list like this:
cool.jpg
hmmmm.js
And then any of those other files mentioned above aren't listed because they are mentioned somewhere in another file. Note: I don't want it to just automatically delete the unused files, I'll do that manually.
Also, of course I have multiple folders so it will need to search recursively from my current location and output all the unused (unreferenced) files.
I'm thinking command line would be the fastest/easiest way, unless someone knows of another. Thanks in advance for any help that you guys can be!
Yep! This is pretty easy to do with grep. In this case, you would run a command like:
$ for orphan in `cat orphans.txt`; do \
echo "Checking for presence of ${orphan} in present directory..." ;
grep -rl $orphan . ; done
And orphans.txt would look like your list of files above, one file per line. You can add -i to the grep above if you want to grep case-insensitively. And you would want to run that command in /var/www or wherever your distribution keeps its webroots. If, after you see the above "Checking for..." and no matches below, you haven't got any files matching that name.

Resources