Concat folder/subfolder file of type X contents to variable - macos

Im trying to write a script that:
Reads a single file to a variable
Iterates through all files in all subfolders of a type and adds their contents to that same variable
I had tried this path.
common_css=$(cat "$path/normalize.css") #$path is valid
find $path/common -type f -name *.css | while read f; do
common_css=$common_css $(cat "$f")
echo "$common_css"
done
echo "$common_css"
EX:
FILES
/normalize.css = body{ background-color : red; }
/common/name.css = table{ display : block; }
/common/oauth.conf = key="somekey"
/common/place.css = p{ font-size: 2em; }
DESIRED OUTPUT
body{ background-color : red; }
table{ display : block; }
p{ font-size: 2em; }

Try the following (assuming you really want to build up the entire string in memory):
common_css=$(cat "$path/normalize.css";
find "$path/common" -type f -name '*.css' -exec cat {} +)
A single command substitution can output the concatenation of the contents of all files of interest, no need for loops.
-exec cat {} +, thanks to the trailing +, passes as many matching file paths as will fit onto a single command line to cat, resulting in typically just one invocation.
I've added quoting to make the command more robust - notably, you should always quote the glob argument passed to -name, to prevent premature expansion by the shell.

Related

Recursively convert media directory from HEVC to h.264 with ffmpeg

I have media server with two directories: Movies and TV Shows. Within each of those directories, each entry exists in a sub-directory which contains the video file and subtitle files.
I've scoured the web and have found an excellent perl script from Michelle Sullivan, posted here:
#!/usr/bin/perl
use strict;
use warnings;
open DIR, "ls -1 |";
while (<DIR>)
{
chomp;
next if ( -d "$_"); # skip directories
next unless ( -r "$_"); # if it's not readable skip it!
my $file = $_;
open PROBE, "ffprobe -show_streams -of csv '$file' 2>/dev/null|" or die ("Unable to launch ffmpeg for $file! ($!)");
my ($v, $a, $s, #c) = (0,0,0);
while (<PROBE>)
{
my #streaminfo = split(/,/, $_);
push(#c, $streaminfo[2]) if ($streaminfo[5] eq "video");
$a++ if ($streaminfo[5] eq "audio");
$s++ if ($streaminfo[5] eq "subtitle");
}
close PROBE;
$v = scalar #c;
if (scalar #c eq 1 and $c[0] eq "ansi")
{
warn("Text file detected, skipping...\n");
next;
}
warn("$file: Video Streams: $v, Audio Streams: $a, Subtitle Streams: $s, Video Codec(s): " . join (", ", #c) . "\n");
if (scalar #c > 1)
{
warn("$file has more than one video stream, bailing!\n");
next;
}
if ($c[0] eq "hevc")
{
warn("HEVC detected for $file ...converting to AVC...\n");
system("mkdir -p h265");
my #params = ("-hide_banner", "-threads 2");
push(#params, "-map 0") if ($a > 1 or $s > 1 or $v > 1);
push(#params, "-c:a copy") if ($a);
push(#params, "-c:s copy") if ($s);
push(#params, "-c:v libx264 -pix_fmt yuv420p") if ($v);
if (system("mv '$file' 'h265/$file'"))
{
warn("Error moving $file -> h265/$file\n");
next;
}
if (system("ffmpeg -xerror -i 'h265/$file' " . join(" ", #params) . " '$file' 2>/dev/null"))
{
warn("FFMPEG ERROR. Cannot convert $file restoring original...\n");
system("mv 'h265/$file' '$file'");
next;
}
} else {
warn("$file doesn't appear to need converting... Skipping...\n");
}
}
close DIR;
The script performs perfectly - as long as it is run from within the directory containing the media.
My question: Can this script be modified to run recursively from the root directory? How?
Thanks in advance.
(Michelle's script can be seen here: http://www.michellesullivan.org/blog/1636)
Why do you want to run recursively? Do you mean that you want to run it on all the files under a particular directory?
In this problems, I'd rather separate the part that generates the list of files to process from the processing. With a long list of files, I might take the lines from standard input instead:
while( <> ) {
...
}
Pipe the list into the script:
$ find ... | script
Or take it from a file:
$ script list_of_files.txt
With a short list, I might use a favorite xargs trick:
$ find ... -print0 | xargs -0 script
In that case I go through the command-line arguments:
foreach ( #ARGV ) {
...
}
If you want to do all of this in the program, you can use File::Find.
Beyond that, it sounds like you are asking someone to do the work for you.

Looping through folders and renaming all files in each folder sequentially with folder name

I have a series of subfolders, each with images on them.
Structure looks like
/stuff/
IMG_012.JPG
IMG_013.JPG
IMG_014.JPG
/morestuff/
IMG_022.JPG
IMG_023.JPG
IMG_024.JPG
I would like to write a script on my mac terminal to loop through each folder and rename the images sequentially including the folder name. So, the above structure would look like:
/stuff/
stuff_1.JPG
stuff_2.JPG
stuff_3.JPG
/morestuff/
morestuff_1.JPG
morestuff_1.JPG
morestuff_1.JPG
I orignally tried creating a Automator workflow and using variables, but had difficulty assigning the folder name as the variable and getting the loop to work.
I'm hoping there is an elegant solution with the terminal.
Any ideas?
This should work nicely for you. Save it in your HOME directory as "RenameImages", then make it executable like this:
cd
chmod +x RenameImages
Then you can run it (-exec) it on every directory (-type d) like this:
find . -type d -exec ./RenameImages {} \;
Here is the script:
#!/bin/bash
# Ignore case, i.e. process *.JPG and *.jpg
shopt -s nocaseglob
shopt -s nullglob
# Go to where the user says
echo Processing directory $1
cd "$1" || exit 1
# Get last part of directory name
here=$(pwd)
dir=${here/*\//}
i=1
for image in *.JPG
do
echo mv "$image" "${dir}${i}.jpg"
((i++))
done
At the moment it does nothing except show you what it would do. If you like what it is doing, simply remove the word echo from the 3rd to last line and run it again.
I would just like to throw out another way to do this since you mentioned you tried using Automator. This file search and process software: http://www.softpedia.com/get/File-managers/JFileProcessor.shtml https://github.com/stant/jfileprocessor
Will let you search for files with glob or regex, in subfolders to X or all depth, by name, size, date. You can save to a List window or file. Then you can run a groovy (think java) script to do whatever you want to the list of files; zip or tar them, modify the list strings like sed, delete, move, copy files, grep or ls -l them, whatever. In your case you can modify an existing groovy example to do something like:
int numItems = defaultComboBoxModel.getSize();
System.out.println( "defaultComboBoxModel.getSize() num of items =" + numItems + "=" );
String str = "";
for( int i = 0; i < numItems; i++ )
{
str = defaultComboBoxModel.getElementAt( i ).toString();
System.out.println( "file of index =" + i + " str =" + str + "=" );
String cmd = "mv " + str + " " + i + ".jpg"; // or whatever
def list = cmd.execute().text // this stuff just captures output and write to a log file
list.eachLine{
outFile << it;
}
outFile << System.getProperty("line.separator") + "-------------------------------------" + System.getProperty("line.separator");
}
It will also let you massage your list like add to, delete from, subtract one list from another.

How to find a specific files recursively in the directory, rename it by prefixing sub-directory name, and move it to different directory

I am perl noob, and trying to do following:
Search for files with specific string in a directory recursively. Say string is 'abc.txt'
The file can be in two different sub-directories, say dir_1 or dir_2
Once the file is found, if it is found in dir_1, rename it to dir_1_abc.txt. If it is in dir_2, then rename it to dir_2_abc.txt.
Once all the files have been found and renamed, move them all to a new directory named, say dir_3
I don't care if I have to use any module to accomplish this. I have been trying to do it using File::Find::Rule and File::copy, but not getting the desired result. Here is my sample code:
#!/usr/bin/perl -sl
use strict;
use warnings;
use File::Find::Rule;
use File::Copy;
my $dir1 = '/Users/macuser/ParentDirectory/logs/dir_1'
my $dir2 = '/Users/macuser/ParentDirectory/logs/dir_2'
#ideally I just want to define one directory but because of the logic I am using in IF
#statement, I am specifying two different directory paths
my $dest_dir = '/Users/macuser/dir_3';
my(#old_files) = find(
file => (),
name => '*abc.txt',
in => $dir1, $dir2 ); #not sure if I can give two directories, works with on
foreach my $old_file(#old_files) {
print $old_file; #added this for debug
if ($dest_dir =~ m/dir_1/)
{
print "yes in the loop";
rename ($old_file, "dir_1_$old_file");
print $old_file;
copy "$old_file", "$dest_dir";
}
if ($dest_dir =~ m/dir_2/)
{
print "yes in the loop";
rename ($old_file, "dir_2_$old_file");
print $old_file;
copy "$old_file", "dest_dir";
}
}
The code above does not change the file name, instead when I am printing $old_file inside if, it spits the whole directory path, where the file is found, and it is prefixing the path with dir_1 and dir_2 respectively. Something is horribly wrong. Please help simply.
If you have bash ( I assume in OSX it is available), you can do this in a few lines (usually I put them in one line).
destdir="your_dest_dir"
for i in `find /Users/macuser/ParentDirectory/logs -type f -iname '*abc.txt' `
do
prefix=`dirname $i`
if [[ $prefix = *dir_1* ]] ; then
prefix="dir_1"
fi
dest="$destdir/${prefix}_`basename $i`"
mv "$i" "$dest"
done
The advantage of this method is that you can have many sub dirs under logs and you don't need to specify them. you can search for files like blah_abc.txt, tada_abc.txt too. If you want a exact match just juse abc.txt, instead of *abc.txt.
If the files can be placed in the destination as you rename them, try this:
#!/usr/bin/perl
use strict;
use File::Find;
use File::Copy;
my $dest_dir = '/Users/macuser/dir_3';
foreach my $dir ('/Users/macuser/ParentDirectory/logs/dir_1', '/Users/macuser/ParentDirectory/logs/dir_2') {
my $prefix = $dir; $prefix =~ s/.*\///;
find(sub {
move($File::Find::name, "$dest_dir/${prefix}_$_") if /abc\.txt$/;
}, $dir);
}
If you need to do all the renaming first and then move them all, you could either remember the list of files you have to move or you can make two passes making sure the pattern on the second pass is still OK after the initial rename in the first pass.

How to preserve single and double quotes in shell script arguments WITHOUT the ability to control how they pass

I need to receive arguments I have no control over into a shell script, and preserve any single or double quotes. For instance, a script that simply outputs the given arguments should act as follows:
> my_script.sh "double" 'single' none
"double" 'single' none
I don't have the privilege of augmenting the arguments such as in:
> my_script.sh \"double\" \'single\' none
or
> my_script.sh '"double"' "'single'" none
And neither "$#" nor "$*" work.
I also thought of reading from STDIN and try something like:
> echo "double" 'single' none | my_script.sh
thinking it may help, but no breakthrough so far.
Any suggestions?
CSH / PERL solutions are welcomed.
This is not possible (without escaping), because the shell processes the arguments and removes the quotes before your script is called. As a result, your script does not know about the quotes specified on the command line.
You cannot recover the single/double quotes exactly as they were, because the shell 'eats' them. If you need to call some other script from your script, you can e.g. single quote the arguments again. Here is a PERL solution I use:
sub args2shell
{
local (#argv) = #_;
local $" = '\' \'';
local (#margv);
#margv = map { s/'/'\\''/g; $_ } #argv;
return "\'#margv\'" if #margv;
return undef;
}
Example usage:
$args = args2shell #ARGV;
open F, "find -follow $args ! -type d |";
...

Locating a file on the path

Does anybody know how to determine the location of a file that's in one of the folders specified by the PATH environmental variable other than doing a dir filename.exe /s from the root folder?
I know this is stretching the bounds of a programming question but this is useful for deployment-related issues, also I need to examine the dependencies of an executable. :-)
You can use the where.exe utility in the C:\Windows\System32 directory.
For WindowsNT-based systems:
for %i in (file) do #echo %~dp$PATH:i
Replace file with the name of the file you're looking for.
If you want to locate the file at the API level, you can use PathFindOnPath. It has the added bonus of being able to specify additional directories, in case you want to search in additional locations apart from just the system or current user path.
On windows i'd say use %WINDIR%\system32\where.exe
Your questions title doesn't specify windows so I imagine some folks might find this question looking for the same with a posix OS on their mind (like myself).
This php snippet might help them:
<?php
function Find( $file )
{
foreach( explode( ':', $_ENV( 'PATH' ) ) as $dir )
{
$command = sprintf( 'find -L %s -name "%s" -print', $dir, $file );
$output = array();
$result = -1;
exec( $command, $output, $result );
if ( count( $output ) == 1 )
{
return( $output[ 0 ] );
}
}
return null;
}
?>
This is slightly altered production code I'm running on several servers. (i.e. taken out of OO context and left some sanitation and error checking out for brevity.)
Using PowerShell on Windows...
Function Get-ENVPathFolders {
#.Synopsis Split $env:Path into an array
#.Notes
# - Handle 1) folders ending in a backslash 2) double-quoted folders 3) folders with semicolons 4) folders with spaces 5) double-semicolons i.e. blanks
# - Example path: 'C:\WINDOWS\;"C:\Path with semicolon; in the middle";"E:\Path with semicolon at the end;";;C:\Program Files;
# - 2018/01/30 by Chad#ChadsTech.net - Created
$NewPath = #()
$env:Path.ToString().TrimEnd(';') -split '(?=["])' | ForEach-Object { #remove a trailing semicolon from the path then split it into an array using a double-quote as the delimeter keeping the delimeter
If ($_ -eq '";') { # throw away a blank line
} ElseIf ($_.ToString().StartsWith('";')) { # if line starts with "; remove the "; and any trailing backslash
$NewPath += ($_.ToString().TrimStart('";')).TrimEnd('\')
} ElseIf ($_.ToString().StartsWith('"')) { # if line starts with " remove the " and any trailing backslash
$NewPath += ($_.ToString().TrimStart('"')).TrimEnd('\') #$_ + '"'
} Else { # split by semicolon and remove any trailing backslash
$_.ToString().Split(';') | ForEach-Object { If ($_.Length -gt 0) { $NewPath += $_.TrimEnd('\') } }
}
}
Return $NewPath
}
$myFile = 'desktop.ini'
Get-ENVPathFolders | ForEach-Object { If (Test-Path -Path $_\$myFile) { Write-Output "Found [$_\$myFile]" } }
I also blogged the answer with some details over at http://blogs.catapultsystems.com/chsimmons/archive/2018/01/30/parse-envpath-with-powershell
In addition to the 'which' (MS Windows) and 'where' (unix/linux) utilities, I have written my own utility which I call 'findinpath'. In addition to finding the executable that would be executed, if handed to the command line interpreter (CLI), it will find all matches, returned path-search-order so you can find path-order problems. In addition, my utility returns not just executables, but any file-specification match, to catch those times when a desired file isn't actually executable.
I also added a feature that has turned out to be very nifty; the -s flag tells it to search not just the system path, but everything on the system disk, known user-directories excluded. I have found this feature to be incredibly useful in systems administration tasks...
Here's the 'usage' output:
usage: findinpath [ -p <path> | -path <path> ] | [ -s | -system ] <file>
or findinpath [ -h | -help ]
where: <file> may be any file spec, including wild cards
-h or -help returns this text
-p or -path uses the specified path instead of the PATH environment variable.
-s or -system searches the system disk, skipping /d /l/ /nfs and /users
Writing such a utility is not hard and I'll leave it as an exercise for the reader. Or, if asked here, I'll post my script - its in 'bash'.
just for kicks, here's a one-liner powershell implementation
function PSwhere($file) { $env:Path.Split(";") | ? { test-path $_\$file* } }

Resources