find command in foreach doesn't work in tcsh - tcsh

Assume that command "find dir -name "script" is correct and printing all location of script in dir.
I want to do it with tcsh script
foreach f (find dir -name "script")
echo "$f"
$f #execute this script
but it doesn't work.

I think you forgot the backtick around your find command:
foreach f (`find dir -name script`)

set locations = `find dir -name “script”`
echo “#\!/bin/tcsh” > scriptRunList
foreach location ( $locations )
echo $location >> scriptRunList
end
chmod +x scriptRunList
scriptRunList
I think that does what you want and generates a list of the scripts that were run as a side effect.
tcsh $f - would work and no side effect files. I’m in the habit of keeping log files as I usually need to know what was run on which version at what time. Old habits.

Related

start a shellscript with cygwin from a batch file that is in a different directory than C:

I am trying to make my life easier and tried working with scripting.
Since the script I wanna use is a shellscript ( I couldn´t make it work in powershell had a problem in reading the xml file; so a colleague had coded a shell script that I am using now) I am trying to use it in a batch(cmd) file.
So here is the idea on how it should work:
we have a .sh script that is removing all the timestamps in the XML files in that directory that this .sh file is in.
Code:
#!/bin/bash
for i in `find . -name "*.xml"`
do
echo $i; sed -i '/UCOMPSTAMP/d' $i
done
for i in `find . -name "*.xml"`
do
echo $i; sed -i '/DAT name="UTIMESTAMP"/d' $i
done
for i in `find . -name "*.xml"`
do
echo $i; sed -i '/DAT name="U_INTF"/d' $i
done
for i in `find . -name "*.xml"`
do
echo $i; sed -i '/DAT name="U_SVCUSE"/d' $i
done
for i in `find . -name "*.xml"`
do
echo $i; sed -i '/DAT name="U_FSEQ"/d' $i
done
This script is working and deletes the timestamps in my xml file in this directory.
I have a directory U:\bla\bla\compare
I also export both xml files that I am going to compare in that directory.
Lets say XML_LIVE.xml and XML_TEST.xml
now I have a batch(called: "execute_sh.cmd") that tries to open the .sh file:
#echo off
C:\cygwin64\bin\bash -l /cygdrive/u/bla/compare/01_remove_timestamps.sh
pause
right now it doesnt do anything. Also doesn´t say that it can´t find the path. I tried using ./01_remove_timestamps.sh
U:\bla\Compare\01_remove_timestamps.sh
but I get the error that it couldn´t find the file then. If I try to execute the command in cygwin I have to change the directory with
cd /cygdrive/u/bla/compare/
and then
./remove_timestamps.sh
and this executes the shellscript so why is this not possible with the .cmd?
and my final .cmd (called: execute_all)
call %0\..\execute_sh.cmd"
has this code in so I just have to start this .cmd and everything is automatic.
This entire thing works if I put all these .cmd and .sh files and the xml files in my home directory in cygwin -> C:\cygwin64\home\myName
The code would be
#echo off
C:\cygwin64\bin\bash -l 01_remove_timestamps.sh
pause
But I wanna use the D: Drive and a specific directory there. I hope this is clear to understand my problem.

Issue with shell script running on crontab \n doesnt work

I am working on a cron job, to check and recover ark files if required. I need to get the biggest file size of .ark files and if .TheIsland.ark file is smaller it will auto backup and copy over the biggest size. Now while I have this working out side of crontab one part of the script fails.
Which is:
actualmap=$(find $PWD -type f -printf '%p\n' -name "*.ark"| sort -nr | head -1)
If I remove the \n it actually works but then it cannot sort between them as it is not in separate lines.
The output I get on cron job with \n is:
/srv/daemon-data/da4aaa1b-0ce9-46d2-bd60-5f599cc089ae/ShooterGame/Saved/recovery.sh (which is the recovery script)
The same line of code ran in terminal produces the correct output of:
/srv/daemon-data/da4aaa1b-0ce9-46d2-bd60-5f599cc089ae/ShooterGame/Saved/SavedArks/TheIsland_NewLaunchBackup.bak
Without \n using crontab I get:
/srv/daemon-data/da4aaa1b-0ce9-46d2-bd60-5f599cc089ae/ShooterGame/Saved/SavedArks/TheIsland_27.06.2019_21.46.20.ark/srv/daemon-data/da4aaa1b-0ce9-46d2-bd60-5f599cc089ae/ShooterGame/Saved/SavedArks/TheIsland_28.06.2019_15.15.34.ark
I have attached full code which works manually.
#!/bin/bash
export DISPLAY=:0.0
##ARK Map Recovery Script
cd /srv/daemon-data/da4aaa1b-0ce9-46d2-bd60-5f599cc089ae/ShooterGame/Saved/SavedArks
#Check file size of current ark map
file=TheIsland.ark
echo $file
currentsize=$(wc -c <"$file")
echo $currentsize
#Find biggest map file.
actualmap=$(find $PWD -type f -printf '%p\n' -name "*.ark"| sort -nr | head -1)>/srv/daemon-data/da4aaa1b-0ce9-46d2-bd60-5f599cc089ae/ShooterGame/Saved/SavedArks/log.txt
echo $PWD
echo $actualmap
biggestsize=$(wc -c < "$actualmap")
echo $biggestsize
if [ $currentsize -ge $biggestsize ]; then
echo No map recovery required as over $biggestsize bytes
else
echo Uh Oh! size is under $biggestsize bytes Attempting map recovery
echo Checking for Backup dir and creating if necessary
mkdir -p BackupFiles
#Move old map into backup dir in the saved location
echo Moving old Map File to backup dir
mv $file BackupFiles
#Stop server using docker commands
echo Stopping servers
docker kill da4aaa1b-0ce9-46d2-bd60-5f599cc089ae
#Copy biggest map file with correct name
echo Copying backup file
cp $actualmap $file
fi
Using the -printf option to the find command is not required here. -print will do just fine.
I obtain the result you want (find returning the found filenames, one per line) with this:
find $PWD -type f -name "*.ark" -print.
With -printf, %p gives you the filename anyway.
From man find: -print True; print the full file name on the standard output, followed by a newline.
Option -print already does what you want to do.

Bash/shell search for subfolder in current dir and cd into it

How would I go about search if the folder "test" is anywhere within my current dir and once the first occurrence is found automatically cd into it? Using the terminal on mac (bash).
You can write:
cd "$(find . -type d -name test -print -quit)"
(Caveat: this works for test, but will not work for any filename ending in newlines. Fortunately, I've never heard of anyone having a real filename that ended in a newline — it's possible, but never done — and the filename is an argument under your control. So I can't imagine that this will be a problem.)
The below will work if that dir exists ->
cd `find . -name test -type d`
For bash I guess below should work ->
cd $(find . -name test -type d)
You could use the globstar option of bash which searches directories recursively.
shopt -s globstar
for i in **/*test; do
if [[ -d $i ]]; then
cd "$i"
break
fi
done
You can try :
if [ -d "test" ]; then cd test; fi
Or simply :
[ -d "test" ] && cd test
Or just do ...
cd test
... if test is a directory the you just cd'ed into it.
If is not ... so what.

How to get files and subfolders of a folder

I am trying to create a simple bash script which will echo all the files from a folder, including subfolders. The following is my code. But the output I am getting is just ls $fromFolder
#! /bin/bash
fromFolder="~/proj/activex"
toFolder="~/proj/outgoing"
files='ls $fromFolder'
for file in $files
do
echo $file
done
Thanks
No need to use ls command here. You can simply replace your for loop as:
for file in ~/proj/outgoing/*
do
echo $file
done
find $fromfolder -print
will print all of the files and subdirectories in $fromfolder.
This lists regular files
find $fromfolder -print -type f
This lists directories
find $fromfolder -print -type d
In you code --this has a problem
files='ls $fromFolder'
$fromfolder will never be "translated" into its value by bash because of the single quotes.
You need to use double quotes instead of singles, which will allow the shell to expand the fromFolder variable:
files="ls $fromFolder"
Although anubhava's solution is better

Winter bashing with find

Recently (that is in winter in few days) I wrote a simple script which packs some folders, script is listed below:
#!/bin/bash
for DIR in `find -name "MY_NAME*" -type d`
do
tar -zcvf $DIR.tar.gz $DIR &
done
echo "Packing is done" > packing.txt
It works fine except that it searches for MY_NAME* in every sub-directory of the folder where it runs.
Because MY_NAME* folders contain lots of files, and packing takes long hours, I want to limit time loss and I want the find command to find those MY_NAME* directories only within the folder where the script is running (without sub-directories). Is it possible with command find ?
If you want it only in the folder you are in, don't use find. Try this:
for DIR in MY_NAME*/
do
tar -zcvf "$DIR".tar.gz "$DIR" &
done
echo "Packing is done" > packing.txt
It seems you want to use the -maxdepth flag on the find command:
find -name "MY_NAME*" -type d -maxdepth 1

Resources