Bash command route malfunction - bash

Given this (among more...):
compile_coffee() {
echo "Compile COFFEESCRIPT files..."
i=0
for folder in ${COFFEE_FOLDER[*]}
do
for file in $folder/*.coffee
do
file_name=$(echo "$file" | awk -F "/" '{print $NF}' | awk -F "." '{print $1}')
file_destination_path=${COFFEE_DESTINATION_FOLDER[${i}]}
file_destination="$file_destination_path/$file_name.js"
if [ -f $file_path ]; then
echo "+ $file -> $file_destination"
$COFFEE_CMD $COFFEE_PARAMS $file > $file_destination #FAIL
#$COFFEE_CMD $COFFEE_PARAMS $file > testfile
fi
done
i=$i+1
done
echo "done!"
compress_javascript
}
And just to clarify, everything except the #FAIL line works flawessly, if I'm doing something wrong just tell me, the problem I have is:
the line executes and does what it have to do, but dont write the file that I put in "file_destination".
if a delete a folder in that route (it's relative to this script, see below), bash throws and error saying that the folder do not exist.
If I make the folder again, no errors, but no file either.
If I change the $file_destination to "testfile", it create the file with correct contents.
The $file_destination path its ok -as you can see, my script echoes it-
if I echo the entire line, copy the exact command with params and execute it onto a shell in the same directory the script is, it
works.
I don't know what is wrong with this, been wondering for two hours...
Script output (real paths):
(alpha)[pyron#vps herobrine]$ ./deploy.sh compile && ls -l database/static/js/
===============================
=== Compile ===
Compile COFFEESCRIPT files...
+ ./database/static/coffee/test.coffee -> ./database/static/js/test.js
done!
Linking static files to django staticfiles folder... done!
total 0
Complete command:
coffee --compile --print ./database/static/coffee/test.coffee > ./database/static/js/test.js
What am I missing?
EDIT I've made some progression through this.
In the shell, If I deactivate the python virtualenv the script works, but If I call deactivate from the script it says command not found.

Assuming destination files have no characters as spaces in their names, directories exist etc. I'd try adding 2>&1 e.g.
$COFFEE_CMD $COFFEE_PARAMS $file > testfile 2>&1
compilers may put desired output and/or compilation messages on stderr instead of stdout. You may also want to put full path to coffee , e.g. /usr/bin/coffee instead of just compiler name.

Found that the problem wasn't the bash script itself. A few lines later the deploy script perform the collectstatic method from django. Noticed that until that line the files were there, I started reading that the collecstatic have a cache system. A very weird one IMO, since I have to delete all the static files and start from scratch to have the script working.
So... the problem wasn't the bash script but the django cache system. Im not givin' reputation to me anyways.
The full deploy script is here: https://github.com/pyronhell/deploy-script-boilerplate and everyone is welcome if you can improve it.
Cheers.

Related

Cannot append to a file from bash script

if I run only this - it works as expected
user_conf=( "USER_CONFIG_FILE=user_conf.sh"
"USER_CONFIG_FILE=user_conf.sh" "USER_CONFIG_FILE=user_conf.sh" )
USER_CONFIG_FILE="user_conf.sh"
echo "${user_conf[#]}"
for i in "${user_conf[#]}"; do
echo "$i" >> "$USER_CONFIG_FILE"
done
echo "User config file initiated."
However when I run it in my install.sh nothing is appended to desired config file. The same code as above is at the bottom of install.sh. You can also see tere that I first tried to to it with echo-appends on multiple places(commented out lines) but only two echo statements above separator line (===) worked - not below that line. I really have no idea what causes it. What have I missed?
EDIT:
Do not worry to run install.sh (without args).
It only creates one entry in your .bash_aliases and creates directory 'wenv' in your HOME. You can delete both afterwards.
You just have to figure out what directory you're in when you that executes. make_base_dir calls make_log_file which does a cd but never changes back to the previous directory.

Change the output directory on command line For Loop

I have been working CSVKIT to build out a solution for parsing fixed-width data files to csv. I have put together the below code to iterate through all the files in a directory, but this same code also places the files back into the same directory it came from. As best practice I believe in using an 'IN' folder and an 'OUT' folder. I am also processing this command-line on a MAC.
for x in $(ls desktop/fixedwidth/*.txt); do x1=${x%%.*}; in2csv -f fixed -s desktop/ff/ala_schema.csv $x > $x1.csv; echo "$x1.csv done."; done
I feel that I am missing something and or that I need to change something within my snippet shown below, but I just can't put my finger on it.
x1=${x%%.*}
Any help on this would be wonderful and I thank you in advance.
When you run,
in2csv -f fixed -s desktop/ff/ala_schema.csv $x > $x1.csv
the output of your programm will be writing to the file, ${x1}.csv.
So, just set correct path to x1:
output_dir=/path/to/your/dir/
output_file=${output_dir}${x%%.*}.csv
in2csv -f fixed -s desktop/ff/ala_schema.csv $x > $output_file;
But, you should create your output_dir, before running this code. Otherwise you can receive an error, that directory doesn't exists.

bash is zipping entire home

I am trying to back up a all world* folders from /home/mc/server/ and drop the zipped in /home/mc/backup/
#!/bin/bash
moment=$(date +"%Y%m%d%H%M")
backup="/home/mc/backup/map$moment.zip"
map="/home/mc/server/world*"
zipping="zip -r -9 $backup $map"
eval $zipping
The zipped file is created in backup folder as expected, but when I unzipped it contants the entire /home dir. I am running this bash in two ways:
Manually
Using user's crontab
Finally, If I put an echo of echo $zipping this prints correctly the command that I need to trigger. What am I missing? Thank you in advance.
There's no reason to use eval here (and no, justifying it on DRY grounds if you want to both log a command line and subsequently execute it does not count as a good reason IMO.)
Define a function and call it with the appropriate arguments:
#!/bin/bash
moment=$(date +"%Y%m%d%H%M")
zipping () {
output=$1
shift
zip -r -9 "$output" "$#"
}
zipping "/home/mc/backup/map$moment.zip" /home/mc/server/world*
(I'll admit, I don't know what is causing the behavior you report, but it would be better to confirm it is not somehow specific to the use of eval before trying to diagnose it further.)

bash downloading files ftp url

I working with a function to parse a file that has a list of desired file names to download. I'm using curl to download them but is there a better way? The output is shown which is okay but is there way for the output not be shown? Is there way to handle exceptions if the file isn't found and move on to the next file to be download if something happens? Might wanna ignore what I do for getting the proper link name, it was pain. The directory pattern has a pattern to what the name of the file is.
#!/bin/bash
# reads from the file and assigns to $MYARRAY and download to Downloads/*
FILENAME=$1
DLPATH=$2
VARIABLEDNA="DNA"
index=0
function Download {
VARL=$1
#VARL=$i
echo $VARL
VAR=${VARL,,}
echo $VAR
VAR2=${VAR:1:2}
echo $VAR2
HOST=ftp://ftp.wwpdb.org/pub/pdb/data/structures/divided/pdb/
HOSTCAT=$HOST$VAR2
FILECATB='/pdb'
FILECATE='.ent.gz'
NOSLASH='pdb'
DLADDR=$HOSTCAT$FILECATB$VAR$FILECATE
FILECATNAME=$NOSLASH$VAR$FILECATE
echo $DLADDR
curl -o Downloads/$FILECATNAME $DLADDR
gunzip Downloads/$FILECATNAME
}
mkdir -p Downloads
while read line ; do
MYARRAY[$index]="$line"
index=$(($index+1))
done < $FILENAME
echo "MYARRAY is: ${MYARRAY[*]}"
echo "Total pdbs in the file: ${index}"
for i in "${MYARRAY[#]}"
do
Download $i
done
I'm trying to write the log file to a folder that i made before the downloading but it doesn't seem to be making it in the folder. It writes to the root directory of the file that being executed and it doesn't write it correctly either. My syntax might be wrong??
curl -o Downloads/$FILECATNAME $DLADDR >> Downloads\LOGS\$LOGFILE 2>&1
Okey, first of all, I'm not sure if I got it all right, but I'll give it a try:
I'm using curl to download them but is there a better way?
I don't know a better one. You could use wget instead of curl, but curl is much more powerful.
The output is shown which is okay but is there way for the output not be shown?
You could use nohup (e.g. nohup curl -o Downloads/$FILECATNAME $DLADDR). If you don't redirect the output to a specific file, It will be stored in nohup.out. By adding an ampersand (&) at the end of your command you can also let it run in the background, so the command is still executed, even if you loose the connection to the server.
Is there way to handle exceptions if the file isn't found and move on to the next file to be download if something happens?
You could use something like test...exists.Or you could just check your nohup.out for errors or anything else with a grep.
I hope this helped you in any way.
cheers

unix tee command not working as expected

While i'm using the following command in unix command prompt everything is working fine,log fiel is creating fine.
ls -l|echo "[`date +%F-%H-%M-%S`] SUCCESS - FILES"|tee -a logger2.log
but using the same thing in side the shell script it is showing error
No such file or directory.
I'm not getting what is the problem here!!
If I read between the lines: you want a list of files, followed by a date and a message?
try:
{ ls -l ; echo "[$(date "+%F-%H-%M-%S")] SUCCESS - FILES" ; } |tee -a logger2.log
That should give you in 'logger2.log' of the current directory the lines
.................... file
.................... file2
(ie the list of all files and dirs in the current dir, EXCEPT those starting with a ".")
[2013-12-26-..-..-..] SUCCESS - FILES
Please note that, if you put nothing in front of the script, it could be started by a different shell than the one you use when testing. It depends what/who does invoke the script : if it is a crontab, it will probably be interpreted by "sh" (or by "bash" in "sh-compatibility" mode)...
Please tell us what the above gives you as output, and how you start the script (by a user, on the prompt? or via a crontab [which one: /etc/crontab ? or a user's crontab? which user?], etc. And what error messages you see.

Resources