Wildcard in bash script - bash

I have a bash script to retrieve files from ftp.
Now the files have one part a date string in the filename, but also undefined numbers that changes on every file. I want to download the files based on the date.
This is my code. I only need to do the wildcard trick, the ftp script is allready work.
filename=$(echo $TIMESTAMP'0***vel.radar.h5')
The stars are 3 digits with different numbers that i can't estimate, so i would use the wildcard for them.
Thank you

It sounds like you want to handle multiple files, but your script can only handle one file at a time. Furthermore, because you specified FTP, it sounds like the files are on the FTP server, in which case local filename expansion will not help.
You probably want to use the ftp client's mget command to download multiple files matching a pattern on the remote side. You also want to include $TIMESTAMP as part of the pattern. I'd suggest something like this:
ftp remote-hostname <<EOF
cd path/to/log/files
prompt
mget ${TIMESTAMP}0???vel.radar.h5
bye
EOF
This uses a here-document (<<EOF to EOF on a line by itself) to supply input text to the ftp commmand. It will expand the variable $TIMESTAMP so it becomes part of the mget command, e.g. if $TIMESTAMP was 12345, the ftp command will be told mget 123450???vel.radar.h5.

With the ? characters you include not only numbers, but any character. So if you want to include only numbers, you can replace the ??? characters with [0-9][0-9][0-9].
Example:
If you have the following files:
$ ls
0123vel.h5 0333vel.h5 033vel.h5 0pecvel.h5
with this ls you show the correct files:
$ ls 0[0-9][0-9][0-9]vel.radar.h5
0123vel.h5 0333vel.h5

Related

Error processing variables with special characters in bash script

I need help trying to find a solution to have a bash script be able to read file names with special characters. The user will start the script, but if the folder or the file has special characters, the script will fail or have an error. I have tried several options I found online, but I have not been able to make them work with the script.
The script is set up to take user input with the read command.
read -r -p "Enter directory name : " var1
If the user input is “accoutn&orders,” the script will fail due to the ‘&’ character as it won’t find the directory or file.
When the script looks for the file with specific extensions, the input folder name will be the path to copy the files to a different directory. The issue I am running into is that some of those files or directories have special characters, and the script cannot process the variables and cannot find the file when there are special characters.
The script uses a for loop to check every file in the directory, and if the file's name has a special character, it will fail the loop.
example file name:
file1#depot.rct
file2&logrecord.rct
cd $var1
ls: cannot access '/sharepool/comunityshare//'\''account.&.orders'\''': No such file or directory
line 141: cd: '/sharepool/comunityshare//'\''account.&.orders'\''': No such file or directory
I have tried using single quotes wrapping and bask slashes, but the variable is not readable.
Please note that I am not a coder or developer, I know some basic Linux commands, and I am trying to make this work while a better process is developed. I appreciate your help with this.
I was able to solve the issue using this line.
filename=$(echo "$filename" | sed 's/[&()#+*#!%^'\''^]/\\&/g')
That inserted a backslash if the variable had a special character.
account.&.orders to account.&.orders
Thank you for your help and support.

How to create one output file for each file passed to a loop in bash?

I have a file that I pass to a bash command that will create an output in a loop like so:
for file in /file/list/*
do
command
done
I wish to save the output that would have gone to standard out of each loop to a text file in my working directory. Currently I am trying this:
for file in /file/list/*
do
command | tee "$file_command output.txt"
done
What I expect to see are new files created in my current directory titled file1.txt_commandoutput.txt, file2.txt_commandoutput.txt, etc. The output of the command should be saved as a different file for each file. However I get only one file created and it's called ".txt" and can't be opened by any standard software on Mac. I am new to bash scripting, so help would be much appreciated!
Thanks.
Your problem comes from the variable name you're using:
"$file_command_output.txt" looks for a variable named file_command_output (the dot cannot be in the variable name, but the alphanumerical characters and the underscore all can).
What you're looking for is "${file}_command_output.txt" to make the variable name more explicit.
You have two issues in your script.
First, the wrong parameter/variable is expanded (file_command instead of file) because it's followed by a character that can be interpreted as part of the name (the underscore, _). To fix it, enclose the parameter name in braces, like this: ${file}_command (see Shell Parameter Expansion in bash manual).
Second, even with fixed variable name expansion, the file won't be created in your working directory, because the file holds an absolute pathname (/file/list/name). To fix it, you'll have to strip the directory from the pathname. You can do that with either basename command, or even better with a modified shell parameter expansion that will strip the longest matching prefix, like this: ${file##*/} (again, see Shell Parameter Expansion, section on ${parameter##word}).
All put together, your script now looks like:
#!/bin/bash
for file in /file/list/*
do
command | tee "${file##*/}_command output.txt"
done
Also, to just save the command output to a file, without printing it in terminal, you can use a simple redirection, instead of tee, like this: command > "${file##*/}_com...".
If you are not aware of xargs, try this:
$ ls
file
$ cat > file
one
two
three
$ while read this; do touch $this; done < ./file
$ ls
file one three two

scp multiple files in a shell script

I have a list of directory names.
I want to scp into a remote machine, go into each of my directory names and copy a file back to my local computer.
I so far have:
while read line
do
scp remote_machine:/home/$line/$line.dat ./local
done < file_with_directory_names.txt
I have authorisation keys set up so that I don't have to enter the password each time - but this method does login to the remote machine for every file it transfers. I imagine that there is a much better way than this.
You can specify multiple files in a single scp argument by separating them with spaces; you just need to make sure it's one argument to scp itself. This should work in your case:
scp "remote_machine:$(
sed 's,.*,/home/&/&.dat,' file_with_directory_names.txt | xargs)" ./local
The sed command sticks the /home/ prefix and name.dat suffix on each line; the xargs outputs all the resulting pathnames on a single line separated by spaces. Plug that all into the source argument after the remote_machine: part, all inside double quotes so it's still a single argument to scp, and you're good to go.

Shell Script Not Finding File

Hello I am trying to write a simple shell script to use in a cronjob to copy a backup archive of website files to a remote server via FTP.
The script below works when I type the file name in by hand manually, but with the date and filename specified as a variable it returns that it can't find ".tar.gz" as if it is ignoring the first part of the filename.
I would be grateful if someone could tell me where I am going wrong.
#!/bin/sh
NOW=$(date +"%F")
FILE="$NOW_website_files.tar.gz"
# set the local backup dir
cd "/home/localserver/backup_files/"
# login to remote server
ftp -n "**HOST HIDDEN**" <<END
user "**USER HIDDEN**" "**PASSWORD HIDDEN**"
cd "/backup_files"
put $FILE
quit
END
This is because it is looking for a variable name NOW_website_files which does not exist, and thus the resulting file name evaluates to .tar.gz.
To solve it, do:
#!/bin/sh
NOW=$(date +"%F")
FILE="${NOW}_website_files.tar.gz"
^ ^
instead of
FILE="$NOW_website_files.tar.gz"
This way it will concatenate the variable $NEW to the _website_files.tar.gz text.
You could do this:
FILE=$(date +"%F_website_files.tar.gz")
instead of this:
NOW=$(date +"%F")
FILE="$NOW_website_files.tar.gz"
IMPORTANT
By the way, consider adding "bi" to your FTP script as you are clealy PUTting a binary file and you don't want CR/LF translation to occur in binary files...

How do I use a variable in an FTP shell script?

Using a handy script that I found for FTP-ing, I modified it to my own use, and it works, but too well, or too literally.
#!/bin/sh
HOST='10.0.1.110'
USER='myName'
PASSWD='myPass'
FILE='*.sql' # WILDCARD NOT WORKING - Takes literal string of '*.sql'
# Stripped unrelated code
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
prompt
binary
cd Desktop/SweepToDiskBK
mput /home/myAcct/SQLbackups/"$FILE" "$FILE"
quit
END_SCRIPT
exit
That is, the file(s) that gets 'put', is named *.sql, and replaces any previous versions of it, instead of file1.sql, file2.sql, etc. In the original script, they were doing a put, instead of an mput, and with a single file names text.txt. I've also tried changing the single quotes after FILE, to double quotes, and got the same result. Can someone let me in on the 'trick' to using variables for CLI FTP-ing?
Thanks in advance,
LO
I'd try this:
FILE=*.sql
without any quotes enabling wildcard expansion
and:
mput /home/myAcct/SQLbackups/"$FILE"
with just one $FILE
But i think you can just do mput /home/myAcct/SQLbackups/$FILE to mput *.sql? Or if you need to put several files with put, then you need to do it one by one with some sort of a loop.
To check what the script does, change ftp -n $HOST to cat. Try these very commands in the plain ftp session. Do they work?

Resources