How to get oldest file in ftp directory using bash script - bash

I have a working script that get all file list in a ftp directory and sae it in a local file with this:
curl -s -l ftp://username:password#ftpserver.com/directory/ > source.txt
Now, I need to sort this result by creation date instead of name. I only need to write the oldest file name in the source.txt file. Is it possible?
Thank you.

To get filename (and further informations) about file with oldest modification date in a given directory with lftp:
Example:
lftp -u anonymous,anonymous -e "ls -t; quit" ccrma-ftp.stanford.edu/pub | tail -n 1

Finally this script Works for me: lftp -u user,password -e "cls --sort=date; quit" ftpserveraddress/Folder 2> /dev/null | tail -n 1

Related

Terminal/Bash command to recursively cat each file in directory, and store first 100 lines in text file named the same as the file selected

As the title says, I have a directory in a remote server with a bunch of huge files. I just want to CAT the first 100 lines of each file in the directory, and store it in a .txt file named after the huge file that was "cat'd", in a local directory. Is this possible through one command? Or is a bash script necessary?
#/usr/bin/bash
remote_dir="/home/gary/dir"
local_output_dir="/home/gary/data"
dir_listing="$(ssh user#host ls -Q $remote_dir)"
echo DIR: $dir_listing
IFS='
'
for file in $dir_listing
do
echo Processing $file...
ssh -q user#host "cat "${remote_dir}/$file" | head -100" \> "${local_output_dir}/hun.$file"
done

Checking file existence in Bash using commandline argument

How do you use a command line argument as a file path and check for file existence in Bash?
I have the simple Bash script test.sh:
#!/bin/bash
set -e
echo "arg1=$1"
if [ ! -f "$1" ]
then
echo "File $1 does not exist."
exit 1
fi
echo "File exists!"
and in the same directory, I have a data folder containing stuff.txt.
If I run ./test.sh data/stuff.txt I see the expected output:
arg1=data/stuff.txt
"File exists!"
However, if I call this script from a second script test2.sh, in the same directory, like:
#!/bin/bash
fn="data/stuff.txt"
./test.sh $fn
I get the mangled output:
arg1=data/stuff.txt
does not exist
Why does the call work when I run it manually from a terminal, but not when I run it through another Bash script, even though both are receiving the same file path? What am I doing wrong?
Edit: The filename does not have spaces. Both scripts are executable. I'm running this on Ubuntu 18.04.
The filename was getting an extra whitespace character added to it as a result of how I was retrieving it in my second script. I didn't note this in my question, but I was retrieving the filename from folder list over SSH, like:
fn=$(ssh -t "cd /project/; ls -t data | head -n1" | head -n1)
Essentially, I wanted to get the filename of the most recent file in a directory on a remote server. Apparently, head includes the trailing newline character. I fixed it by changing it to:
fn=$(ssh -t "cd /project/; ls -t data | head -n1" | head -n1 | tr -d '\n' | tr -d '\r')
Thanks to #bigdataolddriver for hinting at the problem likely being an extra character.

Correct Regex in SFTP bash script

I want to automate a SFTP process to transfer the last file created in local server and send it to remote server.
In local server I have "/Source/Path/" I have files named like below:
Logfile_2019-04-24
Logfile_2019-04-24_old.txt
This is my current script:
dyear=`date +'%Y' -d "1 day ago"`
dmonth=`date +'%b' -d "1 day ago"`
ddate=`date +%Y-%m-%d -d "1 day ago"`
HOST='192.168.X.X'
USER='user'
PASSWD='password'
localpath='/Source/Path/'$dyear'/'$dmonth'/'*$ddate*'.txt'
remotepath='/Destination/Path/'$dyear'/'$dmonth'/'
echo $localpath
echo $remotepath
export SSHPASS=$PASSWD
sshpass -e sftp $USER#$HOST << EOF
put '$localpath' '$remotepath'
EOF
When I do echo $localpath it prints the correct file but in the script I get this error:
Connecting to 192.168.X.X...
sftp> put '/Source/Path/2019/Apr/*2019-04-24*' '/Destination/Path/2019/Apr/'
stat /Source/Path/2019/Apr/*2019-04-24*: No such file or directory
How would be the correct regex in this pasrt *$ddate*'.txt' in followingline:
localpath='/Source/Path/'$dyear'/'$dmonth'/'*$ddate*'.txt'
in order to transfer the file "Logfile_2019-04-24_old.txt"?
Thanks in advance
Replace
put '$localpath' '$remotepath'
with
put "$(echo $localpath)" '$remotepath'
to force wildcard (*) replacement in your here-doc.
This does not work if your wildcard is replaced by multiple files.
I don't think you need a regex for this problem. You can get the latest file created in the directory by the following shell command and assign it to your localpath variable.
ls -t directoryPath | head -n1
latestfile=`ls -t /Source/Path/$dyear/$dmonth | head -n1`
localpath='/Source/Path/'$dyear'/'$dmonth'/'$latestfile''
remotepath='/Destination/Path/'$dyear'/'$dmonth'/'
If you are able to get the filename, source and destination directories properly, you can directly use scp to copy the file to remote server:
sshpass -p $PASSWD scp $localpath $USER#$HOST:$remotepath

Creating a concatenated file in unix then mailing that file to you all in the same script

I am trying to create a script which will concatenate all the out.* files in the directory /home/rfranklin/stackDump/ and then pipe that concatenated file to a mailx command - so I can mail it to myself
I've so far tried two methods and neither of them seem to be working. Hoping someone can tell me why!
So far in the /home/rfranklin/stackDump/ directory I have the files:
out.file1
out.file2
out.file3
out.file4
otherfile.txt
otherfile2.txt
First of all I tried to write a for loop:
#!/bin/bash
#
OUT_FILES_DIRECTORY="/home/rfranklin/stackDump/out.*"
for file in $OUT_FILES_DIRECTORY
do
cat $file > stack_dump_`date +%Y%m%d` | mailx -s stack_dump_`date +%Y%m%d` rfranklin#gmail.com
done
This returns:
Null message body; hope that's ok
Null message body; hope that's ok
Null message body; hope that's ok
Null message body; hope that's ok
And I receive 4 blank emails. BUT the concatenated file is created so I know something is working.
Next I tried to use a here document:
#!/bin/bash
#
bash <<'EOF'
cd /home/rfranklin/stackDump
cat out.* > stack_dump_`date +%Y%m%d` | mailx -s stack_dump_`date +%Y%m%d` rfranklin#gmail.com
done
EOF
This does not work for me either. Where am I going wrong!
Thanks
I don't see the point of creating a file here, you could just as easily pipe the output of cat to mailx
cat /home/rfranklin/stackDump/out.* |
mailx -s "stack_dump_$(date +%Y%m%d)" rfranklin#gmail.com
If you prefer an attachment to content in the mail body
cat /home/rfranklin/stackDump/out.* |
uuencode "stack_dump_$(date +%Y%m%d)" |
mailx -s "stack_dump_$(date +%Y%m%d)" rfranklin#gmail.com
You can use tee for this:
#!/bin/bash
d=$(date +%Y%m%d)
for file in /home/rfranklin/stackDump/out.*
do
cat "$file" | tee -a "stack_dump_$d" | mailx -s "stack_dump_$d" rfranklin#gmail.com
done
tee copies standard input to a file, as well as to standard output. The -a option appends to the file rather than overwriting it.
In your original version of the script, the > was redirecting the output of cat to the file, which meant that the pipe to mailx was empty.
I am assuming that your script doesn't run over more than one day, so I have moved the calls to date outside the loop.
From what I understand, you want to concatenate all files in a given directory into one file, and then mail that to yourself. Correct me if I'm wrong.
So first concatenate all files into one:
cat /home/rfranklin/stackDump/out.* > concatFile
Then mail it to yourself:
dat=$(date +%Y%m%d)
mail -s "stack_dump_$dat" rfranklin#gmail.com < concatFile
Edit
You can put it in a script:
dir=$1
cat $dir/out.* > concatFile
dat=$(date +%Y%m%d)
mail -s "stack_dump_$dat" rfranklin#gmail.com < concatFile
run it as so:
./script /home/rfranklin/stackDump
The concatFile will be created in your current directory.

Copy a list of files from a file

I have file containing a list of files separated by end of lines
$ cat file_list
file1
file2
file3
I want to copy this list of files with FTP
How can I do that ? Do I have to write a script ?
You can turn your list of files into list of ftp commands easily enough:
(echo open hostname.host;
echo user username;
cat filelist | awk '{ print "put " $1; }';
echo bye) > script.ftp
Then you can just run:
ftp -s script.ftp
Or possibly (with other versions of ftp)
ftp -n < script.ftp
Something along these lines - the somecommand depends on what you want to do - I don't get that from your question, sorry.
#!/bin/bash
# Iterate through lines in file
for line in `cat file.txt`;do
#your ftp command here do something
somecommand $line
done
edit: If you really want to persue this route for multiple files (you shouldn't!), you can use the following command in place of somecommand $line:
ncftpput -m -u username -p password ftp.server.com /remote/folder $line
ncftpput propably also takes an arbitrary number of files to upload in one go, but I havn't checked it. Notice that this approach will connect and disconnect for every single file!
Thanks for the very helpful example of how to feed a list of files to ftp. This worked beautifully for me.
After creating my ftp script in Linux (CentOs 5.5), I ran the script with:
ftp –n < ../script.ftp
My script (with names changed to protect the innocent) starts with:
open <ftpsite>
user <userid> <passwd>
cd <remote directory>
bin
prompt
get <file1>
get <file2>
And ends with:
get <filen-1>
get <filen>
bye

Resources