I want to ask how save wget or curl output which is writing to terminal.
For example:
wget -O - "some file im downloading"
Now terminal shows me how much of file was downloaded, what is current download speed.
So I want to know how to save all these changing values to a file
The status information of wget is always printed to stderr (channel 2). So you can redirect that channel to a file:
wget -O - "some file im downloading" >downloaded_file 2>wget_status_info_file
Channel 1 (stdout) is redirected to the file downloaded_file and stderr to wget_status_info_file.
Related
I'm running ycsb, which sends workload generated by YCSB to mongodb and it has a standard output, which I am storing in the file outputLoad.
./bin/ycsb load mongodb -s -P workloads/workloada -p mongodb.database=ycsb > outputLoad
The -s parameter in the command tells it to generate a client report status. The report status is printed directly to my terminal. How can I get this status into a log file?
Redirect standard error (file descriptor 2) to a file.
./bin/ycsb [...options...] > outputLoad 2> mylog.log
I want download a bunch of .txt.gz files by ftp. I've written this shell script. How do I get all the files on the sever with out specifying each file?
Some code..
#!/bin/bash
ftp -i -n <<Here
open ftplink.com
user Username password
bin
get XXX_xxxx_mp.txt.gz
get XXX_xxxx_mp.txt.gz
close
quit
Here
Use wget instead:
wget ftp://user:pass#example.com/dir/*_mp.txt.gz
I am using this command to export.
export PGPASSWOD=${PASSWORD}
pg_dump –i –b -o -host=${HOST} -port=5444 -username=${USERNAME} -format=c -schema=${SCHEMA} --file=${SCHEMA}_${DATE}.dmp ${HOST}
Just want to know how can i include the log file in it so that i can get logs also.
I assume you mean you want to capture any errors, notifications, etc that are output by pg_dump in a file.
There is no specific option for this, but pg_dump will write these to STDERR, so you can easily capture them like this:
pg_dump –i –b -o ...other options ... 2> mylogfile.log
In a shell, 2> redirects STDERR to the given file.
This advice is good for nearly any command line tool you are likely to find on a *nix system.
Using Bash scripting, I'm trying to upload the content of variable into an FTP server.
The variable is $HASHED and it contains some hashed password
echo $HASHED
The output of the above command is: M0eSl8NR40wH
I need to do the following:
Create a time/date stamped file in the FTP server (i.e. PASSWORD_18_02_2014)
The file needs to have the same content of the $HASHED value (i.e. the PASSWORD_18_02_2014 needs to have M0eSl8NR40wH inside it).
Trying Curl, I couldn't get it working using the following:
UPLOAD="curl -T $HASHED ftp://192.168.0.1/passwords/ --user username:password"
$UPLOAD
Your help is very much appreciated.
Something like this might help you (tested on Linux Mint 13):
#!/bin/bash
FILENAME=PASSWORD_`date +%d_%m_%Y`
echo $HASHED >$FILENAME
ftp -n your_ftp_site <<EOF
user your_user_name_on_the_ftp_server
put $FILENAME
EOF
rm $FILENAME
A few caveats:
You have to export HASHED, e.g. when you set it, set it like this: export HASHED=M0eSl8NR40wH
The above assumes you will be running this from a terminal and can type in your password when prompted
You may have to add some cd commands after the line that starts "user" and before the line that starts "put", depending on where you want to put the file on your ftp server
Don't forget to make the script executable:
chmod u+x your_command_script_name
You can code the password after the user name on the line that starts "user", but this leaves a big risk that someone can discover your password on the ftp server. At the very least, make the bash command script readable only by you:
chmod 700 your_command_script_name
Please try this:
HASHED="M0eSl8NR40wH"
echo "$HASHED" | curl --silent --show-error --upload-file \
-ftp://192.168.0.1/passwords/$(date +PASSWORD_%d_%m_%Y) --user username:password
Where:
--silent : prevents the progress bar
--show-error : shows errors if any
--upload-file - : get file from stdin
The target name is indicated as part of the URL
I have the following script (let's call it move_site.sh) that copies a website directory structure to another server
#!/bin/bash
scp -r /usr/local/apache2/htdocs/$1 http#$2:/local/htdocs 1>$1$2.out 2>&1
So calling it from the command line, I pass it the webiste site directory name, and destination server as such:
nohup ./move_site.sh site1 server1 &
However, in the resulting that is named site1server1.out, there are only stderr messages, if any.
Can someone tell me how I can get the file and directory names that are copied, included in the output file, so that I have some kind of record?
Thanks.
A quick try :
Maybe it is because when everything went fine, scp doesn't print anything to stdout (?).
Have a try : run your scp command outside the script, most probably you don't have anything on std out. (redirect nothing to $1$2.out, it's still nothing :))
I don't think it is possible with scp but with rsync you can track what has been transferred to stdout. So changing scp -r by rsync -r -v -e should does the trick. (at least if you can go for rsync unstead of scp).