I'm running ycsb, which sends workload generated by YCSB to mongodb and it has a standard output, which I am storing in the file outputLoad.
./bin/ycsb load mongodb -s -P workloads/workloada -p mongodb.database=ycsb > outputLoad
The -s parameter in the command tells it to generate a client report status. The report status is printed directly to my terminal. How can I get this status into a log file?
Redirect standard error (file descriptor 2) to a file.
./bin/ycsb [...options...] > outputLoad 2> mylog.log
Related
I have a Go binary file called "runme" that successfully runs like so:
./runme encrypt --password=password < plaintext.txt > encrypted.txt
It successfully reads in a file called "plaintext.txt" and outputs an encrypted file called "encrypted.txt".
Now I would like to use the dlv debugger for Go to debug it like so:
dlv exec ./runme -- encrypt -password=password < plaintext.txt > encrypted.txt
However I get the following error message from the dlv debugger:
Stdin is not a terminal, use '-r' to specify redirects for the target process or --allow-non-terminal-interactive=true if you really want to specify a redirect for Delve
So I try again slightly differently:
dlv exec -r ./runme -- encrypt -password=password < plaintext.txt > encrypted.txt
But I get the exact same error message shown above. Then I try the following:
dlv exec --allow-non-terminal-interactive=true ./runme -- encrypt -password=password < plaintext.txt > encrypted.txt
This time I get a different error message:
Command failed: command not available
What seems like a simple thing I am not able to do in the debugger. What could I be doing wrong?
With help from #tkausl and #gopher I was able to figure it out.
Solution is:
dlv exec -r stdin:plaintext.txt -r stdout:encrypted.txt ./runme -- encrypt -password=password
I have a script that automate the rebuilding of mongo db's for our servers:
#!/bin/sh
mongo local:host 127.0.0.1 mongodb-create-ourdatabase.js > validate.txt
mongoimport --host 127.0.0.1 --db ourdatabase --collection ourUser --file create-ourUser.js > validate.txt
The output of the first line when the database is created writes to file, but the output of the second line, where the collection ourUser is created outputs to screen.
What am I missing?
First, both calls create an empty, new validate.txt file. So second call clobbers first call result. I doubt that this is what you want, so you should change the second > by >> to append to your logfile.
Second, executables issue output through 2 screen channels: standard output (aka stdout, used for normal output, results) and standard error (aka stderr, used for warnings and errors). It is not possible to know which stream is the target by looking at the output.
To merge both streams and get all process output, you have to pipe stderr to stdout to be able to redirect, using 2&>1 (dup & close pipe 2=stderr to 1=stdout)
mongo local:host 127.0.0.1 mongodb-create-ourdatabase.js 2&>1 > validate.txt
mongoimport --host 127.0.0.1 --db ourdatabase --collection ourUser --file create-ourUser.js 2&>1 >> validate.txt
Thanks for the response Jean-Francois, unfortunately that did not work, but it was close. What worked was:
#!/bin/sh
mongo localhost:27017 mongodb-create-our-database.js 2>&1 > validate.txt
mongoimport --host 127.0.0.1 --db ourdatabase --collection ourUser --file create-ourUser.js >> validate.txt 2>&1
Using 2&>1 had the script looking for file 2, and I found this excellent explanation:
Scroll down to 1st answer
which worked for me.
I'm trying to make a remote mysqldump and afterwards download it with rsync which is all working good, but I also want to log the remote errors I get which I now only see in the terminal output.
I mean errors like this mysqldump: Got error: 1044: Access denied for user 'root'#'localhost' to database 'information_schema' when using LOCK TABLES?
This is the important part of my code:
MYSQL_CMD="mysqldump -u ${MYSQL_USER} -p${MYSQL_PASS} $db -r /root/mysql_${db}.sql"
$SSH -p ${SSH_PORT} ${SSH_USER}#${SSH_HOST} "${MYSQL_CMD}" >> "${LOGFILE}"
In my research I only found solutions for getting the exit code and return values.
I hope someone can give me a hint, thanks in advance.
These error messages are being written to stderr. You can redirect this to a file using 2> or 2>> just like you do for stdout with > and >>. Eg:
ssh ... 2>/tmp/logerrors
Note there is no space between 2 and >. You can merge stderr into the same file as stdout by replacing your >> "${LOGFILE}" with
ssh ... &>> "${LOGFILE}"
Again, no space in &>, which can also be written >&.
I want to ask how save wget or curl output which is writing to terminal.
For example:
wget -O - "some file im downloading"
Now terminal shows me how much of file was downloaded, what is current download speed.
So I want to know how to save all these changing values to a file
The status information of wget is always printed to stderr (channel 2). So you can redirect that channel to a file:
wget -O - "some file im downloading" >downloaded_file 2>wget_status_info_file
Channel 1 (stdout) is redirected to the file downloaded_file and stderr to wget_status_info_file.
I am using this command to export.
export PGPASSWOD=${PASSWORD}
pg_dump –i –b -o -host=${HOST} -port=5444 -username=${USERNAME} -format=c -schema=${SCHEMA} --file=${SCHEMA}_${DATE}.dmp ${HOST}
Just want to know how can i include the log file in it so that i can get logs also.
I assume you mean you want to capture any errors, notifications, etc that are output by pg_dump in a file.
There is no specific option for this, but pg_dump will write these to STDERR, so you can easily capture them like this:
pg_dump –i –b -o ...other options ... 2> mylogfile.log
In a shell, 2> redirects STDERR to the given file.
This advice is good for nearly any command line tool you are likely to find on a *nix system.