I have to write a single command that runs the date, ls, and pwd utilities at the same time and redirects their output to a text file. I can't seem to figure this one out, a single command is fine but three is the issue
You should execute the commands in a subshell then redirect to your out file.
(date; ls; pwd) > /path/to/file
Hope this helps.
Related
I googled this command but there was not.
grep -m 1 "\[{" xxx.txt > xxx.txt
However I typed this command, error didn't occured.
Actually, there was not also result of this command.
Please anyone explain me this command's working?
This command reads from and writes to the same file, but not in a left-to-right fashion. In fact > xxx.txt runs first, emptying the file before the grep command starts reading it. Therefore there is no output. You can fix this by storing the result in a temporary file and then renaming that file to the original name.
PS: Some commands, like sed, have an output file option which works around this issue by not relying on shell redirects.
I'm trying to run several sets of commands in parallel on a few remote hosts.
I've created a script that constructs these commands, and then writes the output in a local file, something along the lines of:
ssh <me>#<ip1> "command" 2> ./path/to/file/newFile1.txt & ssh <me>#<ip2>
"command" 2> ./path/to/file/newFile2.txt & ssh <me>#<ip2> "command" 2>
./path/to/file/newFile3.txt; ...(same repeats itself, with new commands and new
file names)...
My issue is that, when my script runs these commands, I am getting the following errors:
bash: ./path/to/file/newFile1.txt: No such file or directory
bash: ./path/to/file/newFile2.txt: No such file or directory
bash: ./path/to/file/newFile3.txt: No such file or directory
...
These files do NOT exist but will be written. That being said, the directory paths are valid.
The strange thing is that, if I copy and paste the whole big command, then it works without any issue. I'd rather have it automated tho ;).
Any ideas?
Edit - more information:
My filesystem is the following:
- home
- User
- Desktop
- Servers
- Outputs
- ...
I am running the bash script from home/User/Desktop/Servers.
The script creates the commands that need to be run on the remote servers. First thing first, the script creates the directories where the files will be stored.
outputFolder="./Outputs"
...
mkdir -p ${outputFolder}/f{fileNumb}
...
The script then continues to create the commands that will be called on remotes hosts, and their respective outputs will be placed in the created directories.
The directories are there. Running the commands gives me the errors, however printing and then copying the commands into the same location works for some reason. I have also tried to give the full path to directory, still same issue.
Hope I've been a bit clearer.
If this is the exact error message you get:
bash: ./path/to/file/newFile1.txt: No such file or directory
Then you'll note that there's an extra space between the colon and the dot, so it's actually trying to open a file called " ./path/to/file/newFile1.txt" (without the quotes).
However, to accomplish that, you'd need to use quotes around the filename in the redirection, as in
something ... 2> " ./path/to/file/newFile1.txt"
Or the first character would have to something else than a regular space. A non-breaking space perhaps, possible something that some editor might create if you hit alt-space or such.
I don't believe you've shown enough to correctly answer the question.
This doesn't look like a problem with ssh, but the way you are calling the (ssh) commands.
You say that you are writing the commands into a file... presumably you are then running that file as a script. Could you show the code you use to do that. I believe that's your problem.
I suspect you have made a false assumption about the way the working directory changes when you run a script. It doesn't. You are listing relative paths, so its important to know what they are relative to. That is the most likely reason for it working when you copy and paste it... You are executing from a different working directory.
I am new to bash scripting and was building my script based on another one I had seen. I was "running" the command by simply calling the variable where the command was stored:
$cmd
Solved by using:
eval $cmd
instead. My bad, should have given the full script from the start.
I have a third-part software that accepts command line arguments. I want to pipe the output in a file. I have found that for some inexplicable reasons the code hangs if I try:
./run_third_part.py &> log
but it works if
./run_third_part.py
I believe that piping the output is messing up with the process of reading command line arguments, although other ideas are welcome. How can I isolate the program from the pipe command? (I was thinking about putting some sort of parentheses.)
Probably the script is waiting for input on an interactive prompt. The easiest way around this is usually to give it some input:
./run_third_part.py < /dev/null &> log
Can you try creating a subshell and run the script,
bash$ `./run_third_part.py` &> log
Please notice ` is not '
(single quote)
I'm trying to use a CGI script to run a find and replace command on a specific text file.
I currently have a CGI script (foo.sh) which then executes a non-CGI shell script (bar.sh). In the non-CGI shell script (bar.sh), I'm able to perform a number of simple bash commands such as wget, mkdir and cd, and I'm also able to execute a .js file with the standard dot-slash bash syntax.
However, I can't get any find and replace commands to work when executed with CGI. I've tried sed, awk and perl, all of which work perfectly when used either directly on the command line or if I execute bar.sh from the command line. But once I try to execute the CGI script from the browser, the find and replace commands no longer work.
The syntax I've tried is below. Any suggestions appreciated.
sed -i 's/foo/bar/g' text.txt
{ rm text.txt && awk '{gsub("foo", "bar", $0); print}' > text.txt; } < text.txt
perl -p -i -e 's/foo/bar/g' text.txt
just a few thoughts:
if run by the web server, the script will probably be executed as a different user which could lead to permission-related problems.
maybe the awk/sed etc. commands are not in the path used by the web server process (try to use an absolute path here as well)
is there anything in the web server's error log?
Thanks for the input Michael, and sorry for the late response. You were correct in that it was a permissions issue caused by the fact that the CGI script runs not as me but as the "www-data" user.
The problem occurred because that user didn't have write privileges to execute a sed find and replace command on the target folder (in this case a directory within /usr/share/).
I changed permissions on the target folder to full read/write everyone, and now the script runs successfully.
Thanks again.
I have a program (that I did not write) which is not designed to read in commands from a file. Entering commands on STDIN is pretty tedious, so I'd like to be able to automate it by writing the commands in a file for re-use. Trouble is, if the program hits EOF, it loops infinitely trying to read in the next command dropping an endless torrent of menu options on the screen.
What I'd like to be able to do is cat a file containing the commands into the program via a pipe, then use some sort of shell magic to have it switch from the file to STDIN when it hits the file's EOF.
Note: I've already considered using cat with the '-' for STDIN. Unfortunately (I didn't know this before), piped commands wait for the first program's output to terminate before starting the second program -- they do not run in parallel. If there's some way to get the programs to run in parallel with that kind of piping action, that would work!
Any thoughts? Thanks for any assistance!
EDIT:
I should note that my goal is not only to prevent the system from hitting the end of the commands file. I would like to be able to continue typing in commands from the keyboard when the file hits EOF.
I would do something like
(cat your_file_with_commands; cat) | sh your_script
That way, when the file with commands is done, the second cat will feed your script with whatever you type on stdin afterwards.
Same as Idelic answer with more simple syntax ;)
cat your_file_with_commands - | sh your_script
I would think expect would work for this.
Have you tried using something like tail -f commandfile | command I think that should pipe the lines of the file to command without closing the file descriptor afterwards. Use -n to specify the number of lines to be piped if tail -f doesn't catch all of them.