I was wondering how to execute all the commands present in a text file at the same time in Linux.
Brief background-
I have created a text file with content as below,
nohup execute_command1
nohup execute_command2
.
.
.
nohup execute_command30
And now I want to execute all the commands present in the text file at the same time in Linux server.
How do I do that?
Put & at the end of each line.
You have already created a file, you could turn this into a script by adding a hash bang to the top of the file (for bash use #!/usr/bin/env bash )
Then you can make that script executable by running chmod +x filename Then run the script with ./filename
This will run each of your commands in order, to run them all at the same time put & at the end of each command (as mentioned by #bib).
You file should look like
#!/usr/bin/env bash
command1 options &
command2 options &
....
commandn options &
All of the processed will be running in the background and the script will end. If these are long running processes you will need to find and kill the process once you are finished with it.
Related
I need some help:
(On macos, bash shell)
If I run a .sh file which calls e.g. exit 1 (any exit code) my terminal session ends (and the iterm2 tab/window closes).
I'm calling the script like this $ . myscript.sh
I'm pretty sure it should not be like that or was not like this a while before.
Using:
. myscript.sh
You are actually running the script in the existing shell or "sourcing" the script. With exit at the end of the script, this means that the terminal session will also exit
Alternatively:
./myscript.sh
or
bash myscript.sh
Will run the script in a separate bash shell and stop the terminal session from exiting.
Instead of . myscript.sh you can run ./myscript.sh which will run it in a separate bash shell and will not exit the current session.
If you control the content of this .sh file, and you do want to source the script - simply return 1 instead of exit 1, and use proper error handling.
I'm novice to running bash script. (you can suggest me, if title I've given is incorrect.)
I want to run a jar file using bash script in loop. Then it should write the output of jar command into some file.
Bash file datagenerate.sh
#!/bin/bash
echo Total iterations are 500
for i in {1..500}
do
the_output="$(java -jar data-generator.jar 10 1 mockData.csv data_200GB.csv)"
echo $the_output
echo Iteration $i processed
done
no_of_lines="$(wc -l data_200GB.csv)"
echo "${no_of_lines}"
I'm running above script using command nohup sh datagenerate.sh > datagenerate.log &. As I want to run this script in background, so that even I log out from ssh it should keep running & output should go into datagenerate.log.
But when I ran above command and hit enter or close the terminal it ends the process. Only Total iterations are 500 is getting logged into output file.
Let me know what I'm missing. I followed following two links to create above shell script: link-1 & link2.
nohup sh datagenerate.sh > datagenerate.log &
nohup should work this way without using screen program, but depending on your distro your sh shell might be linked to dash.
Just make your script executable:
chmod +x datagenerate.sh
and run your command like this:
nohup ./datagenerate.sh > datagenerate.log &
You should check this out:
https://linux.die.net/man/1/screen
With this programm you can close your shell while a command or script is still running. They will not be aborted and you can pick the session up again later.
I'm trying to execute commands sequentially through Python.
My aim is to do something like this:
command1
calculate something
command2
...
I want the CMD to remain opened after executing 'command1' , since 'command2' is dependent on 'command1'.
I've tried these answers with no results:
This one gives me the error:
ValueError: write to closed file
while executing two communicate commands.
Python Popen - how to execute commands in nested sub shell using python
Execute Commands Sequentially in Python
Thanks in advance!
I think you want this:
import os
os.system("start cmd /k command1 & command2")
unless you want it to quit after execting both of the commands:
import os
os.system("start cmd /c command1 & command2")
If you want to add more commands add & after command 2 and write your commands.
the first example will run both of the commands in the same window, but won't close it.
the second example will run both of the commands in the same window, and will close it afterwards
I have a bash script start.sh which calls another run.sh, which takes me to another prompt where I have to delete a file file.txt and then exit out of that prompt.
When I call run.sh from inside start.sh, I see the prompt and I believe that it deletes the file.txt but the inner/new prompt waits for me to exit out of it while the script is running - meaning it needs intervention to proceed. How do I avoid it in bash?
In Python I can use Popen and get it going but not sure about bash.
EDIT: I would rather like to know what command to provide to exit out of the shell (generated from running run.sh") so I can go back to the prompt where "start.sh" was started.
Etan: To answer your question
VirtualBox:~/Desktop/ > ./start
company#4d6z74d:~$ ->this is the new shell
company#4d6z74d:~$ logout ---> I did a "Control D here" so the script could continue.
Relevant part of start.sh which:
/../../../../run.sh (this is the one that takes us to the new $ prompt)
echo "Delete file.txt "
rm -f abc/def/file.txt
You can run run.sh in the background using &. In start.sh, you would invoke the script via /path/run.sh &. Now, start.sh will exit without waiting for run.sh to finish (which is running in the background).
When I run nohup some_command &, the output goes to nohup.out; man nohup says to look at info nohup which in turn says:
If standard output is a terminal, the
command's standard output is appended
to the file 'nohup.out'; if that
cannot be written to, it is appended
to the file '$HOME/nohup.out'; and if
that cannot be written to, the command
is not run.
But if I already have one command using nohup with output going to /nohup.out and I want to run another, nohup command, can I redirect the output to nohup2.out?
nohup some_command &> nohup2.out &
and voila.
Older syntax for Bash version < 4:
nohup some_command > nohup2.out 2>&1 &
For some reason, the above answer did not work for me; I did not return to the command prompt after running it as I expected with the trailing &. Instead, I simply tried with
nohup some_command > nohup2.out&
and it works just as I want it to. Leaving this here in case someone else is in the same situation. Running Bash 4.3.8 for reference.
Above methods will remove your output file data whenever you run above nohup command.
To Append output in user defined file you can use >> in nohup command.
nohup your_command >> filename.out &
This command will append all output in your file without removing old data.
As the file handlers points to i-nodes (which are stored independently from file names) on Linux/Unix systems You can rename the default nohup.out to any other filename any time after starting nohup something&. So also one could do the following:
$ nohup something&
$ mv nohup.out nohup2.out
$ nohup something2&
Now something adds lines to nohup2.out and something2 to nohup.out.
my start.sh file:
#/bin/bash
nohup forever -c php artisan your:command >>storage/logs/yourcommand.log 2>&1 &
There is one important thing only. FIRST COMMAND MUST BE "nohup", second command must be "forever" and "-c" parameter is forever's param, "2>&1 &" area is for "nohup". After running this line then you can logout from your terminal, relogin and run "forever restartall" voilaa... You can restart and you can be sure that if script halts then forever will restart it.
I <3 forever