Scenario:
I have a pipeline in a bash script and a list of some process along with their arguments. I want to run a python script after the execution of each process (executable) in the pipeline if the process is in my list.
(I use Python 2.7)
My proposed solution:
Using a python wrapper script. I have replaced all executable in the pipeline with my custom python script which:
1) checks the process if is in the list then FLAG=True
2) execute the process by the original executable using subprocess.Popen(process.command, shell=True).communicate()
3) if FLAG==True then do something.
Problem:
Using the current solution when I run the process using
subprocess.Popen().communicate(), the processes will execute separately
and they cannot get the output of inner process (child process) to the outer process (parent).
For example:
#!/bin/bash
Mean=`P1 $Image1 -M`
P2 "$Image2" $Mean -F
We have not output value of Mean in the second line execution.
Second line will execute like:
subprocess.Popen("P2 $Image2 \nP1 $Image1 -M -F" , shell=True).communicate()
Therefore, it returns an error!
Is there a better way in python to execute process like this?
Please let me know if there is any other suggestion for this scenario (I'm a very beginner in bash).
There's no need to use bash at all.
Assuming modern Python 3.x:
#!/usr/bin/env python
import subprocess
image1 = sys.argv[1]
image2 = sys.argv[2]
p1 = subprocess.run(['P1', image1, '-M'], check=True, capture_output=True)
p2 = subprocess.run(['P2', image2, p1.stdout.strip(), '-F'], check=True, capture_output=True)
print(p2_result.stdout)
See here that we refer to p1.stdout.strip() where we need the mean value in p2's arguments.
Related
I'm trying to use pexpect to test application which spawns podman container. I'm using sendline() however its arguments are just send to the child but not executed, as if there was no 'return'.
Once I do child.interact() whole previously sent content is executed at once. But I cannot use interact in my code.
Any idea what to change child's bash process executes its input after sendline?
Using pexpect 4.8.0 from PyPI with python3-3.11.1-1.fc37.x86_64 on Fedora 37.
import pexpect
l = open('/tmp/pexpect_session.log', 'wb')
child = pexpect.spawn('podman run --rm -ti fedora bash', logfile=l)
# few seconds delay
child.sendline('echo hello world;')
print(child.buffer)
At this point child.buffer contains only b'' and logfile has only the contant I sent via sendfile. Not the output of the command itself.
If I run child.interact() at this point the echo command is executed.
I want to execute some kind of bash script in Robot Framework.
In terminal I use that command:
bash /home/Documents//script.sh --username=root --password=hello --host=100.100.100.100 --port=400 - --data='{"requestId":1,"parameters":{"name":"check","parameters":{"id":"myID"}}}'
and it works
In robot script I try with:
Running script
${result} = Run Process bash /home/Documents//script.sh "username\=root" "password\=hello" "host\=100.100.100.100" "port\=400" "data\='{"requestId":1,"parameters":{"name":"check","parameters":{"id":"myID"}}}'" shell=True stdout=stdout.txt
Log To Console ${result}
Log ${result}
Log ${result.stdout}
Log ${result.stderr}
But I get Missing required arguments: username, password, host, port.
Process doesn't recognise arguments.
How to pass script arguments in Robot Framework with Process Library?
Please show examples, I checked already doc in Process Library for Specifying command and arguments but I don't understand it.
After the night I found solution:
Running script
${result} = Run Process bash /home/Documents//script.sh username\=root password\=hello host\=100.100.100.100 port\=400 data\='{"requestId":1,"parameters":{"name":"check","parameters":{"id":"myID"}}}' shell=True stdout=stdout.txt
Options should be unquoted but = should be escaped with \
My requirement is to run multiple shell scripts at a time.
After searching on Google could conclude that I can use "&" at the end of filename while triggering the run like:
sh file.sh &
the thing is I have for loop which generates the values and gives runtime parameters for the shell script:
sample code:
declare -a arr=("1" "2")
for ((i=0;i<${#arr[#]};++i));
do
sh fileto_run.sh ${arr[i]}
done
this successfully triggers the fileto_run.sh in parallel but it hangs there itself.. imagine I have echo statement in the script then the following is how the code hangs:
-bash-x.x$ 1
2
until I use ctrl+c the code execution wont exit.
I thought of using a break statement but that breaks the loop.
Am I doing wrong anywhere?
Please do correct me.
I have issue about executing commands in python.
Problem is:
In our company we have bought commercial software that can be used either GUI or Command line interface. I have been assigned a task that automize it as possible as. First I thought about using CLI instead of GUI. But then i have encountered a problem about executing multiple commands.
Now, I want to execute CLI version of that soft with arguments and continue executing commands in its menu(I dont mean execute script with args again.I want , once initial commands executed , it will open menu and i want to execute soft's commands inside Soft's menu at background). Then redirect output to variable.
I know, I must use subprocess with PIPE , but I didn't manage it.
import subprocess
proc=subprocess.Popen('./Goldbackup -s -I -U', shell=True, stdout=subprocess.PIPE)
output=proc.communicate()[0]
proc_2 = subprocess.Popen('yes\r\n/dir/blabla/\r\nyes', shell=True, stdout=subprocess.PIPE)
# This one i want to execute inside first subprocess
Set stdin=PIPE if you want to pass commands to a subprocess via its stdin:
#!/usr/bin/env python
from subprocess import Popen, PIPE
proc = Popen('./Goldbackup -s -I -U'.split(), stdin=PIPE, stdout=PIPE,
universal_newlines=True)
output = proc.communicate('yes\n/dir/blabla/\nyes')[0]
See Python - How do I pass a string into subprocess.Popen (using the stdin argument)?
I am writing a Python script that will help me submit homework assignments. What I want to do is pass the login credentials I have stored in a file to a submission script, after which I will hand over to the user for further input. I have come up with the shell command, cat login_credentials.txt /dev/stdin | python3 submit_hw0.py. I am almost finished, I just don't know how to execute the shell command cat login_credentials.txt /dev/stdin | python3 submit_hw0.py, where submit_hw0.py is replaced by an arbitrary Python file I have stored in a variable named submission_script. Also, I have tried to say:
import subprocess
subprocess.call("cat login_credentials.txt /d")
However, python3 just closes with no output.
Any help would be appreciated,
Kind regards, Kabelo Moiloa.
Take a look on the Popen class of subprocess module (link). You can pass stdin and stdout parameters to it. That should solve your problems.