I would like to run a python script multiple times in a new terminal each time so I am making a bash script to do this.
#!/bin/bash
alias bot_1="cd ../Folder1"
alias bot_2="cd ../Folder2"
gnome-terminal
bot_1
python3 bot_one.py
gnome-terminal
bot_2
python3 bot_two.py
I would like to run a python script multiple times in a new terminal each time so I am making a bash script to do this.
With my script I have a new terminal which opens but the following commands are executed in the old and not the new
gnome-terminal has the ability to execute a command other than the default interactive shell.
gnome-terminal --working-directory ../Folder1 -- python3 bot_one.py
gnome-terminal --working-directory ../Folder2 -- python3 bot_two.py
I am hoping to write a small method that can interact with a subprocess (bash in this case) and should be able to both write commands and have those commands print their outback back to my shell when running the Ruby file.
So far, I can do something similar with this code:
require 'io/console'
#shell = IO.popen('/bin/bash', 'w')
def run(command)
puts command
#shell.puts command
puts 'Done'
end
run 'var=3'
run 'echo $var'
run 'sleep 2'
run 'ls docs'
#shell.close
And then when I run this code all of the Ruby code is printed first, and only later does any of the shell code get printed:
var=3
Done
echo $var
Done
sleep 2
Done
ls docs
Done
3
<ls output>
I was trying to read some of the tests for io/console as I'm almost certain there exists a really straightforward way to interact with a subprocess like this and get the output inline with the commands being run:
https://github.com/ruby/io-console/blob/master/test/io/console/test_io_console.rb
I'm creating the following .bash_profile (from linuxfromscratch guide) for lfs user:
exec env -i HOME=$HOME TERM=$TERM PS1='\u:\w\$ ' /bin/bash
When executing su - lfs I get:
[1]+ Stopped su - lfs
Executing fg resumes lfs' user shell. Why is this happening?
That's because exec executes the code in the current process. Normally a command is executed in a child shell/environment. Try the following:
$ bash # open second shell
$ exec false # close second shell
$ echo $? # get exit code
$ exit # close terminal
The man page isn't really helpful here. I often use exec if I run a script through a Qt process and it should end after some period of time, regardless whether the command if finished or not.
I'm using the sass-lint NPM package to style-check .scss files from within a Rake task, thus:
sass_lint_cmd = "sass-lint --config #{ui_library_path}/scss/.sass-lint.yml '#{ui_library_path}/scss/*.scss' -v -q --max-warnings=0"
output, status = Open3.capture2e(sass_lint_cmd)
raise IOError, output unless status == 0
This basically works, insofar as in the event of any linter warnings or errors the Rake task aborts and the sass-lint output, including errors, is dumped to the console.
However, when run directly, sass-lint produces nice colorized output. When captured by capture2e, the colors are lost.
I assume the issue is that sass-lint (or Node) detects it's not running in a TTY, and so outputs plain text. Is there some Process.spawn() option I can pass to Open3.capture2e(), or some other method, by which I can make it think it's running in a TTY?
(Note: I did look at Trick an application into thinking its stdout is a terminal, not a pipe, but the BSD version of script that ships with macOS doesn't seem to support either the --return or the -c options, and I'm running on macOS.)
Update: I tried script -q /dev/null and PTY.spawn() as per Piccolo's answer, but no luck.
script -q /dev/null … works from the command line, but doesn't work in Open3.capture2e() (it runs, but produces monochrome output and a spurious Bundler::GemNotFound stack trace).
As for PTY.spawn(), replacing the code above with the following:
r, _w, pid = PTY.spawn(scss_lint_command)
_, proc_status = Process.wait2(pid)
output, status = [r, proc_status.exitstatus]
(warn(output); raise) unless status == 0
the subprocess never seems to complete; if I ps in another terminal it shows as in interruptible sleep status. Killing the subprocess doesn't free up the parent process.
The same happens with the block form.
output, status = nil
PTY.spawn(scss_lint_command) do |r, _w, pid|
_, proc_status = Process.wait2(pid)
output, status = [r, proc_status.exitstatus]
end
(warn(output); raise) unless status == 0
Have you considered using Ruby's excellent pty library instead of Open3?
Pseudo terminals, per the thread you linked, seem to emulate an actual TTY, so the script wouldn't know it wasn't in a terminal unless it checked for things like $TERM, but that can also be spoofed with relative ease.
According to this flowchart, the downside of using pty instead of Open3 is that STDERR does not get its own stream.
Alternatively, per this answer, also from the thread you linked, script -q /dev/null $COMMAND appears to do the trick on Mac OS X.
On Macs, ls -G colorizes the output of ls, and as a brief test, I piped ls -G into cat as follows:
script -q /dev/null ls -G | cat
and it displayed with colors, whereas simply running
ls -G | cat
did not.
This method also worked in irb, again using ls -G:
$ touch regular_file
$ touch executable_file
$ mkdir directory
$ chmod +x executable_file
$ irb
2.4.1 :001 > require 'Open3'
=> true
2.4.1 :002 > output, status = Open3.capture2e("ls -G")
=> ["directory\nexecutable_file\nregular_file\n", #<Process::Status: pid 39299 exit 0>]
2.4.1 :003 > output, status = Open3.capture2e("script -q /dev/null ls -G")
=> ["^D\b\b\e[1m\e[36mdirectory\e[39;49m\e[0m \e[31mexecutable_file\e[39;49m\e[0m regular_file\r\n", #<Process::Status: pid 39301 exit 0>]
2.4.1 :004 >
I have issue about executing commands in python.
Problem is:
In our company we have bought commercial software that can be used either GUI or Command line interface. I have been assigned a task that automize it as possible as. First I thought about using CLI instead of GUI. But then i have encountered a problem about executing multiple commands.
Now, I want to execute CLI version of that soft with arguments and continue executing commands in its menu(I dont mean execute script with args again.I want , once initial commands executed , it will open menu and i want to execute soft's commands inside Soft's menu at background). Then redirect output to variable.
I know, I must use subprocess with PIPE , but I didn't manage it.
import subprocess
proc=subprocess.Popen('./Goldbackup -s -I -U', shell=True, stdout=subprocess.PIPE)
output=proc.communicate()[0]
proc_2 = subprocess.Popen('yes\r\n/dir/blabla/\r\nyes', shell=True, stdout=subprocess.PIPE)
# This one i want to execute inside first subprocess
Set stdin=PIPE if you want to pass commands to a subprocess via its stdin:
#!/usr/bin/env python
from subprocess import Popen, PIPE
proc = Popen('./Goldbackup -s -I -U'.split(), stdin=PIPE, stdout=PIPE,
universal_newlines=True)
output = proc.communicate('yes\n/dir/blabla/\nyes')[0]
See Python - How do I pass a string into subprocess.Popen (using the stdin argument)?