Read output from subprocess without using pipes - windows

I'm trying to run bash.exe (Bash on Ubuntu for Windows) as a build command for Sublime Text. However, bash.exe has a bug and does not support outputting its stdout to any pipe.
Question is this: how can I run a cmd line (i.e. "bash.exe -c ls") and capture the output without ever making bash.exe output into pipes on windows?
I'm open to using any languages or environment on Windows to make this tool.
Edit
I ran
bashTest = subprocess.Popen(["bash.exe", "-c", "ls"]), stdout=subproccess.PIPE)
Which yielded:
bashTest.communicate()[0] b'E\x00r\x00r\x00o\x00r\x00:\x00\x000\x00x\x008\x000\x000\x007\x000\x000\x005\x007\x00\r\x00\r\x00\n\x00'

This is currently not possible. There's a github issue about it which was closed as a known limitation. If you want to increase awareness of it, I see 2 related User Voice ideas: Allow Windows programs to spawn Bash and Allow native Win32 applications to launch Linux tools/commands.
There are ways you could hack around it, however. One way would be to write a script which loops forever in a bash.exe console. When the script gets a signal, it runs Linux commands with the output piped to a file then signals that it is complete. Here's some pseudo code:
Linux:
while true
while not exists /mnt/c/dobuild
sleep 1
end
gcc foo.c > /mnt/c/build.log
rm /mnt/c/dobuild
end
Windows:
touch C:\dobuild
while exists C:\dobuild
sleep 1
end
cat C:\build.log
This does require keeping a bash.exe console always open with the script running, which is not ideal.
Another potential workaround, which was already mentioned, is to use ReadConsoleOutput.

You need to use the option shell=True in Popen() to have pipes work.
like this example dont need to split this command.
>>> import subprocess as sp
>>> cmd = 'echo "test" | cat'
>>> process = sp.Popen(cmd,stdout=sp.PIPE,shell=True)
>>> output = process.communicate()[0]
>>> print output
test

Your only realistic option, if you can't wait for a fix, would be to use ReadConsoleOutput and/or the related functions.

Related

running mulitple commands using Subprocess.Popen in the same shell

Hi I am at a loss on how to run multiple commands using popen,
I am trying to automate a series of steps that are normally run on the Windows command line. The basic steps are usually run from the Windows cmd line are
Run a windows command script (.cmd) file to setup environment variables i.e C:\Program Files (x86)\appsettings\setupvariables.cmd
type in the command to connect to the database
type in the command to get data from the database
Stop connection to the database
All these commands must run in the same command line window one after another, not separate processes or separate command line windows. Instead of opening a cmd window and typing in the command I want to use python's subprocess.popen command
So far I have:
args=[]
args.append(r'C:\Program Files (x86)\appsettings\setmyvars.cmd')
args.append(r'start db on db_path="my_url"')
args.append(r'get_data_from_db>c:\temp\output.txt')
args.append(r'stop db on db_path="my_url"')
p=Popen(args,stdout=PIPE,sterr=PIPE,shell=True)
stdout,stderr=p.communicate()
if stderr:
print "you have an error", stderr
else:
print "well done you have data", stdout
This isn't quite working I can see that the first line is run i.e the setmyvars.cmd is executed, but nothing else, none of the other arguments get called, if they did I would see the results in the ouput.txt file.
How do I run a series of commands one after the other using popen. Why is it only the first command seems to be executed and none of the others
I am using python2.7 on Windows
Regards.
You have a couple of issues going on. You still have to tell popen() which program to run. Just using shell=True does not obviate the need to provide cmd.exe as the program to run. If you really want to run all of these commands with one invocation of cmd.exe, then you will need to string them together with &&.
from subprocess import *
args=[]
args.append(r'C:\Windows\System32\cmd.exe')
args.append(r'/C')
args.append(r'(echo 1 && echo 2 && echo 4)')
p = Popen(args,stdout=PIPE,stderr=PIPE,shell=True)
stdout,stderr=p.communicate()
if stderr:
print "you have an error", stderr
else:
print "well done you have data", stdout
It would probably be better to use the %ComSpec% environment variable than it would be to hardcode the location of cmd.exe. The path you have is -usually- correct. :-)

How do I send commands to the ADB shell directly from my app?

I want to send commands in the ADB shell itself as if i had done the following in cmd.
>adb shell
shell#:/ <command>
I am using python 3.4 on a windows 7 OS 64bit machine. I can send one-line shell commands simply using subprocess.getoutput such as:
subprocess.getoutput ('adb pull /storage/sdcard0/file.txt')
as long as the adb commands themselves are recognized by ADB specifically, such as pull and push, however there are other commands such as grep that need to be run IN the shell, like above, since they are not recognized by adb. for example, the following line will not work:
subprocess.getoutput ('adb shell ls -l | grep ...')
To enter the commands in the shell I thought I needed some kind of expect library as that is what 'everyone' suggests, however pexpect, wexpect, and winexpect all failed to work. they were written for python 2 and after being ported to python 3 and my going through the .py files by hand, even those tweaked for windows, nothing was working - each of them for different reasons.
how can i send the input i want to the adb shell directly?
If none of the already recommended shortcuts work for you you can still go the 'regular' way using 'subprocess.Popen' for entering commands in the adb shell with Popen:
cmd1 = 'adb shell'
cmd2 = 'ls -l | grep ...'
p = subprocess.Popen(cmd1.split(), stdin=PIPE)
time.sleep(1)
p.stdin.write(cmd2.encode('utf-8'))
p.stdin.write('\n'.encode('utf-8'))
p.stdin.flush()
time.sleep(3)
p.kill()
Some things to remember:
even though you import subprocess you still need to invoke subprocess.Popen
sending cmd1 as a string or as items in a list should work too but '.split()' does the trick and is easier on the eyes
since you only specidfied you want to enter input to the shell you only need stdin=PIPE. stdout would only be necessary if you wanted to receive output from the shell
time.sleep(1) isn't really necessary, however since many complained about input issues being faster or slower in python 2 vs 3 consider maybe using it. 'they' might have been using versions of 'expect' that need the shell's reply first. this code also worked when i tested it with simply swapping out and in the process with time.sleep(0)
stdin.write will return an error if the input is not encoded properly. python's default is unicode. entering by binary did not work for me in my tests like this "b\ls ..." but .encode() worked. dont forget the endline!
if you use .encode() there is a worry that the line might not get sent properly, so to be sure it might be good to include a flush().
time.sleep(3) is completely uneccesary, but if your command takes a long time to execute (eg a regressive search through the entire device piped out to a txt file on the memory card) maybe give it some extra time before killing anyhting.
remember to kill. if you didnt kill it, the pipe may remain open, and even after exiting the test app on the console the next commend still went to the shell even though the prompt appearsed to be my regular cmd prompt.
Amichai, I have to start with pointing out that your own "solution" is pretty awful. And your explanation makes it even worse. Doing all those unnecessary things just because you do not understand how shell (here I mean your PC's OS shell, not adb) command parsing works.
When all you needed was just this one command:
subprocess.check_output(['adb', 'shell', 'ls /storage/sdcard0 | grep ...']).decode('utf-8')

log all stderr to file and console

There are plenty of threads here discussing how to do this for scripts or for the cmdline (mostly involving pipes, redirections, tee).
What I didn't find is a solution which can be set up once and then just works globally, without manipulating single scripts or adding something to every command line.
What I want to achieve is something like described in the top answer of
How do I write stderr to a file while using "tee" with a pipe?
Isn't it possible to configure the bash session so that all stderr output is logged to a file, while still writing it to console? Something I could add to .bashrc and thus automatically set up every time I login?
Software: Bash 4.2.24(1)-release (x86_64-pc-linux-gnu), xterm, Ubuntu 12.04
Try this variation on #0xC0000022L's previous solution (put it in your .bash_profile):
exec 2> >( tee log.file > /dev/tty )
A couple of caveats:
The prompt and anything you type at the command line are printed to stderr, and so will be logged in your file.
There could be an issue with the newline that terminates a command not being displayed in your terminal; I observe it on my Linux host, but not on my Mac OS X laptop. Perhaps someone else can explain and/or fix the issue. For example, if I type "echo stdout", I see the following:
$ echo stdoutstdout
$

spawning a process in ruby, capturing stdout, stderr, getting exist status

I want to run an executable from a ruby rake script, say foo.exe
I want the STDOUT and STDERR outputs from foo.exe to be written directly to the console I'm running the rake task from.
When the process completes, I want to capture the exit code into a variable. How do I achieve this?
I've been playing with backticks, process.spawn, system but I cant get all the behaviour I want, only parts
Update: I'm on Windows, in a standard command prompt, not cygwin
system gets the STDOUT behaviour you want. It also returns true for a zero exit code which can be useful.
$? is populated with information about the last system call so you can check that for the exit status:
system 'foo.exe'
$?.exitstatus
I've used a combination of these things in Runner.execute_command for an example.
backticks will get stdout captured into resulting string
foo.exe suggests you are running windows - do you have anything like cygwin installed? if you run your script within unixy shell you can do this:
result = `foo.exe 2>&1`
status = $?.exitstatus
quick googling says this should also work in native windows shell but i can't test this assupmtion

Terminal emulator implementation - problems with repeated input

I am trying to implement a terminal emulator in Java. It is supposed to be able to host both cmd.exe on Windows and bash on Unix-like systems (I would like to support at least Linux and Mac OS X). The problem I have is that both cmd.exe and bash repeat on their standard output whatever I send to their standard input.
For example, in bash, I type "ls", hit enter, at which point the terminal emulator sends the input line to bash's stdin and flushes the stream. The process then outputs the input line again "ls\n" and then the output of the ls command.
This is a problem, because other programs apart from bash and cmd.exe don't do that. If I run, inside either bash, or cmd.exe, the command "python -i", the python interactive shell does not repeat the input in the way bash and cmd.exe does. This means a workaround would have to know what process the actual output came from. I doubt that's what actual terminal emulators do.
Running "bash -i" doesn't change this behaviour. As far as I know, cmd.exe doesn't have distinct "interactive" and "noninteractive" modes.
EDIT: I am creating the host process using the ProcessBuilder class. I am reading the stdout and stderr and writing to the stdin of the process using a technique similar to the stream gobbler. I don't set any environment variables before I start the host process. The exact commands I use to start the processes are bash -i for bash and cmd for cmd.exe. I'll try to post minimal code example as soon as I manage to create one.
On Unix, run stty -echo to disable "local echo" (i.e. the shell repeating everything that you type). This is usually enabled so a user can edit what she types.
In your case, BASH must somehow allocate a pseudo TTY; otherwise, it would not echo every command. set +x would have a similar effect but then, you'd see + ls instead of ls in the output.
With cmd.exe the command #ECHO OFF should achieve the same effect.
Just execute those after the process has been created and it should work.

Resources