On windows XP, I have a bat file which calls a perl script, which itself shall contain a "make" command in a cygwin context. (I can't change this BAT -> PERL -> CYGWIN structure). This works a little bit but I don't find how to exit from the cygwin environment.
For example, in my perl script, this line seems to work :
print STDOUT "START PERL SCRIPT\n" ;
system ("$ENV{CYGWIN_HOME}\\bin\\bash | $ENV{CYGWIN_HOME}\\bin\\make -version");
print STDOUT "END OF PERL SCRIPT\n" ;
However, in my cmd.exe, this is my output:
START PERL SCRIPT
GNU Make 3.80 Copyright (C) 2002 Free Software
Foundation, Inc.
This is free software; see the source for copying
conditions. There is NO warranty; not even for MERCHANTABILITY or
FITNESS FOR A PARTICULAR PURPOSE.
bash-3.2$
My problem is that I stay in bash environment : the line "bash-3.2$" still waits for a command, and I cant' find a solution to exit automatically from it (the only way is to manually print "exit" in cmd.exe).
My perl script is thus stopped instead of automatically continuing its jobs.
I've not succeeded with solutions such as
"Using perl to run more than one system command in a row"
Do you know how to write the perl script and commands in order to get out of the bash environment and continue the perl script ?
Thanks a lot,
Calim
Update
So finally I have moved to another solution. It seems to work (i.e. I go back to my perl script after the shell command is executed) ; I just hope there will be no unexpected effect. Do you have any opinion about it ?
print STDOUT "START PERL SCRIPT\n";
open ( CYGWIN , "|-" , "$ENV{CYGWIN_HOME}\\bin\\bash" ) ;
print CYGWIN "$ENV{CYGWIN_HOME}\\bin\\make -w -C /cygdrive/c/tmp/Generation makefile" ;
close CYGWIN ;
print STDOUT "END OF PERL SCRIPT\n";
Why are you piping the ouptut of bash to make?!?!?!
system("$ENV{CYGWIN_HOME}\\bin\\make -version");
Or maybe you need some environment variables set up by bash.
system("$ENV{CYGWIN_HOME}\\bin\\bash -c 'make -version'");
So finally I have moved to another solution. It seems to work (i.e. I go back to my perl script after the shell command is executed) ; I just hope there will be no unexpected effect. Do you have any opinion about it ?
print STDOUT "START PERL SCRIPT\n";
open ( CYGWIN , "|-" , "$ENV{CYGWIN_HOME}\\bin\\bash" ) ;
print CYGWIN "$ENV{CYGWIN_HOME}\\bin\\make -w -C /cygdrive/c/tmp/Generation makefile" ;
close CYGWIN ;
print STDOUT "END OF PERL SCRIPT\n";
Related
I am running Windows 10 and am trying to save the error output of a test.sh file to a text file.
So I created the test.sh file and wrote an unkown command in it (i.e. "blablubb").
After that I open the terminal (cmd.exe), switch to the directory and type test.sh 2>> log.txt.
Another window opens with "/usr/bin/bash --login -i \test.sh" in the title bar, shows me "bash: blablubb: command not found" and then closes immediately.
I want to save that output because the bash-window just opens for a split second. Every google search brings me to websites talking about redirecting the output and that Stream2 ist STDERR and therefore I should use test.sh 2>> log.txt or something smiliar that takes care of the STDERR stream.
If I try the same with a test.sh file and the content:
#!/bin/bash
echo hi there
I get the output in the briefly open bash-window:
bash: #!/bin/bash: No such file or directory
hi there
But the log.txt file is empty.
If I only have echo hi therein the test.sh file I get bash: echo: command not found in the bash-window.
The log.txt also empty.
If I type the following directly in the terminal, the output is written in the log.txt:
echo hi > log.txt 2>&1
If I type directly in the terminal:
echdo hi > log.txt 2>&1
I get 'Der Befehl "echdo" ist entweder falsch geschrieben oder konnte nicht gefunden werden.' in the log.txt file.
So I guess the redirecting of the output works fine until I use test.sh.
I know that .sh files are something from the unix world and that the problem might lie there but I don't know why I can not redirect the output briefly shown in the bash-console to a text file.
The 2>> redirection syntax only works if the command line containing that syntax is interpreted by bash. So it won't work from the Windows command prompt, even if the program you are running happens to be written in bash. By the time bash is running, it's too late; it gets the arguments as they were interpreted by CMD or whatever your Windows command interpreter is. (In this case, I'm guessing that means the shell script will find it has a command line argument [$1] with the value "2".)
If you open up a bash window (or just type bash in the command one) and then type the test.sh 2>>log.txt command line in that shell, it will put the error message in the file as you expect.
I think you could also do it in one step by typing bash -c "test.sh 2>>log.txt" at the Windows command prompt, but I'm not sure; Windows quoting is different than *nix quoting, and that may wind up passing literal quotation marks to bash, which won't work.
Note that CMD does have the 2>> syntax as well, and if you try to run a nonexistent windows command with 2>>errlog.txt, the "is not recognized" error message goes to the file. I think the problem comes from the fact that CMD and bash disagree on what "standard error" means, so redirecting the error output from Windows doesn't catch the error output by bash. But that's just speculation; I don't have a bash-on-Windows setup handy to test.
It would help to know if you are running Windows Subsystem for Linux (Beta). Or if you are doing something else. I'm assuming this is what you are doing on windows 10.
If this is the case are you using bash to run the script?
Are you using win-bash?
If it is win-bash I'm not very familiar and would recommend Windows Subsystem for Linux (Beta) for reasons like this. win-bash, while cool, might not be compatible with redirection operators like 2>>.
You have stdout and stderr, by default (if you don't specify) the >> (or append) will only append standard output into the txt file.
If you use 2 it will append the standard error into the txt file. Example: test.sh 2>> log.txt
This could be better described at this site.
To get exactly the command for append both stdout and stderr, go to this page.
Please tell me if this doesn't answer your question. Also, it could be more valuable to attempt a search for this answer first and explain why your search found nothing or give more extensive clarification as to what the problem is. I love answering questions and helping, but creating a new forum page for what might be an easy answer may be ineffective. I've had a bunch of fun with your question. I hope that I've helped.
That's makes a lot of sense. Thanks Mark!
Taking what mark says into account I would get Windows Subsystem for Linux (Beta). There are instructions here. Then run your script from there.
I'm trying to run bash.exe (Bash on Ubuntu for Windows) as a build command for Sublime Text. However, bash.exe has a bug and does not support outputting its stdout to any pipe.
Question is this: how can I run a cmd line (i.e. "bash.exe -c ls") and capture the output without ever making bash.exe output into pipes on windows?
I'm open to using any languages or environment on Windows to make this tool.
Edit
I ran
bashTest = subprocess.Popen(["bash.exe", "-c", "ls"]), stdout=subproccess.PIPE)
Which yielded:
bashTest.communicate()[0] b'E\x00r\x00r\x00o\x00r\x00:\x00\x000\x00x\x008\x000\x000\x007\x000\x000\x005\x007\x00\r\x00\r\x00\n\x00'
This is currently not possible. There's a github issue about it which was closed as a known limitation. If you want to increase awareness of it, I see 2 related User Voice ideas: Allow Windows programs to spawn Bash and Allow native Win32 applications to launch Linux tools/commands.
There are ways you could hack around it, however. One way would be to write a script which loops forever in a bash.exe console. When the script gets a signal, it runs Linux commands with the output piped to a file then signals that it is complete. Here's some pseudo code:
Linux:
while true
while not exists /mnt/c/dobuild
sleep 1
end
gcc foo.c > /mnt/c/build.log
rm /mnt/c/dobuild
end
Windows:
touch C:\dobuild
while exists C:\dobuild
sleep 1
end
cat C:\build.log
This does require keeping a bash.exe console always open with the script running, which is not ideal.
Another potential workaround, which was already mentioned, is to use ReadConsoleOutput.
You need to use the option shell=True in Popen() to have pipes work.
like this example dont need to split this command.
>>> import subprocess as sp
>>> cmd = 'echo "test" | cat'
>>> process = sp.Popen(cmd,stdout=sp.PIPE,shell=True)
>>> output = process.communicate()[0]
>>> print output
test
Your only realistic option, if you can't wait for a fix, would be to use ReadConsoleOutput and/or the related functions.
I have a Java program that I'm running using a Bash script on Mac OS X. I have two files - a FIFO that allows me to pipe commands into the program, and an output log file.
The Bash script consists of the following code:
#!/bin/bash
java -jar file.jar <./run/command-fifo >>./run/server.log 2>&1 &
echo $! >| ./run/server.pid
I honestly can't remember why I used >| in the third line (I just know that it works). In the java line, the first < redirects the fifo file to standard input. The >> should redirect standard output to the file, and the 2>&1 should redirect standard error to it as well. It then runs in the background.
The problem is that nothing is ever written to the server.log file. The command-fifo file is read, but the log is not written. The program IS writing to standard output (if I run it on its own it works fine).
I also tried script as suggested in this question but it didn't work either:
script -q /dev/null java -Xmx4G -Xms4G -jar current.jar --nogui <./run/command-fifo >>./run/server.log 2>&1 &
Anyone have ideas to get this to write to the log properly?
FOLLOWUP: I should explain a bit more of how the software works for this explanation to make sense. There are three parts at work here:
A plist that launchd uses to start the program at boot by calling the launcher script
A launcher script that handles kill signals and waits for the pid of the java process
A start script, called by the launcher script, that launches the program and saves its pid
The script given above is the start script. This launches the java process, echoes its pid to a file, then returns. The launcher script (not given here) then waits for the pid to exit before terminating. If it terminates, launchd automatically relaunches the launcher script.
Launchd has a feature that can set the standard output path for the file it launches. Essentially, it redirects stdout of the launcher script to the given file.
I did this, and lo and behold, it works. By changing the start script line to the following:
java -jar file.jar <>./run/command-fifo &
it allows standard output to be captured by launchd and written to the file. It's a bit different solution, but it does in fact work. It's strange because the launcher script technically has nothing to output since the java process is in the background, but however it works, it does i fact work somehow.
Of course, I'd prefer to explicitly redirect the file's standard output into a file (in other scripts there may be cases where there are more than one and I need to keep them separate). I'm still going to experiment and try to find a solution.
I think #torek's comment about buffering is probably right on the money. You can force your java process to line-buffer its output using the stdbuf utility:
#!/bin/bash
stdbuf -oL java -jar file.jar <./run/command-fifo >>./run/server.log 2>&1 &
echo $! >| ./run/server.pid
Regarding the >| operator, #torek is also correct. Here is the bash manual entry.
There are plenty of threads here discussing how to do this for scripts or for the cmdline (mostly involving pipes, redirections, tee).
What I didn't find is a solution which can be set up once and then just works globally, without manipulating single scripts or adding something to every command line.
What I want to achieve is something like described in the top answer of
How do I write stderr to a file while using "tee" with a pipe?
Isn't it possible to configure the bash session so that all stderr output is logged to a file, while still writing it to console? Something I could add to .bashrc and thus automatically set up every time I login?
Software: Bash 4.2.24(1)-release (x86_64-pc-linux-gnu), xterm, Ubuntu 12.04
Try this variation on #0xC0000022L's previous solution (put it in your .bash_profile):
exec 2> >( tee log.file > /dev/tty )
A couple of caveats:
The prompt and anything you type at the command line are printed to stderr, and so will be logged in your file.
There could be an issue with the newline that terminates a command not being displayed in your terminal; I observe it on my Linux host, but not on my Mac OS X laptop. Perhaps someone else can explain and/or fix the issue. For example, if I type "echo stdout", I see the following:
$ echo stdoutstdout
$
Is it possible to call winrar through perl on a windows system, such as
perl -e "rar a -rr10 -s c:\backups\backup.rar #backup.lst"
If so, is there a more efficient way to do this?
I've looked up "perl -e" +winrar on google, however none of the results gave me any answer that was remotely close to what i was looking for. The system Im running this on is a Windows XP system. Im open to doing this in another language like python if its easier, however I am more comfertable with perl.
You can access the RAR facilities in Windows using the CPAN module Archive::Rar:
use Archive::Rar;
my $rar = Archive::Rar->new(-archive => $archive_filename);
$rar->Extract();
One way to execute external commands from a Perl script is to use system:
my $cmd = 'rar a -rr10 -s c:\backups\backup.rar #backup.lst';
if (system $cmd) {
print "Error: $? for command $cmd"
}
To use external applications from your Perl program, use the system builtin.
If you need the output from the command, you can use the backtick (``) or qx operator as discussed in perlop. You can also use pipes as discussed in perlipc.