I found that you can run an external command from ruby, like this:
command = "find /home/user/workspace -name *.java"
%x(#{command})
and it works nice for commands that don't take too much time to execute, but for commands like the one above, which takes more time and progressively outputs the result, there's no way I can see the results until the command completes.
What I would like is to have the same look and feel as when the command is run directly from shell, in this particular case, as soon as a file is found, to show it on console.
Is this possible?
Use IO.popen or Open3.
IO.popen("echo 1; sleep 1; echo 2; sleep 1; echo 3") do |io|
io.each_line do |line|
puts line
end
end
Related
I have a program where i test different data sets and configuration. I have a script to execute all of those.
imagine my code as :
start = omp_get_wtime()
function()
end = omp_get_wtime()
print(end-start)
and the bash script as
for a in "${first_option[#]}"
do
for b in "${second_option[#]}"
do
for c in "${third_option[#]}"
do
printf("$a $b $c \n")
./exe $a $b $c >> logs.out
done
done
done
now when i execute the exact same configurations by hand, i get varying results from 10 seconds to 0.05 seconds but when i execute the script, i get the same results on the up side but for some reason i can't get any timings lower than 1 seconds. All the configurations that manually compute at less than a second get written in the file at 1.001; 1.102; 0.999 ect...
Any ideas of what is going wrong?
Thanks
My suggestion would be to remove the ">> logs.out" to see what happens with the speed.
From there you can try several options:
Replace ">> log.out" with "| tee -a log.out"
Investigate stdbuf and if your code is python, look at "PYTHONUNBUFFERED=1" shell variable. See also: How to disable stdout buffer when running shell
Redirect bash print command with ">&2" (write to stderr) and move ">> log.out" or "| tee -a log.out" behind the last "done"
You can probably see what is causing the delay by using:
strace -f -t bash -c "<your bash script>" | tee /tmp/strace.log
With a little luck you will see which system call is causing the delay on the bottom of the screen. But it is a lot of information to process. Alternatively look for the name of your "./exe" in "/tmp/strace.log" after tracing is done. And then look for the system calls after invocation (process start of ./exe) that eat most time. Could be just many calls ... Don't spent to much time on this if you don't have the stomach for it.
I have a simple shell script written in ruby that runs some predefined commands and saves the output strings.
The script works well, but I need a way branch conditionally if the command fails. I've tried using the $? object but the script exits before it gets there.
#!/usr/bin/ruby
def run_command(cmd)
`#{cmd}`
if $?.success?
# continue as normal
else
# ignore this command and move on
end
end
run_command('ls')
run_command('not_a_command')
# Output:
# No such file or directory - not_a_command (Errno::ENOENT)...
I've tried $?.exitstatus or even just puts $? but it always exits before it gets there because the script is obviously running the command before hitting that line.
Is there a way to check if the command will run before actually running it in the script?
Hope that's clear, thanks for your time!
Use system (which returns true or false depending on exit code) instead of backticks (which return output string):
if system(cmd)
...
else
...
end
If you want it to run quietly without polluting your logs / output:
system(cmd, out: File::NULL, err: File::NULL)
I have a ruby script that gets triggered every minute by a CRON job called 'process_files.rb'.
process_files.rb executes another ruby script called 'process_client.rb' with paramaters like so:
ruby process_client.rb -c ABC -s 123 -f /path/to/client/file/client_file
Since process_files.rb runs every minute I want to avoid running process_client.rb if a version of it is currently running with same -c parameter. So if process_client.rb is currently running for -c ABC, it will not
execute the ruby command above.
Both of the scripts are in the same directory called /cdir.
This is what I've got in process_files.rb but it is not working:
client = "ABC"
name = "process_client.rb -c #{client}"
needle = `find /cdir -maxdepth 1 -type f -name #{name}`.to_i
if needle > 0
puts "DONT RUN the ruby script for client abc"
else
puts "RUN the ruby script for client abc"
ruby process_client.rb -c ABC -s 123 -f /path/to/client/file/client_file
end
Then before I execute process.files.rb I execute process_client.rb for client ABC, which has some code to put it into sleep mode for 30 seconds, like so:
...some code
sleep 30
...some code
The problem is that process_files.rb never finds the current execution of process_client.rb for client ABC and executes another version when it should not.
Something is probably wrong with the find command but I don't know what it is.
I would use a Ruby script to run everything.
You can have your Ruby script launch your process_files.rb and process_client.rb using Process.spawn you can get the process PID to know if they are running or not.
Obivously you will need to modify the code below to suit your needs, but this should get you started.
def process_client(client)
client_pid = Process.spawn('/usr/bin/ruby', 'process_client.rb', '-c', client)
Process.detach(client_pid)
client_pid
end
def process_files(some_arg)
# do some stuff
end
process_client_pid = process_client('ABC')
loop do
begin
Process.getpgid(process_client_pid)
# This means the process_client.rb script is running
rescue Errno::ESRCH
# Code here for when the client is not being processed
process_client_pid = process_client('ABC')
process_files(some_arg)
sleep 30
end
I think your problem is that you are using the wrong strategy to detect a match from find. It will return a list of filespecs (possibly empty) separated by new lines. Calling to_i on that list doesn't make sense. You could instead pipe it to word count to get the number of lines, and convert that to an int:
`find ... | wc -l`.to_i
Also, in the line that assigns a value to name, did you mean to use backticks rather than double quotes?
I'm converting an XLS 2 CSV file with a system command in Ruby.
After the conversion I'm processing the CSV files, but the conversion is still running when the program wants to process the files, so at that time they are non-existent.
Can someone tell me if it's possible to let Ruby wait the right amount of time for the system command to finish?
Right now I'm using:
sleep 20
but if it will take longer once, it isn't right of course.
What I do specifically is this:
#Call on the program to convert xls
command = "C:/Development/Tools/xls2csv/xls2csv.exe C:/TDLINK/file1.xls"
system(command)
do_stuff
def do_stuff
#This is where i use file1.csv, however, it isn't here yet
end
Ruby's system("...") method is synchronous; i.e. it waits for the command it calls to return an exit code and system returns true if the command exited with a 0 status and false if it exited with a non-0 status. Ruby's backticks return the output of the commmand:
a = `ls`
will set a to a string with a listing of the current working directory.
So it appears that xls2csv.exe is returning an exit code before it finishes what it's supposed to do. Maybe this is a Windows issue. So it looks like you're going to have to loop until the file exists:
until File.exist?("file1.csv")
sleep 1
end
Try to use threads:
command = Thread.new do
system('ruby programm.rb') # long-long programm
end
command.join # main programm waiting for thread
puts "command complete"
I have a Perl script that launches another Perl script in a new console through Win32::Process as follows:
Win32::Process::Create($ProcessObj,
"C:\\Perl\\bin\\perl.exe",
"$path_to_other_perl_script",
0,
NEW_CONSOLE,
".");
$ProcessObj->Suspend();
$ProcessObj->Resume();
$ProcessObj->Wait(0);
The problem is, there is no stdout in the new console created. If I don't use the new console option, the script runs silently in the background.
If I use cmd.exe to launch the Perl script, I can see the output fine but now I cannot control the child Perl script through Win32::Process.
Does anyone have a solution that works?
Update: Based on your comments, I get the feeling that your programs are not examples of best practices on Linux or Windows. However, I am sure, when reading the documentation for Win32::Process, you noticed that you can call the Kill method on the process to terminate it. So, I changed the example below to do that.
Your chances of getting useful help increase exponentially if you provide real code. Here are the arguments to Win32::Process::Create:
$iflags: flag: inherit calling processes handles or not
Now, I am not sure if you are trying to capture the STDOUT of the second process or if you are trying to have its STDOUT output show up in the same console as the parent.
If the latter, then the following scripts illustrate one way of doing that:
parent.pl
#!/usr/bin/perl
use strict;
use warnings;
use Win32;
use Win32::Process;
$| = 1;
my $p;
print "Starting child process ... \n";
Win32::Process::Create(
$p,
'c:/opt/perl/bin/perl.exe',
'perl hello.pl',
1,
NORMAL_PRIORITY_CLASS,
'.',
) or die Win32::FormatMessage( Win32::GetLastError() );
print "Waiting three seconds before killing 'hello.pl'\n";
for (1 .. 3) {
print;
sleep 1;
}
$p->Kill(0)
or die "Cannot kill '$p'";
hello.pl
#!/usr/bin/perl
$| = 1;
print "Hello World\n";
print "Sleeping 1000 seconds\n";
for (1 .. 1000) {
sleep 1;
print '.';
}
Output:
Starting child process ...
Waiting three seconds before killing 'hello.pl'
1Hello World
Sleeping 1000 seconds
2.3.
Now, I am still not sure why you are using Win32::Process. Unless there is a specific reason to tie your script to Win32, I would recommend using standard Perl facilities. For example, read perldoc -f open and perldoc perlipc (see esp. Using open for IPC).
Explain your question better to get answers that address your particular situation rather than generalities.
Using -w before $filepath fixed it ! Still need an explanation though.