Perl script is running slow when calling bash - bash

I am running a perl script that calls another script. Via command line. But it runs extremely slow. The back ticks makes it run via command line.
for my $results (#$data){
`/home/data/push_to_web $results->{file}`;
}
If i run the same command via bash /home/data/push_to_web book.txt the same script runs extremely fast. If i build a bash file that contains
/home/data/push_to_web book_one.txt
/home/data/push_to_web book_two.txt
/home/data/push_to_web book_three.txt
The code executes extremely fast. Is there any secret to speeding perl up via another perl script.

Your perl script fires up a new bash shell for each element in the array, whereas running the commands in bash from a file doesn't have to create any new shells.
Depending on how many files you have, and what's in your bash startup files, this could add a significant overhead.
One option would be to build a list of semicolon-separated commands in the for loop, and then run one system command at the end to execute them all in one bash process.

Related

Script piped into bash fails to expand globs during rm command

I am writing a script with the intention of being able to download and run it from anywhere, like:
bash <(curl -s https://raw.githubusercontent.com/path/to/script.sh)
The command above allows me to download the script, run interactive commands (e.g. read), and - for the most part - Just Works. I have run into an issue during the cleanup portion of my script, however, and haven't been able to discern a fix
During cleanup I need to remove several .bkp files created by the script's execution. To do so I run rm -f **/*.bkp inside the script. When a local copy of the script is run, this works great! When run via bash/curl, however, it removes nothing. I believe this has something to do with a failure to expand the glob as a result of the way I've connected the I/O of bash and curl, but have been unable to find a way to get everything to play nice
How can I meet all of the following requirements?
Download and run a script from a remote resource
Ensure that the user's keyboard input is connected for use in e.g. read calls within the script
Correctly expand the glob passed to rm
Bonus points: colorize input with e.g. echo -e "\x1b[31mSome error text here\x1b[0m" (also not working, suspected to be related to the same bash/curl I/O issues)

Getting continuous output from shell script that is run by Applescript in Cocoa

I have a shell script that is using echo to give a continuous output (the progress of an rsync) that I am using AppleScript to run with administrator privileges. Before I was using NSTask to run the shell script, but I couldn't find a way to run it with the privileges that it needed, so now I am using applescript to run it. When it was running via NSTask, I could use an output pipe and waitForDataInbackgroundAndNotify to get the continuous output and put it into a text field, but now that I am using AppleScript, I cannot seem to find a way to accomplish this. The shell script is still using echo, but it seems to get lost in the AppleScript "wrapper." How do I make sure that the AppleScript sees the output from the shell script and passes it on to the application? Remember, this isn't one single output, but continuous output.
Zero is correct. When you use do shell script, you can consider it similar to using backticks in perl. The command will be executed, and the everything sent to STDOUT will be returned as the result.
The only work around would be to have the your command write the output to a temporary file and then use do shell script "foo" without waiting. From there, you can read from the file sequentially using native AppleScript commands. It's clunky, but it'll work in a pinch.

Redirect bash output called from batch file to console

I have a batch file (build.bat) which calls a bash script (makelibs.sh). The bash script contains several commands which build 20 libraries from source.
If I run makelibs.sh from MSYS, I get continuous output. If I call it from the batch file, then I see the full output only at the end of every single command.
This makes it difficult to assess the current status of the process.
Is it possible to redirect the output of makelibs.sh in order to get a continuous feedback on the execution?
I have a batch file (build.bat) which calls a bash script (makelibs.sh)
I strongly advise against doing this. You are calling a script with a script, when you could simply open up Bash and put
makelibs.sh
However if you insist on doing this then perhaps start would work
start bash.exe makelibs.sh
ref

Ensuring Programs Run In Ordered Sequence

This is my situation:
I want to run Python scripts sequentially in sequence, starting with scriptA.py. When scriptA.py finishes, scriptB.py should run, followed by scriptC.py. After these scripts have run in order, I need to run an rsync command.
I plan to create bash script like this:
#!/bin/sh
python scriptA.py
python scriptB.py
python scriptC.py
rsync blablabla
Is this the best solution for perfomance and stability ?
To run a command only after the previous command has completed successfully, you can use a logical AND:
python scriptA.py && python scriptB.py && python scriptC.py && rsync blablabla
Because the whole statement will be true only if all are true, bash "short-circuits" and only starts the next statement when the preceding one has completed successfully; if one fails, it stops and doesn't start the next command.
Is that the behavior you're looking for?
If you have some experience with python it will almost certainly be better to write a python script that imports and executes the relevant functions from the other script. That way you will be able to use pythons exceptions handling. Also you can run the rsync from within python.

Get unix script to behave as if it was run from a different folder

I am using a scheduler to run a unix script which starts up my application. The script is in the PATH of the user used by the scheduler. Hence, can be run from an y
My application log files are created relative to where the script is run from. Unfortunatley, the scheduler does not run the script from the folder I had hoped hence log files are not going to correct folder.
Is there any way in I get the script to run and behaves as it was run from a specified folder, e.g. ./ScriptName.sh Working_Folder | Run_Folder
Note: I cannot change the script
if your scheduler run your tasks using a shell (which it probably do) you can use { cd /log/dir ; script; } directly as command.
if not, you need to use a wrapper script as stated by #Gilles but i would do:
#!/bin/sh
cd /log/dir
exec /path/to/script "$#"
to save a little memory. The extra exec will make sure only the script interpreter is in memory instead of both (sh and the script interpreter).
If you can't change the script, you'll have to make the scheduler run a different command, not the script directly. For example, make the scheduler run a simple wrapper script.
#!/bin/sh
cd /desired/directory/for/log/files
/path/to/script "$#"

Resources