How to use output of Laravel-(artisan)-command in bash-script? - laravel

I am trying to write a small startup-script for one of my docker-containers. The problem is that the bash-script has to wait until a artisan-command echoes "1".
The artisan-commands handle-function looks like this:
$condition = something-to-check;
if($condition){ echo 1; } else { echo 0; }
What I had in mind was something like like this:
#!/bin/bash
while php artisan myapp:mycommand == "0"
do
sleep 1m
done
do-something-else
How can I achieve this?
EDIT:
For anyone comming here via Google - James Taylor's answer pointed me in the right direction, which is why I accepted it and edit my solution in the question.
The approach was to edit the handle-function like this:
/**
* Execute the console command.
*
* #return int
*/
public function handle()
{
$mycondition = true; //or whatever
try{
if($mycondition == false){
echo "Some fancy status message based on the condition is false \n";
exit(1);
} else {
echo "Some fancy status message based on the condition is true \n";
exit (0);
}
} catch (\Exception){
echo "Some fancy status message based on the condition with an exception\n";
exit(1);
}
}
And set up the bash-script like this:
#!/bin/bash
until php /path/to/artisan myapp:mycommand
do
echo "Condition is false!"
sleep 1m #or whatever you wanna do
done
echo "Condition is true!"
do-something

Update your my app:mycommand to return an exit code instead of echo.
exit(1);
Reference the PHP exit function: https://www.php.net/manual/en/function.exit.php

Related

Out of memory with running JavaScript by PhantomJS

My shell script is written in cygwin for windows:
// main.sh
#!/bin/bash
[ "$#" -lt 1 ] && echo "Usage: thisscript.sh <filename.txt>" && exit 0
filename=`basename -s .txt $1`
i=0
while [ $i == 0 ]
do
phantomjs --web-security=no myXHR.js $filename.txt
logLastLine=`tail -n 1 $filename.log`
if [[ "$logLastLine" =~ "Error" ]]; then
echo "Error occurs, now keep looping it..."
elseif [[ "$logLastLine" =~ "503" ]]; then
echo "Error occurs, now keep looping it..."
elseif [[ "$logLastLine" =~ "500" ]]; then
echo "Error occurs, now keep looping it..."
else
echo "Complete! Exiting the execution..."
i=1
fi
done
And here are the codes contained in the myXHR.js
// myXHR.js
phantom.onError = function(msg, trace) {
console.log("PhantomJS Error");
phantom.exit();
};
var fs = require('fs'), system = require('system');
if (system.args.length < 2) {
console.log("Usage: myXHR.js <FILE>");
}
var content = '',
f = null,
lines = null,
eol = "\n";
try {
f = fs.open(system.args[1], "r");
filename=system.args[1].replace(/\.txt/,"");
content = f.read();
} catch (e) {
console.log(e);
}
if (f) {
f.close();
}
var request = new XMLHttpRequest();
if (content) {
lines = content.split(eol);
for (i=0; i<(lines.length-1);i++) {
request.open('GET', "http://stackoverflow.com/", false);
request.send();
if (request.status === 200) {
try {
fs.write($filename.log, line[i] + "Succeed!", 'a');
} catch(e) {
console.log(e);
}
} else {
try {
fs.write($filename.log, line[i] + "Error!", 'a');
} catch(e) {
console.log(e);
}
}
}
phantom.exit();
To illustrate, the javascript, executed by PhantomJS, are reading 1st argument(a filename.txt file), passed into the shell script, line by line. For each line it sends a XMLHttpRequest to check the request status and writes it into filename.log file.
Error status number includes 503 and 500. Luckily these statuses are less likely to occur again if I resend the same XMLHttpRequest. So what I need to do is to set up a error handler which is for resend the same XMLHttpRequest when errors occur.
In this error handler, I use X=${tail -n 1 log} to see if there is a error status number(containing "503" or "500" string). For instance, if [[ "$X" =~ "503" ]]; then restart the execution of the javascript, by not giving i=1 and while loop never exits. Until it has finished reading the last line of the imported file without any error status numbers.
(I know it is awkward to handle error like this, but it was a quick solution that came to my mind.)
But this is theoretical. In practice, this script ended with an error "Memory exhausted". I reckon this error is triggered by the large amount of lines(>100k) in the $1 file, and it occurs in the JavaScript execution part. I used free -m command to get memory usage information, and I noticed that when Javascript is running, the used swap is increasing!
Could anybody teach me how to release the memory when the scripts is being executed.

Empty function in BASH

I'm using FPM tool to create .deb package. This tool create before/after remove package from supported files.
Unfortunatly the bash script generated by FPM contains such function
dummy() {
}
And this script exit with an error:
Syntax error: "}" unexpected
Does BASH doesn't allow empty functions? Which version of bash/linux have this limitation?
You could use : that is equivalent to true and is mostly used
as do nothing operator...
dummy(){
:
}
A one liner
dummy(){ :; }
: is the null command
; is needed in the one line format
An empty bash function may be illegal. function contains only comments will be considered to be empty too.
a ":" (null command) can be placed in function if you want to "DO NOTHING"
see: http://tldp.org/LDP/abs/html/functions.html
I recommend this one:
dummy(){ unused(){ :;} }
If you use : null command, it will be printed by xtrace option:
(
set -o xtrace
dummy(){ :; }
dummy "null command"
)
echo ------
(
set -o xtrace
dummy(){ unused(){ :;} }
dummy "unused function"
)
output:
+ dummy 'null command'
+ :
------
+ dummy 'unused function'
For debug I use wrapper like this:
main() {(
pwd # doing something in subshell
)}
print_and_run() {
clear
(
eval "$1() { unused() { :; } }"
set -o xtrace
"$#"
)
time "$#"
}
print_and_run main aaa "bb bb" ccc "ddd"
# output:
# + main aaa 'bb bb' ccc ddd
# ..
dummy_success(){ true; } #always returns 0
dummy_fail(){ false; } #always returns 1
minimal functions returning always OK or ERROR status..
also useful to redefine missing functions with empty function
(or update it with some default action, for example - debug warning):
#!/bin/sh
#avoid error if calling unimportant_func which is underfined
declare -F unimportant_func >/dev/null || unimportant_func() { true; }
#get error if calling important_func which is underfined
declare -F important_func >/dev/null || important_func() { false; }
# print debug assert if function not exists
declare -F some_func >/dev/null || some_func() {
echo "assert: '${FUNCNAME[0]}() is not defined. called from ${BASH_SOURCE[1]##*/}[${BASH_LINENO[0]}]:${FUNCNAME[1]}($#)" 1>&2; }
my_func(){
echo $(some_func a1 a2 a3)
important_func && echo "important_func ok" || echo "important_func error"
unimportant_func && echo "unimportant_func ok" || echo "unimportant_func error"
}
my_func
output:
$> testlib.sh
assert: 'some_func() is not defined. called from testlib.sh[15]:my_func(a1 a2 a3)
important_func error
unimportant_func ok

How to read the output AND the return value of a external program in shell script?

I'm making a script that reads a tracking code, looks at the results of posting the tracking to a website and prints some messages and has a return value.
Here's part of the python code:
# update return True if there was a change to the .msg file
def update(cod):
msg = extract_msg(cod)
if msg == 'ERROR':
print('ERROR: invalid code\n')
sys.exit(2)
file = open('.msg', "r+")
old_msg = file.read()
if msg == old_msg:
return False
else:
print('Previous message: ' + old_msg)
print('Latest message: ' + msg)
file = overwrite(file, msg)
file.close()
return True
def main(argv):
if len(argv) > 1:
cod_rastr = argv[1]
else:
print("Error: no arg, no code\n")
return -1
# Verify if file exists
if os.path.isfile(".msg") == False:
arq = open('.msg', 'w')
arq.close()
# post() returns the source code of the resulting page of the posted code.
cod = post(cod_rastr)
if update(cod) == False:
return 0
else:
print ('\n Message!\n')
return 1
And here, I want to read not only the prints (for the final user) but the return values (for conditional use). This script should read the output of the .py and send me an email in case there is an update from the last check (I'll put this script in the crontab):
#!/bin/bash
if [ -z "$1" ]; then
echo usage: $0 CODE
exit
fi
CODE=$1
STATUS=$(myscript.py $CODE 2>&1)
VAL=$?
FILE=$(<.msg)
# always prints 0 (zero)
echo $VAL
# I want to check for an existing update case
if [[ $STATUS == 'Message!' ]]
then
echo $STATUS
echo $FILE | mail myuser#mydomain.com -s '$CODE: Tracking status'
fi
The problem is that $? always returns 0, and my string check inside the if, is not working, because I think It reads the update() prints too, which has variables in the print.
How can I make this shell script run, without changing the python script?
Thanks in advance.
I suspect that you can do what you want with the subprocess module. Either use rc = subprocess.call(...) to get a return code while directing stdout to a file, or use p = subprocess.Popen(...) and then perhaps p.communicate to get output and p.returncode to get the returncode.

How can I check if stdin exists in PHP ( php-cgi )?

Setup and Background
I am working on script that needs to run as /usr/bin/php-cgi instead /usr/local/bin/php and I'm having trouble checking for stdin
If I use /usr/local/bin/php as the interpreter I can do something like
if defined('STDIN'){ ... }
This doesn't seem to work with php-cgi - Looks to always be undefined. I checked the man page for php-cgi but didn't find it very helpful. Also, if I understand it correctly, the STDIN constant is a file handle for php://stdin. I read somewhere that constant is not supposed to be available in php-cgi
Requirements
The shebang needs to be #!/usr/bin/php-cgi -q
The script will sometimes be passed arguments
The script will sometimes receive input via STDIN
Current Script
#!/usr/bin/php-cgi -q
<?php
$stdin = '';
$fh = fopen('php://stdin', 'r');
if($fh)
{
while ($line = fgets( $fh )) {
$stdin .= $line;
}
fclose($fh);
}
echo $stdin;
Problematic Behavior
This works OK:
$ echo hello | ./myscript.php
hello
This just hangs:
./myscript.php
These things don't work for me:
Checking defined('STDIN') // always returns false
Looking to see if CONTENT_LENGTH is defined
Checking variables and constants
I have added this to the script and run it both ways:
print_r(get_defined_constants());
print_r($GLOBALS);
print_r($_COOKIE);
print_r($_ENV);
print_r($_FILES);
print_r($_GET);
print_r($_POST);
print_r($_REQUEST);
print_r($_SERVER);
echo shell_exec('printenv');
I then diff'ed the output and it is the same.
I don't know any other way to check for / get stdin via php-cgi without locking up the script if it does not exist.
/usr/bin/php-cgi -v yields: PHP 5.4.17 (cgi-fcgi)
You can use the select function such as:
$stdin = '';
$fh = fopen('php://stdin', 'r');
$read = array($fh);
$write = NULL;
$except = NULL;
if ( stream_select( $read, $write, $except, 0 ) === 1 ) {
while ($line = fgets( $fh )) {
$stdin .= $line;
}
}
fclose($fh);
Regarding your specific problem of hanging when there is no input: php stream reads are blocking operations by default. You can change that behavior with stream_set_blocking(). Like so:
$fh = fopen('php://stdin', 'r');
stream_set_blocking($fh, false);
$stdin = fgets($fh);
echo "stdin: '$stdin'"; // immediately returns "stdin: ''"
Note that this solution does not work with that magic file handle STDIN.
stream_get_meta_data helped me :)
And as mentioned in the previous answer by Seth Battin stream_set_blocking($fh, false); works very well 👍
The next code reads data from the command line if provided and skips when it's not.
For example:
echo "x" | php render.php
and php render.php
In the first case, I provide some data from another stream (I really need to see the changed files from git, something like git status | php render.php.
Here is an example of my solution which works:
$input = [];
$fp = fopen('php://stdin', 'r+');
$info = stream_get_meta_data($fp);
if (!$info['seekable'] && $fp) {
while (false !== ($line = fgets($fp))) {
$input[] = trim($line);
}
fclose($fp);
}
The problem is that you create a endless loop with the while($line = fgets($fh)) part in your code.
$stdin = '';
$fh = fopen('php://stdin','r');
if($fh) {
// read *one* line from stdin upto "\r\n"
$stdin = fgets($fh);
fclose($fh);
}
echo $stdin;
The above would work if you're passing arguments like echo foo=bar | ./myscript.php and will read a single line when you call it like ./myscript.php
If you like to read more lines and keep your original code you can send a quit signal CTRL + D
To get parameters passed like ./myscript.php foo=bar you could check the contents of the $argv variable, in which the first argument always is the name of the executing script:
./myscript.php foo=bar
// File: myscript.php
$stdin = '';
for($i = 1; $i < count($argv); i++) {
$stdin .= $argv[$i];
}
I'm not sure that this solves anything but perhaps it give you some ideas.

What's does a bash function return when there is no “return” statement?

Is the return value of a bash function the status of the last executed command?
I wrote this test and it looks like it's so. I just want to verify. No one has asked this question before apparently and tutorials don't mention this.
Test program:
funa() {
echo "in funa";
true;
};
funb() {
echo "in funb"
false;
};
funa && echo "funa is true";
funb && echo "funb is true";
Output when I run the program:
in funa
funa is true
in funb
Does anyone know the answer?
Yes. Per man bash:
Shell Function Definitions
When executed, the exit status of
a function is the exit status of the last command executed in
the body. (See FUNCTIONS below.)
Did you try reading the manpage? It's in there.
When executed, the exit status of
a function is the exit status of the last command executed in
the body. (See FUNCTIONS below.)

Resources