Laravel queues run forever - laravel-4

I have a page which supports some kind of mail notificiation. When user inserts some data, I want to send mail to another. I know, Mail::send() works perfectly, but it is slow. So I want to push this mail to queue. I use iron.io as provider. Everything works perfectly until I close console.
So is it possible to run php artisan queue:listen forever after I close console on Win and Linux?

You can run every process in the background in linux by using nohup
nohup php artisan queue:listen
This will keep the process running even if you close your terminal, nohup will force to ignore hangup signals.
nohup creates a logfile. If you want to suppress this, you can add
>/dev/null 2>&1 &
after your command

Related

Deploy a TCP Server written in Ruby

I've written a TCP Server in ruby running on port 2000 with event machine.
Right now, what I do is ssh to my server and run the command ruby lib/tcp_server.rb to turn on the server, but it shuts down when I log out.
I've tried nohup and using & but nothing seems to stick for the server for a long time.
So my question is, how do I deploy this server on port 2000 and keep it running, like how we deploy Rails to nginx.
It's not a webserver, but an a tcp server for a connected device, if that helps.
Thanks!
Solution 1: tmux or screen
This is the simplest way to approach, you will have to create a tmux or screen session, then start your server in that session.
Solution 2: nohup
nohup ruby lib/tcp_server.rb > stdout.log 2> stderr.log &
You've tried nohup and using &, I suppose you've already known how to do.
Solution 3: daemonize
You can detach from the shell and daemonize the process by forking
it twice, setting the session ID and changing the current working directory.
def daemonize
exit if fork
Process.setsid
exit if fork
Dir.chdir '/'
end
With this approach, you will have to redirect stdout and stderr to keep logs.
Another way to daemonize is to use gems like daemons.
update:
To restart the process automatically after being killed, you need a process manager like god or pm2.
To start the process automatically after booting, you need to compose an init scripts but how it looks like depends on your service management system and operating system. One of the most well-known is System V. If you are using Ubuntu, you might want to take a look at Upstart or systemd.

How to kill the laravel queue:listen --queue=notification?

For a cron job I am using following code in laravel 5.1 and run the command in every 1 min. But even though after stopping cronjob from crontab the laravel code still executes. ?
$this->call('queue:listen', [
'--queue' => 'notification-emails','--timeout'=>'30'
]);
what could be the problem ? How can I stop this queue listen ?
You probably looking for queue:work which will stop, when no more jobs left, meanwhile queue:listen will persist.
If You want to kill existing process - You have to do it manually, because there is no command in laravel to kill all queue:listen processes.
Keep in mind, that You will not find process like artisan queue:listen, You have to look for artisan schedule:run because queue:listen, when called internally, will not create separate process.

Laravel Schedule command does not work

I used laravel 5 in my project. I wanted to create a scheduler for inserting user. In kernel.php, I put my codes and set the scheduler.
I created a command class named "InsertUser" and put it in kernel.php $commands variable.
In command line, I ran "php artisan schedule:run". But I found "No scheduled commands are ready to run.". If I used call function instead of command function (in lernel.php), it was working fine. Please help me.
The Laravel Scheduler needs a cron job that runs the php artisan schedule:run command periodically, which in turn evaluates any scheduled commands and runs them accordingly.
From your screenshot I see you're running Windows, which means you can't use the job code snippet from the Starting The Scheduler section in the documentation because there is no cron on Windows. Windows is not officially supported for the task scheduler because of that and no instructions for it can be found in the documentation.
You could however get around the problem by creating a batch file, say scheduler.bat, that has the following contents:
cd c:\lamp\www\larasoft
php artisan schedule:run 1>> NUL 2>&1
Then you can add a Windows Scheduler Task to run that file every minute.
Windows does support Laravel Scheduler but, you've to run the command on your own for multiple times. Since we can't use Windows Task Scheduler to run for every 1 min as we can do with linux crontab. If you're using windows for development environment and want to test if command is working on not you can try this
If you run the
php artisan schedule:run
command for multiple times by giving a min gap for each trial it'll work.
If you want to run directly the command you can follow this.
"path\to\php.exe" "artisan" YourCommand > "NUL" 2>&1 &
In your case
Run "where php.exe" in command prompt
Copy The php location
"paste\your\php\location" "artisan" InsertUser > "NUL" 2>&1 &

How to tell Bash to not stop the simulations when ssh disconnects?

I am running some simulations on another machine via ssh. Here is what I do
ssh username#ipp.ip.ip.ip
Go to the right directory
cd path/to/folder
And then I just call my executable
.\myexecutable.exe
The issue is that every time the ssh disconnect, the simulations stops. How can I make sure the simulations doesn't stop on the other machine? Will I somehow receive potential error messages (assuming the code will crash) once I reconnect (ssh)?
You should launch a screen or tmux to create a terminal from which you can detach, leave running in the background and later reattach.
Further reading:
http://ss64.com/osx/screen.html
https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man1/screen.1.html
You may also want to try out Byobu:
http://byobu.co
run your command as follows : nohup ./myexecutable.exe >nohup.out 2>&1 &
The & is to run the command in the background
The >nohup.out 2>&1 sends your stdout and stderr to nohup.out)
Note the '/' as opposed to '\' - which won't work on osx

executing a script which runs even if i log off

So, I have a long running script (of order few days) say execute.sh which I am planning to execute on a server on which I have a user account...
Now, I want to execute this script so that it runs forever even if I logoff or disconnect from the server??
How do i do that?
THanks
You have a couple of choices. The most basic would be to use nohup:
nohup ./execute.sh
nohup executes the command as a child process and detaches from terminal and continues running if it receives SIGHUP. This signal means sig hangup and will getting triggered if you close a terminal and a process is still attached to it.
The output of the process will getting redirected to a file, per default nohup.out located in the current directory.
You may also use bash's disown functionality. You can start a script in bash:
./execute.sh
Then press Ctrl+z and then enter:
disown
The process will now run in background, detached from the terminal. If you care about the scripts output you may redirect output to a logfile:
./execute.sh > execute.log 2>&1
Another option would be to install screen on the remote machine, run the command in a screen session and detach from it. You'll find a lot of tutorials about this.
nohup (no hangup) it and run it in the background:
nohup execute.sh &
Output that normally would have gone to the screen (STDOUT) will go to a file called nohup.out.

Resources