Basic windows script to keep node running - windows

I am using the following very very basic script in a .bat file to keep a node.js server running on a windows machine.
: loop
npm start
goto loop
: end
However, if the server goes down, it does not restart automatically.
I do know that there are preferable ways of keeping the node up and running (example), but I really want to focus on other parts of the code at the moment and keep on integrating with other partners there. Thus, I am really looking for a very very simple bat file that can restart the server when it goes down (on Windows). What can possibly be wrong with the above one that I have?

Probably (you will have to check it), npm is a batch file (.bat or .cmd).
When you invoke a batch file from another batch file, the execution is transfered to the called batch and does not return to the caller. In your case, your goto loop is never reached as npm will never return
You need to use call npm start, so the execution will continue in the caller when the called batch ends.

Related

Is there a way to prevent a bash script from running certain commands if the script has to be run again?

I have a bash script that works at the moment. It gets an image and JDK 8 from a link and then runs a installer for the JDK 8 to move on to setting up another piece of software.
As I was debugging the script, I kept finding myself having to delete directories and even the java installation because when I introduce a fix and rerun the script, I have to wait for everything to download again and I have to worry about duplicate files messing up my current logic -which can probably be improved, but I'll go to the StackExchange Code Review site later.
At the moment, I would like to know what approaches there are to prevent commands -like downloading the JDK and running the JDK installer script all over again and others- from running again.
What kind of general approaches are out there for cases such as these?
For the JDK download and running the installer, I did think of simply checking for the existing of java on the system and if there is then bash would not not to run those commands.
However, there are other commands I do not want run and I do want to simply check, for example, the existence of certain files to prevent wget-ing them all over again and moving them -causing duplicates. (Should I maybe suck it up and do that anyway as that might be best practice?)
I did also think of perhaps, at each successful command, outputting like a 1 to a text file and mapping each line in that text file to the commands run in the script (like using an if statement to see if that command had a 1 or not in the text file) and if it was a 0, then the script would know only to run that command and never the 1s.
That sounded clunky to me and I am pretty sure that is not a good approach.

GetExitCodeProcess return 1

I have a MFC application which uses CreateProcess and then, call GetExitCodeProcess to get the exit code. But GetExitCodeProcess returns 1 and it fail.
More detail about my application -
My application run two process first is bat file and after successful completion of first process I create second process which is vb script. Both run in automation environment. This script contains simply a message box.
My second process fail with error 1 return by GetExitCodeProcess().
When I run scripts in reverse order like first as vbscript and second as bat file, both executes successfully.
I am not able to understand why my VB script fail with error code 1?
Please help. Thanks in advance!
Jyoti
Thank you very much for looking at my query and trying to answer it.
I have resolved this issue, hence thought to share solution.
In my application I was executing vb script with 'cscript' as command line parameter.
When I changed that parameter to 'wscript', it prompt me message that the filename.vbs not exists.
Then I understand the process is executing but it does not get vb script file and hence the Process fail with GetExitCodeProcess as return value = 1
This problem was not understood because I was using cscript. Whereas,
'cscript' runs entirely in the command line and is ideal for non-interactive scripts.
And 'wscript' will popup Windows dialogue boxes for user interaction.
Hence when I use wscript I understood the root cause.
The solution for this issue is I have checked the vbscript file exists or not. It will go ahead only if the file exists, till that, application wait for creation of vbscript file.
As said in GetExitCodeProcess() return 1 when process is not yet finished GetExitCodeProcess returns a BOOL to signal success or failure. The process return code is returned in the second parameter to GetExitCodeProcess.

Jenkins Timeout because of long script execution

I have some Issues regarding Jenkins and running a Powershell Script within. Long Story short: the Script takes 8x longe execution time then running it manually (takes just a few minutes) on the Server(Slave).
Im wondering why?
In the script are functions which which invoke commands like & msbuild.exe or & svn commit. I found out that the script hangs up in those Lines where before metioned commands are executed. The result is, that Jenkins time out because the Script take that long. I could alter the Timeout threshold in the Jenkins Job Configuration but i dont think this is the solution for the problem
There are no error ouputs or any information why it takes that long and i do not have any further Idea for the reason. Maybe one of you could tell me, how Jenkins invokes internaly those commands.
This is what Jenkins does (Windows batch plugin):
powershell -File %WORKSPACE%\ScriptHead\DeployOrRelease.ps1
I've created my own Powershell CI Service before I found that Jenkins supports it's own such plugin. But in my implementation and in my current jobs configs we follow sample segregation principle rule: more is better better. I found that my CI Service works better when is separated in different steps (also in case of error it's a lot easy for a root cause analyse). The Single responsibility principle is also helpful here. So as in Jenkins we have pre- & post-, build and email steps as separate script. About
msbuild.exe
As far as I remember in my case there were issues related with the operations in FileSystem paths. So when script was divided/separated in different functions we had better performance (additional checks of params).
Use "divide and conquer" technique. You have two choices: modify your script so that will display what is doing and how much it takes for every step. Second option is to make smaller scripts to perform actions like:
get the code source,
compile/build the application,
run the test,
create a package,
send the package,
archive the logs
send notification.
The most problematic is usually the first step: To get the source code from GIT or SVN or Mercurial or whatever you have as version control system. Make sure this step is not embeded into your script.
During the job run, Jenkins capture the output and use AJAX to display the result in your browser. In the script make sure you flush standard output for every step or several steps. Some languages cache standard output so you can see the results only at the end.
Also you can create log files that can be helpful to archive and verify activity status for older runs. From my experience using Jenkins with more then 10 steps requires you to create a specialized application that can run multiple steps like "robot framework".

VBScript - How to know when complete?

I'm running a simple/single vbscript in Windows Scheduler to perform 13 individual file exports from our SalesForce app. The script runs as expected. Depending upon network traffic, the 13 exports take 3-5 minutes total to complete.
My intent was to run these exports serially, but vbscript seems happy to run them in parallel. SalesForce accommodates with no issue or complaint.
Upon successful completion of the Export, I run a second vbscript to import these results into another application (via an msaccess function). This second vbscript also provides the desired result.
Question: Is there any way to programatically determine when the Export script has completed, to permit me to safely kick-off the Import script? Currently I have setup a 2nd Scheduler job to run the Import script 10 minutes after the separate Export script...but this could fail. I am looking to tie these two script more closely to one another.
Any suggestions?
Thanks!
There are a couple of options. If both scripts are running on the same system with the same permissions, you could have the first script actually kick off the second script whenever it's finished.
If the scripts require different permissions, or you need them to start from a task manager, have your first script start by looking for an existing file such as SCRIPT1.COMPLETE. If that file exists, have script1 delete the file and start processing. When script1 finishes it's processing, create that file. Then in script2, create a while loop that looks for SCRIPT1.COMPLETE. If the file is not there, hold off for a few seconds then try again. Don't exit the while loop until the complete file shows up. Have script2 delete the COMPLETE file when it finishes processing. I would recommend setting your "wait a while" function to at least 30 seconds or so, that way your script isn't just constantly checking.

How to run a specified bat file at particular time ie.. scheduler

I have been using the at command to schedule the task
ex: at 14:45 my.bat
and i am getting the o/p on the command prompt as
"JOB ID is added"
But this command is not getting fired on the time which i have scheduled..
Can anyone please help me.......
I suspect the issue is not that the BAT file is not executing at all, but rather either or both of i) individual commands within the BAT file are failing, or ii) the output isn't getting sent to the place you're looking for it. (Things get even weirder if anything in the batch file requests input, since a) by default batch files may not be able to interact with the "console" at all and b) the system is probably unattended anyway at the time the batch file executes.) If my suspicion is right, there is no one "fix-everything" but rather a whole bunch of small fixes ...and you have to hit every one of them.
Find out the needed password to actually login as 'admin' (rather than your usual user), open a DOS box, and try to run the batch file. There should be some sort of error message that you can see. Fix that problem. Then try again ...and fix the next problem. Keep correcting problems and trying again until finally everything works.

Resources