Execute windows batch command dose not run completely - windows

I have set of commands which I am trying to run in a " Execute windows batch command" in Jenkins.
I am facing issues as the complete set of command do not run, but jenkins marks the build as success.
Command 1
Command 2 # This command takes almost 90 minutes to complete
command 3 # This never runs
I am trying to understand if there is a timeout associated here. What are the ways to overcome this problem.

Related

Sending a command in the windows command prompt that terminates after a certain time

I am using a batch script to create a short functional test for technicians at my work which would normally require them to send several commands back-to-back themselves.
There is a command that brings up an image on the test device, but the command hangs until you press ctr+c. When doing everything manual that's fine but I don't want the technicians to do that during the script because they might accidentally exit out of the whole thing.
Is there a way to make it so the script can stop that command and move on the next line in the script? Something like a timeout on the command? Or a key press that just stops the command but doesn't close the script?
Code sample:
echo "Booting Device..."
adb start device
timeout 40
adb shell load_image yosimite.png
timeout 5

Windows "Start" command doesn't return from within "Execute shell script" step

Within a kettle job we need to call a program that doesn't return until it is stopped. From a command line this can be done with the Start command of Windows:
Start "some title" /b "C:\windows-style\path with spaces\program.exe" unqoted_param -i -s "quoted param"
This works well by starting the program in another shell while the shell calling it returns and can continue. From within a kettle job this should be possible too, I think, by simply running the above command in a Execute a shell script step with the Insert script option.
However instead of returning from running the program in a new shell, the execution waits for the program to finish. This is not what we want because while the program is running (it's a VPN connection) we need to perform some other steps before the program is stopped again.
I suspect this might have something to do with how kettle performs the script, namely by putting the commands in a temporary batch file, then running that one. At least that's how it is presented in the job log:
2019/09/17 09:40:24 - Step Name - Executing command : cmd.exe /C "C:\Users\username\AppData\Local\Temp\kettle_69458258-d91e-11e9-b9fb-5f418528413ashell.bat"
2019/09/17 09:40:25 - Step Name - (stdout)
2019/09/17 09:40:25 - Step Name - (stdout) C:\pentaho_path>Start "some title" /b "C:\windows-style\path with spaces\program.exe" unqoted_param -i -s "quoted param"```
For a quick solution, you can use parallel execution in the job.
From the Start step (or whichever step precedes the need for the VPN), activate the option to run the subsequent steps in parallel. Then you can put the shell script step in its own branch while the rest of the job can continue (with a wait step on the other branch to allow the VPN to start).
From the question, you are probably running jobs from the Pentaho server. If you happen to run them from a scheduler with kitchen.bat, you could start the VPN before calling kitchen of course.

Running shell script from jenkins

When i try to execute the jobs from terminal it works for one hour without any issue. when i try to execute shell from Jenkins it works for just one minute and stops the execution. The output from Jenkins console output as follows :
Creating folder path in /jenkins/workspace/load_test/scripts/loadtest/loadtest1
PWD is : /jenkins/workspace/load_test/scripts/loadtest
Running /jenkins/workspace/load_test/scripts/loadtest/loadtest1/testRestApi.sh
1495126268
1495129868
3600
Process leaked file descriptors. See http://wiki.jenkins-ci.org/display/JENKINS/Spawning+processes+from+build for more information
Finished: SUCCESS
Any ideas/ suggestion to make the script run for one hour from Jenkins job ?
Have you tried with BUILD_ID=dontKillMe it is commonly used for daemons. https://wiki.jenkins-ci.org/display/JENKINS/ProcessTreeKiller however this should let you run your script

Psexec failing when running multiple commands in sequel

Using windows task scheduler i am running multiple commands, I'll call them task1.bat, task2.bat, and task3.bat . Each one of these scrips runs a different Psexec command (psexec version 2.11).
When running task1.bat, task2.bat, and task3.bat indivdually, these scripts run successfully; however when run in succession, task1.bat will run successfully, then task2.bat and task3.bat will usually fail with the error "Couldnt access servername. Access is denied. The syntax of the command is incorrect".
It seems like an error with Psexec, since when run individually the commands works fine. Is there a way to force Psexec to exit/end before moving onto the next script (besides just putting in a timeout)? It seems like psexec is hung which is causing the next to fail.
The .bat script will run sequentially if you create and run the batch file:
CALL task1.bat
CALL task2.bat
CALL task3.bat

Running remotely Linux script from Windows and get execution result code

I have the current scenario to deal with:
I have to schedule the backup of my company's Linux-based server (under Suse Linux) with ARCServe R15 (installed on Windows 2003R2SP2).
I know I have the ability in my backup software (ARCServe) to add pre/post execution scripts to my backup-jobs.
If failure of the script, ARCServe would be specified NOT to run the backup-job, and if success, specified to be run. I have no problem with this.
The problem is, I want to make a windows script (to be launched by ARCServe) for executing a Linux script on the cluster:
- If this Linux-script fails, I want my windows-script to fail, so my backup job in ARCServe wouldn't run
- If the Linux-script success, I want my windows-script to end normally with error code 0, so my ARCServe job would run normally.
I've tried creating this batch file (let's call it HPC.bat):
echo ON
start /wait "C:\Program Files\PUTTY\plink.exe" -v -l root -i "C:\IST\admin\scripts\HPC\pri.ppk" [cluster_name] /appli/admin/backup_admin
exit %errorlevel%
If I manually launch this .bat by double-clicking on it, or launching it in a command prompt under Windows, it executes normally and then ends.
If I make it being launched by ARCServe, the script seems never to end.
My job stays in "waiting" status, it seems the execution code of the linux script isn't returned to my batch file, and this one doesn't close.
In my mind, what's happening is plink just opens the connection to the Linux, send the sript execution signal, and then close the connection, so the execution code can't be returned to the batch. Am I right ?
Is what I want to do possible or am I trying something impossible to do ?
So, do I have to proceed differently ?
Do I have to use PUTTY or CygWin instead of plink ?
Please, it's giving me headaches ...
If you install Cygwin, you could do it exactly like you can do it on Linux to Linux, i.e. remotely run a command with ssh someuser#remoteserver.com somecommand
This command will return with the same return code on the calling client, as the command exited with on the remote end. If you use SSH shared keys for authentication instead of passwords, it can also be scripted without user interaction.

Resources