Windows "Start" command doesn't return from within "Execute shell script" step - cmd

Within a kettle job we need to call a program that doesn't return until it is stopped. From a command line this can be done with the Start command of Windows:
Start "some title" /b "C:\windows-style\path with spaces\program.exe" unqoted_param -i -s "quoted param"
This works well by starting the program in another shell while the shell calling it returns and can continue. From within a kettle job this should be possible too, I think, by simply running the above command in a Execute a shell script step with the Insert script option.
However instead of returning from running the program in a new shell, the execution waits for the program to finish. This is not what we want because while the program is running (it's a VPN connection) we need to perform some other steps before the program is stopped again.
I suspect this might have something to do with how kettle performs the script, namely by putting the commands in a temporary batch file, then running that one. At least that's how it is presented in the job log:
2019/09/17 09:40:24 - Step Name - Executing command : cmd.exe /C "C:\Users\username\AppData\Local\Temp\kettle_69458258-d91e-11e9-b9fb-5f418528413ashell.bat"
2019/09/17 09:40:25 - Step Name - (stdout)
2019/09/17 09:40:25 - Step Name - (stdout) C:\pentaho_path>Start "some title" /b "C:\windows-style\path with spaces\program.exe" unqoted_param -i -s "quoted param"```

For a quick solution, you can use parallel execution in the job.
From the Start step (or whichever step precedes the need for the VPN), activate the option to run the subsequent steps in parallel. Then you can put the shell script step in its own branch while the rest of the job can continue (with a wait step on the other branch to allow the VPN to start).
From the question, you are probably running jobs from the Pentaho server. If you happen to run them from a scheduler with kitchen.bat, you could start the VPN before calling kitchen of course.

Related

Running shell script commands sequentially in Jenkins

In Jenkins, I have created a job which runs many shell script commands:
command1
command2
...etc
command1 is an ssh command which calls a shell script file on another server machine. I have to wait until it is finished, and AFTER it, command2 should come.
So, how can I make sure that the script file on the other machine, started by command1, has already finished its jobs, when in the Jenkins job the next command (command2) is started?
Or, alternatively,how can I make sure that command2 won't be started until the shell script on the other machine (started by command1) has already finished?
You can check out "How to send many commands to shell and wait for the command behind ends" in order to chain commands and wait for their completion.
When you execute a command through an ssh session, you might have to wrap that command in a script able to loop/wait for the command completion.
See an example in "How can I make ssh wait until the command exits?".
Or (a simpler wraper): How do I know when a command run over ssh has finished?
#/bin/bash
$#
echo "==== Command Output Finished ===="
look for the string ==== Command Output Finished ==== in your I/O routines to determine where the boundary between command outputs are.
Or you can try isolate those commands in their own Jenkins shell build step.
(Not a different job, just a different build step within the same job)

Exiting batch script after it starts

I have this batch script:
C:\{Directory}\PsExec.exe -u {UserName} -p {Password} \\{IP_Address} /accepteula "C:\batchfiles\{BatchScript}.bat"
And the {BatchScript}.bat script is:
C:\{Directory}\infacmd.bat wfs startWorkflow -DomainName {Domain_Name} -ServiceName {Service_Name} -UserName {Username} -Password {Password} -Application {Application} -Workflow {Workflow} -wait
This script kicks off an Informatica process to build a data warehouse (not sure if that's important, but thought I would mention it). When I run the first batch script, it kicks off the second batch script. However, it seems like command prompt waits for Informatica to be finished before it exits. My issue is that I have other processes that need to run, and this process takes 5 hours currently. Is there a command I can add on to my second (or first) script that will exit command prompt immediately after it kicks off? I don't believe this will impact the data warehouse build since I don't need Windows to monitor the process.
start "windowtitle" C:\...whatever...\infacmd.bat w....
should do what you want...

Windows batch start command and ECHO on completion and close the cmd's window

I'm trying to schedule a script to run on windows. The triggering part works fine. The important part of my script looks like:
start C:\staging-script -arg1 arg -arg2 arg & ECHO "Did staging"
start C:\prod-script -arg1 arg -arg2 arg & ECHO "Did prod"
When I run it from cmd.exe, two more cmd windows are opened, both execute the script, and then the windows don't close. When I try to use Windows scheduler for this, it fails because the "resource is still in use"
Additionally, the ECHOs happen in the original window (which is where they should happen) but happen right away, not when the start task completes.
start creates an independent process. Once the process is started, the message is produced and the next line executed.
If you want the two started processes to execute in parallel and you're only bothered by those processes' windows' not closing, insert
exit
in the scripts started
If you want to execute the processes serially, that is complete process1 before producing the message and starting process2, then CALL the batches, don't start them.
try adding exit to the end of each script the windows execute.

In batch programing can one command run before the previous command finishes executing?

In batch programing is one command waited until completed until the next one is run? What I mean is for example
net stop wuauserv
net start wuauserv
Since net stop wuauserv takes a while to complete is it given time to complete or do I need another command to wait until it completes?
The NET STOP command does wait (or timeout while waiting) for a service to stop or start.
You can check the %ERRORCODE% from the command to get more information about if there was a problem or if it worked as expected.
In general most system command line tools return control once they are done executing. A few specialized programs will call into other services or systems and may return control before execution is complete. You will need to check the docs for whatever you are trying to run, but generally processes exit once the 'task' they perform is complete.
In a batch file, all commands are run sequentially, and execution waits for the command to complete.
In your example, net stop wuauserv would complete before net start wuauserv gets run.
You could confirm that by running something you know will take a long time, such as
ping www.google.com
ping www.stackoverflow.com
and you'll see that the second ping does not start until the first completes.
In your case, yes the second command will not execute until the first finishes.
However, GUI apps will start up and return control the batch file.
For example,
PING localhost
NOTEPAD
DIR
The DIR command will execute even if NOTEPAD is still running.

Running remotely Linux script from Windows and get execution result code

I have the current scenario to deal with:
I have to schedule the backup of my company's Linux-based server (under Suse Linux) with ARCServe R15 (installed on Windows 2003R2SP2).
I know I have the ability in my backup software (ARCServe) to add pre/post execution scripts to my backup-jobs.
If failure of the script, ARCServe would be specified NOT to run the backup-job, and if success, specified to be run. I have no problem with this.
The problem is, I want to make a windows script (to be launched by ARCServe) for executing a Linux script on the cluster:
- If this Linux-script fails, I want my windows-script to fail, so my backup job in ARCServe wouldn't run
- If the Linux-script success, I want my windows-script to end normally with error code 0, so my ARCServe job would run normally.
I've tried creating this batch file (let's call it HPC.bat):
echo ON
start /wait "C:\Program Files\PUTTY\plink.exe" -v -l root -i "C:\IST\admin\scripts\HPC\pri.ppk" [cluster_name] /appli/admin/backup_admin
exit %errorlevel%
If I manually launch this .bat by double-clicking on it, or launching it in a command prompt under Windows, it executes normally and then ends.
If I make it being launched by ARCServe, the script seems never to end.
My job stays in "waiting" status, it seems the execution code of the linux script isn't returned to my batch file, and this one doesn't close.
In my mind, what's happening is plink just opens the connection to the Linux, send the sript execution signal, and then close the connection, so the execution code can't be returned to the batch. Am I right ?
Is what I want to do possible or am I trying something impossible to do ?
So, do I have to proceed differently ?
Do I have to use PUTTY or CygWin instead of plink ?
Please, it's giving me headaches ...
If you install Cygwin, you could do it exactly like you can do it on Linux to Linux, i.e. remotely run a command with ssh someuser#remoteserver.com somecommand
This command will return with the same return code on the calling client, as the command exited with on the remote end. If you use SSH shared keys for authentication instead of passwords, it can also be scripted without user interaction.

Resources