Exiting batch script after it starts - windows

I have this batch script:
C:\{Directory}\PsExec.exe -u {UserName} -p {Password} \\{IP_Address} /accepteula "C:\batchfiles\{BatchScript}.bat"
And the {BatchScript}.bat script is:
C:\{Directory}\infacmd.bat wfs startWorkflow -DomainName {Domain_Name} -ServiceName {Service_Name} -UserName {Username} -Password {Password} -Application {Application} -Workflow {Workflow} -wait
This script kicks off an Informatica process to build a data warehouse (not sure if that's important, but thought I would mention it). When I run the first batch script, it kicks off the second batch script. However, it seems like command prompt waits for Informatica to be finished before it exits. My issue is that I have other processes that need to run, and this process takes 5 hours currently. Is there a command I can add on to my second (or first) script that will exit command prompt immediately after it kicks off? I don't believe this will impact the data warehouse build since I don't need Windows to monitor the process.

start "windowtitle" C:\...whatever...\infacmd.bat w....
should do what you want...

Related

How to prevent Powershell from killing its own background processes when spawned via SSH?

We have a Powershell script in C:\test\test.ps1. The script has the following content (nothing removed):
Start-Process -NoNewWindow powershell { sleep 30; }
sleep 10
When we open a command line window (cmd.exe) and execute that script by the following command
c:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -File C:\test\test.ps1
we can see in the Windows task manager (as well as Sysinternals Process Explorer) that the script behaves as expected:
Immediately after having executed the command above, two new entries appear in the process list, one being Powershell executing the "main" script (test.ps1), and one being Powershell executing the "background script" ({ sleep 30; }).
When 10 seconds have passed, the first entry (related to test.ps1) disappears from the process list, while the second entry remains in the process list.
When additional 20 seconds have passed (that is, 30 seconds in sum), the second entry (related to { sleep 30; }) also disappears from the process list.
This is the expected behavior, because Start-Process starts new processes in the background no matter what, unless -Wait is given. So far, so good.
But now we have a hairy problem which already has cost us two days of debugging until we finally figured out the reason for the misbehavior of one of our scripts:
Actually, test.ps1 is executed via SSH.
That is, we have installed Microsoft's implementation of the OpenSSH server on a Windows Server 2019 and have configured it correctly. Using SSH clients on other machines (Linux and Windows), we can log into the Windows Server, and we can execute test.ps1 on the server via SSH by executing the following command on the clients:
ssh -i <appropriate_key> administrator#ip.of.windows.server c:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -File C:\test\test.ps1
When observing the task manager on the Windows Server, we can again see the two new entries in the process list as described above as soon as this command is executed on a client.
However, both entries then disappear from the process list on the server after 10 seconds.
This means that the background process ({ sleep 30; }) gets killed as soon as the main process ends. This is the opposite of what is documented and of what should happen, and we really need to prevent it.
So the question is:
How can we change test.ps1 so that the background process ({ sleep 30; }) does not get killed under any circumstances when the script ends, even when the script is started via SSH?
Some side notes:
This is not an academic example. Actually, we have a fairly complex script system in place on that server which consists of about a dozen of Powershell scripts, one of them being the "main" script which executes the other scripts in the background as shown above; the background scripts themselves in turn might start further background scripts.
It is important to not start the main script in the background via SSH in the first place. The clients must process the output and the exit code of the main script, and must wait until the main script has done some work and returns.
That means that we must use Powershell's capabilities to kick off the background processes in the main script (or to kick off a third-party program which is able to launch background processes which don't get killed when the main script ends).
Note: The original advice to use jobs here was incorrect, as when the job is stopped along with the parent session, any child processes still get killed. As it was incorrect advice for this scenario, I've removed that content from this answer.
Unfortunately, when the PowerShell session ends, so too are child processes created from that session. However, when using PSRemoting we can tell Invoke-Command to run the command in a disconnected session with the -InDisconnectedSession parameter (aliased to -Disconnected):
$icArgs = #{
ComputerName = 'RemoteComputerName'
Credential = ( Get-Credential )
Disconnected = $true
}
Invoke-Command #icArcs { ping -t 127.0.0.1 }
From my testing with an infinite ping, if the parent PowerShell session closes, the remote session should continue executing. In my case, the ping command continued running until I stopped the process myself on the remote server.
The downside here is that it doesn't seem you are using PowerShell Remoting, but instead are invoking shell commands over SSH that happen to be a PowerShell script. You will also have to use PowerShell Core if you require the SSH transport for PowerShell remoting.

Windows "Start" command doesn't return from within "Execute shell script" step

Within a kettle job we need to call a program that doesn't return until it is stopped. From a command line this can be done with the Start command of Windows:
Start "some title" /b "C:\windows-style\path with spaces\program.exe" unqoted_param -i -s "quoted param"
This works well by starting the program in another shell while the shell calling it returns and can continue. From within a kettle job this should be possible too, I think, by simply running the above command in a Execute a shell script step with the Insert script option.
However instead of returning from running the program in a new shell, the execution waits for the program to finish. This is not what we want because while the program is running (it's a VPN connection) we need to perform some other steps before the program is stopped again.
I suspect this might have something to do with how kettle performs the script, namely by putting the commands in a temporary batch file, then running that one. At least that's how it is presented in the job log:
2019/09/17 09:40:24 - Step Name - Executing command : cmd.exe /C "C:\Users\username\AppData\Local\Temp\kettle_69458258-d91e-11e9-b9fb-5f418528413ashell.bat"
2019/09/17 09:40:25 - Step Name - (stdout)
2019/09/17 09:40:25 - Step Name - (stdout) C:\pentaho_path>Start "some title" /b "C:\windows-style\path with spaces\program.exe" unqoted_param -i -s "quoted param"```
For a quick solution, you can use parallel execution in the job.
From the Start step (or whichever step precedes the need for the VPN), activate the option to run the subsequent steps in parallel. Then you can put the shell script step in its own branch while the rest of the job can continue (with a wait step on the other branch to allow the VPN to start).
From the question, you are probably running jobs from the Pentaho server. If you happen to run them from a scheduler with kitchen.bat, you could start the VPN before calling kitchen of course.

Windows Task Scheduler - Task Won't Die

I have a task scheduled to run daily that executes a .bat file and I have checked the options for ending the task after an hour and forcing it to stop if it does not end when requested to, but it runs indefinitely until I manually kill it. Any ideas? (I do not want to use pskill in my script). Script below:
sqlcmd -S MyServer -U username -P password _Q "BACKUP DATABASE x to y"
net use z: "\\server1\project"
cd C:\trial1
copy * "Z:\backups"
net use z: /delete
exit
Thanks to Hans Passant for the answer! See below:
Taking away 'net use' and just directly copying to the server cleared up some error that the task scheduler was getting stuck on.

Executing same commands with multiple arguments at same time in Windows

i have 3 commands to run in Windows terminal, but all of them should start in parallel roughly at the same time. I created .bat file, put my 3 commands in there and had run that bat file. But the commands are being executed in serial fashion(second command executes only after first one is finished). Is there a special way to do this in Windows?
What if i have to run the same command/exe i parallel, but with different arguments? I mean, lets suppose i have two commands, iperf.exe -s -u -i 1 10.10.100.1 and iperf.exe -s -u -i 1 10.10.100.2; Will these execute in parallel if i enter below lines in .bat file and run it?
start iperf.exe -s -u -i 1 10.10.100.1
start iperf.exe -s -u -i 1 10.10.100.2
And, even if they run in parallel, will there be any interruptions(even the slightest) while executing parallel processes like if one thread is dependent on another or will they execute strictly parallel like each command is an independent process? Because i need the programs to run at the same time and they should behave like separate and independent processes without any dependency or interruptions. Will START provide me with such kind of functionality?
In batch, use the start command, followed by the command you want to launch. All arg:
start iperf.exe -s -u -i 1 10.10.100.1
start iperf.exe -s -u -i 1 10.10.100.2
In PowerShell, use Start-Process (you can also call it "start", that's a built-in alias for Start-Process:
Start-Process iperf.exe -ArgumentList '-s -u -i 1 10.10.100.1'
Start-Process iperf.exe -ArgumentList '-s -u -i 1 10.10.100.2'
[Note: My answer appears to repeat information from the question because the question was edited to include information from my answer. The original question didn't mention the start command, it just asked how to start processes in parallel using batch or PowerShell.]
To answer the revised question: Both cmd's start and PowerShell's Start-Process spwan a child process, not a thread in the original process. In Windows, child processes are entirely independent from each other and from the parent. I've used both of these many, many times to launch applications that run separately, and can assure you that their operation is no more interdependent than if you had launched them in any other way.
The equivalent of start in PowerShell is Start-Process as has been pointed out but getting the parameters to the exe is a bit tricky e.g.:
Start-Process iperf -arg -s,-u,-i,1,10.10.100.1
Start-Process iperf -arg -s,-u,-i,1,10.10.100.2
I assume the exe runs until you press Ctrl+C or press enter? Otherwise the window opens, runs and then exits.
I think you just can't, neither in batch, nor in OS.
To minimize OS process loading overhead, you may try via Windows API to load all process suspended (CreateProcess, CREATE_SUSPENDED) and then, in a fast cycle start all threads (ResumeThread).
But I can't see any way to do that with command line, you have to go through programming.
Sorry to post it as a Reply, but not enough reputation to comment.
Insteady of:
Start-Process iperf -arg -s,-u,-i,1,10.10.100.1
Start-Process iperf -arg -s,-u,-i,1,10.10.100.2
You can do:
$params='-s -u -i 1 '
$ips=#('10.10.100.1','10.10.100.2')#and other more ips
$ips | foreach-Object{ Start-Process iperf -ArgumentList "$($params)$($_)"}
Then you can get the IP's from a list file, escalate and more.
I was looking for something like this when I've find your question.

Running remotely Linux script from Windows and get execution result code

I have the current scenario to deal with:
I have to schedule the backup of my company's Linux-based server (under Suse Linux) with ARCServe R15 (installed on Windows 2003R2SP2).
I know I have the ability in my backup software (ARCServe) to add pre/post execution scripts to my backup-jobs.
If failure of the script, ARCServe would be specified NOT to run the backup-job, and if success, specified to be run. I have no problem with this.
The problem is, I want to make a windows script (to be launched by ARCServe) for executing a Linux script on the cluster:
- If this Linux-script fails, I want my windows-script to fail, so my backup job in ARCServe wouldn't run
- If the Linux-script success, I want my windows-script to end normally with error code 0, so my ARCServe job would run normally.
I've tried creating this batch file (let's call it HPC.bat):
echo ON
start /wait "C:\Program Files\PUTTY\plink.exe" -v -l root -i "C:\IST\admin\scripts\HPC\pri.ppk" [cluster_name] /appli/admin/backup_admin
exit %errorlevel%
If I manually launch this .bat by double-clicking on it, or launching it in a command prompt under Windows, it executes normally and then ends.
If I make it being launched by ARCServe, the script seems never to end.
My job stays in "waiting" status, it seems the execution code of the linux script isn't returned to my batch file, and this one doesn't close.
In my mind, what's happening is plink just opens the connection to the Linux, send the sript execution signal, and then close the connection, so the execution code can't be returned to the batch. Am I right ?
Is what I want to do possible or am I trying something impossible to do ?
So, do I have to proceed differently ?
Do I have to use PUTTY or CygWin instead of plink ?
Please, it's giving me headaches ...
If you install Cygwin, you could do it exactly like you can do it on Linux to Linux, i.e. remotely run a command with ssh someuser#remoteserver.com somecommand
This command will return with the same return code on the calling client, as the command exited with on the remote end. If you use SSH shared keys for authentication instead of passwords, it can also be scripted without user interaction.

Resources