GCE Windows startup Script is not running - windows

I have a simple Django code which I want to keep running on a specific GCE instance. Sometimes the instance gets restarted due to some reasons, not in my control. I created a batch script which I tried to put in Startup folder in both users and common folder. It didn't work. I tried putting the script in using sysprep-specialize-script-url(using cloud storage), sysprep-specialize-script-cmd and sysprep-specialize-script-bat. It didn't work. Here's the content of the batch script -
cd C:\Users\kartik_domadiya\Desktop\happierMiscGoogleCloud
manage.py runserver 0.0.0.0:80
pause
I tried running C:\Program Files\Google\Compute Engine\metadata_scripts\run_startup_scripts.cmd manually and it worked (with any metadata key). So I can see that there's no problem with the script itself.
I even tried with putting the batch script in task scheduler which didn't work too.
So is there any way I can debug the problem and find out why isn't the batch script working? I am using Windows 2012 R2, if that matters.
PS: I know that's a development server and should not be used in production.

I moved the code to C:/code (basically out of any particular user's folder) and then provided all user its access (Right Click > Properties > Security), updated the batch file and put it into startup folder (Run > shell:startup).
It started working after that. I suppose the issue was due to access permission.

Related

pm2 process using the old system Environment Variables on resurrection

I have created a node application that is for subscribing to an OPC-UA server and storing the data on our s3 bucket. I am using the node-opcua module for that purpose.
I am working on a Windows server via RDP and the node-opcua module creates some files under %LOCALAPPDATA%\Temp as part of the process and uses it. I am using pm2 to run the application and it is getting the path of those files via TMP and TEMP environment variables which are dynamically generated by the process itself.
When the Windows server restarts it delete those files and the location updates of the new file. I already have run pm2 save and put the pm2 resurrect command in the Batch file which has a shortcut in the windows startup to make sure the process automatically gets started.
The issue was that the pm2 process was resurrected but still throwing the error %LOCALAPPDATA%\Temp\{some_path} file not found by the node-opcua process which was running through pm2. I ran pm2 restart manually but still didn't work out.
First I was thinking of it from the problem with the node-opcua module and thought about how can make it use the new system variables, but that was not in had as the process keeps making and deleting temporary files, so I need the pm2 to use the new system variable which has the updated path after system reboot and was not updating even after pm2 restart.
So, for updating the variables I figured out two solutions:
Either delete the old process and initiate a new pm2 process to run that application and put it in the batch file which is being called at server reboot
Adds pm2 restart {name} --update-env after pm2 resurrect and the system variables will be updated.

Unable to save output from Rscripts in system directory using Devops Pipeline

I am running Rscripts on a self hosted Devops agent. My Windows agent is able to access the system's directory where its hosted. Below is the directory structure for my code
Agent loc. : F:/agent
Source Code : F:/agent/deployment/projects/project1/sourcecode
DWH _dump : F:/agent/deployment/DWH_dump/2021/
Output loca. : F:/agent/deployment/projects/project1/output_data/2021
The agent is using CMD in the devops pipeline to trigger R from the system and use the libraries from the system directory.
Problem statement: I am unable to save the output from my Rscript in to the Output Loca. directory. It give an error as Probable reason: permission denied error by pointing to that directory.
Output File Format: file_name.rds but same issue happens even for a csv file.
Command leading to failure: saverds(paste0(Output loca.,"/",file_name.rds))
Workaround : However I found a workaround, tried to save the scripts at the Source Code directory and then save the same files at the Output Loca. directory. This works perfectly fine but costs me 2 extra hours of run time because I have to save all intermediatory files and delete them in the end. Keeping the intermediatory files in memory eats up my RAM.
I have not opened that directory anywhere in the machine. Only open application in my explorer is my browser where the pipeline is running. I spent hours to figure out the reason but no success. Even I checked the system Path to see whether I have mentioned that directory over there and its not present.
When I run the same script directly, on the machine using Rstudio, I do not have any issues with saving the file at any directory.
Spent 2 full days already. Any pointers to figure out the root cause can save me few hours of runtime.
Solution was to set the Azure Pipeline Agent services in Windows to run with Admin Credentials. The agent was not configured as an admin during creation and so after enabling it with my userid which has admin access on the VM, the pipelines were able to save files without any troubles.
Feels great, saved few hours of run time!
I was able to achieve this by following this post.

monitor folder and execute command

I'm looking for a solution for monitoring a folder for new file creation and then execute shell command upon the created file. The scenario is I have a host machine that runs a virtual machine and they share a folder. What I want is when I create or copy a new file to that shared folder on my host machine, on the VM, the system should be able to detect those changes. I have tried incron and inotify but they only work when I do the copy, create as a user in the VM. Thanks
Method 1 in this answer may help: Bash script, watch folder, execute command
Just run that script in your VM, and you should be able to detect changes made by the host.

Doxygen not running when started with the Windows task scheduler

I've started using Doxygen to document my team's project source code (we have C#, Objective-C, and Android/Java projects). I wrote up a Windows batch script which checks out the latest trunk versions of each project and uses the command-line Doxygen to generate HTML documentation sites and publish to a directory on the local file system which IIS 7 already hosts. This batch script works perfectly and does everything it needs to, though it takes 10 - 20 minutes to run completely.
Now I'm trying to automate the process so that it will run at the end of every day. I added a scheduled task which simply runs the batch script. Every part of the script seems to work except for the Doxygen part. I can log into the machine and watch the file system and see that working copies are being checked out with no problem and the cleanup stuff works. However it never generates the Doxygen HTML output. The output/target directories Doxygen is configured to use will stay empty every time. I'm finding no error messages of any kind (in Scheduled Tasks and eventvwr). It doesn't work whether I let the task scheduler start it on its own or I tell it to run the task now. As said earlier, I can double-click the batch file and run it normally and everything works fine that way.
The process is done on our development server, it's an older Dell workstation running Windows Vista Business 32-bit. I have the scheduled task running on the "System" account though I have also tried "Local Service" and my own Active Directory domain account (which is an administrator on this server) and it still doesn't work.
Has anyone else successfully used the task scheduler to automate Doxygen? I have no idea what I'm doing wrong. What should I look for next?
I can post slightly anonymized versions of my batch file and Doxygen config files if necessary.
In your batch file, try adding redirection of the doxygen output to a log file. Then run it through the scheduler and see what output was generated. If doxygen encounters an error when run that way you should see it in the log file.
doxygen doxyfile > doxygen.log 2>&1
Also make sure that your bat file runs correctly, even if invoked from another directory than the one where the doxygen stuff resides. When run through the task scheduler, I think that the current directory will be c:\windows\system32, so try this:
c:\windows\system32>c:\path\to\batchfile\mybatch.bat
If that gives path errors you have to fix them.

Windows Scheduled Task not working for PHP script

I'm using Windows Server 2003 to try and execute a PHP script every day at 9AM. Here's the command inside of run:
"C:\Program Files\Zend\ZendServer\bin\php.exe" C:\Program Files\Zend\Apache2\htdocs\signout\teachernotify.php
It's supposed to execute the PHP script above which is supposed to e-mail me, but it doesn't work. Accessing the script above through a browser works, but not through Scheduled Tasks.
When I run it through the task scheduler, it goes through like something happened, but it doesn't appear to work.
Any advice?
Thanks
It sounds like it could be a permissions problem. When run through Apache via a web browser, the permissions would be based on whatever permissions Apache is running under. As a scheduled task, the permissions are based on the user specified for the task.
Make sure that the user specified in the "Run as" field in the task settings dialog has permissions to send an email.
Based on your most recent comment, it seems the problem is that the filename needs double quotes around it. I failed to notice that in the original question.
"C:\Program Files\Zend\ZendServer\bin\php.exe" "C:\Program Files\Zend\Apache2\htdocs\signout\teachernotify.php"

Resources