I installed 3 scheduled task and triggered them with Windows startup.
After disabling hibernation, this worked fine only on one random task after several tries.
All of them consuming the same exe file with different parameter.
I then created 3 binary folders for 3 different locations of the exe file and its binaries and it turned out, that this works.
Problem is, that I want to create much more scheduled tasks with the same exe and calling with different arguments.
So the question is. Do I really have to copy new folders for every new task?
Related
I would like to examine the possibility to start a program with task scheduler. The program is starting with different config .ini files depending on the network status (online/offline)
This is managed with a script today but i,m curious if this is possible to do with the task scheduler.
I,m not asking for a complete solution but if i even should consider it.
I have written a script in python 3.8 that takes two file names as command-line args and does some operations. And now I wish to add the .exe file of that script to Windows 10's context menu.
I manually added an entry to the registry under "HKEY_CLASSES_ROOT*\shell\MY_APP\command" but what it does is, execute the .exe file twice instead of taking them as multiple parameters to a single instance of my app.
Ask:
How can I make it execute only once by-passing multiple selected file names as parameters to a single app instance?
How can I build an installer that adds an entry to the registry automatically when the user is installing my app?
Thank you
The classic static verb registration you are doing here is always going to launch multiple instances.
There are two possible workarounds:
Implement IExplorerCommand or IDropTarget COM shell extensions.
When your application is started, try to find another instance already running and pass it the new path with some kind of IPC, for example the WM_COPYDATA message.
Our software project uses Inno Setup to roll it out to the customers. We found on some target computers it takes a long time for some DLLs to be copied to the system32 directory (about 2 minutes per file). The first intention was this files are hardly monitored by the virus scanner. But it has nothing to do with the certain files itself. If I change the order of the DLLs to be copied - always the first 3 files takes this long time. Because of this behaviour it's out of question that the virus scanner finds something peculiar within the files.
I found another strange thing: In the PrepareToInstall function the installer executes vcredist_x86_vc++_2013.exe. If I exchange the file to another executable file that is packed within the installation the problem persists. But if I disable the execution of the pre-requisite installation file later all DLLs are copied within a fraction of a second.
Summarised: If I copy a prerequisite file (doesn't matter what) later on the first 3 or 4 copied DLLs needing a very long time to be copied.
Does somebody here find an similar problem or knows a way to solve this. Any help is welcome.
Other information:
Its the same for both Win7 or Win10
The user owns local administrative rights
I have attached a script with a folder to run whenever something is added to the folder. The problem is if multiple files are added one after another the process gets queued. How to make the script run even if its already running for another file.
You can't stop the process queuing. The "on adding folder items to xxfolder after receiving xxfiles", gets all files dropped at same time (xxfiles is a list). When system takes too much time to add all files (copy via slow network, ...), the systems split the list of added files in sub-lists, each one calling the script. That's the way it has been built ! but what is your troubles having several calls instead of only 1 ? ..as long as all files added are processed, it should be OK.
fyi, the system does periodic calls to check folder actions, using launchd process, but it is not documented where the period is set.
I am trying to keep two directories synchronized with the same files in them.
Files are dropped into Directory A throughout the day. I would like to create a file watcher script that will copy files from Directory A to Directory B as soon as they are dropped.
My thought was to run the job every minute and simply copy everything that dropped in the last minute, but I am wondering if there is a better solution out there.
I'm running MKS toolkit under Windows. Different servers, same operating system.
Thanks for your help!
If you use Linux, you can hook into the kernel using the inotify API to get notified if something in a folder changes. There are command line versions like inotifywatch(1) as well.
To copy the files, I suggest to use rsync(1): it is clever, knows how to clean up after itself and it will create new files hidden while they are copied so users and programs are less likely to pick them up before they are complete.