VBScript to close windows, log off and log back on - vbscript

I realize that this might sound ambiguous at first, but just hear me out.
I need a vbscript to either run locally or remotely to close some windows (I have already done it locally with sendkeys) and I can log it off too. However, since a vbscript can't run w/ out it being logged in to the same account that ran it, I'm curious if anyone has any ideas. I've thought about starting a script on my local machine to run the script on the server, then having a seperate script run to log it in again, but that is tedious and I need it done hopefully with just one click on my local machine (as many scripts as need be can be run on the server). Anyone have ideas?

Related

How to run script as admin on startup

I have a case where my users runs a script (bat) file that I wrote on win7 as admin. Im looking for a simple way (without installing any tools) to make a different script that I wrote run on windows start up through this batch file.
I tried using startup folder but that will run my script without admin rights. I also read about a solution with runas command but it didnt work and also its problematic to know the user details in advanced. I looked online but couldnt find anything to help me to do this automatically through the command line
UPDATE
By looking at the answers im thinking maybe the situation is not clear enough.
Im writing this script on my pc. I give this script (batch file) to my clients, who lack any knowledge of how to do anything but simple stuffs, such as openning cmd as admin and running my batch file that I write in advance. To sum up, I need this batch to be able to set a process (a different batch or vbs file) to run with admin privelleges on startup of the pc (again, without requiring my clients to do any complicated actions, im hoping to get my script to do everything for them)
If you do not have the credentials for the administrator account, you will not be able to run the script with elevated privilege. If you do have those credentials, then you can set up a scheduled task (described at this SevenForums post), running it under the administrator account.
Check the script carefully, and ensure that it's not incorporating anything that may cause problems, like an unavoidable GUI presentation - this question on ServerFault discusses that pitfall.
Make a shortcut to your batch, set its properties>advanced to run as administrator and then move the shortcut to the startup directory.

Automation: How to run some program on multiple windows 7 PCs?

I have an program which I need to run on multiple PCs (>100) all in same domain and share the same user and password.
If I run a program manually, it opens a console windows where I can see the logs running. I can remote desktop and still be able to see the console windows open and logs running. It helps in debugging as I can see whats going on. Issue is I want to avoid running it manually on all PCs.
I have following requirement
a. It should be able to run program on multiple pcs remotely
b. program should open in foreground where console logs can be seen.
c. launching PC should launch program on PC1 and then (without wait for the program to finish on PC1) it should launch the program on PC2 and so on.
I explored STAF it needs to be installed on all stations. Its heavyweight, needs complex configuration.
I explored power shell , it needs to be enabled on all stations and also probably (read somewhere) has issue with running tasks in foreground.
psexec can run the program but it does it in background, if I use -i option I need to give session id so only it will show the console for a specific remote desktop session. Another major issue with -i is that it will interactive so it waits until the program exits.
Looked a paexe, its simillar to psexe and doesn't solve the issue I am facing with psexe.
Any help is greatly appreciated.
If you have a server, you can put the program on the server and create a shortcut to it (assuming the program is small/simple enough to run on a network share).
You can use One-Click to deploy the programs, and give each of your users a link.
You can change the program into a website for central access.
You can use group policy to deploy the program when a user logs in, assuming you can create a .msi file from it.

Run an AppleScript from a server/in the cloud

Is there a possibility to run AppleScripts from a server or from a cloud service?
I want to have some scripts that can run if my computer is sleeping/off.
I looked around a bit on Google, but haven't found anything promising.
If this doesn't exist I basically need to remove the password from my computer and wake up the computer whenever the script needs to run.
It largely depends on what you want to do with the script. There are a few options.
You can use 'stay open' script bundles that, for example, check a certain folder and run when you interact with this folder
You can launch certain scripts when the server boots.
You need to have a server that is always on for this to work. I have this running myself and it works just fine. However, as I said before, it largely depends on what you want to do with your scripts.

I get "access denied" when my build script is running

I have a huge .bat file build script that calls into compilers and what not.
I have until recently been able to run this without any problems (and as I have not made any changes, I suspect there's a Windows update and/or setting at play?)
Using Windows 7
I am logged in as Administrator
I run the command line with "Run as administrator"
My build script then compiles and run my application (shuts down itself which I have confirmed in "Task manager") the first time.
This goes well.
But when it tries to delete the file (the application)
I get "access denied"
This basicly stops me completely because it is a large build system which non-automated takes a very long time and is very error prone to do manually.
The kicker is that if I try delete the file in Windows explorer I get "need admin rights" which is kinda peculiar since... I am already an administrator... and I have also run Windows explorer with "Run as administrator". However, after some time and attempting a rename it seeems to budge and allow the file to be deleted. (I have checked, and there is no process listed in task manager which should be "holding" the file in any way, so I tend to think this is an access rights issue... somehow)
I am... Open to ideas? :)
I have decided to move this question to:
https://superuser.com/questions/493249/windows-file-access-denied
as I think it is my Windows that is messed up somehow :(
I will still monitor this question at SO, but I think this a peculiar Windows issue I am experiencing

Script to automatically sync on directory/file modification in between Mac OSX machine and Linux machine

I have some source code on my Mac, and in order to test I'm interested in synchronizing it with a VM containing a similar web server setup to the production environment. Therefore I need to be able to automatically copy files over to the VM every time there are changes.
I know I can use rsync to do this manually whenever a script is run but I need some way of getting it to run in the background every single time a file in a particular directory or one of its sub-directories is modified.
I know inotifywait exists on Linux machines and could solve this problem. I've also read about the FSEvents API and kqueue. However, none of these seem to be accessible from the command line and I really don't want to spend a long time making something to do this...
I guess I could use a cronjob but a minute is a pretty long time to wait to see changes on a website...
Any ideas?
I do this all the time, developing on a Windows/Linux/Mac workstation, and saving changes to a remote Linux server where they're immediately served back to my workstation's browser for testing. You've got a couple options:
You could mount the remote files locally (like via sshfs) and make changes directly to them. I.e., your Mac thinks the files are local, so you can edit them with your GUI editor, but when you File->Save, it actually saves the file remotely. The main downside to this is that you can't work when disconnected from the server.
Mount the local files remotely. This would allow you to work locally while disconnected but won't allow the test site to work when disconnected -- which may not be a big deal. This option might not be doable if you don't have the right tools/access on the remote server.
(My preference.) Use NetBeans IDE, which has a very nice "copy to remote" feature. You maintain a full copy of all files locally, and edit them directly. When you hit File->Save on a file, NetBeans will save it locally and transparently scp/ftp it to your remote server.
How about using a DVCS like git or mercurial, and having the local repo run post-commit hooks to run the rsync and then the test itself?
I'm a bit confused about why you can't just run rsync from the same script that runs the test. If you run rsync -e ssh you can set up automatic public key authentication between the VM and the Mac. There won't be anything manual about the rsync in that case.
You might be able to set up a launchd agent to do what you want for a simple setup. See this question and the man page for launchd.plist for more information about the launchd WatchPath key. But it looks like WatchPath may not work for changes within sub-directories.

Resources