This may be a toughy or straight forward enough but trying to run a shell script every time a file is open to stop people accessing in use files. The file is in fused from a Google Cloud Storage bucket to a Ubuntu VM with a copy living in another bucket. When a user opens a file I want to run a shell script that sets the copy to read only. Anyone know a way or potential avenue I could explore to achieve this?
Thanks in advance
Related
I have never written AppleScript in my life (a few Powershell and batch files for work, but I am by no means an expert on those even). I've used automator before to do similar things, but the program I need to have run the command only accepts AppleScript.
I need an apple script that can do the following
Mount an network share (windows machine network share, if that
matters, username and password are stored in Keychain)
Move (NOT copy) all the contents from a folder on my MacBook to that share
Unmount that share.
Alternatively, I have this already in a workflow, so if there is a way to convert/export/whatever a workflow into an AppleScript, that would be awesome too!
Any help would be greatly appreciated!
I am working on building a application that will pull data from SFTP.
Basically I have written a shell script that's run on a cron job daily.
Now I want to know if I can implement some logic in the shell script that will scan the files (for security threats – such as: software viruses, worms, Trojans, adware, etc.) before downloading. Is that possible and how?
You basically want to "remote control" your SFTP session. There are several ways to do this, but I don't see, how you can scan a file for a virus using the (S)FTP protocol, without downloading it. This would require to execute a program on the remote side, and for what I know, FTP does not support this.
Maybe ssh would be the tool of choice. First open a ssh session, do all the checking of the files, then transfer them with FTP. If you are really paranoid, you can calculate the MD5 sum of the file before the checking, and after the download, and verify that they are identical.
I am needed to move entire directories from one computer in the network to the other (In a platform independent way). Basically I am working on some automation tool to help the developers do Build Verification Tests, for this; I am directed to automate the installation and un-installation of the product on multiple platforms. So, I will need to first copy the files!
And this is where I needed some help in both conceptual and practical knowledge.
Firstly, let me mention that using something like FileZilla or WinSCP is out of the question since I need things to happen automatically and not through button clicks. But please let me know if these tools have any command line utilities!
I tried Perl's NET::FTP, and while it looked promising, I was wondering whether it was the best way to go. Also, I want to know what are the pre-requisites before I can run FTP, I mean would I need perl installed on the other end as well ? I constantly read that the commands from perl's FTP actually try to connect to a FTP host, does this mean its not going to work if I haven't configured the remote host in some way? And if I am right, then what is this extra piece of configuration to be done?
Apart from this, is there any other way I could solve my problem ? I mean I am looking for API's here that would help me do platform independent file transfers. But once again, I cannot use tools that would need button clicks and stuff, because I am doing automation and everything needs be dome programmatic-ally and automatically.
Also, I think this is a very generic problem-statement: "Moving files across a computers connected by LAN"; So, it would be wonderful if we can have a list of (possibly) many options (ways to solve the problem) in the form of answers to this post.
Thanks in advance for any help that you wish to provide.
If nearly all of the files in your directory have changed, creating an archive, sending it over the network, and unarchiving makes sense. Actually, if your LAN is fast enough, though, it may be faster not to compress the archive--just use tar.
If only some of the files have changed, rsync, a command line tool, will only download the changes. It can be used with ssh like this:
rsync -ae ssh username#hostname:/path/to/files /store/here/locally
http://www.thegeekstuff.com/2010/09/rsync-command-examples/
On Linux and OS X, cron and crontab allow you to schedule scripts to run periodically. Windows provides the Windows Task Scheduler.
FTP is fine if you don't care about encryption over your LAN. Otherwise, SSH would be preferable.
rsync is available on OS X and Linux, but I think you can use it on Windows through Cygwin.
I suggest making an archive (e.g. a .tar.gz file) on the source host, transferring it with scp, and unarchive it on the target host.
You could also use unison or rsync
I would suggest you to develop your own FTP client in .NET. This way you will have complete control over the application, and instead of button-clicks you can schedule it using windows-scheduler. Here is an article about how to create your own FTP client in VB.NET:
http://dot-net-talk.blogspot.com/2008/12/how-to-create-ftp-client-in-vbnet.html
Are there any FTP programs which can automatically copy (or rather 'move') the contents of a folder to a remote server? I have of course googled this but only really found one or two ancient products which look really clunky and unmaintained. I was wondering if there's a way to do this from the command line or any better solution to the base problem.
In more detail, new files get written to a folder every few hours. These new files need to be FTP'd elsewhere and then deleted. Mirroring or synchonisation systems are probably out of the picture as we need to delete the source files once they've been successfully transferred.
If it's easier, the 'solution' could pull the files off the server (rather than the server pushing them to the client). The computers will both be Windows OS.
You could use any off the shelf FTP program that supports command line and schedule a task on Windows Scheduler to run every 10 minutes. Check the folder, and move any files to the FTP site.
In the end I used a program called FTP Auto Sync: http://ftp-auto-sync.com/
I like the functionality of dreamweaver where you can add a site and define an ftp and then when you save a file it saves a local copy and also uploads a file via ftp. I am trying to get similar functionality with linux. What I have thought of doing is have inotify monitor a local folder and upload any new or changed files to an ftp site, but I am having a hard time finding information on this. Any ideas on how I can accomplish this?
Also, I do not want to install any programs on the ftp server.
Thanks
Dean
You might want to take a look at cron scheduling an rsync job, which will efficiently copy changed files across a network at a chosen interval. rsync will use ssh or rsh (not ftp), so this might not work, but would seem a better way in most cases.
I'd throw together a python script which uses inotify and scp/ftp.
These are all common and should be supported by whatever distro your using. They're also all pretty well documented.