Approach to automatically upload updated assets to ftp - ftp

Say I need to track about 10 big files (each over 1G) over the the LAN shared folder, and if the files get modified/updated/overwritten, I need to upload those new files to ftp, and do some sort of notification like email.
But I don't need to version control for older files, so it won't inflate the repository.
How could be an easy approach to this? Since it need to keep monitoring the files, ftp protocol, email notification. Will make a C# program be a feasible way to do this? Or any suggestions? Thanks.
EDIT:
After checking around a couple version control large binaries threads in SO, it seems I don't need a version control system. Actually, I just need to keep checking the file metadata and once it's changed, upload to ftp and send out email. Not sure Mogware can disable backup old binaries?
Is there any other tools are feasible for this specific case? Thanks!

I'm in no way affiliated with the guys at Mogware, but there is a tool called "FileHamster" which is a personal revision tool, offering you to track file changes, call scripts after a file has changed etc. It also provides FTP upload.

Related

Coreftp won't delete source

I'm using coreftp to automatically pull files daily from an external ftp via SFTP. I'm able to pull the files, however despite using the flag 'delsrc', it won't actually delete the source file meaning the files may build up. I think it may have to do with the fact that I can't push to the ftp, however I can delete the files through the coreftp GUI. Thanks for the help
I recommend to use use: "FluentFTP"
It is better than CoreFTP.

Automatically uploading text files to an FTP site

I'm looking to automate an upload of a text file to an FTP site. This upload would need to occur daily, and I have access to a server that would run whatever script needed to do the upload. I've looked around for a solution to this and found some information on howtogeek, but neither idea there seemed to be automatic. I'm looking to do this without third-party software if possible. I would appreciate any pointers.
If you're on windows I'd use vbscript (more functionality can be added easily) or .bat files (if you don't need extra functionality) to call on windows FTP.(Provided you don't need anything super secure) Just build the .bat file to call on FTP and append the connection information accordingly. The link Here should help you out. Now in order to make this automatic you need to use the "Task Scheduler" to schedule how you want the script to run.

Allow people uploads files to same dropbox folder Ruby

I am very new to this, Sorry if this is a naive question. I've been through the ruby tutorial for Dropbox API. Still confused on where should I start.
My situation:
I am running a copyshop. And usually my customers either bring their usb sticks OR upload to gmail then download in my shop OR upload to dropbox then print thier docs/pdfs/fotos.
And one day a customer ask me if I know dropbox. I say YES, I know it. And he ask me if it's possible to share a folder with him, then he can put his files into the folder at home, then come to my shop, open it and print, neat!!
But...Other clients also want to use this service, and they don't want their files exposure to other people (maybe private fotos, secret business plan, important letters...etc). The other problem is I want to make those who do not have dropbox also could upload files to my Dropbox folder that they can come and print.
Why use dropbox is because it's free for till 18G. And When the customer comes to print, I can remove the files, so 2-10G will be enough for 1-3 days buffering.
What I am thinking is to implement a website that allows people to upload DOCs/PDFs/Photos and save these files to my dropbox folder.
For people who have dropbox accounts, they will have a folder called copyshop in their dropbox folder, and they drop files as they usually do. And I will have a folder App/copyshop/ , each one puts files in their copyshop folder will appears in my dropbox as a sub-folder under my App/copyshop folder, e.g. App/copyshop/Tom , App/copyshop/Mary ...etc.
For non-dropboxers they can take advantage of uploading to my website, then save it to my dropbox folder.
Is this possible with Dropbox API? From the official statement:
The API provides methods to read and write from Dropbox securely, so your users can bring all their important files with them to your app. Any changes they make will be saved back to all their computers, tablets and mobile phones.
It looks like not a recommendation way to do it.
Thank you! Every reply is appreciate.
if you make a site to upload files there is no need for dropbox, just let them upload to a map that is available in the shop.
To make it safe with dropbox would be a lot of work, i suppose customers don't want their files exposed to others, only suitable for regular customers, for occasional customers the best method i can think of is let them make a public link of a dropbox file and send it to you.
Another drawback of dropbox is that the size of shared files is added to both the the sharer and the shared so you could get in trouble with the limits.
You could also make a script that monitors a public dropboxfolder and immediately on arrival moves the files to a safe location not accessible from others.
I suppose FTP would be better manageable, you could give big customers their own map and password and occasional users a just-write, don't read the upload of others security.
Answer from the dropboxer,
Yes, this would be possible. There are a number of ways you might do this, and the method you choose will be up to you, so I'll just touch on a few.
Without even using the API, you could have your customers enable and use this feature to send you a read-only link to any file or folder in their Dropbox:
https://www.dropbox.com/help/167
This isn't a shared folder exactly, but it sounds like it should be sufficient.
You could use the API to build an app that would essentially do 1, but help them along with it. Essentially, you would have them authorize your app, and then let them select a file or folder, on which you would call /files (GET) (or /shares if that is more convenient for whatever reason) to download or share the files.
Hope this helps!
Greg

Check in - Check out process/version control for PSDs and Image files

The title may not be so clear but the issue I am facing is this:
Are designers are working on large photoshop files across the network, this has a number of network traffic and file corruption issues which I am trying to overcome.
The way I want to do this is to have the designers copy the the files to their machine (Mac OSX) and work on them locally. But the problem then stands that they may forget to copy them back up or that another designer may start work on the version stored on the network.
What I need is a system where the designer checks out the files or folders from the server which locks those files so no other user can copy them until they are checked back in. We do not need to store revisions for the files.
My initial idea was to use SVN or preferably GIT and force lock on checkout somehow, does this sound feasible or is there a better system?
How big are the files on average? Not sure about GIT haven't used it but SVN should be ok - If you did go with SVN I would trial checking out over Http/Https vs Network Path to the repo as you may get a speed advantage out of one or the other. When we vpn to our repo at work it is literally 100 times faster over http than checking out using a network \\path to the repo.
SVN is a good option, but you will have revisions (this is the whole point of SVN). SVN doesn't lock files by default, but you may configure it so that it does. See http://svnbook.red-bean.com/nightly/en/svn-book.html?bcsi_scan_554E00F99A9AD604=0&bcsi_scan_filename=svn-book.html#svn.advanced.locking
I don't know git very well, but since it's not a centralized VCS, I'm pretty sure it isn't the right tool for your situation.

Script to execute on CVS check-in, without access to the server?

Is it possible to write a script that executes certain instructions, and is triggered by any check-in to a CVS repository?
The script would scan the list of files in the change-set and do a copy operation on certain files in a certain sub-directory.
I would hopefully be able to execute various console applications, including ones written in .NET.
Problem is, I need this done quickly and I don't have access to the CVS server, due to corporate IT red-tape, etc.
Is there a way to set this up on one of the client workstations instead?
Can it be done without interfering with my working folder?
Can you get commit notifications by email as this blog shows? If so, you could be able to use maildrop (or good old procmail, etc) to run arbitrary commands and scripts on your workstation when the commit notification mails arrive.
I found a .NET library that seems up to the task - SharpCVSLib.
http://csharpopensource.com/sharpcvslib.aspx
(Hopefully it will work on a developer workstation and not need to be hosted on the CVS server.)

Resources