Syncing a file from a client to a server - client

I'm trying to keep a file updated real time with the server. Its more like a real time syncing which has a very small delay. Is there any application that lets me do this? Or would you suggest me using a local host as a server?

I dont know how you are connected to your server - but i assume this will be something like SCP / SFTP / FTP and i dont know your OS. WinSCP will do excatly this what you need, you can set it to watch your Filesystem (to a specified folder) and it will update the server files as soon as your file on your drive changes.
It also supports command line features so that you can use it within your own applications.

Related

Trouble Uploading Large Files to RStudio using Louis Aslett's AMI on EC2

After following this simple tutorial http://www.louisaslett.com/RStudio_AMI/ and video guide http://www.louisaslett.com/RStudio_AMI/video_guide.html I have setup an RStudio environment on EC2.
The only problem is, I can't upload large files (> 1GB).
I can upload small files just fine.
When I try to upload a file via RStudio, it gives me the following error:
Unexpected empty response from server
Does anyone know how I can upload these large files for use in RStudio? This is the whole reason I am using EC2 in the first place (to work with big data).
Ok so I had the same problem myself and it was incredibly frustrating, but eventually I realised what was going on here. The default home directory size for AWS is less than 8-10GB regardless of the size of your instance. As this as trying to upload to home then there was not enough room. An experienced linux user would not have fallen into this trap, but hopefully any other windows users new to this who come across this problem will see this. If you upload into a different drive on the instance then this can be solved. As the Louis Aslett Rstudio AMI is based in this 8-10GB space then you will have to set your working directory outside this, the home directory. Not intuitively apparent from Rstudio server interface. Whilst this is an advanced forum and this is a rookie error I am hoping no one deletes this question as I spent months on this and I think someone else will too. I hope this makes sense to you?
Don't you have shell access to your Amazon server? Don't rely on RStudio's upload (which may have a 2Gb limit, reasonably) and use proper unix dev tools:
rsync -avz myHugeFile.dat amazonusername#my.amazon.host.ip:
on your local PC command line (install cygwin or other unixy compatibility system) will transfer your huge file to your amazon server, and if interrupted will resume from that point, will compress the data for transfer too.
For a windows gui on something like this, WinSCP was what we used to do in the bad old days before Linux.
This could have something to do with your web server. Are you using nginx or apache as your web server. If so you can modify the upload feature in your nginx server. If you are running nginx on the front end of the web server I would recommend the following fix in your nginx.conf file.
http {
...
client_max_body_size 100M;
}
https://www.tecmint.com/limit-file-upload-size-in-nginx/
I had a similar problems with a 5GB file. What worked for me was to use SQLite to create a database with the csv file that I needed. Use SQLite code to bring create the database. Then I used a function in RStudio to communicate with the local database. In that way, I was able to bring in the csv file. I can track down the R code that I used if you like.

Saving CSV file on UNIX Server from windows based Lotus Server using Lotus Scripting

I have to write a script on Lotus Server which is on Windows server to save a csv file on UNIX server. I and Unix server path requires authentication. So can somebody help me or suggest me how to do it?
Thanks in advance.
Siddhartha
Could setting up a FTP server on Domino and accessing this from your UNIX server be an option ?
Mindoo FTP server
I once resolved this in two steps:
1. Save the file to a temporary directory on the D omino server using LotusScript
2. Create a scheduled taks on the windowd serverr to copy the file to the second server
Advantages:
You can specify any user in the scheduled task and you don`t have to care about accessibility of the other server.
Disadvantages
Two separate processes.
Hope that helos.
Michael
In my scenario which was very similar to yours, I did the following:
On the Windows Server, I created a Mapped Drive to the folder on the Unix OS. This also managed the Authentication.
In the LotusScript Agent, I extracted to this Mapped Drive, which worked 100%.
You need to provide more details. Presuming you can access the Unix folder from Windows Explorer, map the drive and let Windows store the password. Then access it through the mapped drive letter.
LotusScript can't write to UNC locations, so you need the drive letter.
That file will be probably picked up by another program. CVS is the worst approach. You could offer to write to a Web Service or provide one.
Update
On Unix "access" more often than not doesn't mean a CIFS (a.k.a Windows share) access, but SSH (or FTP). For SSH you would want to:
configure SSH Keys, so you actually don't need username/password any more
use a Java library as asked on Stackoverflow before (or an alternative)
you also could write the file to a temp directory and call a cmd file for the copy operation
With a little care (make the cmd file configurable) the stuff will work when moving your Domino to Unix/Linux too
Let us know how it goes

Encrypted FTP Storage

I guess this is kind of a programming question, because I'm going to write a program if this doesn't exist.
So I found a very cheap web-host (I don't really care about the actual web hosting). They will give me a domain name and ftp server with a ton of storage space. Anyway, I want to backup a few hundred gigs of data (mostly family photos and scans of important documents). I also want to backup any future family photos / documents. I don't care if everything on my local NAS dies in a fire, I just want to have the photos and important documents backed up off-site.
So I want some program that lets me select folders locally and schedules them to be backed up to the ftp server. I'm a bit of a security nut, so i'd like the files to be encrypted locally before being transferred up onto the server.
I know I can do this with truecrypt volumes, but I don't want to transfer an entire encrypted volume blob up to the server ever time I change a file in it. So I could do multiple true crypt volumes but that will be a pain to manage
Also this must be mac/linux compatible although I'll primarily be on linux.
I basically need rsync + truecrypt + cron + sftp all rolled into a cryptographically secure program.
I've been searching for days with no luck. Any ideas?
mozyBackup does this - it doesn't use FTP, it has a custom uploader.
ps. Remember a typical home ADSL connection only does about 1Gb/day upstream
Linux option.
Out of the box option probably duplicity ( for example see http://www.howtoforge.com/creating-encrypted-ftp-backups-with-duplicity-and-ftplicity-on-debian-lenny )
Otherwise if these are basically rarely changed archive copies of files - I would roll my own gnupg (or dpad) individual file encryption, a file changed script, and ftp or rsync.

Efficiently creating tar files

Note: I'm using Windows file servers and .NET
If I were to create a TAR file from files on a remote file server (meaning, the TAR file would be created on the remote file server, where the original files are), would the bytes need to come to my machine and then go back to the file server (since my machine is running the code that's generating the TAR), or would they stay on the file server? I'm asking about the best possible (theoretical) implementation.
Thank you!
The bytes need to be where they are processed.
If you process them on your remote system, they must be transferred.
If you process them on your server, they don't need to be transferred.
If your goal is to minimize bandwidth usage, your best bet would be to have a script on your server that will generate the tar files for you when triggered by your remote system.
The best possible implementation really depends on what your goals and constraints are.
The bytes would have to be read into your machine. The only way I know that you can just do the TARing on the remote server is to have the remote server generate the TAR. For example, you could connect via SSH and run a shell command on the remote server.
Unfortunately, in the scenario described, the TAR operation will use network bandwidth. You need to run the tar program on the file server to avoid using bandwidth.

Automated FTP to Windows servers

I need to set up some sort of infrastructure to automatically FTP some files from one remote server to another. The FTP transaction will occur on a scheduled basis. Both these servers are Windows boxes and the location of the files that need to be FTP'ed will depend on the current date (the folder they sit in will be named the current day's date).
I would really hate to have to write something like that from scratch, so are there tools/utilities/the likes out there?
Windows command line ftp will read input from a text file. For example:
ftp -ni < ftpscript.txt
Where these options are used:
-n - Suppresses auto-login upon initial connection.
-i - Turns off interactive prompting during multiple file transfers.
The ftpscript.txt would look something like this:
open ftp.mycompany.com
user someuser password
binary
get /somedir/myfile.dat
It should be reasonably straightforward to write a script that outputs an FTP script containing the correct file name.
You would think that this would be an easy task. It isn't.
The best tool I've found is SyncBack and I think there is a free/lite version available.
If you're in a Windows environment consider using PowerShell and the System.Net.WebClient .NET Framework class like this post. This is a download example, but the WebClient object also provides an UploadFile method.

Resources