I need to transfer webserver-log-like-files containing periodically from windows production servers in the US to linux servers here in India. The files are ~4 MB in size each and I get about 1 file per minute. I can take about 5 mins lag between the files getting written in windows and them being available in the linux machines. I am a bit confused between the various options here as I am quite inexperienced in such design:
I am thinking of writing a service in C#.NET which will periodically archive, compress and send them over to the linux machines. These files are pretty compressible. WinRAR can convert 32 MB of these files into a 1.2 MB archive. So that should solve the network transfer speed issue. But then how exactly do I transfer files to linux? I could mount linux drive on windows server using samba, or should I create an ftp server, or send the file serialized as a POST request. Which one would be good? Also, I have to minimize the load on the windows server.
Mount the windows drive on linux instead. I could use the mount command or I could use samba here (What are the pros and cons of these two?). I can then write the compressing and copying part in linux itself.
I don't trust the internet connection to be very stable, so there should be a good retry mechanism and failure protection too. What are the potential gotchas in these situations, and other points that I must be worried about?
Thanks,
Hari
RAR is bad. Stick to 7zip or bzip2. Transfer it using ssh, probably with rsync since it can be link-failure-tolerant.
WinSCP can help you transfer files from Windows to Linux in batch with script. Then configure Windows Task Scheduler to run the script periodically.
I learnt from this post step by step: https://techglimpse.com/batch-script-automate-file-transfer-winscp/
Related
I need to pull log files from a Windows server. The log files appear every minute and I am trying to ftp to the server pull the file back to the linux box and process it. I have been looking around and found about inotify, I am not sure how to use it in a bash script after ftp to the Windows server. I am open to other implementation it does not have to be ftp/inotify but I am not sure how this could be done.
any ideas??
Two products come to mind depending on how you plan to approach the solution.
I personally use Splunk on a variety of platforms (Windows & Linux servers/local Linux & OSX dev environments). It is a real-time log aggregator that features an API and the ability to query. Even if this doesn't solve your problem, the free version has some very robust features that you should consider: http://www.splunk.com
The second approach would be synchronization of your web directories using something like RSync. I've used RSync on Linux boxes and always appreciated what it can do. I even see it now has a Windows port: https://www.itefix.no/i2/cwrsync
Recently we bought dedicated server with Windows.I really surprised windows allowing me to copy files(CTRL+C) from my local system and paste files(CTRL+P) in dedicated server.Vice versa.
Before we tried Linux dedicated server.If you want transfer file from local system(Windows) to dedicated server(Linux),I need to use FTP.But in windows server all these are not needed.
I have a curiosity to know,How the procedure is working internally.For this I search allot,But I didn't found clear materials.So If you know any good material,suggest me.
Thanks.
I'm not sure what your question is precisely. Are you using remote desktop to log into your windows server, and then copying and pasting files between your local system and the remote server? If that's the case, then the copying and pasting is happening within your "remote desktop" client. It's not really related to your server's operating system, per se.
We need to distribute lots of small jpg files to offline systems. Right now, we send it as a 7zip (or plain zip) which is 800MB (230K files) and use 7zip to unzip it. It is taking about an hour to unzip on fairly large 4 core processors.
Is there a way on windows7 (or win server 2008) to create and unpack a package of files of this size in a more reasonable time frame?
(I will entertain even far out answers such as: put this all in a single CloudDB database as binary blobs and then ship the archive to the target machine, or create a VM, or a virtual disk image - but I will need some pointers to tips on doing that sort of stuff).
So then here's your far out answer: ;)
The problem probably doesn't lie in computing power. The filesystem and/or harddisk are the bottleneck most likely.
For Win7 (and afaik Server2008 as well) you could use a Virtual Hard Disk instead of zipping it. Win7 has native support for VHD-files and can emulate the content as a drive or subfolder via Disk Management. So there would be no need to unzip the files.
I had the same problem, and solved it. The issue is likely the Windows Attachment Service, which subjects downloaded or attached zip files to additional scrutiny for security reasons.
To bypass this:
Right-click the file
Choose Properties
Check Unblock
For more info, see: Why is WinZip slow?
I spoke to some colleagues, and they might have an easier solution. Since the size is under 4GB, and I want READ-ONLY access, I can create an ISO image, and then mount it on win7 or win2008server, using this Microsoft utility:
This utility enables users of Windows XP, Windows Vista, and Windows 7 to mount ISO disk image files as virtual CD-ROM drives.
I'm building a (for now pretty minimal) network sync system for some of our users, involving a samba server on one end and an rsync cron job which is "installed" for OSX or Linux clients by running a simple bash script linked from our intranet.
I need to do the same thing for Windows clients. I know there are several rsync implementations on Windows (I used cwRsync ages ago), but are there any (off the top of your head) that I can silently pass a config to during install? As it is, I guess I'm going to have to write a crappy old batchfile to interface with Windows Task Scheduler, but I'd at least like for clients installing this to not have to input any more than their username and password.
Thanks!
I've had success with
RichCopy
RoboCopy
Cygwin rsync.exe
All using scheduled tasks.
RichCopy (and maybe robocopy) have options to save config files from the GUI. All worked well for me from a batch file.
All three have restartable/incremental modes. Most are highly aware of specific features think
NTFS encryption
NTFS compression
permissions (ACLs)
alternate NTFS streams
junctions/reparse points
hardlinks/symlinks
etc.
I'm not familier with linux and debian system I work most with windows computers. But one of my clients uses debian linux web server and I need to upgrade the server's raid array.
Before I do anything with the server I would need a full system backup. I search through the internet for solution and also this site, but I haven't found acceptable answer.
I would need something like LVM snapshot, but I don't want to convert everything to LVM partition just for a backup. I found the DD to make bit by bit copy of the hardrive, but I should unmount the drive for it and I don't too much service offline. The reconfiguration of the raid will be enough offline. I found solution like TAR the necessary files and send through SSH, but it isn't a full system backup. I do backup every month form the files and settings.
I need a solution that makes an easy restorable image file of the server for emergency case. If the raid configuration fails I will need SOS restoration of the full system to the old config.
You should use rsync.
It's not a snapshot, but if you don't want to use LVM, it's a start.