I would like to copy a site which is currently being hosted live on an easyspace web domain to another domain that is hosted by the web company Parallels. I wondered if there was a way of doing this without fist taking the site down and copying the files back to the local server, then re-uploading them again.
If anyone has any advice on this, I would be most grateful!
Regards,
Robert Y
It is necessary to check if fxp protocol (that is a protocol that supports file transfer between servers) is supported by both servers: but usually is disabled for security reasons. If it is active (make a question to your host admin) than you can use a fxp software as FlashFXP, SmartFTP or FTP Rush (Freeware) or similar sw.
SUPER FAST:
For direct Server to Server file transfers: if you know your ftp url or ip address you can use FlashFXP (not related to adobe flash)
Make a zip file that contains all the live files on your target server. You can do this via command line or easier through your server's cPanel. Compression Time depends on your server's CPU speed and quantity & size of files.
Download a copy of FlashFXP it works well and has a free trial
http://www.flashfxp.com/download
Log-in to your target server, navigate to directory with your target files.
Log-in to your destination server, navigate to directory to store transfered files.
Once connected & directories are visually lined-up... just drag and drop desired files into desired directory on opposing server.
Overall transfer speed depends on internet backbone traffic & each servers connection + cpu speed & throttling controls between your two servers. GOOD NEWS the speed is NOT reliant on your cable modem's connection speed. No slow download and then even slower upload, just a clear straight shot to the other server.
This FlashFXP Server to Server video tutorial is nice --> https://www.youtube.com/watch?v=6XXQgeRWWRw
Related
I've been using a .NET DLL by Renci SSH NET which transfers the files from local machine to secure FTP. It was working fine for small files but this has a problem while uploading large files through my application. I increased the buffer time, operation time out but it still doesn't upload. When I debugged my code, it stuck at the point where I'm uploading & it does not even throw any error.
Any Suggestions Please?
Thanks
Per your description, it's not clear if the issue caused by your codes or by the SFTP server, so I would like to suggest:
Use a ftp client to upload the same file to the SFTP server to see if that OK.
If OK, then you can use network monitor tool, such as wireshark, tcpdump, to check if the issue is caused by network unstable.
Also, by using the network monitor tool, you can check if the upload process stop at the same point every time.
Hopefully that would helpful.
I am trying to figure a way of connecting 2 web server's file systems together so they can access each others files natively. The servers are Windows 2012 and connected directly to the internet via public IPs. One server will be for storing large files, the other for the scripts and database (web server).
Essentially, I need a way to securely map a drive/folder between these servers so they show as folders e.g. the "d:\www\assets" folder is actually on the other sever (junction). As this link needs to be accessible to the SYSTEM (apache service) not a single user, a mapped drive is not ideal. Open, unsecured shares are also not a good idea.
Playing with junctions and links in the console doesn't show any method to provide a login/password to the remote system. The junction is created but inaccessible. If I map a drive, it is only for that user and not available to the SYSTEM account that Apache is using. If I run Apache as a user and map a drive as that user, it likely won't survive a reboot or work without being logged in on the console all the time.
Are there any native ways to hook these 2 servers together securely? I have full admin access on both servers and can create as many users as required, but they are not in a domain or potentially even on the same subnet.
You may be able to use to a directory symlink via cifs/windows share considering you have have access to the local disk on one of the servers and the your sharing the the folder you want to symlink
for example:
on server a:
1. navigate to server a's local disk: d:\www\
2. mklink /d assets \serverb\assets
option 2DFS (unconfirmed)
if you can create a dfs on server on one user windows boxes, i believe you set dfs target to point folder assestes to \serverb\assets
I have excel with macros run in Mac OS.
One of the macros uses Workbooks.Open to open a file in network folder (a SharePoint site).
It works fine if there is connection into the network.
Also it works fine if the user of the Mac profile runs the macro for the first time with the network connection on or off. If there is no connection Mac returns error like it should when run first time.
However the problem is that after the user has succesfully used the file (macro) with the connection on once then Workbooks.Open does not return any error when running the macro second time and the network connection is down.
I added exit sub command right after the Open method and could see that there is same network file open on excel with status of 'Offline file'. That file was the same as the one previously opened when using the macro with network open for the first time. However the file was not fetched from the network drive as this time there is no connection to network.
Tried to find the file from Mac file system, without success.
What creates the offline copy and where is it stored?
How to delete the offline file via vba code or how to prevent excel (or Mac) from creating it ever again?
Br,
MikkoT
You should disable oplocks in the SMB protocol.
Oplocks are opportunistic locks, a client-side performance enhancement
that requires cooperation between a Windows client and the SMB
service. If SMB service supports oplocks, the client can request to
cache a file locally, in order to perform read and write operations on
the cached file rather than directly on the server. This saves network
bandwidth and increases performance for the SMB client. If another SMB
client requests access to the file, the SMB service notifies the
holder of the oplock, and that client should write changes from its
cache back to the SMB service. The SMB service does not let another
client have access to the file until the first client has finished
writing.
We are building a system on windows where we centrally (server) need to do fopen to either local files or remote smb resources. The idea is to authenticate in the case of remote resources before doing fopen (with unc paths).
We need to authenticate with the credentials the user (client application) supplied for this resource on that remote share. We don't want to copy any resources.
Using the Win Net Api this works smoothly since it stores the given credentials so that subsequent fopens in the same or in different processes succeed.
But there is a problem:
Many of you probably know the following message from windows when trying to connect to a smb share with different credentials then the ones used for a previous connection:
"Multiple connections to a server or shared resource by the same user, using more than one user name, are not allowed. Disconnect all previous connections to the server or shared resource and try again."
See http://support.microsoft.com/kb/938120 for the defined limitation and possible "work arounds".
Since we have a central server application running as a service ('Local System' account) we hit this limitation with having already two different users :).
Closing the previously established connection to allow for the 2nd one is not an option (ongoing processing).
On the one hand it's great that windows caches authentication information on the other hand it's too limited.
Modifying the hosts file for each user does not look very nice.
Using smb client libraries (like libsmb++, impacket) doesn't seem to be the solution since we need "over process" authentication.
Configuring a "master" smb share user is also not wanted.
Maybe passing windows user auth tokens around is a way?
This problem is of general nature (i.e. independent of language) and I'm convinced that there are people out there who solved it (in a more or less elegant way ;))
I hope my explanation is understandable.
Thanks in advance for any hint.
felix
I have Microsoft Windows Server 2003 and i can connect to it with remote desktop connection.
Internet connection speed is good in server. In my place some famous websites are blocked like youtube, and all file hosting websites.
I want to download my favorite files from these websites,but downloading speed is very low (after using VPN connection), so I decided to download my files in server first and next to download them by FTP to my PC.
My problem is that I have to connect to server remotely always. Is there anyway or software can does this for me? I do not want to connect remotely!
You have no other options just connect to remote desktop (or VNC, etc).
Or you can setup some very simple proxy on Windows server and point your browser(on your desk) to this proxy.Then traffic will go to your server first and then to your PC.Okay, this does not solve speed throught VPN, but you will not be blocked to restricted sites.If you know exact list of files (URLs) to download, there are many software which can do download work for you (simplest is DownThenAll in FF)