Windows file copy using http - windows

Is there a windows command to copy or download files from a http url to the filesystem? I've tried with copy, xcopy and robocopy and they dont seem to support http urls.

You can use a powershell script to accomplish this.
Get-Web http://www.msn.com/ -toFile www.msn.com.html
function Get-Web($url,
[switch]$self,
$credential,
$toFile,
[switch]$bytes)
{
#.Synopsis
# Downloads a file from the web
#.Description
# Uses System.Net.Webclient (not the browser) to download data
# from the web.
#.Parameter self
# Uses the default credentials when downloading that page (for downloading intranet pages)
#.Parameter credential
# The credentials to use to download the web data
#.Parameter url
# The page to download (e.g. www.msn.com)
#.Parameter toFile
# The file to save the web data to
#.Parameter bytes
# Download the data as bytes
#.Example
# # Downloads www.live.com and outputs it as a string
# Get-Web http://www.live.com/
#.Example
# # Downloads www.live.com and saves it to a file
# Get-Web http://wwww.msn.com/ -toFile www.msn.com.html
$webclient = New-Object Net.Webclient
if ($credential) {
$webClient.Credential = $credential
}
if ($self) {
$webClient.UseDefaultCredentials = $true
}
if ($toFile) {
if (-not "$toFile".Contains(":")) {
$toFile = Join-Path $pwd $toFile
}
$webClient.DownloadFile($url, $toFile)
} else {
if ($bytes) {
$webClient.DownloadData($url)
} else {
$webClient.DownloadString($url)
}
}
}
source http://blogs.msdn.com/mediaandmicrocode/archive/2008/12/01/microcode-powershell-scripting-tricks-scripting-the-web-part-1-get-web.aspx

I am not familiar with any commands on Windows that can do that, but I always download GNU wget on Windows for these and similar purposes.

cURL comes to mind.
curl -o homepage.html http://www.apptranslator.com/
This command downloads the page and stores it into file homepage.html.
Thousands options available.

Use BITSAdmin Tool (bitsadmin is a command line utility on windows)
example :
bitsadmin /transfer "Download_Job" /download /priority high "http://www.sourceWebSite.com/file.zip" "C:\destination\file.zip"
where,
Download_Job - Any relevant job name you want

I can't remember any command line utility for this.
Maybe you can implement something similar using JavaScript (with WinHttpRequest) and running it like that:
wscript your_script.js
Or just install msys with wget.

Just use Win32 api (1 line of code in C...)

Related

Update files on FTP server folder hierarchy with local files from a single folder

I have a little-big problem. I need to copy/overwrite JPG files from my local FOLDER to server FOLDERS.
Is there a way to search and match JPG files on SERVER with my files on LOCAL and overwrite them in server folders? I do it manually and it takes lot of time.
There are 50 000 JPGs on server and I need to overwrite 20 000 of them in short time.
Many thanks for answers!!
There's no magic way to do your very specific task. You have to script it.
If you are on Windows, it's rather trivial to write a PowerShell script for this, using WinSCP .NET assembly and its Session.EnumerateRemoteFiles method:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "username"
Password = "password"
}
$remotePath = "/remote/path";
$localPath = "C:\local\Path";
# Connect
Write-Host "Connecting..."
$session = New-Object WinSCP.Session
$session.SessionLogPath = "upload.log"
$session.Open($sessionOptions)
# Enumerate remote files
$fileInfos =
$session.EnumerateRemoteFiles(
$remotePath, "*.*", [WinSCP.EnumerationOptions]::AllDirectories)
# And look for a matching local file for each of them
foreach ($fileInfo in $fileInfos)
{
$localFilePath = (Join-Path $localPath $fileInfo.Name)
if (Test-Path $localFilePath)
{
Write-Host ("Found local file $localFilePath matching remote file " +
"$($fileInfo.FullName), overwriting..."
# Command-out this line with # for a dry-run
$session.PutFiles($localFilePath, $fileInfo.FullName).Check()
}
else
{
Write-Host ("Found no local file matching remote file " +
"$($fileInfo.FullName), skipping..."
}
}
Write-Host "Done"
Save the script to a file (SortOutFiles.ps1), extract a contents of WinSCP .NET assembly package along with the script, and run it like:
C:\myscript>powershell -ExecutionPolicy Bypass -File SortOutFiles.ps1
Connecting...
Found local file C:\local\path\aaa.txt matching remote file /remote/path/1/aaa.txt, overwritting...
Found local file C:\local\path\bbb.txt matching remote file /remote/path/2/bbb.txt, overwritting...
Found local file C:\local\path\ccc.txt matching remote file /remote/path/ccc.txt, overwritting...
Done
You can first dry-run the script by commenting out the line with $session.PutFiles call.
(I'm the author of WinSCP)
download "Filezilla"... Upload your local files (all 50000 images).. If a image is already there in server,, it will ask you options.. select 'overwrite' and use 'apply for all'...

I want to fetch the name of the latest updated folder at particular path of FTP server

Using this command I am able to get the latest updated folder in Unix
ls -t1 | head -1
But how can I get the same in FTP server from Windows?
I want to get the name of the latest updated folder at particular path of FTP server. Could any one please help?
There's no easy way to do this with Windows shell commands.
You can:
Use ftp.exe to execute ls /path c:\local\path\listing.txt to save a directory listing to a text file.
Exit ftp.exe.
Parse the listing and find the latest files. Not an easy task for Windows shell commands.
It would be a way easier with a PowerShell script.
You can use FtpWebRequest class. Though it does not have an easy way to retrieve structured directory listing either. It offers only ListDirectoryDetails and GetDateTimestamp methods.
See Retrieving creation date of file (FTP).
Or use a 3rd-party library for the task.
For example with WinSCP .NET assembly you can do:
param (
$sessionUrl = "ftp://user:mypassword#example.com/",
$remotePath = "/path"
)
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.ParseUrl($sessionUrl)
# Connect
$session = New-Object WinSCP.Session
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
}
else
{
Write-Host "The latest file is $latest"
}
See full example Downloading the most recent file.
(I'm the author of WinSCP)

FTP: copy, check integrity and delete

I am looking for a way to connect to a remote server with ftp or lftp and make sure the following steps:
Copy files from FTP server to my local machine.
Check if the downloaded files are fine (i.e. md5checksum).
If the download was fine then delete the downloaded files from the FTP server.
This routine will be executed each day from my local machine. What would be the best option to do this? Is there a tool that makes abstraction of all the 3 steps ?
I am running Linux on both client and server machines.
Update: Additionally, I have also a text file that contains the association between the files on the FTPserver and their MD5sum. They were computed at the FTP server side.
First, make sure your remote server supports the checksum calculation at all. Many do not. I believe there's even no standard FTP command to calculate a checksum of a remote file. There were many proposals and there are many proprietary solutions.
The latest proposal is:
https://datatracker.ietf.org/doc/html/draft-bryan-ftpext-hash-02
So even if your server supports checksum calculation, you have to find a client that supports the same command.
Some of the commands that can be used to calculate checksum are: XSHA1, XSHA256, XSHA512, XMD5, MD5, XCRC and HASH.
You can test that with WinSCP. The WinSCP supports all the previously mentioned commands. Test its checksum calculation function or the checksum scripting command. If they work, enable logging and check, what command and what syntax WinSCP uses against your server.
Neither the ftp (neither Windows nor *nix version) nor the lftp support checksum calculation, let alone automatic verification of downloaded file.
I'm not even aware of any other client that can automatically verify downloaded file.
You can definitely script it with a help of some feature-rich client.
I've wrote this answer before OP specified that he/she is on Linux. I'm keeping the Windows solution in case it helps someone else.
On Windows, you could script it with PowerShell using WinSCP .NET assembly.
param (
$sessionUrl = "ftp://username:password#example.com/",
[Parameter(Mandatory)]
$localPath,
[Parameter(Mandatory)]
$remotePath,
[Switch]
$pause = $False
)
try
{
# Load WinSCP .NET assembly
Add-Type -Path (Join-Path $PSScriptRoot "WinSCPnet.dll")
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.ParseUrl($sessionUrl);
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
Write-Host "Downloading $remotePath to $localPath..."
$session.GetFiles($remotePath, $localPath).Check();
# Calculate remote file checksum
$buf = $session.CalculateFileChecksum("sha-1", $remotePath)
$remoteChecksum = [BitConverter]::ToString($buf)
Write-Host "Remote file checksum: $remoteChecksum"
# Calculate local file checksum
$sha1 = [System.Security.Cryptography.SHA1]::Create()
$localStream = [System.IO.File]::OpenRead($localPath)
$localChecksum = [BitConverter]::ToString($sha1.ComputeHash($localStream))
Write-Host "Downloaded file checksum: $localChecksum"
# Compare cheksums
if ($localChecksum -eq $remoteChecksum)
{
Write-Host "Match, deleting remote file"
$session.RemoveFiles($remotePath).Check();
$result = 0
}
else
{
Write-Host "Does NOT match"
$result = 1
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
}
catch [Exception]
{
Write-Host "Error: $($_.Exception.Message)"
$result = 1
}
# Pause if -pause switch was used
if ($pause)
{
Write-Host "Press any key to exit..."
[System.Console]::ReadKey() | Out-Null
}
exit $result
You can run it like:
powershell -file checksum.ps1 -remotePath ./file.dat -localPath C:\path\file.dat
This is partially based on WinSCP example for Verifying checksum of a remote file against a local file over SFTP/FTP protocol.
(I'm the author on WinSCP)
The question was later edited to say that OP has a text file with a checksum. That makes it a completely different question. Just download the file, calculate local checksum and compare it to the checksum you have in the text file. If they match, delete the remote file.
That's a long shot, but if the server supports php, you can exploit that.
Save the following as a php file (say, check.php), in the same folder as your name_of_file.txt file:
<? php
echo md5_file('name_of_file.txt');
php>
Then, visit the page check.php, and you should get the md5 hash of your file.
Related questions:
Calculate file checksum in FTP server using Apache FtpClient
How to perform checksums during a SFTP file transfer for data integrity?
https://serverfault.com/q/98597/401691

Windows / Perl / Net::SSLeay / OpenSSL: What locations are CA certificates loaded from?

Here's a program that does an HTTPS request, with some code at the start that I'm going to explain below:
use 5.012;
use LWP::UserAgent;
use HTTP::Request::Common;
use Net::SSLeay;
BEGIN {
return unless $^O eq 'MSWin32'; # only needed on Windows
print STDERR "attempting to set HTTPS_CA_FILE to PEM file path\n";
require Mozilla::CA; # load module to determine PEM file path
my $pemfile = do {
my $path = $INC{ 'Mozilla/CA.pm' };
$path =~ s#\.pm$#/cacert.pem#;
$path;
};
if ( -f $pemfile ) {
$ENV{HTTPS_CA_FILE} = $pemfile;
print STDERR "HTTPS_CA_FILE set to $pemfile\n";
}
else {
warn "PEM file $pemfile missing";
}
} # ==========================================================================
$Net::SSLeay::trace = 2;
my $ua = LWP::UserAgent->new;
my $req = GET 'https://client.billsafe.de/';
my $rsp = $ua->request( $req );
say $rsp->is_success ? 'success' : 'failure';
say $rsp->status_line;
say '=================';
say substr $rsp->decoded_content, 0, 200;
say '=================';
# possibly relevant module versions
for ( qw/Net::SSLeay Crypt::SSLeay LWP::Protocol::https Mozilla::CA/ ) {
no strict 'refs';
say $_, "\t", ${"${_}::VERSION"}
}
The code at the beginning sets the environment variable HTTPS_CA_FILE to the value of the PEM file cacert.pem from Mozilla::CA that gets loaded by default (I checked using procmon.exe, the file is fully read by default).
The reason for doing this apparently nonsensical setting is that we have some Windows machines (Windows Server 2008) where the SSL setup fails with certificate verify failed when the environment variable is not set. It is a mystery to us why this is so. And it works fine on other Windows machines with identical versions for Net::SSLeay, LWP::Protocol::https and Mozilla::CA.
Our module versions are:
Net::SSLeay 1.36
Crypt::SSLeay -/-
LWP::Protocol::https 6.02
Mozilla::CA 20110409
Now the question: Are there other places, apart from cacert.pem, that root certificates are loaded from in this constellation (Windows, Perl, Net::SSLeay)? If so, what are they? Where can I read up on it?
Update
The OpenSSL docs do not mention any certificate store other than a plain file and a plain directory:
SSL_CTX_set_cert_store(3)
SSL_CTX_load_verify_locations(3)
The Windows C API functions used to open the system certificate store are the following:
CertOpenStore
CertOpenSystemStore
I checked out the OpenSSL HEAD from CVS. The CertOpenStore function is indeed used in engines/e_capi.c. I haven't investigated further to find out what is used to access a store in the OpenSSL versions on the servers in question.
If you do a web search you'll see that a couple of people have wondered whether OpenSSL can access the Windows certificate store directly, or have proposed to patch OpenSSL accordingly. There's also this recent issue on the TortoiseSVN list (Windows Certificate Store / OpenSSL CAPI). Some more research needed to find out what's the matter here.
Since LWP 6.00 you can pass an ssl_opts hashref to new specifying the certificate files amongst other options:
my $ua = LWP::UserAgent->new(
ssl_opts => {
verify_hostname => 1,
SSL_cert_file => $ssl_cert_file,
SSL_key_file => $ssl_key_file,
},
);

FTP copy a file to another place in same FTP

I need to upload same file to 2 different place in same FTP. Is there a way to copy the file on the FTP to the other place instead of upload it again? Thanks.
There's no standard way to duplicate a remote file over the FTP protocol. Some FTP servers support proprietary or non-standard extensions for this though.
Some FTP clients do support the remote file duplication. Either using the extensions or via a temporary local copy of the remote file.
For example WinSCP FTP client does support the duplication using both drag&drop and menu/keyboard command:
It supports the SITE CPFR/CPTO FTP extension (supported for example by the ProFTPD mod_copy module)
It falls back to an automatic duplication via a local temporary copy, if the above extension is not available.
(I'm the author of WinSCP)
Another workaround is to open a second connection to the FTP server and make the server upload the file to itself by piping a passive mode data connection to an active mode data connection. This solution is shown in the answer by #SaadAchemlal. This is basically use of FXP protocol, but for one server. Though many FTP servers will reject this, as they wont allow data connection to/from an address different to the client's.
Side note: people often confuse move with copy. In case you actually want to move, then that's a completely different question. Moving file on FTP is widely supported.
I don't think there's a way to copy files without downloading and re-uploading, at least I found nothing like this in the List of FTP commands and no client I have seen so far supported something like this.
Yes, the FTP protocol itself can support this in theory. The FTP RFC 959 discusses this in section 5.2 (see the paragraph starting with "When data is to be transferred between two servers, A and B..."). However, I don't know of any client that offers this sort of dual server control operation.
Note that this method could transfer the file from the FTP server to itself using its own network, which won't be as fast as a local file copy but would almost certainly be faster than downloading and then reuploading the file.
I can copy files between remote folders in Linux based systems.
In my particular case, I'm using very common file manager PCManFM:
Menu "Go" --> "Connect to server"
FTP Login info, etc
Open new tab in PCManFM
Connect to same server
Copy from tab to tab...
It's a bit slow, so I guess that it could be downloading and uploading back the files, but it's done automatically and very user-friendly.
The code below makes the FTP server to upload the file to itself (using loopback connection). It needs the FTP server to allow both passive and active connection mode.
If you want to understand the ftp commands here is a list of them : List of ftp commands
function copyFile($filePath, $newFilePath)
{
$ftp1 = ftp_connect('192.168.1.1');
$ftp2 = ftp_connect('192.168.1.1');
ftp_raw($ftp1, "USER ftpUsername");
ftp_raw($ftp1, "PASS mypassword");
ftp_raw($ftp2, "USER ftpUsername");
ftp_raw($ftp2, "PASS mypassword");
$res = ftp_raw($ftp2, "PASV");
$addressAndPort = substr($res[0], strpos($res[0], '(') + 1);
$addressAndPort = substr($addressAndPort, 0, strpos($addressAndPort, ')'));
ftp_raw($ftp1, "CWD ." . dirname($newFilePath));
ftp_raw($ftp2, "CWD ." . dirname($filePath));
ftp_raw($ftp1, "PORT ".$addressAndPort);
ftp_raw($ftp1, "STOR " . basename($newFilePath));
ftp_raw($ftp2, "RETR " . basename($filePath));
ftp_raw($ftp1, "QUIT");
ftp_raw($ftp2, "QUIT");
}
I managed to do this by using WebDrive to mount the ftp as a local folder, then "download" the files using filezilla directly to the folder. It was a bit slower than download normally is, but you dont need to have the space on your hdd.
Here's another workaround using PHP cUrl to execute a copy request on the server by feeding parameters from the local machine and reporting the outcome:
Local code:
In this simple test routine, I want to copy the leaning tower photo to the correct folder, Pisa:
$ch = curl_init();
$data = array ('pic' => 'leaningtower', 'folder' => 'Pisa');
curl_setopt($ch, CURLOPT_URL,"http://travelphotos.com/copypic.php");
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
$result = curl_exec($ch);
curl_close($ch);
echo $result;
Server code (copypic.php):
On the remote server, I have simple error checking. On this server I had to mess with the path designation, i.e., I had to use "./" for an acceptable path reference, so you may have to tinker with it a bit.
$pic = $_POST["pic"];
$folder = $_POST["folder"];
if (!$pic || !$folder) exit();
$sourcePath = "./unsortedpics/".$pic.".jpg";
$destPath = "./sortedpics/".$folder."/".$pic.".jpg";
if (!file_exists($sourcePath )) exit("Source file not found");
if (!is_dir("./sortedpics/".$folder)) exit("Invalid destination folder");
if (!copy($sourcePath , $destPath)) exit("Copy not successful");
echo "File copied";
You can do this from C-Panel.
Log into your C-Panel.
Go into file manager.
Find the file or folder you want to duplicate.
Right-click and chose Copy.
Type in the new director you want to copy to.
Done!
You can rename the file to be copied into the full path of your wanted result.
For example:
If you want to move the file "file.txt" into the folder "NewFolder" you can write it as
ftp> rename file.txt NewFolder/file.txt
This worked for me.

Resources