Update files on FTP server folder hierarchy with local files from a single folder - ftp

I have a little-big problem. I need to copy/overwrite JPG files from my local FOLDER to server FOLDERS.
Is there a way to search and match JPG files on SERVER with my files on LOCAL and overwrite them in server folders? I do it manually and it takes lot of time.
There are 50 000 JPGs on server and I need to overwrite 20 000 of them in short time.
Many thanks for answers!!

There's no magic way to do your very specific task. You have to script it.
If you are on Windows, it's rather trivial to write a PowerShell script for this, using WinSCP .NET assembly and its Session.EnumerateRemoteFiles method:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "username"
Password = "password"
}
$remotePath = "/remote/path";
$localPath = "C:\local\Path";
# Connect
Write-Host "Connecting..."
$session = New-Object WinSCP.Session
$session.SessionLogPath = "upload.log"
$session.Open($sessionOptions)
# Enumerate remote files
$fileInfos =
$session.EnumerateRemoteFiles(
$remotePath, "*.*", [WinSCP.EnumerationOptions]::AllDirectories)
# And look for a matching local file for each of them
foreach ($fileInfo in $fileInfos)
{
$localFilePath = (Join-Path $localPath $fileInfo.Name)
if (Test-Path $localFilePath)
{
Write-Host ("Found local file $localFilePath matching remote file " +
"$($fileInfo.FullName), overwriting..."
# Command-out this line with # for a dry-run
$session.PutFiles($localFilePath, $fileInfo.FullName).Check()
}
else
{
Write-Host ("Found no local file matching remote file " +
"$($fileInfo.FullName), skipping..."
}
}
Write-Host "Done"
Save the script to a file (SortOutFiles.ps1), extract a contents of WinSCP .NET assembly package along with the script, and run it like:
C:\myscript>powershell -ExecutionPolicy Bypass -File SortOutFiles.ps1
Connecting...
Found local file C:\local\path\aaa.txt matching remote file /remote/path/1/aaa.txt, overwritting...
Found local file C:\local\path\bbb.txt matching remote file /remote/path/2/bbb.txt, overwritting...
Found local file C:\local\path\ccc.txt matching remote file /remote/path/ccc.txt, overwritting...
Done
You can first dry-run the script by commenting out the line with $session.PutFiles call.
(I'm the author of WinSCP)

download "Filezilla"... Upload your local files (all 50000 images).. If a image is already there in server,, it will ask you options.. select 'overwrite' and use 'apply for all'...

Related

PowerShell to move file type up one dir

I'm looking for a script the moves a certain file type *.msg up one directory but still remaining their project ID folder.
The current file structure looks like:
Host Folder\C12345678\A\test.msg
Host Folder\C12345678\B\test.msg
Host Folder\C99999999\F\test.msg
Host Folder\C56351114\T\test.msg
Host Folder\C69365814\I\test.msg
I want to remove the lettering, so just the project ID hosts the *.msg:
Host Folder\C12345678\test(A).msg
Host Folder\C12345678\test(B).msg
Host Folder\C99999999\test.msg
Host Folder\C56351114\test.msg
Host Folder\C69365814\test.msg
Currently, my code is just moving the various *.msg file to the Host folder, unable to duplicate folders.
$src = "..\Host Folder"
$files = Get-ChildItem $src -file -recurse
foreach ($file in $files) {
#Move-Item $file.PSPath $src
}
I mostly batch file so I'm not so good with PS but I need to add $src\$files?

How to get the Dropbox folder in Powershell in Windows

Same question exists for Python here: How can I get the Dropbox folder location programmatically in Python?, or here for OSX: How to get the location of currently logined Dropbox folder
Same thing in Powershell. I need the path of DropBox to copy files to it (building a software and then copying it to dropbox to share with team).
This Dropbox help page tells us where this info is stored, ie, in a json file in the AppData of the user: https://www.dropbox.com/help/4584
function GetDropBoxPathFromInfoJson
{
$DropboxPath = Get-Content "$ENV:LOCALAPPDATA\Dropbox\info.json" -ErrorAction Stop | ConvertFrom-Json | % 'personal' | % 'path'
return $DropboxPath
}
The line above is taken from: https://www.powershellgallery.com/packages/Spizzi.Profile/1.0.0/Content/Functions%5CProfile%5CInstall-ProfileEnvironment.ps1
Note that it doesn't check if you've got a Dropbox business account, or if you have both. It just uses the personal one.
You can then use this base Dropbox folder to build your final path, for example:
$targetPath = Join-Path -Path (GetDropBoxPathFromInfoJson) -ChildPath 'RootDropboxFolder\Subfolder1\Subfolder2'
if (-not (Test-Path -Path $targetPath)) { throw "Path '$targetPath' not found!" }
--
Alternative way is using the host.db file, as shown on this page:
http://bradinscoe.tumblr.com/post/75819881755/get-dropbox-path-in-powershell
$base64path = gc $env:appdata\Dropbox\host.db | select -index 1 # -index 1 is the 2nd line in the file
$dropboxPath = [System.Text.Encoding]::ASCII.GetString([System.Convert]::FromBase64String($base64path)) # convert from base64 to ascii

I want to fetch the name of the latest updated folder at particular path of FTP server

Using this command I am able to get the latest updated folder in Unix
ls -t1 | head -1
But how can I get the same in FTP server from Windows?
I want to get the name of the latest updated folder at particular path of FTP server. Could any one please help?
There's no easy way to do this with Windows shell commands.
You can:
Use ftp.exe to execute ls /path c:\local\path\listing.txt to save a directory listing to a text file.
Exit ftp.exe.
Parse the listing and find the latest files. Not an easy task for Windows shell commands.
It would be a way easier with a PowerShell script.
You can use FtpWebRequest class. Though it does not have an easy way to retrieve structured directory listing either. It offers only ListDirectoryDetails and GetDateTimestamp methods.
See Retrieving creation date of file (FTP).
Or use a 3rd-party library for the task.
For example with WinSCP .NET assembly you can do:
param (
$sessionUrl = "ftp://user:mypassword#example.com/",
$remotePath = "/path"
)
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.ParseUrl($sessionUrl)
# Connect
$session = New-Object WinSCP.Session
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
}
else
{
Write-Host "The latest file is $latest"
}
See full example Downloading the most recent file.
(I'm the author of WinSCP)

FTP: copy, check integrity and delete

I am looking for a way to connect to a remote server with ftp or lftp and make sure the following steps:
Copy files from FTP server to my local machine.
Check if the downloaded files are fine (i.e. md5checksum).
If the download was fine then delete the downloaded files from the FTP server.
This routine will be executed each day from my local machine. What would be the best option to do this? Is there a tool that makes abstraction of all the 3 steps ?
I am running Linux on both client and server machines.
Update: Additionally, I have also a text file that contains the association between the files on the FTPserver and their MD5sum. They were computed at the FTP server side.
First, make sure your remote server supports the checksum calculation at all. Many do not. I believe there's even no standard FTP command to calculate a checksum of a remote file. There were many proposals and there are many proprietary solutions.
The latest proposal is:
https://datatracker.ietf.org/doc/html/draft-bryan-ftpext-hash-02
So even if your server supports checksum calculation, you have to find a client that supports the same command.
Some of the commands that can be used to calculate checksum are: XSHA1, XSHA256, XSHA512, XMD5, MD5, XCRC and HASH.
You can test that with WinSCP. The WinSCP supports all the previously mentioned commands. Test its checksum calculation function or the checksum scripting command. If they work, enable logging and check, what command and what syntax WinSCP uses against your server.
Neither the ftp (neither Windows nor *nix version) nor the lftp support checksum calculation, let alone automatic verification of downloaded file.
I'm not even aware of any other client that can automatically verify downloaded file.
You can definitely script it with a help of some feature-rich client.
I've wrote this answer before OP specified that he/she is on Linux. I'm keeping the Windows solution in case it helps someone else.
On Windows, you could script it with PowerShell using WinSCP .NET assembly.
param (
$sessionUrl = "ftp://username:password#example.com/",
[Parameter(Mandatory)]
$localPath,
[Parameter(Mandatory)]
$remotePath,
[Switch]
$pause = $False
)
try
{
# Load WinSCP .NET assembly
Add-Type -Path (Join-Path $PSScriptRoot "WinSCPnet.dll")
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.ParseUrl($sessionUrl);
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
Write-Host "Downloading $remotePath to $localPath..."
$session.GetFiles($remotePath, $localPath).Check();
# Calculate remote file checksum
$buf = $session.CalculateFileChecksum("sha-1", $remotePath)
$remoteChecksum = [BitConverter]::ToString($buf)
Write-Host "Remote file checksum: $remoteChecksum"
# Calculate local file checksum
$sha1 = [System.Security.Cryptography.SHA1]::Create()
$localStream = [System.IO.File]::OpenRead($localPath)
$localChecksum = [BitConverter]::ToString($sha1.ComputeHash($localStream))
Write-Host "Downloaded file checksum: $localChecksum"
# Compare cheksums
if ($localChecksum -eq $remoteChecksum)
{
Write-Host "Match, deleting remote file"
$session.RemoveFiles($remotePath).Check();
$result = 0
}
else
{
Write-Host "Does NOT match"
$result = 1
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
}
catch [Exception]
{
Write-Host "Error: $($_.Exception.Message)"
$result = 1
}
# Pause if -pause switch was used
if ($pause)
{
Write-Host "Press any key to exit..."
[System.Console]::ReadKey() | Out-Null
}
exit $result
You can run it like:
powershell -file checksum.ps1 -remotePath ./file.dat -localPath C:\path\file.dat
This is partially based on WinSCP example for Verifying checksum of a remote file against a local file over SFTP/FTP protocol.
(I'm the author on WinSCP)
The question was later edited to say that OP has a text file with a checksum. That makes it a completely different question. Just download the file, calculate local checksum and compare it to the checksum you have in the text file. If they match, delete the remote file.
That's a long shot, but if the server supports php, you can exploit that.
Save the following as a php file (say, check.php), in the same folder as your name_of_file.txt file:
<? php
echo md5_file('name_of_file.txt');
php>
Then, visit the page check.php, and you should get the md5 hash of your file.
Related questions:
Calculate file checksum in FTP server using Apache FtpClient
How to perform checksums during a SFTP file transfer for data integrity?
https://serverfault.com/q/98597/401691

How to check the folder on the appearance in it of any new file?

Help please. I can not find a solution. (Windows platform)
I need to:
Scan the folder
If you receive any new file.
Process the file.
Another method to detect "new files" is the archive attribute. Whenever a file is created or changed, this attribute is set by windows.
Whenever you process a file, unset it's archive attribute (attrib -a file.ext).
The advantage is, you don't depend on any timing.
To list "new" (or changed) files, use dir /aa (dir /a-a will list processed files)
for more infos see dir /? and attrib /?
Without knowing exactly what you're trying to execute, this is all I can provide. You would theoretically run this as a scheduled task every 1 hour:
foreach ($file in (Get-ChildItem "C:\TargetDirectory" | where {$_.lastwritetime -gt (Get-Date).AddHours(-1)})) {
# Execute-Command -Target $file
}
You could use the FileSystemWatcher class to monitor the folder for new files.
It can easily be used from PowerShell as well:
$FSW = New-Object System.IO.FileSystemWatcher
Then use Register-ObjectEvent to "listen" for events raised from it
FileSystemWatcher is a utility I have recently learned and will definitely use in the future. The best part is that it relies on .net eventing, so you don't need to build an external triggering structure.
Here is an example of how I am using this in a 24/7 production environment (the full script receives an xml, processes it, and inserts the results into SQL in under 3 seconds).
Function submit-resultFile {
#Actual file processing takes place here
}
Function create-fsw {
Register-ObjectEvent $fsw Created -SourceIdentifier "Spectro FileCreated" -Action {
$name = $Event.SourceEventArgs.Name
$File = $Event.SourceEventArgs.Fullpath
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Verbose "The file '$name' was $changeType at $timeStamp" -fore green
submit-ResultFile -xmlfile $file
}
}
# In the following line, you can change 'IncludeSubdirectories to $true if required.
$fsw = New-Object IO.FileSystemWatcher $watchFolder, $watchFilter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
$xmlFiles = Get-ChildItem -Path $ResultsDirectory -Filter *.xml
foreach ($file in $xmlfiles)
{
submit-ResultFile -xmlfile $File.FullName
}
Create-fsw
# Register a new File System Watcher
Several important points to be aware of:
- if files exist in that location before the FSW is created they WILL NOT trigger an "objectevent", so in my script you'll observe that I begin by running a sweep for existing files.
when FSW does trigger you want it to process only 1 file at a time. Since the next file creation event will generate a new "objectevent". Structuring a FSW to work on multiple files per trigger will eventually result in a crash.

Resources