I want to fetch the name of the latest updated folder at particular path of FTP server - windows

Using this command I am able to get the latest updated folder in Unix
ls -t1 | head -1
But how can I get the same in FTP server from Windows?
I want to get the name of the latest updated folder at particular path of FTP server. Could any one please help?

There's no easy way to do this with Windows shell commands.
You can:
Use ftp.exe to execute ls /path c:\local\path\listing.txt to save a directory listing to a text file.
Exit ftp.exe.
Parse the listing and find the latest files. Not an easy task for Windows shell commands.
It would be a way easier with a PowerShell script.
You can use FtpWebRequest class. Though it does not have an easy way to retrieve structured directory listing either. It offers only ListDirectoryDetails and GetDateTimestamp methods.
See Retrieving creation date of file (FTP).
Or use a 3rd-party library for the task.
For example with WinSCP .NET assembly you can do:
param (
$sessionUrl = "ftp://user:mypassword#example.com/",
$remotePath = "/path"
)
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.ParseUrl($sessionUrl)
# Connect
$session = New-Object WinSCP.Session
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
}
else
{
Write-Host "The latest file is $latest"
}
See full example Downloading the most recent file.
(I'm the author of WinSCP)

Related

Powershell script: List files with specific change date (Amount if possible)

For license porpuses I try to automate the counting process instead of having to login into every single server, go into directory, search a file name and count the results based on the change date.
Want I'm aiming for:
Running a powershell script every month that checks the directory "C:\Users" for the file "Outlook.pst" recursively. And then filters the result by change date (one month or newer). Then packing this into an email to send to my inbox.
I'm not sure if that's possible, cause I am fairly new to powershell. Would appreciate your help!
It is possible.
I dont know how to start a ps session on a remote computer, but I think the cmdlet Enter-PSSession will do the trick. Or at least it was the first result while searching for "open remote powershell session". If that does not work use the Invoke-Command as suggested by lit to get $outlookFiles as suggested below.
For the rest use this.
$outlookFiles = Get-ChildItem -Path "C:\Users" -Recurse | Where-Object { $_.Name -eq "Outlook.pst" }
Now you have all files that have this name. If you are not familiar with the pipe in powershell it redirects all objects it found with the Get-ChildItem to the next pipe section and here the Where-Object will filter the received objects. If the current object ($_) will pass the condition it is returned by the whole command.
Now you can filter these objects again to only include the latest ones with.
$latestDate = (Get-Date).AddMonths(-1)
$newFiles = $outlookFiles | Where-Object { $_.LastAccessTime -gt $latestDate }
Now you have all the data you want in one object. Now you only have to format this how you like it e.g. you could use $mailBody = $newFiles | Out-String and then use Send-MailMessage -To x#y.z -From r#g.b -Body $mailBodyto send the mail.

Update files on FTP server folder hierarchy with local files from a single folder

I have a little-big problem. I need to copy/overwrite JPG files from my local FOLDER to server FOLDERS.
Is there a way to search and match JPG files on SERVER with my files on LOCAL and overwrite them in server folders? I do it manually and it takes lot of time.
There are 50 000 JPGs on server and I need to overwrite 20 000 of them in short time.
Many thanks for answers!!
There's no magic way to do your very specific task. You have to script it.
If you are on Windows, it's rather trivial to write a PowerShell script for this, using WinSCP .NET assembly and its Session.EnumerateRemoteFiles method:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "username"
Password = "password"
}
$remotePath = "/remote/path";
$localPath = "C:\local\Path";
# Connect
Write-Host "Connecting..."
$session = New-Object WinSCP.Session
$session.SessionLogPath = "upload.log"
$session.Open($sessionOptions)
# Enumerate remote files
$fileInfos =
$session.EnumerateRemoteFiles(
$remotePath, "*.*", [WinSCP.EnumerationOptions]::AllDirectories)
# And look for a matching local file for each of them
foreach ($fileInfo in $fileInfos)
{
$localFilePath = (Join-Path $localPath $fileInfo.Name)
if (Test-Path $localFilePath)
{
Write-Host ("Found local file $localFilePath matching remote file " +
"$($fileInfo.FullName), overwriting..."
# Command-out this line with # for a dry-run
$session.PutFiles($localFilePath, $fileInfo.FullName).Check()
}
else
{
Write-Host ("Found no local file matching remote file " +
"$($fileInfo.FullName), skipping..."
}
}
Write-Host "Done"
Save the script to a file (SortOutFiles.ps1), extract a contents of WinSCP .NET assembly package along with the script, and run it like:
C:\myscript>powershell -ExecutionPolicy Bypass -File SortOutFiles.ps1
Connecting...
Found local file C:\local\path\aaa.txt matching remote file /remote/path/1/aaa.txt, overwritting...
Found local file C:\local\path\bbb.txt matching remote file /remote/path/2/bbb.txt, overwritting...
Found local file C:\local\path\ccc.txt matching remote file /remote/path/ccc.txt, overwritting...
Done
You can first dry-run the script by commenting out the line with $session.PutFiles call.
(I'm the author of WinSCP)
download "Filezilla"... Upload your local files (all 50000 images).. If a image is already there in server,, it will ask you options.. select 'overwrite' and use 'apply for all'...

How to get the Dropbox folder in Powershell in Windows

Same question exists for Python here: How can I get the Dropbox folder location programmatically in Python?, or here for OSX: How to get the location of currently logined Dropbox folder
Same thing in Powershell. I need the path of DropBox to copy files to it (building a software and then copying it to dropbox to share with team).
This Dropbox help page tells us where this info is stored, ie, in a json file in the AppData of the user: https://www.dropbox.com/help/4584
function GetDropBoxPathFromInfoJson
{
$DropboxPath = Get-Content "$ENV:LOCALAPPDATA\Dropbox\info.json" -ErrorAction Stop | ConvertFrom-Json | % 'personal' | % 'path'
return $DropboxPath
}
The line above is taken from: https://www.powershellgallery.com/packages/Spizzi.Profile/1.0.0/Content/Functions%5CProfile%5CInstall-ProfileEnvironment.ps1
Note that it doesn't check if you've got a Dropbox business account, or if you have both. It just uses the personal one.
You can then use this base Dropbox folder to build your final path, for example:
$targetPath = Join-Path -Path (GetDropBoxPathFromInfoJson) -ChildPath 'RootDropboxFolder\Subfolder1\Subfolder2'
if (-not (Test-Path -Path $targetPath)) { throw "Path '$targetPath' not found!" }
--
Alternative way is using the host.db file, as shown on this page:
http://bradinscoe.tumblr.com/post/75819881755/get-dropbox-path-in-powershell
$base64path = gc $env:appdata\Dropbox\host.db | select -index 1 # -index 1 is the 2nd line in the file
$dropboxPath = [System.Text.Encoding]::ASCII.GetString([System.Convert]::FromBase64String($base64path)) # convert from base64 to ascii

Remote Powershell parsing

Im trying to retrieve a parsed list of different information regarding remote executables within a windows domain, permissions are take care of and the individual Powershell commands are working, my issue is outputting this recursive list on a file (putting all together properly):
My desired Output (per computer):
computer_name.csv # Filename
$application1Name.exe, $application1Version, $application1LastModifiedDateMMDDYY, $application1MD5HASH
$application2Name.exe, $application2Version, $application2LastModifiedDateMMDDYY, $application2MD5HASH
...
So far I have all the pieces:
#A way to recursive retrieve executables from a given remote path (Name + LastModified):
get-childitem \\192.168.X.X\C$\defaultPath\FoldersAndSubfoldersWithExecutables\ - Include *.exe -Recurse | ForEach-Object {$_.Name, $_.LastWriteTime} > C:\LOCALPATH\output.txt
#A way to retrieve the version info from remote executables (Version):
[System.Diagnostics.FileVersionInfo]::GetVersionInfo("\\192.168.X.X\C$\defaultPath\application1.exe").FileVersion
#A way to retrieve the MD5 Hash from remote executable files (MD5HASH):
get-FileHash \\192.168.X.X\C$\defaultPath\application1.exe -Algorithm MD5 | ForEach-Object { $_.Hash }
My issue is building this script structure to accomodate the desired output listed above, I have a list of IP address to loop this script thru but Im having issues connecting the dots..
Thanks!
Each operation you listed can be executed within the ForEach-Object loop, and a resultant csv string containing all the necessary data points can be built using string interpolation.
Get-ChildItem \\192.168.x.x\C$\defaultPath\FoldersAndSubfoldersWithExes\ -Include *.exe -Recurse | ForEach-Object {
$Name = $_.Name
$LastWriteTime = $_.LastWriteTime
$Version =[System.Diagnostics.FileVersionInfo]::GetVersionInfo($_.FullName).FileVersion
$Hash = (Get-FileHash $_.FullName -Algorithm MD5).Hash
"$Name, $Version, $LastWriteTime, $Hash"
} | Out-File computerName.csv

How to check the folder on the appearance in it of any new file?

Help please. I can not find a solution. (Windows platform)
I need to:
Scan the folder
If you receive any new file.
Process the file.
Another method to detect "new files" is the archive attribute. Whenever a file is created or changed, this attribute is set by windows.
Whenever you process a file, unset it's archive attribute (attrib -a file.ext).
The advantage is, you don't depend on any timing.
To list "new" (or changed) files, use dir /aa (dir /a-a will list processed files)
for more infos see dir /? and attrib /?
Without knowing exactly what you're trying to execute, this is all I can provide. You would theoretically run this as a scheduled task every 1 hour:
foreach ($file in (Get-ChildItem "C:\TargetDirectory" | where {$_.lastwritetime -gt (Get-Date).AddHours(-1)})) {
# Execute-Command -Target $file
}
You could use the FileSystemWatcher class to monitor the folder for new files.
It can easily be used from PowerShell as well:
$FSW = New-Object System.IO.FileSystemWatcher
Then use Register-ObjectEvent to "listen" for events raised from it
FileSystemWatcher is a utility I have recently learned and will definitely use in the future. The best part is that it relies on .net eventing, so you don't need to build an external triggering structure.
Here is an example of how I am using this in a 24/7 production environment (the full script receives an xml, processes it, and inserts the results into SQL in under 3 seconds).
Function submit-resultFile {
#Actual file processing takes place here
}
Function create-fsw {
Register-ObjectEvent $fsw Created -SourceIdentifier "Spectro FileCreated" -Action {
$name = $Event.SourceEventArgs.Name
$File = $Event.SourceEventArgs.Fullpath
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Verbose "The file '$name' was $changeType at $timeStamp" -fore green
submit-ResultFile -xmlfile $file
}
}
# In the following line, you can change 'IncludeSubdirectories to $true if required.
$fsw = New-Object IO.FileSystemWatcher $watchFolder, $watchFilter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
$xmlFiles = Get-ChildItem -Path $ResultsDirectory -Filter *.xml
foreach ($file in $xmlfiles)
{
submit-ResultFile -xmlfile $File.FullName
}
Create-fsw
# Register a new File System Watcher
Several important points to be aware of:
- if files exist in that location before the FSW is created they WILL NOT trigger an "objectevent", so in my script you'll observe that I begin by running a sweep for existing files.
when FSW does trigger you want it to process only 1 file at a time. Since the next file creation event will generate a new "objectevent". Structuring a FSW to work on multiple files per trigger will eventually result in a crash.

Resources