Not able transfer multiple file though inputted correct path
######Transfer file
Write-Host ("Start script.")
try
{
$todaysDate = (Get-Date).ToString('yyyy-MM-dd')
Add-Type -Path "D:\WinSCP\WinSCPnet.dll"
# Setup session options
$session = New-Object WinSCP.Session
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::Sftp
$SessionOptions.Timeout = New-TimeSpan -Seconds 90
$sessionOptions.HostName = "hostname"
$sessionOptions.UserName = "username"
$sessionOptions.PortNumber = "portnumber"
$sessionOptions.Password = ""
$sessionOptions.SshPrivateKeyPath = "D:\privatekey"
$sessionOptions.SshHostKeyFingerprint = "ssh-rsa 2048 xxxxxx"
#Write-Host ("Connecting.")
$session = New-Object WinSCP.Session
$session.SessionLogPath = "D:\WinSCPSessionLog_$todaysDate.log"
#Upload files
try
{
# Connect
$session.Open($sessionOptions)
#File list:
Write-Host ("File list: ")
#transferoptions
$transferOptions = New-Object WinSCP.TransferOptions
#$transferOptions.FileMask = "*.*"
$transferOptions.FilePermissions = $Null # This is default
$transferOptions.PreserveTimestamp = $False # if Timestamp on file is enable
$localPath = Get-ChildItem "D:\dq\*.csv" | Where-Object {($_.LastWriteTime -ge [datetime]::today)}
$remotePath = "/Outbox/"
# Upload files, collect results
$transferResult = $session.PutFiles($localPath, $remotePath, $False, $transferOptions)
# Upload files, collect results
#$transferResult = $session.PutFiles(($localPath + "*.*"), ($remotePath, + "*.*") $False, $transferOptions).Check()
# Iterate over every transfer
foreach ($transfer in $transferResult.Transfers)
{
# Success or error?
if ($transfer.Error -eq $Null)
{
#$transferResult = $session.PutFiles($localPath, $remotePath, $False, $transferOptions).Check()
Write-Host ("Upload of {0} succeeded, moving to save" -f $transfer.FileName)
}
else
{
Write-Host ("Upload of {0} failed: {1}" -f $transfer.FileName, $transfer.Error.Message)
}
}
#End of files:
Write-Host ("End of files. ")
}
finally
{
# Disconnect, clean up
$session.Dispose()
Write-Host ("Disconnected.")
}
#exit 0
}
catch [Exception]
{
$todaysDate = (Get-Date).ToString('yyyy-MM-dd')
Set-Content -Path "D:\WinSCPError_$todaysDate.log" $_.Exception.Message
#exit 1
}
Log File:
File list:
Upload of D:\dq\ADD.csv D:\dq\MINUS.csv D:\dq\DIVIDE.csv failed: File or folder 'D:\dq\ADD.csv D:\dq\MINUS.csv D:\dq\DIVIDE.csv' does not exist.
System Error. Code: 123.
The filename, directory name, or volume label syntax is incorrect
Error:
not able transfer file due to file not exists. I have inputted correct
Tools:
Using superb old version of .NET
Using superb old version of window server
Expected result:
able transfer multiple files in "D:\dq*.csv" and the file modified date must be today's date
I have inputted correct
Nope, you haven't. From the WinSCP documentation for the localPath argument:
Full path to local file or directory to upload. Filename in the path can be replaced with Windows wildcard1 to select multiple files. To upload all files in a directory, use mask *.
That is, it expects one path, not multiple.
Given your requirement for filtering on LastWriteTime, passing "D:\dq\*.csv" as the argument is not a viable option, so the solution is to upload the files one-by-one:
# ...
try{
# Connect
$session.Open($sessionOptions)
#File list:
Write-Host ("File list: ")
#transferoptions
$transferOptions = New-Object WinSCP.TransferOptions
#$transferOptions.FileMask = "*.*"
$transferOptions.FilePermissions = $Null # This is default
$transferOptions.PreserveTimestamp = $False # if Timestamp on file is enable
# Loop through each relevant file
foreach($localFile in Get-ChildItem "D:\dq\*.csv" | Where-Object {($_.LastWriteTime -ge [datetime]::today)}){
$remotePath = "/Outbox/"
# Upload files, collect results
$transferResult = $session.PutFileToDirectory($localFile.FullName, $remotePath, $false, $transferOptions)
# Success or error?
if ($transferResult.Error -eq $Null) {
#$transferResult = $session.PutFiles($localPath, $remotePath, $False, $transferOptions).Check()
Write-Host ("Upload of {0} succeeded, moving to save" -f $transferResult.FileName)
}
else {
Write-Host ("Upload of {0} failed: {1}" -f $transferResult.FileName, $transferResult.Error.Message)
}
}
}
finally{
# ...
}
The answer by #Martias is correct. You cannot pass multiple paths to Session.PutFiles.
But you can let WinSCP itself pick the today's *.csv files with just two lines of code, using todays constraint.
Using Session.PutFilesToDirectory would also make the code somewhat simpler.
$transferOptions = New-Object WinSCP.TransferOptions
# (your other transfer options)
$transferOptions.FileMask = ">=today"
$transferResult =
$session.PutFilesToDirectory(
$localPath, $remotePath, "*.csv", $False, $transferOptions).Check()
Related
I frequently have to copy a single file to multiple destinations, so i'm trying to write a script to make that go faster. it seems to work fine when i'm dealing with local files, but fails without any errors when running on a file that is on a mapped network drive.
at first I was using copy-item, and I couldn't make that work, so i used robocopy. that does the trick, but if the file already exists, i have an if statement using test-path which is supposed to skip to a user input that asks if you want to overwrite.. this is not working. i should say the one that checks the folder exists is working, but the one that checks for the file name always comes back true. for now, i have it just forcing an overwrite with robocopy because most of the time that's what i'll want to do anyway.
here's what i have right now.. "K:" is the mapped network drive i'm copying to, and i'm usually copying files from another mapped network drive "T:". I also should mention i have this set up to run from the context menu in windows (7) explorer, and it passes the file path to the script via %L and $args.
any advice is appreciated. (i apologize in advance, i know it's rather rough.. This is somewhat new to me.)
$Folders = #("K:\OKKHM 800" , "K:\OKKHM 1000" , "K:\OKKHM 1002" , "K:\OKKHM 1003" , "K:\OKKHM 1004", "K:\OKKHM 1250")
$source = $args[0]
$Filename = Split-Path -Path $source -Leaf
$sourcefolder= split-path -path $source -parent
$COUNTER = 0
$successful=0
$CONFIRMATION=0
foreach($Folder in $Folders){
$newpath = $folder + "\" + $filename
WRITE-HOST $NEWPATH
if(-not(test-path -path $newpath)) {
if((test-path -path $folder)) {
WRITE-HOST 'TEST 2'
robocopy $sourcefolder $folder $filename -is -it
$successful=1
}
else{
write-host 'folder does not exist'
}
}
else {
$title = 'Existing File Will Be Overwritten'
$question = 'Are you sure you want to proceed?'
$choices = New-Object Collections.ObjectModel.Collection[Management.Automation.Host.ChoiceDescription]
$choices.Add((New-Object Management.Automation.Host.ChoiceDescription -ArgumentList '&Yes'))
$choices.Add((New-Object Management.Automation.Host.ChoiceDescription -ArgumentList '&No'))
$decision = $Host.UI.PromptForChoice($title, $question, $choices, 1)
if ($decision -eq 0) {
Write-Host 'confirmed'
$CONFIRMATION=1
}
else {
Write-Host 'cancelled'
$CONFIRMATION=0
}
IF ($CONFIRMATION -EQ 1) {
try {
robocopy $sourcefolder $folder $filename
$successful=1
}
catch {
throw "NO GOOD"
}
}
}
$COUNTER++
}
if ($successful -eq 1) {
WRITE-HOST 'SUMMARY: ' $COUNTER ' FILES COPIED SUCCESSFULLY.'
}
Start-Sleep 5
I am trying to understand what this error actually means. I am new to PowerShell and cannot
figure this one out. I have searched for similar questions but the content differs to my
requirement.
In a nut shell the script is queering a data historian system for a batch/lot number and the
start time of that batch.
This script will run every minute using task scheduler. This has not been set up yet as I am
still in the testing phase.
I have set up a service account is order for the script to run. The details of which are
stored in a cred file.
The script creates a folder using this batch/lot number.
The script creates a log file with the batch number and the start date and time of the batch.
Then the script searches a source folder on the server when a file is uploaded from the
factory floor into the source folder the script moves the file into the already created folder
with the correct batch number.
If files that are outside of the batch start and end time then the files are moved to no batch
folder where they will be reviewed manually.
I have done tests whereby I manually added files to the source folder on the server and
everything worked and did not get the "a positional parameter cannot be found that accepts
argument "+" from the script.
I am looking into the server configuration and permission levels but to my knowledge, nothing
has changed. I cannot see what is wrong with the script but hopefully, someone can give me
some pointers.
Error Code below
`PS C:\Users\a-graydx2> E:\Kistler Script\Batch ID with log 2021-11-29.ps1
An error occurred:
Key not valid for use in specified state.
Add-Content : A positional parameter cannot be found that accepts argument '+'.
At E:\Kistler Script\Batch ID with log 2021-11-29.ps1:186 char:11
+ Add-Content -Path $ErrorFileName -Value (Get-Date -Format " ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Add-Content], ParameterBindingException
+ FullyQualifiedErrorId :
PositionalParameterNotFound,Microsoft.PowerShell.Commands.AddContentCommand
An error occurred:
Key not valid for use in specified state.
Add-Content : A positional parameter cannot be found that accepts argument '+'.
At E:\Kistler Script\Batch ID with log 2021-11-29.ps1:186 char:11
+ Add-Content -Path $ErrorFileName -Value (Get-Date -Format " ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Add-Content], ParameterBindingException
+ FullyQualifiedErrorId :
PositionalParameterNotFound,Microsoft.PowerShell.Commands.AddContentCommand`
Script is below
Thanks for your help
`# Declare global variables
$fmSourcePath = "E:\Kistler\CoMo Services\Data\5336_L1.4345277\"
$shSourcePath = "E:\Kistler\CoMo Services\Data\5338_L1.5338_L1\"
$fmDesinationPath = "E:\Kistler XML Files\FM\"
$shDesinationPath = "E:\Kistler XML Files\SH\"
$fmWebAPI = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
$shWebAPI = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
# the path to stored credential
$credPath = "E:\Kistler Script\Cred.xml"
$logFileName = "BatchLog.txt"
#Path to the error log file
$ErrorFileName = "E:\Kistler Script\errorlog.txt"
function Move_Kistler_Files {
param (
[string]$url,
[string]$SourcePath,
[string]$DestinationPath
)
try {
# check for stored credential
if ( Test-Path $credPath ) {
#crendetial is stored, load it
$cred = Import-CliXml -Path $credPath
} else {
# no stored credential then: create store, get credential and save it
$parent = split-path $credpath -parent
if ( -not ( test-Path $parent ) ) {
New-Item -ItemType Directory -Force -Path $parent
}
$cred = get-credential
$cred | Export-CliXml -Path $credPath
}
# Get the current batch id using the Web-API call
$result = Invoke-RestMethod -Uri $url -Credential $Cred
$BatchID = $result.Value
$BatchFolder = $DestinationPath + $BatchID
Write-Host $BatchFolder
# Create a new folder in the destination path based on the Batch ID
If(!(test-path $BatchFolder))
{
New-Item -ItemType Directory -Force -Path $BatchFolder | Out-Null
# Add the current date/time to the log file
$LogFile = $DestinationPath + $logFileName
# if file exist Update the last record with the batch end date
If((test-path $LogFile)){
$txt = get-content $LogFile
$txt[$txt.length - 1 ] = $txt[$txt.length - 1 ] + ", " + (Get-Date)
$txt | set-content $LogFile
}else{
#add a header row in the file
Add-Content -Path $LogFile -Value "BatchID, StartDate, EndDate"
}
# create a new record in the log file with current Batch Id and date as start of
batch indicator
$msg = $BatchID + ", " + (Get-Date)
Add-Content -Path $LogFile -Value $msg
}
##############################################################################
# Copy the Kistler XML files from the source to the destination
##############################################################################
# get al the Kistler XML files in the source folder
$Files = get-childitem -path $SourcePath -Recurse | Where-Object {$_.Extension -eq ".XML"}
| Sort-Object LastWriteTime -Descending
# If we have files to process do it
if ($Files.Length -gt 0) {
# read back the batch start and end dates from the log table
$LogFile = $DestinationPath + $logFileName
$txt = get-content $LogFile
# Get the latest Batch Id and it's start date
$FileMoveCount = 0
$FileNotMoveCount = 0
$ptr = 1
$batchArray =$txt[$txt.length - $ptr ].Split(",")
$MoveToPath = $DestinationPath + $batchArray[0]
$batchStartDate = $batchArray[1]
#Process each XML file
Foreach ($File in $Files ) {
$FileTime = $File.LastWriteTime
#write-host $File.FullName $File.Name $FileTime $MoveToPath $batchStartDate
#if the XML file's date-time is older than the batch start time, skip to the
previus Batch Id and start time
while ( ([DateTime]$FileTime -lt [DateTime]$batchStartDate) -and ($ptr -lt
($txt.length)-1) ) {
#Write a log for the number of files copied
if ($FileMoveCount -gt 0){
Add-Content -Path $ErrorFileName -Value ((Get-Date -Format "dd/MM/yyyy
HH:mm") + ": " + $FileMoveCount + " XML files moved to " + $MoveToPath)
$FileMoveCount = 0
}
$ptr++
$batchArray =$txt[$txt.length - $ptr ].Split(",")
$MoveToPath = $DestinationPath + $batchArray[0]
$batchStartDate = $batchArray[1]
#write-host $MoveToPath $batchStartDate
}
#Copy the XML file to the destination folder
if ([DateTime]$FileTime -ge [DateTime]$batchStartDate){
Move-Item $File.FullName -Destination ($MoveToPath + "\" + $File.Name)
$FileMoveCount++
}else{
Move-Item $File.FullName -Destination ($DestinationPath + "\NoBatch\" +
$File.Name)
$FileNotMoveCount++
}
}
#Write a log for the number of files copied
if ($FileMoveCount -gt 0){
Add-Content -Path $ErrorFileName -Value ((Get-Date -Format "dd/MM/yyyy HH:mm") + ": "
+ $FileMoveCount + " XML files moved to " + $MoveToPath)
}
if ($FileNotMoveCount -gt 0){
Add-Content -Path $ErrorFileName -Value ((Get-Date -Format "dd/MM/yyyy HH:mm") + ":
Could not find batch ID for " + $FileNotMoveCount + " XML files " )
}
}
}catch{
#Write the error
Write-Host "An error occurred:" -ForegroundColor red
Write-Host $_ -ForegroundColor red
Add-Content -Path $ErrorFileName -Value (Get-Date -Format "dd/MM/yyyy HH:mm") + ": " +
$_
}
}
### Process the FM Kistler files
Move_Kistler_Files $fmWebAPI $fmSourcePath $fmDesinationPath
### Process the SH Kistler files
Move_Kistler_Files $shWebAPI $shSourcePath $shDesinationPath`
I wrote a PowerShell utility that takes in a couple parameters, and transfers files from a source directory to a destination directory.
Initially, all was done as a single function, and worked well enough.
Before adding some features, I broke repeated logic into its own function.
Then, the ISSUES began.
It appears that the Param() variables are seeded with incorrect values. Running the script yields the following:
PS ...> .\photoTransfer.ps1 E:\DCIM\100OLYMP
Cannot convert value "" to type "System.Boolean". Boolean parameters accept only Boolean values and numbers, such as
$True, $False, 1 or 0.
At C:\Users\SWPhantom\Desktop\admin\photoTransfer.ps1:85 char:3
+ [Parameter(Mandatory=$true, Position = 0, HelpMessage = "The path o ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : MetadataError: (:) [], ArgumentTransformationMetadataException
+ FullyQualifiedErrorId : RuntimeException
I can confirm that something's strange with
Write-Output "src: $source", which spits out src: True. (Expected to be src: E:\DCIM\100OLYMP)
HOWEVER: I can get the value I expect to be passed in with an $args[0].
I expect that the issue is simple, but I can't pick up on it, as this was my first foray into more... mature PowerShell scripting.
I am getting around the immediate problem by using the $args[i] method, but it'd be nice to not get an error message and use the seemingly Nice and orderly Params. (Especially since they seemed to work before I made the separate Transfer function).
Thanks!
Full code:
# Purpose: Transfer all photos from a memory card, to a destination folder, organized by year, month, date.
# Ensure that the Date Modified and Date Created is preserved.
function Transfer {
Param(
[Parameter(Mandatory, Position = 0)]
[string]$src,
[Parameter(Mandatory, Position = 1)]
[string]$dst,
[Parameter(Mandatory, Position = 2)]
[string]$extension
)
# Look at the source directory. Enumerate files to be sent over. (Only copy .ORF/.MOV files)
$files = Get-ChildItem -Path $src -Include $extension -Recurse
$numberOfFiles = $files.Count
if($numberOfFiles -eq 0) {
return "No $extension files found in $src!"
}
# Give user last chance to stop program. Show them number of files and destination folder.
Write-Output "Ensure the action is correct:"
read-host "Copying $numberOfFiles files from $src to $dst ?`nPress Enter to continue"
# Iteration for progress tracking.
$iter = 1
# Foreach file, check the Date Modified field. Make sure the destination folder has the folder structure like:
# Drive/Photos/YYYY/MM/DD/
# Where the YMD matches the Date Modified field of every photo.
foreach ($file in $files) {
$originalCreationTime = $file.LastWriteTime
[string]$year = $file.LastWriteTime.Year
[string]$month = $file.LastWriteTime.Month
[string]$date = $file.LastWriteTime.Day
# Add leading zero, if necessary
if($month.length -lt 2) {
$month = "0" + $month
}
if($date.length -lt 2) {
$date = "0" + $date
}
# Test the path of destinationPath/YYYY/MM/DD/
$path = $dst + "$year\$month\$date\"
if (!(Test-Path -Path $path)) {
if($verb) {
Write-Output " $path"
}
New-Item -ItemType Directory -Path $path
}
# The filepath exists!
if($verb) {
Write-Output " ($iter/$numberOfFiles) $file -> $path"
}
$iter += 1
Copy-Item $file.FullName -Destination $path
# Fix the Creation Time
$(Get-Item -Path "$path$($file.Name)").CreationTime=$originalCreationTime
}
Write-Output "`nCopying done!`n"
# Delete items?
Write-Output "Delete $numberOfItems items?"
$del = read-host "Deleting copied files from $src ?`nY to continue"
if($del -eq "Y") {
foreach ($file in $files) {
Remove-Item $file.FullName
}
}
}
Param(
# Source Folder
[Parameter(Mandatory=$true, Position = 0, HelpMessage = "The path of the source of the media")]
[Alias("s")]
[string]$source,
# Photo Destination
[Parameter(Mandatory=$false, Position = 1, HelpMessage = "The path of the folder you want to move photos to")]
[Alias("pd")]
[string]$photoDestination,
# Video Destination
[Parameter(Mandatory=$false, Position = 2, HelpMessage = "The path of the folder you want to move videos to")]
[Alias("vd")]
[string]$videoDestionation,
# Verbosity
[Parameter(Position = 3, HelpMessage = "Turn extra logging on or off")]
[Alias("v")]
[bool]$verb = $true
)
$usageHelpText = "usage:`n photoTransfer.ps1 <DriveName> <pathToDestinationRootFolder>`nex:`n .\photoTransfer.ps1 C T:\Photos"
#TODO: Solve this conundrum, where passing a via CMD
# Write-Output "Source before treatment: $($args[0])"
# Write-Output "Source before treatment: $($args[1])"
# Write-Output "Source before treatment: $($args[2])"
# Write-Output "Source before treatment: $($args[3])"
$source = $args[0]
$verb = $true
# I expect a drive name. If a ':' is missing, I add it.
if(!$source.Contains(":")) {
$source = $source + ":"
}
# The assumption is that the photos are coming from my Olympus camera, which has the following path to the files.
# $olympusFolderPath = "DCIM\100OLYMP\"
# $source += $olympusFolderPath
# Make sure the destination path has a terminating '\'
# if(!($photoDestination -match "\\$")) {
# $photoDestination = $photoDestination + "\"
# }
$photoDestination = "T:\Photos\"
$videoDestionation = "T:\Footage\"
# Check if the source and destination paths are valid.
if (!(Test-Path -Path $source)) {
Write-Output "Source disk ($source) doesn't exist`n$usageHelpText"
exit 0
}
if (!(Test-Path -Path $photoDestination)) {
Write-Output "Destination path ($photoDestination) doesn't exist`n$usageHelpText"
exit 0
}
if (!(Test-Path -Path $videoDestionation)) {
Write-Output "Destination path ($videoDestionation) doesn't exist`n$usageHelpText"
exit 0
}
Transfer $source $photoDestination "*.ORF"
Transfer $source $videoDestionation "*.MOV"
What I do, is copying photo files from SD card to HDD using powershell ps1 file and Windows PowerShell ISE.
I get a taken date from image exif and add it to destination path.
The problem is that robocopy creates folders and adds strange prefix, which I do not want to have.
As a result I can see two subfolders with same name "2020", one folder created by hand and the other created by robocopy.
This prefix is only seen when I list folders with CMD.
The prefix not seen in output.log and in powershell.
$copy_from = "G:\DCIM\100MSDCF\"
$copy_to = "C:\Photos\"
function GetDateTaken {
param (
[Parameter(ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)]
[Alias('FullName')]
[String]
$Path
)
begin {
$shell = New-Object -COMObject Shell.Application
}
process {
$returnvalue = 1 | Select-Object -Property Name, DateTaken, Folder
$returnvalue.Name = Split-Path $path -Leaf
$returnvalue.Folder = Split-Path $path
$shellfolder = $shell.Namespace($returnvalue.Folder)
$shellfile = $shellfolder.ParseName($returnvalue.Name)
$returnvalue.DateTaken = $shellfolder.GetDetailsOf($shellfile, 12)
$returnvalue.DateTaken
}
}
$file = Get-ChildItem -Path $copy_from -recurse -include ('*.jpg','*.arw')
$i = 0
$jpg = 0
$arw = 0
$logifile = 'output.log'
if ([System.IO.File]::Exists($logifile)) {
Clear-Content $logifile
Write-Host ("Logfile cleaned: $logifile")
} else {
try {
New-Item -Path . -Name $logifile | Out-Null
Write-Host ("New logfile created: $logifile")
}
catch {
"Failed to create $logifile"
}
}
foreach ($file in $file) {
if ($file.extension -eq '.JPG') { $jpg++ }
if ($file.extension -eq '.ARW') { $arw++ }
$i++
$datetaken = ($file.fullname | GetDateTaken).Split(' ')[0]
$datetaken_Day = $datetaken.Split('.')[0]
$datetaken_Month = $datetaken.Split('.')[1]
$datetaken_Year = $datetaken.Split('.')[2]
$TargetPath = "$copy_to$datetaken_Year\$datetaken_Month\$datetaken_Day\"
Write-Host ("$i. " + $file.Name + " `tDate taken: " + $datetaken)
robocopy $copy_from $TargetPath $file.Name /ts /fp /v /np /unilog+:$logifile | Out-Null
}
Write-Host ("`nTotal: " + $i + " files (" + $jpg + " JPG files, " + $arw + " ARW files)")
Not helps if write $TargetPath = $copy_to + $datetaken_Year + "\" + $datetaken_Month + "\" + $datetaken_Day + "\".
Not helps if I set /fat option to robocopy.
But, for example, when I set a year manualy, everything is ok $datetaken_Year = 2020
What should be fixed to create correct folder names?
Using the GetDetailsOf() method from the COM object returns localized results, which leads to your function on my Dutch machine returning the date in 'dd-MM-yyyy HH:mm' format (with invisible characters surrounding it).
A better approach IMO would be to get the date taken using System.Drawing.Imaging.Metafile to read the exif data as null-terminated byte array and parse the date from that as DateTime object using below function:
function Get-ExifDate {
# returns the 'DateTimeOriginal' property from the Exif metadata in an image file if possible
[CmdletBinding(DefaultParameterSetName = 'ByName')]
Param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true, Position = 0, ParameterSetName = 'ByName')]
[Alias('FullName', 'FileName')]
[ValidateScript({ Test-Path -Path $_ -PathType Leaf})]
[string]$Path,
[Parameter(Mandatory = $true, ValueFromPipeline = $true, Position = 0, ParameterSetName = 'ByObject')]
[System.IO.FileInfo]$FileObject
)
Begin {
Add-Type -AssemblyName 'System.Drawing'
}
Process {
# the function received a path, not a file object
if ($PSCmdlet.ParameterSetName -eq 'ByName') {
$FileObject = Get-Item -Path $Path -Force -ErrorAction SilentlyContinue
}
# Parameters for FileStream: Open/Read/SequentialScan
$streamArgs = #(
$FileObject.FullName
[System.IO.FileMode]::Open
[System.IO.FileAccess]::Read
[System.IO.FileShare]::Read
1024, # Buffer size
[System.IO.FileOptions]::SequentialScan
)
try {
$stream = New-Object System.IO.FileStream -ArgumentList $streamArgs
$metaData = [System.Drawing.Imaging.Metafile]::FromStream($stream)
# get the 'DateTimeOriginal' property (ID = 36867) from the metadata
# Tag Dec TagId Hex TagName Writable Group Notes
# ------- --------- ------- -------- ----- -----
# 36867 0x9003 DateTimeOriginal string ExifIFD (date/time when original image was taken)
# get the date taken as an array of bytes
$exifDateBytes = $metaData.GetPropertyItem(36867).Value
# transform to string, but beware that this string is Null terminated, so cut off the trailing 0 character
$exifDateString = [System.Text.Encoding]::ASCII.GetString($exifDateBytes).TrimEnd("`0")
# return the parsed date
return [datetime]::ParseExact($exifDateString, "yyyy:MM:dd HH:mm:ss", $null)
}
catch{
Write-Warning -Message "Could not read Exif data from '$($FileObject.FullName)'"
}
finally {
If ($metaData) {$metaData.Dispose()}
If ($stream) {$stream.Close()}
}
}
}
Another option would be to download and unzip ExifTool
(you can download the zip files from here)
Then use it like:
$exifTool = 'Path\To\Unzipped\ExifTool.exe' # don't forget to 'Unblock' after downloading
$file = 'Path\To\The\ImageFile' # fullname
# retrieve all date tags in the file
# -s2 (or -s -s) return short tag name add the colon directly after that
$allDates = & $exifTool -time:all -s2 $file
# try to find a line with tag 'DateTimeOriginal', 'CreateDate' or 'ModifyDate'
# which will show a date format of 'yyyy:MM:dd HH:mm:ss'
# and parse a DateTime object out of this string
$dateTaken = switch -Regex ($allDates) {
'^(?:DateTimeOriginal|CreateDate|ModifyDate):\s(\d{4}:\d{2}:\d{2} \d{2}:\d{2}:\d{2})' {
[datetime]::ParseExact($matches[1], 'yyyy:MM:dd HH:mm:ss', $null)
break
}
}
Short explanation of what the above returns
Both methods return the date the image was taken as a DateTime object, not a string.
This object has properties like .Year, .Month, .Day etc. It also has various methods like .AddDays(), .ToShortDateString(), .ToString() and a lot more.
If you do $datetaken = ($datetaken -split ' ')[0] as per your comment, you are asking PowerShell to implicitely convert it to a string using the default ToString() method.
You can use that ToString() method in your code if you give it the formatting string you need in between the brackets, anyway you like.
If you for instance do $dateTaken.ToString('yyyy\\MM\\dd'), you'll get a string 2020\10\08 if $dateTaken was today, which could serve as part of a file path.
In your code, you could do:
$TargetPath = Join-Path -Path $copy_to -ChildPath $dateTaken.ToString('yyyy\\MM\\dd')
# if that path does not exist yet, create it
if (!(Test-Path -Path $TargetPath -PathType Container)) {
$null = New-Item -Path $TargetPath -ItemType Directory
}
Then go ahead and copy the file to the now existing $TargetPath
Please have a look at all the standard format strings and custom format specifiers you can use on a DateTime object.
Hoping someone can guide me / help me.
The issue, I have 2 servers one running a Ubuntu which has a website for clients to login and download / view reports. The other is a windows server 2012 R2 which creates / stores the reports. I need to move the files from the windows to the Ubuntu server so clients can view. The data is large currently 7gb and growing at 3 gb a year.
I need a batch file to connect using ftp and then copy the folder to a local folder. However it only needs to copy those files which have modified.
I have only ever written one batch file and I cant seem to find any ftp batch scripts which only copies modifed files.
Your my last resort as I cant seem to find a coder who knows batch script (its a dieing art). I have never used powershell so would not know where to start here.
Any help or advice please let me know.
Thanks
John
You can do it with PowerShell with winscp. Exemple :
try
{
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Sftp
HostName = "example.com"
UserName = "user"
Password = "mypassword"
SshHostKeyFingerprint = "ssh-rsa 2048 xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Upload files
$transferOptions = New-Object WinSCP.TransferOptions
$transferOptions.TransferMode = [WinSCP.TransferMode]::Binary
$transferResult = $session.PutFiles("d:\toupload\*", "/home/user/", $False, $transferOptions)
# Throw on any error
$transferResult.Check()
# Print results
foreach ($transfer in $transferResult.Transfers)
{
Write-Host ("Upload of {0} succeeded" -f $transfer.FileName)
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host ("Error: {0}" -f $_.Exception.Message)
exit 1
}
This would be a way to do it in PowerShell. It would take files that are older then 31 days and upload them.
function FTP-Upload {
[CmdletBinding()]
param(
[Parameter(Mandatory=$true)]
[string]$Source_File,
[Parameter(Mandatory=$true)]
[string]$Target_File,
[Parameter(Mandatory=$true)]
[string]$Target_Server,
[Parameter(Mandatory=$true)]
[string]$Target_Username,
[Parameter(Mandatory=$true)]
[string]$Target_Password
)
$FTP = [System.Net.FTPWebRequest]::Create("ftp://$Target_Server/$Target_File")
$FTP = [System.Net.FTPWebRequest]$FTP
$FTP.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$FTP.Credentials = New-Object System.Net.NetworkCredential($Target_Username,$Target_Password)
$FTP.UseBinary = $true
$FTP.UsePassive = $true
# read in the file to upload as a byte array
$content = [System.IO.File]::ReadAllBytes($Source_File)
$FTP.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $FTP.GetRequestStream()
$rs.Write($content, 0, $content.Length)
# be sure to clean up after ourselves
$rs.Close()
$rs.Dispose()
}
$Upload_Server = "server.network.tld"
$Upload_Location = "/data/"
$Upload_Username = "ftpuser"
$Upload_Password = "ftppassword"
$Files_To_Upload = Get-ChildItem E:\Path\To\Files -Recurse | Where-Object {($_.CreationTime -le (Get-Date).AddDays(-31)) -and (!$_.PSIsContainer)}
Foreach ($File in $Files_To_Upload) {
FTP-Upload -Source_File $File.FullName -Target_File ($Upload_Location + $File.Name) -Target_Server $Upload_Server -Target_Username $Upload_Username -Target_Password $Upload_Password
}