How to create a powershell script that moves files from one location to another specifying that any file that is 5 days old does not move?
Used Move-Item to move files from one location to another, but I need to move files which are more than 5 days old.
I think this pipeline should works for you
Get-ChildItem -Path C:\Downloads -File | Where-Object { $_.CreationTime -lt (Get-Date).AddDays(-5)} | Move-Item -PipelineVariable $_ -Destination C:\button
I want to count all files with a particular prefix in a directory and then display the results based on each sub directory.
The directory tree is as follows
Test
January
sms20180101_110.unl
rec20180101_110.unl
data20180101_110.unl
February
sms20180201_110.unl
rec20180201_110.unl
data20180201_110.unl
March
sms20180301_110.unl
rec20180301_110.unl
data20180301_110.unl
So, I need to count for example the total data files in each subdirectory and display results as follows
January 1
February 5
March 10
I wrote the following command in Powershell
Get-ChildItem -Path D:\Test -Filter *data* -Force -Recurse | Measure-Object | %{$_.Count}
So, the problem is it is giving me the total files in the root directory
A similar question was asked here Recursively count files in subfolders but I have not been able to customize the solutions provided here to my need
Based on your scenario, you can use Group-Object like this -
Get-ChildItem -Path D:\Test -Filter *data* -Force -Recurse | Group-Object -Property Directory | Select-Object Name, Count
This will list all the name of the folders and sub-folders along with the count of files having data in it's name.
I'm new to powershell and I want to check if file in readable and regular. In unix we can do it in one line by using -f & -r. For example the following shell script function accepts filename as argument and checks the readability and regularity of file, whats the powershell equivalent for this?
_ChkRegularFile_R() # checks whether a file is regular and readable
{
_CRFRfilename=$1 # name of the file to be checked
_CRFRsts=1 # default is error
if [ -f "$_CRFRfilename" ]
then
if [ -r "$_CRFRfilename" ]
then
_CRFRsts=0 # success: regular file is readable
fi
fi
return $_CRFRsts
}
To test if a file is readable, you try to open it. If you get an error, then it's not readable. You need to either trap or catch exceptions or stop on errors, as appropriate. Remember, Windows locks files that are open for writing, so applications need to expect that they sometimes can't open a file.
If you absolutely have to, you can use something like this to test if you can read a file:
try {
[System.IO.File]::OpenRead($FullPathName).Close()
$Readable = $true
}
catch {
$Readable = $false
}
And this to test if you can write to a file:
try {
[System.IO.File]::OpenWrite($FullPathName).Close()
$Writable = $true
}
catch {
$Writable = $false
}
That logic is fairly easy to wrap into a function if you really need it.
As far as file types, nearly everything in the file system in Windows is a plain file or a directory, since Windows doesn't have the "everything is a file" convention. So, normally you can test as follows:
# Test if file-like
Test-Path -Path $Path -Leaf
# Test if directory-like
Test-Path -Path $Path -Container
If you're working with a FileInfo or DirectoryInfo object (i.e., the output of Get-Item, Get-ChildItem, or a similar object representing a file or directory) you'll have the PSIsContainer property which will tell you if the item is a file or a directory.
That covers probably 99.999% of cases.
However, if you need to know if something is an NTFS hard link to a file (rare, but oldest), an NTFS junction to a directory, an NTFS symlink, an NTFS volume mount point, or any type of NTFS reparse point, it gets much more complicated. [This answer does a good job describing the first three.]
Let's create a simple NTFS folder to test with:
# Create a test directory and change to it.
New-Item -Path C:\linktest -ItemType Directory | Select-Object -ExpandProperty FullName | Push-Location
# Create an empty file
New-Item -Path .\file1 -ItemType file -Value $null | Out-Null
New-Item -Path .\file2 -ItemType file -Value $null | Out-Null
# Create a directory
New-Item -Path .\dir1 -ItemType Directory | Out-Null
# Create a symlink to the file
New-Item -ItemType SymbolicLink -Path .\sfile1 -Value .\file1 | Out-Null
# Create a symlink to the folder
New-Item -ItemType SymbolicLink -Path .\sdir1 -Value .\dir1 | Out-Null
# Create a hard link to the file
New-Item -ItemType HardLink -Path .\hfile1 -Value .\file1 | Out-Null
# Create a junction to the folder
New-Item -ItemType Junction -Path .\jdir1 -Value .\dir1 | Out-Null
# View the item properties
Get-ChildItem -Path . | Sort-Object Name | Format-Table -Property Name, PSIsContainer, LinkType, Target, Attributes -AutoSize
Your output will be:
Name PSIsContainer LinkType Target Attributes
---- ------------- -------- ------ ----------
dir1 True {} Directory
file1 False HardLink {C:\linktest\hfile1} Archive
file2 False {} Archive
hfile1 False HardLink {C:\linktest\file1} Archive
jdir1 True Junction {C:\linktest\dir1} Directory, ReparsePoint
sdir1 True SymbolicLink {C:\linktest\dir1} Directory, ReparsePoint
sfile1 False SymbolicLink {C:\linktest\file1} Archive, ReparsePoint
Note that both file1 and hfile1 are hard links, even though file1 wasn't created as such.
To clean up the above garbage, do:
Get-ChildItem -Path C:\linktest\ | ForEach-Object { $_.Delete() }
There's a bug in Remove-Item with deleting some container links which prevents the command from removing the items.
The general solution would be to get the item and test it:
# Get the item. Don't use Get-ChildItem because that will get a directory's contents
$Item = Get-Item -Path $Path
# Is it a container
$Item.PSIsContainer
# Is it a link of some kind?
[System.String]::IsNullOrWhiteSpace($Item.LinkType)
$Item.LinkType -eq 'Junction'
# Is it a Reparse Point?
($Item.Attributes -band [System.IO.FileAttributes]::ReparsePoint) -eq [System.IO.FileAttributes]::ReparsePoint
There are several other potential attributes, too:
PS> [System.Enum]::GetNames([System.IO.FileAttributes])
ReadOnly
Hidden
System
Directory
Archive
Device
Normal
Temporary
SparseFile
ReparsePoint
Compressed
Offline
NotContentIndexed
Encrypted
IntegrityStream
NoScrubData
Note that Device is documented as reserved for future use. Ain't no device file type in Windows.
For volume mount points, I'm not 100% sure how those look. I know you can create them on Windows 8.1 and later with Get-Partition followed by an appropriate Add-PartitionAccessPath, but I'm on Windows 7 currently. I'm afraid I have no means of testing this at the moment.
Finally, I have no idea how exactly PowerShell Core 6.0 on Linux handles file types.
Soooo,,,,
This is not something I regulary do, but if memory serves. In *nix, a regular file contains data, is a direcotry,
Again, not somehting I do/have to worry about under normal PoSH stuff.
So you are testing for where the object is a writable file (and / or non-zero) or a directory or binary?
So, in PoSH, prior to v3... you do something like this...
$IsDir = {$_.PsIsContainer}
$IsFile = {!$_.PsIsContainer}
ls D:\Temp | Where $IsDir
lsectory: D:\Temp
Mode LastWriteTime Length Name
---- ------------- ------ ----
d----- 1/4/2018 2:31 PM ArchiveDestination
d----- 1/4/2018 1:40 PM ArchiveSource
d----- 1/1/2018 3:34 PM diff
...
ls D:\Temp | Where $IsFile
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 6/7/2017 5:28 PM 512 CombinedSources07Jun2017.txt
-a---- 2/24/2018 6:29 PM 115 EmpData.csv
-a---- 11/18/2017 6:47 PM 11686 fsoVolume.docx
...
PoSH V3 and higher. This is supported natively e.g.:
ls -directory
ls -ad
ls -file
ls -af
Of course any of the above can be set to just return true or false using if/then or try/catch.
If all the above is a bit more typing than you'd like then you can create your own function and give it whatever alias you choose, well, as long as it's not an alias already in use.
See the help files ...
# Get parameters, examples, full and Online help for a cmdlet or function
(Get-Command -Name Get-ChildItem).Parameters
Get-help -Name Get-ChildItem -Examples
Get-help -Name Get-ChildItem -Full
Get-help -Name Get-ChildItem -Online
Get-Help about_*
Get-Help about_Functions
Get-Alias -Definition Get-ChildItem
# Find all cmdlets / functions with a target parameter
Get-Help * -Parameter Append
# All Help topics locations
explorer "$pshome\$($Host.CurrentCulture.Name)"
Of course you can check / modify file attributes as well. See this article on the topic:
File Attributes in PowerShell
Fun with file and folder attributes, via PowerShell and the DIR command.
https://mcpmag.com/articles/2012/03/20/powershell-dir-command-tricks.aspx
So, you could do something like this, to achieve the same attribute check
Get-ChildItem -Path $FilePath -File -Force | Where {$_.Attributes -notmatch 'ReadOnly'}
Or a function wiht an alias.
Function Test-RegularFile
{
[CmdletBinding()]
[Alias('trf')]
Param
(
[string]$FilePath
)
try
{
Get-ChildItem -Path $FilePath -File -Force `
| Where {$_.Attributes -notmatch 'ReadOnly'}
"$FilePath is a regular file" # success: regular file is readable
}
catch
{
Write-Warning -Message "$FilePath is not a Regular file."
}
}
trf -FilePath D:\Temp\fsoVolume.txt
Since you are new to PoSH, it reall important / vital that you get a base understanding before looking at conversion comparisons.
See this post for folks providing some paths for learning PowerShell.
https://www.reddit.com/r/PowerShell/comments/7oir35/help_with_teaching_others_powershell
To test whether it's a regular file:
Test-Path -PathType Leaf foo.txt
To test whether it's readable:
Get-ChildItem foo.txt | ? { $_.Mode -match 'r'}
To test whether it's hidden:
Get-ChildItem -force foo.txt | ? { $_.Mode -match 'h'}
All I need is a list of sub-directories, given a starting directory, listing the owner of that directory.
Any ideas on easy way to do this?
Use the Get-ChildItem cmdlet to retrieve the sub-directories and determine the owner using the Get-Acl cmdlet:
Get-ChildItem '\\server\yourshare' |
where PSIsContainer |
select FullName, #{e={Get-Acl $_.FullName | select -expand Owner}; l="Owner"}
Is it possible to copy from the Source location only New folders that have been added.
I have a Source location that is updated with folders every 5 minutes. The PS1 script will run every 5 minutes and copy all the folders to the destination location.
The issue im having is - It's copying over everything, i only want it to match up what has already been copied over Prior and copy over only newly added folders, Instead of copying everything again that is already there. Is this possible?
Also if possible once the copying of only Recently added folders, can the script then email out completion of this with what is has done?
So far i have the following :
Copy-Item -Recurse \\192.168.1.37\d$\Transactions\* -Destination D:\UK_Copy\Transactions_Bk -Force –Verbose
You could create a script that just loops. Something like:
$Source = "\\192.168.1.37\d$\Transactions\"
$Destination = "D:\UK_Copy\Transactions_Bk"
$LastScan = Get-ChildItem $Destination -recurse
While(1 -lt 2){
$NewScan = Get-ChildItem $Source -recurse
Compare-Object $NewScan -DifferenceObject $LastScan -Passthru | Copy-Item -Destination ($_.DirectoryName -replace [regex]::escape($source),$Destination) -force
$LastScan = $NewScan
Start-Sleep 300
}
That will get a listing of files in the destination, then look for files in the source that are new, or have changed compared to the destination and copy those over. Then it sets that scan as the last known listing to compare to, sleeps for 5 minutes, and loops again looking for files that are new or updated since the last scan.