I have been assigned with the following task.
Create a foreachloop. You can use the following template:
$directory-variable-here
foreach ($item in $directory) {
Script block here
}
Above the foreachcondition, set a variable, $directory, to the contents of the current directory.
Replace the script block placeholder with the command to enumerate the ACL of a file, using the $itemvariable in place of the file name.
You'll need to use the following cmdlets:
Get-ChildItem(or any alias of Get-ChildItem, such as lsor dir)
Get-Acl
I need some help to solve this. Secondly what can be used to store the current directory path, I am confused.
So far I have tried the following script but it returns the current directory's content only not for subdirectories.
$dirpath= $PSScriptRoot
foreach ($item in $dirpath){
$var = Get-ChildItem $item
Get-Acl $var
}
I tried a lot and after some searching, I found the solution, here is the script
$mypath= $PSScriptRoot
$var = Get-ChildItem -Recurse -Path $mypath
foreach ($item in $var){
Get-Acl $item.FullName
}
Related
We're migrating our FTP and I would like to only migrate folders that have had files in them that have been used/written in the last 6 months. I would think this would be something that I find all over the place with google, but all the scripts I've found have the same fatal flaw.
It seems with everything I find, it depends on the "Date modified" of the folder. The problem with that is, I have PLENTY of folders that show a "Date Modified" of years ago, yet when you dig into it, there are files that are being created and written as recently as today.
Example:
D:/Weblogs may show a date modified of 01/01/2018 however, when you dig into it, there is some folder, idk, called "Log6" let's say, and THAT folder has a log file in it that was modified as recently as yesterday.
All these scripts I'm seeing pull the date modified of the top folder, which just doesn't seem to be accurate.
Is there any way around this? I would expect something like
Get all folders at some top level, then foreach through the CONTENTS of those folders looking for files with the datemodified -lt adddays(-180) filter. If stuff that's "New" is found, then don't add the overarching directory to the array, but if not, then list the directory.
Any ideas?
Edit: I've tried this
$path = <some dir>
gci -path $path -Directory where-object {LastWriteTime -lt (get-date).addmonths(-6))}
and
$filter = {$_.LastWriteTime -lt (Get-Date).AddDays(-180)}
#$OldStuff = gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
$OldFolders = gci "D:\FTP\BELAMIINC" -Directory | ForEach-Object {
gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
}
Write-Host $OldFolders
Give this a try, I added comments for you to follow along the thought process.
The use of -Force is mainly to find hidden files and folders.
$path = '/path/to/parentFolder'
$limit = [datetime]::Now.AddMonths(-6)
# Get all the child folders recursive
$folders = Get-ChildItem $path -Recurse -Directory -Force
# Loop over the folders to see if there is at least one file
# that has been modified or accessed after the limit date
$result = foreach($folder in $folders)
{
:inner foreach($file in Get-ChildItem $folder -File -Force)
{
if($file.LastAccessTime -gt $limit -or $file.LastWriteTime -gt $limit)
{
# If this condition is true at least once, break this loop and return
# the folder's FullName, the File and File's Date for reference
[pscustomobject]#{
FullName = $folder.FullName
File = $file.Name
LastAccessTime = $file.LastAccessTime
LastWriteTime = $file.LastWriteTime
}
break inner
}
}
}
$result | Out-GridView
If you need to find the folders that don't have files recently modified you can use the $folders array and compare it against $result:
$folders.where({$_.FullName -notin $result.FullName}).FullName
I want to do the following in PowerShell, but it seems like the curly brackets pose an issue.
Here's how I would do it in Bash:
mkdir -p /path/to/dir/{dir1,dir2,dir3...dir10}
This creates a parent directory, then several directories of which "dir" contains several subfolders.
This should work for you
New-Item -ItemType Directory C:\temp1\dir1,C:\temp1\dir2,C:\temp1\dir3
one more way
$dirs = 1..10
$dirs | % {
New-Item -ItemType Directory ( Join-Path -Path 'c:\temp\dir' -ChildPath ('dir' +$_ ) )
}
If you are after Directories 1 to 10 (or similar) - something like the following will work for you:
1..10 | ForEach {
New-Item -ItemType Directory -Path ("C:\Temp\Dir" + $_)
}
If you are looking to create an array of named subfolders - the something like the following will work (including the ability to create sub-sub-folders:
$subFolderNameArray = #(
"folder1",
"Folder2",
"Folder1\Subfolder3"
)
ForEach ($subFolderName in $subFolderNameArray) {
New-Item -ItemType Directory -Path ("C:\Temp\" + $subFolderName)
}
This relies on creating the folders in teh correct order (you must create parent folders before subfolders). If you want to do things out of order (or can't guarantee the starting array will be tree-wise) - you can use the force switch:
New-Item -ItemType Directory -Path ("C:\Temp\Folder\Subfolder") -Force
i am trying to loop through all files no matter the type, in a folder, and change a string with one that is input by the user..
i can do this now, with the code below, but only with one type of file extension..
This is my code:
$NewString = Read-Host -Prompt 'Input New Name Please'
$scriptPath = split-path -parent $MyInvocation.MyCommand.Definition
$InputFiles = Get-Item "$scriptPath\*.md"
$OldString = 'SolutionName'
$InputFiles | ForEach {
(Get-Content -Path $_.FullName).Replace($OldString,$NewString) | Set-Content -Path $_.FullName
}
echo 'Complete'
How do i loop through the files, no matter the extension ?
so no matter if it is a md, txt or cshtml or some other, it will replace the string as instructed.
To get all the files in a folder you can get use Get-ChildItem. Add the -Recurse switch to also include files inside of sub-folders.
E.g. you could rewrite your script like this
$path = 'c:\tmp\test'
$NewString = Read-Host -Prompt 'Input New Name Please'
$OldString = 'SolutionName'
Get-ChildItem -Path $path | where {!$_.PsIsContainer} | foreach { (Get-Content $_).Replace($OldString,$NewString) | Set-Content -Path $_.FullName }
this will first get all the files from inside the folder defined in $path, then replace the value given in $OldString with what the user entered in when prompted and finally save the files.
Note: the scripts doesn't make any difference regarding if the content of the files changed or not. This will cause all files modified date to get updated. If this information is important to you then you need to add a check to see if the files contains the $OldString before changing them and saving.
The folder structure is:
--root
--root\source-code\
--root\powershell-scripts\
I need the method below that is inside the \powershell-scripts folder to target files inside \source-code:
function Test($param)
{
dir -Include ASourceCodeFile.txt -Recurse |
% { SomeMethod $_ $param }
}
What am I missing?
The $PSScriptRoot automatic variable contains the path of the directory in which the current script is located. Use Split-Path to find its parent (your --root) and Join-Path to get the path to the source-code folder:
Join-Path -Path (Split-Path $PSScriptRoot -Parent) -ChildPath 'source-code'
$PSScriptRoot was introduced in PowerShell 3.0
A bit late, but maybe still helpful for someone:
Directory structure :
MyRoot\script\scriptrunning.ps1
config:
MyRoot\config.xml
to read the xml file from scriptrunning.ps1:
[xml]$Config = Get-Content -path "${PSScriptRoot}\..\config\config.xml"
if you have a script in --root\powershell-scripts\ and you want to reference something in --root\source-code\ or say get-content you can do this:
cd --root\powershell-scripts\
get-content '..\source-code\someFile.txt'
The ..\ references the parent directory which contains \source-code\ and then you reference or pull in file or scripts from that directory.
this was a trick that I used in vbs that I converted to PS...
$scriptPath = Split-Path $MyInvocation.MyCommand.Path -Parent
$a = $scriptPath.split("``\``") for ($i = 0 ; $i -lt $a.count-1 ; $i++){
$parentDir = $parentDir + $a[$i] <br>
if($i -lt $a.count-2){$parentDir = $parentDir + "``\``"}
}
Write-Output $parentDir
How I can do a ls using PowerShell?
for i in `ls`
do
if [ -d $i ] #miro si és directori
then
echo "Directory"
else echo "File"
fi
done
POWERSHELL
$llistat -ls
forEach $element in $llistat ??? this is possible
}
A more PoSh way is to use a pipeline, and perhaps a hashtable:
$type = #{
$true = 'Directory'
$false = 'File'
}
Get-ChildItem | ForEach-Object { $type[$_.PSIsContainer] }
PowerShell even has a default alias ls for Get-ChildItem, so you could use more Unix-ish syntax:
ls | % { $type[$_.PSIsContainer] }
In PowerShell, the Get-ChildItem cmdlet works like ls (at least with the file system provider). All items returned have a PowerShell-specific property called PSIsContainer, indicating whether it's a directory or not:
foreach($item in (Get-ChildItem)){
if($item.PSIsContainer){
"Directory"
} else {
"File"
}
}
If you want to see what's inside each directory, one level down:
foreach($item in (Get-ChildItem)){
if($item.PSIsContainer){
# Directory! Let's see what's inside:
Get-ChildItem -Path $item.FullName
}
}
As of PowerShell version 3.0 and up, the Get-ChildItem supports a File and Directory switch on the filesystem provider, so if you ONLY want directories, you could do:
Get-ChildItem -Directory
So the second example becomes:
Get-ChildItem -Directory | Get-ChildItem
You could also list files recursively (like ls -R):
Get-ChildItem -Recurse