Remote PowerShell, Local Variables, and ForEach - windows

I am looking to get a file inventory of a specific file on some reomote servers using PowerShell and Invoke-Command. I want to gather the info and then export as CSV to a specific folder.
I have a couple of local variables for the date (to append to the file name) and a list of servers. I understand that I need to use the -ArgumentList parameter to pass the local variables, but the syntax is confounding me with the ForEach aspect involved (I am not a programatically minded person). Here is what I have:
$FileServerList = "Server01","Server02","Server03"
$DateTime = Get-Date -Format s
ForEach ($FileServer in $FileServerList) {
Invoke-Command -ComputerName $FileServer -ScriptBlock {
Get-ChildItem -Path "D:\PathtoFile\*" -Recurse -Include "index.html" | Select-Object Name,DirectoryName,CreationTime,LastWriteTime`
| Export-Csv -Path C:\Users\Public\Documents\Data_$DateTime.csv -NoTypeInformation -Verbose
}
}
Should I save the script block itself in a variable or go another route? Any guidance is appreciated.

Since Invoke-Command is not called with a separate credential, I assume that the user in current Powershell session has credentials to access servers.
For remote locations, you can use UNC paths:
$FileServerList = "Server01","Server02","Server03"
$DateTime = Get-Date -Format s
ForEach ($FileServer in $FileServerList) {
Get-ChildItem -Path "\\$FileServer\d$\PathtoFile\*" -Recurse -Include "index.html" |
Select-Object Name,DirectoryName,CreationTime,LastWriteTime |
Export-Csv -Path \\$FileServer\c$\Users\Public\Documents\Data_$DateTime.csv -NoTypeInformation -Verbose
}

Related

Powershell replace file content with output of previous command [duplicate]

I am having a helluva time trying to understand why this script is not working as intended. It is a simple script in which I am attempting to import a CSV, select a few columns that I want, then export the CSV and copy over itself. (Basically we have archived data that I only need a few columns from for another project due to memory size constraints). This script is very simple, which apparently has an inverse relationship with how much frustration it causes when it doesn't work... Right now the end result is I end up with an empty csv instead of a csv containing only the columns I selected with Select-Object.
$RootPath = "D:\SomeFolder"
$csvFilePaths = Get-ChildItem $RootPath -Recurse -Include *.csv |
ForEach-Object{
Import-CSV $_ |
Select-Object Test_Name, Test_DataName, Device_Model, Device_FW, Data_Avg_ms, Data_StdDev |
Export-Csv $_.FullName -NoType -Force
}
Unless you read the input file into memory in full, up front, you cannot safely read from and write back to the same file in a given pipeline.
Specifically, a command such as Import-Csv file.csv | ... | Export-Csv file.csv will erase the content of file.csv.
The simplest solution is to enclose the command that reads the input file in (...), but note that:
The file's content (transformed into objects) must fit into memory as a whole.
There is a slight risk of data loss if the pipeline is interrupted before all (transformed) objects have been written back to the file.
Applied to your command:
$RootPath = "D:\SomeFolder"
Get-ChildItem $RootPath -Recurse -Include *.csv -OutVariable csvFiles |
ForEach-Object{
(Import-CSV $_.FullName) | # NOTE THE (...)
Select-Object Test_Name, Test_DataName, Device_Model, Device_FW,
Data_Avg_ms, Data_StdDev |
Export-Csv $_.FullName -NoType -Force
}
Note that I've used -OutVariable csvFiles in order to collect the CSV file-info objects in output variable $csvFiles. Your attempt to collect the file paths via $csvFilePaths = ... doesn't work, because it attempts to collects Export-Csv's output, but Export-Csv produces no output.
Also, to be safe, I've changed the Import-Csv argument from $_ to $_.FullName to ensure that Import-Csv finds the input file (because, regrettably, file-info object $_ is bound as a string, which sometimes expands to the mere file name).
A safer solution would be to output to a temporary file first, and (only) on successful completion replace the original file.
With either approach, the replacement file will have default file attributes and permissions; if the original file had special attributes and/or permissions that you want to preserve, you must recreate them explicitly.
As Matt commented, your last $PSItem ($_) not related to the Get-ChildItem anymore but for the Select-Object cmdlet which don't have a FullName Property
You can use differnt foreach approach:
$RootPath = "D:\SomeFolder"
$csvFilePaths = Get-ChildItem $RootPath -Recurse -Include *.csv
foreach ($csv in $csvFilePaths)
{
Import-CSV $csv.FullName |
Select-Object Test_Name,Test_DataName,Device_Model,Device_FW,Data_Avg_ms,Data_StdDev |
Export-Csv $csv.FullName -NoType -Force
}
Or keeping your code, add $CsvPath Variable containing the csv path and use it later on:
$RootPath = "D:\SomeFolder"
Get-ChildItem $RootPath -Recurse -Include *.csv | ForEach-Object{
$CsvPath = $_.FullName
Import-CSV $CsvPath |
Select-Object Test_Name,Test_DataName,Device_Model,Device_FW,Data_Avg_ms,Data_StdDev |
Export-Csv $CsvPath -NoType -Force
}
So I have figured it out. I was attempting to pipe through the Import-Csv cmdlet directly instead of declaring it as a variable in the o.g. code. Here is the code snippet that gets what I wanted to get done, done. I was trying to pipe in the Import-Csv cmdlet directly before, I simply had to declare a variable that uses the Import-Csv cmdlet as its definition and pipe that variable through to Select-Object then Export-Csv cmdlets. Thank you all for your assistance, I appreciate it!
$RootPath = "\someDirectory\"
$CsvFilePaths = #(Get-ChildItem $RootPath -Recurse -Include *.csv)
$ColumnsWanted = #('Test_Name','Test_DataName','Device_Model','Device_FW','Data_Avg_ms','Data_StdDev')
for($i=0;$i -lt $CsvFilePaths.Length; $i++){
$csvPath = $CsvFilePaths[$i]
Write-Host $csvPath
$importedCsv = Import-CSV $csvPath
$importedCsv | Select-Object $ColumnsWanted | Export-CSV $csvPath -NoTypeInformation
}

Trying to get folder sizes for all users directory

I am trying to write a powershell that will look at a network share and write out to a CSV the full name of the share and the size in MB or GB of those folders for each user home directory folder.
This is my code so far:
$StorageLocation = '\\wgsfs01\USERDIR\USERS'
$Roots = Get-ChildItem $StorageLocation | Select Fullname
ForEach ($Root in $Roots) { (Get-ChildItem $Root -Recurse | Measure-Object -Property Length -Sum).Sum }
I believe there is something wrong with my ForEach statement as this is my error message
Get-ChildItem : Cannot find path 'C:#{FullName=\wgsfs01\USERDIR\USERS' because it does not exist.
I appreciate any advice and thank you in advance.
The issue you have is that FullName contains a DirectoryInfo object, you have two options;
Change your select to ExpandProperty which will change it to a string of the full path.
Select-Object -ExpandProperty Fullname
Refer to $Root using the property FullName which is a property on the DirectoryInfo Object.
Get-ChildItem -path $Root.FullName -Recurse
This is one solution to what you are trying to achieve, note that errors (e.g. access denied) are ignored.
Get-ChildItem $StorageLocation | ForEach-Object {
$sizeInMB = (Get-ChildItem $_.FullName -Recurse -ErrorAction SilentlyContinue | Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue).Sum / 1MB
New-Object PSObject -Property #{
FullName = $_.FullName
SizeInMB = $sizeInMB
}
}

Is there a faster way to move pictures by a PowerShell script

I want to move all pictures from several folders to one destination folder, if they listed in my txt-file.
The script works, but there are about 81k pictures and 450k names (eg samlpe-green-bigpic-detail-3.jpg) in the txt-file, it is damn slow.
Is there a way to script it, so it works faster?
$qpath = "c:\sample\picz\"
$Loggit = "c:\sample\pic_move.log"
$txtZeileU = "c:\sample\names.txt"
$d_pic = "C:\sample\moved_picz"
$arrZeileU = Get-Content -Path $txtZeileU
foreach ($Zeile in $arrZeileU) {
Get-ChildItem -Path $qpath -Recurse |
where {$_.Name –eq $Zeile} |
Move-Item -Destination $d_pic -Verbose -Force *>&1 |
Out-File -FilePath $Loggit -Append
}

Is it possible to invoke-command with in a workflow?

Do anyone know if I can use Invoke-Command in a PowerShell workflow?
Currently I have script that loops through a text file with the list of services but I would like it push to all of the servers at once verses going through one by one. Is this possible?
This is the current script block I am working with:
{
ForEach ($Server in $Servers) {
Write-Host "Copying code to $Server..."
If (!(Test-Path -path \\$Server\c$\Websites\Versions\v$version)) {
New-Item \\$Server\c$\Websites\Versions\v$version -Type Directory | Out-Null
}
Copy-Item .\Packages\v$version\* \\$Server\c$\Websites\Versions\v$version -Force -Recurse
Write-Host "Converting to application on $Server..."
Invoke-Command -ComputerName $Server -ScriptBlock $Script -Argumentlist $Version | Out-Null
}
}
The PowerShell Workflow engine is not capable of directly invoking PowerShell cmdlets. Instead, if a script writer calls a PowerShell cmdlet inside a Workflow definition, the PowerShell Workflow engine will automatically wrap invocations of PowerShell cmdlets inside the InlineScript Workflow Activity.
workflow test
{
ForEach ($Server in $Servers) {
Write-Host "Copying code to $Server..."
If (!(Test-Path -path \\$Server\c$\Websites\Versions\v$version)) {
New-Item \\$Server\c$\Websites\Versions\v$version -Type Directory | Out-Null
}
Copy-Item .\Packages\v$version\* \\$Server\c$\Websites\Versions\v$version -Force -Recurse
Write-Host "Converting to application on $Server..."
InlineScript {
Invoke-Command -ComputerName $Server -ScriptBlock $Script -Argumentlist $Version | Out-Null
}
}
}
As for whether or not it will work, you'll have to try it out, as suggested by Mathias.
#Trevor's response is good as an overall skeleton, but it won't work as it is.
There are several things missing or incorrect:
Passing arguments to workflow
Passing arguments to InlineScript
Passing ScriptBlock as an argument;
Using Out-Null in workflow
The working example:
$serversProd=#"
server1
server2
server3
server4
"#-split'\r\n'
$reportScript = "report.script.ps1"
$generateReport = {
param($reportScript)
cd D:\Automations\ConnectivityCheck
powershell -file $reportScript
}
workflow check-connectivity {
Param ($servers, $actionBlock, $reportScript)
# Prepare the results folder
$resultsFolder = "D:\Automations\ConnectivityCheckResults"
$unused1 = mkdir -Force $resultsFolder
# Run on all servers in parallel
foreach -parallel ($server in $servers) {
# Upload script to the server
$unused2 = mkdir -Force \\$server\D$\Automations\ConnectivityCheck
cp -Force $reportScript \\$server\D$\Automations\ConnectivityCheck\
"Starting on $server..."
# Execute script on the server. It should contain Start-Transcript and Stop-Transcript commands
# For example:
# $hostname = $(Get-Wmiobject -Class Win32_ComputerSystem).Name
# $date = (Get-Date).ToString("yyyyMMdd")
# Start-Transcript -path ".\$date.$hostname.connectivity.report.txt"
# ...Code...
# Stop-Transcript
$results = InlineScript {
$scriptBlock = [scriptblock]::Create($Using:actionBlock)
Invoke-Command -computername $Using:server -ScriptBlock $scriptBlock -ArgumentList $Using:reportScript
}
# Download transcript file from the server
$transcript = [regex]::Match($results,"Transcript started.+?file is \.\\([^\s]+)").groups[1].value
"Completed on $server. Transcript file: $transcript"
cp -Force \\$server\D$\Automations\ConnectivityCheck\$transcript $resultsFolder\
}
}
cls
# Execute workflow
check-connectivity $serversProd $generateReport $reportScript

How to output errors to a log file in powershell when copying files

I am trying to write a powershell script that does the following:
Check to see if a folder on a remote machine(text list of computers) exists, if so delete it.
Copy a folder from a remote share to the same machine and if there is an error output to an error log file, if not, output to a success log file.
I have searched but have been unable to find a solution to my seemingly simple problem, please see my code below:
$computers=Get-Content C:\pcs.txt
$source="\\RemoteShare\RemoteFolder"
$dest="C$\Program Files\Destination"
foreach ($computer in $computers) {
If (Test-Path \\$computer\$dest){
Remove-Item \\$computer\$dest -Force -Recurse
}
Copy-Item $source \\$computer\$dest -recurse -force -erroraction silentlycontinue
If (!$error)
{Write-Output $computer | out-file -append -filepath "C:\logs\success.log"}
Else
{Write-Output $computer | out-file -append -filepath "C:\logs\failed.log"}
}
Currently, when the script runs, everything is getting put in the failed.log file, regardless of if it fails or not.
How can I properly handle errors in powershell, while running through a for loop?
Here's an example.
$array = #(3,0,1,2)
foreach ($item in $array)
{
try
{
1/$item | Out-Null
$computer | Add-Content -Path "C:\logs\success.log"
}
catch
{
"Error: $_" | Add-Content -Path "C:\logs\failed.log"
}
}
Don't use $error, it always contains an array of recent error objects, even if the last command was successful. To check the results of the last command, use the $?, it will be false if the last command failed.
See about_Automatic_Variables for more details on these variables.

Resources