powershell exporting to text file - windows

I'm working on a script that checks folders in specific directory. For example, I run the script for first time, it generates me a txt file containing folders in the directory.
I need the script to add any new directories that are found to the previously created txt file when the script is run again.
Does anyone have any suggestions how to make that happen?
Here is my code so far:
$LogFolders = Get-ChildItem -Directory mydirectory ;
If (-Not (Test-Path -path "txtfilelocated"))
{
Add-Content txtfilelocated -Value $LogFolders
break;
}else{
$File = Get-Content "txtfilelocatedt"
$File | ForEach-Object {
$_ -match $LogFolders
}
}
$File

something like this?
You can specify what directory to check adding path to get-childitem cmdlet in first line
$a = get-childitem | ? { $_.psiscontainer } | select -expand fullname #for V2.0 and above
$a = get-childitem -Directory | select -expand fullname #for V3.0 and above
if ( test-path .\list.txt )
{
compare-object (gc list.txt) ($a) -PassThru | Add-Content .\list.txt
}
else
{
$a | set-content .\list.txt
}

Related

Getting root folder name with PS

I am trying to create a PowerShell script to fetch the root folder's name where in their subdirectories files with error names are present with today's date. Below is the sample code I have tried so far to pick the folder names.
Root Log folder - C:\Errorlogs, contains many other application log level folders.
$targetDir="C:\Errorlogs"
Get-ChildItem $targetDir -Recurse -ErrorAction SilentlyContinue -Force -Filter "*Error*"|
where {([datetime]::now.Date -eq $_.lastwritetime.Date)} |
select FullName
I have tried the above code; however, it's giving me the whole path as result, whereas I only need the folder name.
Result - C:\Errorlogs\AsyncCreateUsersAPIProcessor\202302\04\Error.txt
Required - AsyncCreateUsersAPIProcessor
Use string LastIndexOf and SubString
$rootPath = "C:\Temp\Errorlogs"
$date = [DateTime]::Now.ToString("yyyyMM\\\\dd")
$pattern = '\\(?<folder>\w+)\\' + $date + '\\Error.*$'
$files = Get-ChildItem -Path $rootPath -Recurse | Select-Object -Property Fullname | Where-Object {$_.Fullname -Match $pattern}
foreach($file in $files)
{
$file.Fullname -match $pattern
Write-Host "folder = " $Matches.folder
}
Looks like you can do it just with splitting the path using \ as delimiter then picking the 3rd token (2nd index of an array):
$targetDir = "C:\Errorlogs"
Get-ChildItem $targetDir -Recurse -ErrorAction SilentlyContinue -Force -Filter "*Error*" |
Where-Object { [datetime]::Now.Date -eq $_.LastWriteTime.Date } |
Select-Object #{ N='Name'; E={ $_.FullName.Split('\')[2] }}
Another option if you want 2 levels up in the folder hierarchy is to query the .Directory property of the file then the .Parent property of the parent folder (2 times or as many times as needed):
$targetDir = "C:\Errorlogs"
Get-ChildItem $targetDir -Recurse -ErrorAction SilentlyContinue -Force -Filter "*Error*" |
Where-Object { [datetime]::Now.Date -eq $_.LastWriteTime.Date } |
Select-Object #{ N='Name'; E={ $_.Directory.Parent.Parent.Name }}
As long as the subfolders inside the folder you are after all have numeric-only names, you can loop backwards to get at the first non-numeric foldername and output that.
$targetDir = "C:\Errorlogs"
Get-ChildItem -Path $targetDir -File -Filter "*Error*" -Recurse -Force -ErrorAction SilentlyContinue |
Where-Object { [datetime]::Now.Date -eq $_.LastWriteTime.Date } | ForEach-Object {
$parentDir = $_.Directory
while ($parentDir.Name -match '^\d+$') { $parentDir = $parentDir.Parent }
$parentDir.Name
}
That way, even a path C:\Errorlogs\AsyncCreateUsersAPIProcessor\202302\02\04\1234\567\Error.txt would produce folder name AsyncCreateUsersAPIProcessor

Trying to pull out newest errors in Log file and the create custom.txt output

I am new to powershell scripting and have been tasked to create some alerts based on errors in certain logfiles. These are just logs from a bespoke application.
My current Code is
`$OutputFile3 = (Get-Location).Path + ".\Results.txt"
$Sourcefolder= "C:\Users\dewana\Documents\Test\"
$Targetfolder= "C:\Users\dewana\Documents\Test\Test3"
Get-ChildItem -Path $Sourcefolder -Recurse|
Where-Object {
$_.LastWriteTime -gt [datetime]::Now.AddMinutes(-5)
}| Copy-Item -Destination $Targetfolder
$Testing5 = Get-Content -Tail -1 -Path "C:\Users\dewana\Documents\Test\Test3\*.txt" | Where-Object
{ $_.Contains("errors") }
Remove-Item $OutputFile3
New-Item $OutputFile3 -ItemType file
try
{
$stream = [System.IO.StreamWriter] $OutputFile3
$stream.WriteLine('clientID 1111')
$stream.WriteLine('SEV 1')
$stream.WriteLine('Issue with this process')
}
finally
{
$stream.close()
}`
What i am struggling with is trying is
$Testing5 = Get-Content -Tail -1 -Path "C:\Users\dewana\Documents\Test\Test3\*.txt" | Where-Object { $_.Contains("errors") }
I am trying to store the latest string which contains the word error in the log file. i would want to use the stored string to the create an if statement to say if $Testing5 have a new value of error assigned the create a custom text file.
I can't seem to find out why the get-content is not working with the where-object
The only issue I can see is your Where-Object code block is on the next line.
Get-Content -Tail -1 -Path $tempfile | Where-Object
{ $_.Contains("errors") }
If you separate at the pipe it's fine.
Get-Content -Tail -1 -Path $tempfile |
Where-Object { $_.Contains("errors") }

recursively check all folders and export the list of folders that has no created backup file in it during last 24 hour

There is a folder containing database backup files. I need to recursively check all folders and export the list of last backup files in each folder.
I have a code here that contains my idea. just I have to add this part:
-check for each selected file(selected file is the last created backup file) if the file creation time is older than 24 hours, export in csv file.
Thanks in advance
[Cmdletbinding()]
param(
[Parameter(Position=0,Mandatory=$false,ValueFromPipeline=$true)]$path=
"\F:\backups",
[Parameter(Position=1,Mandatory=$false,ValueFromPipeline=$true)]
$OutPutFilepath=
"f:\backup-daily.csv"
)
function Get-LastestWroteFile{
[Cmdletbinding()]
param(
[Parameter(Position=0,Mandatory=$true)]$Folder
)
begin{
$Latest = Get-ChildItem $Folder.FullName -File | select FullName,
CreationTime, LastAccessTime, LastWriteTime, Attributes, #{N='SizeInMb';E=
{$_.Length/1mb}},Name | Sort-Object CreationTime | select -Last 1
}
process{
}
end{
#new custom object with 3 props.
if($Latest){
return New-Object PSobject -Property #{"FullName"=$latest.Name;
LastWriteTime =
$latest.LastWriteTime;"Folder"=$folder.FullName;"SizeInMB" =
[math]::Round($Latest.SizeInMB,3)} #FileInfo=$Latest; }
}
}
}
$OutPut=#()
Get-ChildItem -Directory -Path $path -Recurse | foreach{
$OutPut+= Get-LastestWroteFile $_
}
$OutPut | ConvertTo-Csv -NoTypeInformation -delimiter '|' | Out-File -
FilePath $OutPutFilepath
an advanced function would not be required, try below
Get-ChildItem -Path $Path -Recurse -File | Where-Object -FilterScript {
([Datetime]::Now - $_.CreationTime ).Hours -gt 24
} | Select-Object -Property Name,LastWriteTime,FullName,#{N='SizeInMb';E=
{$_.Length/1mb}},

Merging Text files in Different Folders in a Parent folder

I have 50 sub-Folders inside a Single Parent Folder.
Inside each sub folders there are multiple .txt files. I want to merge all the text files in a single sub-folder into 1 .txt file.
But I want a command so that it can be done in one go for all the subfolder, like i don't want to write command for each sub-folder.
For example:-
ABCD (Parent Folder ):-
A
B ; Here A and B are sub-folder
A\0001.txt
A\0002.txt
I want to merge and make a single text file A\0001.txt.
B\0001.txt
B\0002.txt
I want to merge both the text files in B Folder.
Can it be done in one go ?
This is probably a lot easier using powershell.
Try the following and change the basedir to the parent folder of all your subdirectories.
$basedir = "C:\Basedir"
$folderlist = Get-childitem -Path $basedir
foreach ($folder in $folderlist)
{
$dir = $folder
$outFile = Join-Path $dir "merged.txt"
# Build the file list
$fileList = Get-ChildItem -Path $dir -Filter File*.txt -File
# Get the header info from the first file
Get-Content $fileList[0] | select -First 2 | Out-File -FilePath $outfile -Encoding ascii
# Cycle through and get the data (sans header) from all the files in the list
foreach ($file in $filelist)
{
Get-Content $file | select -Skip 2 | Out-File -FilePath $outfile -Encoding ascii -Append
}
}
Maybe old but useful: this version works with folders and subfolders recursively:
$basedir = "..."
$folderlist = Get-childitem -Path $basedir -Recurse -Directory | Select-Object FullName
foreach ($folder in $folderlist)
{
Write-Host $folder.FullName
$dir = $folder.FullName
$outFile = Join-Path $basedir "merged.txt"
# Build the file list
$fileList = Get-ChildItem -Path $dir -Filter *.log | Select-Object FullName
# Get the header info from the first file
#Get-Content $fileList[0] | select -First 2 | Out-File -FilePath $outfile -Encoding ascii
# Cycle through and get the data (sans header) from all the files in the list
foreach ($file in $filelist)
{
Write-Host $file.FullName
Get-Content $file.FullName | Out-File -FilePath $outfile -Encoding ascii -Append
}
}

How can I use PowerShell to copy the same file to many directories with the same name?

I'm trying to copy one file to any subfolder in a directory that has a specific name. I am part way there, but just can't quite get it to work.
I am able to find all of the subfolders called "help" using:
Get-ChildItem -Path Y:\folder1\subfolder -Directory -Recurse | ? { ($_.PSIsContainer -eq $true) -and ($_.Name -like 'help')}
That will get any folder in Y:\folder1\subfolder named help. So have been trying:
$folder = Get-ChildItem -Path Y:Y:\folder1\subfolder -Directory -Recurse | ? { ($_.PSIsContainer -eq $true) -and ($_.Name -like 'help')}
foreach ($f in $folder){
Copy-Item Y:\Info.html -Destination $folder[$f]
}
and that does not work. Bonus points if you can also tell me how to have it write out to a csv file all of the directories it copies the file to.
Thanks
I wrote this with version 3, but I think it will work with 1 and 2 since I used Set-StrictMode -Version <number> to test them.
The CSV output will look something like this for every line: Y:\Info.html,Y:\folder1\subfolder\help
$logpath = 'C:\log.csv'
$logopts = #{filepath=$logpath; append=$true; encoding='ascii'}
$file = 'Y:\Info.html'
$path = 'Y:\folder1\subfolder'
$search = 'help'
gci $path -d -s `
| ?{ $_.psIsContainer -and $_.name -match $search } `
| %{
cp $file $_.fullName; # copy file
$line = $file, $_.fullName -join ','; # build output
$line | out-file #logopts; # write output
}
Version 1
$folders = #(
(gci Y:\folder1\subfolder -dir -r | ? {$_.Name -like 'help'}).fullname
)
ForEach ($f in $folders) {
Copy-Item Y:\Info.html $f
}
Version 2
(gci Y:\folder1\subfolder -dir -r | ? {$_.Name -like 'help'}).fullname | % {cp Y:\Info.html $_}

Resources