I feel like this one should be easy, but it's giving me a bit of trouble. The goal is to get downloaded wsus updates migrated to a separate drive that can be burned to a disk and moved to a secure environment.
The current process was set up by another guy and I am taking over. He has 2 batch files that run we will call them first.bat and second.bat
first.bat is run that spits out a log of how many new files there are. We'll call it new.txt this simply contains the hash file paths for the changes i.e. C:\folder\sub\1A\HASHFILE.cab
Then, we copy the file paths in new.txt by hand and manually paste them into second.bat ... add in an xcopy function and a destination folder in a new drive. i.e. xcopy C:\folder\sub\1A\HASHFILE.cab E:\eport\**1A** However, we have to manually add in the hash folder identifier (1A for example) I would like the script to pick it up from the source folder path and add it into the destination to eliminate the potential for human error.
What I am trying to write a shell script to accomplish is this.
run first.bat
export info from new.txt and modify to
add xcopy parameter to all new files in new.txt as well as the destination folder path
automate the addition of the hash folder callout (i.e. 1A to the end of the destination folder path. i.e. E:\export\**1A**)
run second.bat
Anyone have any ideas? I would like to wrap all of this up (or a similar function if that's easier, i imagine it might be) into a handy script that will automate this tedious process.
I have tinkered with a few ideas, I get it to spit out the destination path, but it doesn't actually move anything. Here is what I have so far that successfully tags the hash marker at the end of the destination folder, but does nothing else:
$source = "C:\temp\test folder\"
$dest = "C:\Temp2\export\"
$folders = Get-ChildItem $source | Where-Object {$_.PSIsContainer -eq $true} | Sort-Object
foreach ($folder in $folders){
$fname = $folder.FullName
$sourcefolder = $fname.Replace($source,"")
$newdest = $dest + $sourcefolder
Write-Output $newdest
}
Related
I downloaded a backup folder of about 3,000 files from our email service provider. None of the files have an associated filetype; instead the file extension was appended to the name of each individual file. For example:
community-involvement-photo-1-jpg
social-responsibility-31-2012-png
report-02-12-15-pdf
I can manually change the last dash to a period and the files work just fine. I'm wondering if there is a way to batch convert all of the files so they can be sorted and organized properly. I know in the Command Line I can do something like ren *. *.jpg but there are several different file types contained in the folder, so it wouldn't work for all of them. Is there any way I can tell it to convert the last "-" in each file name into a "." ?
I'm on Windows 10; unable to install any filename conversion programs unless I want to go through weeks of trouble with the IT group.
$ordner = "c:\temp\pseudodaten"
$Liste = (get-childitem -path $Ordner).Name
cd $ordner
foreach ($Datei in $Liste) {
$Length = $datei.length
$NeuerName=$Datei.Substring(0,$Length-4)+"."+$datei.Substring($Length - 3, 3)
rename-item -Path $Datei -NewName $NeuerName
}
I'm relatively new to programming so I need some help.
I have 2 directories
C:\test1
C:\test2
So in test1 will get constantly get files.
Which look like this:
testA000_00001.txt0..txt
test00A0_00102.txt1..txt
test00A0_00102_00123.txt45..txt
...
testG000_00999.txt999..txt
testH000_00013.txt0..txt
Since its essential that the files in test1 stay the way that they are I'm gonna need them in test2.
And since test2 needs to be the current version it is needed to be done the moment the files are in test1.
But without the .txt0. - .txt999. part.
testA000_00001.txt
test00A0_00102.txt
test00A0_00102_00123.txt
...
testG000_00999.txt
testH000_00013.txt
Its also essential that these files are only copied once since they aren't gonna stay in test2 for long.
I tried it with xcopy and some other versions of copy but each time it copies the files back into test2 and after I move the files from test2 the files are copied into it again.
(sry cant comment yet)
I dont have exact code (working on it)
What i think would work is to:
Check a text file for things it alr coppied
schedual a task to run every minute or so
move to the folder you want
Use the dir command to take all files in the current directory
Copy them
Write these to a text file
Change The names
loop
When i get the code done i will edit.
Also moving them assoon as they enter the directory is very resource intensive but might be possible if the code is very efficient.
#make a folder
md "C:\test\test3"
#move to test1
cd "C:\test\test1"
# Presets
$txt = ".txt"
$files = dir
#clean up the files list
$files = $files -split "`r`n"
#for each file it found
foreach ($line in $files) {
#move the file from test1 to test 3
Copy-Item "C:\test\test1\$line" -destination "C:\Test\test3"
#take the first 14 chars from the file name
$line_name = $line.Substring(0, 14)
#add a .txt extension
$line_name = $line_name + $txt
#rename the files in test3
Rename-Item -path "C:\test\test3\$line" -NewName "$line_name"
#move the files to test 2 and if they alr exist replace(update) them
Move-Item -Path "C:\test\test3\$line_name" -Destination "C:\Test\test2" -Force
}
Remove-Item –path "C:\test\test3" –recurse -force
This Code is super close to working but i might not be able to finish today.
It only failes to move the file: test00A0_00102_00123.txt45..txt
due to the name not working. I will update the script when it works fully.
Story:
I have multiple folders with 1000+ files in each that are named similar to each other but are slightly different but they relate to the same content.
For example, in one folder I have files named quite simply "Jobs to do.doc" and in another folder "Jobs to do (UK) (Europe).doc" etc.
This is on Windows 10, not Linux.
Question:
Is there a script to compare each folder's content and rename them based on minimum similarity? So the end result would be to remove all the jargon and have each file in each folder (multiple) the same as one another but STILL remain in the retrospective folder?
*Basically compare multiple folder content to one folders contents and rename them so each file in each folder is named the same?
Example:
D:/Folder1/Name_Of_File1.jpeg
D:/Folder2/Name_Of_File1 (Europe).jpeg
D:/Folder3/Name_of_File1_(Random).jpeg
D:/folder1/another_file.doc
D:/Folder2/another_file_(date_month_year).txt
D:/Folder3/another_file(UK).XML
I have used different file extensions in the above example in hope someone can write a script to ignore file extensions.
I hope this make sense. So either a script to remove the content in brackets and keep the files integrity or rename ALL files across all folders based on minimum similarity.
The problem is its 1000+ files in each folder so want to run it as an automated job.
Thanks in advance.
If the stuff you want to get rid of is always in brackets then you could write a regex like
(.*?)([\s|_|]*\(.*\))
Try something like this
$folder = Get-ChildItem 'C:\TestFolder'
$regex = '(.*?)([\s|_|]*\(.*\))'
foreach ($file in $folder){
if ($file.BaseName -match $regex){
Rename-Item -Path $file.FullName -NewName "$($matches[1])$($file.extension)" -Verbose #-WhatIf
}
}
Regarding consistency you could run a precheck using same regex
#change each filename if it matches regex and store only it's new basename
$folder1 = get-childitem 'D:\T1' | foreach {if ($_.BaseName -match $regex){$matches[1]}else{$_.BaseName}}
$folder2 = get-childitem 'D:\T2' | foreach {if ($_.BaseName -match $regex){$matches[1]}else{$_.BaseName}}
#compare basenames in two folders - if all are the same nothing will be returned
Compare-Object $folder1 $folder2
Maybe you could build with that idea.
I have 3 folders, /Incoming, /Processed and /Temp. The incoming folder is updated with new files hourly and currently has 120k+ individual .zip files in it. Every hour these files are copied to the processed folder. where they are unzipped and the records are inserted into a SQL table. The table is Dropped and Recreated every hour and all files are re-imported. This process is starting to take a long time.
All the file transfers are currently being done in a cmd batch file, using robocopy /MIR and a SQL .dtsx file for importing.
I am trying to find a method to compare the incoming folder with the processed folder before new files are copied every hour, and copy the differences to the temp folder so only they are added to SQL instead of dropping and recreating every hour.
Any help would be awesome as I have spent hours on this single issue with no luck.
This solution will compare two folders (e.g. Incoming and Processed) and copy the new files in the first folder (Incoming) to the third folder (Temp) for processing.
$Folder1 = (Get-ChildItem -Recurse -path "C:\Incoming")
$Folder2 = (Get-ChildItem -Recurse -path "C:\Processed")
(Diff $Folder1 $Folder2 | ? {$_.SideIndicator -eq "<="}).InputObject |
ForEach-Object {
$ItemName1 = $_;
$ItemName2 = "C:\Temp\" + $ItemName1;
Copy-Item $ItemName2 -Destination "C:\Temp" -Force
}
you can directly use fc command
ex : fc srcfile destfile >logfile
or use ROBOCOPY
There's something I want to accomplish with administrating my Windows file server:
I want to change the "Last Modified" date of all the folders on my server (just the folders and subfolders, not the files within them) to be the same as the most recent "Created" (or maybe "Last Modified") date file within the folder. (In many cases, the date on the folder is MUCH newer than the newest file within it.)
I'd like to do this recursively, from the deepest subfolder to the root. I'd also like to do this without me manually entering any dates and times.
I'm sure with a combination of scripting and a Windows port of "touch" I could maybe accomplish this. Do you have any suggestions? I could maybe accomplish this. Do you have any suggestions?
This closed topic seems really close but I'm not sure how to only touch folders without touching the files inside, or how to get the most-recent-file's date. Recursive touch to fix syncing between computers
If it's for backup purposes, in Windows there is the Archive flag (instead of modifying the timestamp). You can set it recursively with ATTRIB /S (see ATTRIB /?)
If it is for other purposes you can use some touch.exe implementation and use a recursive for:
FOR /R (see FOR /?)
http://ss64.com/nt/for_r.html
http://ss64.com/nt/touch.html
I think that you can do this in PowerShell. I just tried throwing something together and it seems to work correctly. You could invoke this in PowerShell using Set-DirectoryMaxTime(".\Directory") and it will operate recursively on each directory under that.
function Set-DirectoryMaxTime([System.IO.DirectoryInfo]$directory)
{
# Grab a list of all the files in the directory
$files = Get-ChildItem -File $directory
# Get the current CreationTime of the directory we are looking at
$maxdate = Get-Date $directory.CreationTime
# Find the most recently edited file's LastWriteTime
foreach($file in $files)
{
if($file.LastWriteTime -gt $maxdate) { $maxdate = $file.LastWriteTime }
}
# This needs to be in a try/catch block because there is a reasonable chance of it failing
# if a folder is currently in use
try
{
# Give the directory a LastWriteTime equal to the newest file's LastWriteTime
$directory.LastWriteTime = $maxdate
} catch {
# One of the directories could not be updated
Write-Host "Could not update directory: $directory"
}
# Get all the subdirectories of this directory
$subdirectories = Get-ChildItem -Directory $directory
# Jump into each of the subdirectories and do the same thing to each of their CreationTimes
foreach($subdirectory in $subdirectories)
{
Set-DirectoryMaxTime($subdirectory)
}
}