Evaluating Variables first in PowerShell Commands - windows

I'm trying to get PowerShell to evaluate variables before executing a command, for example:
$OutputPath = "C:\Temp"
Get-ChildItem -include *.mp3 | Move-Item -Destination $OutputPath
However, the Move-Item cmdlet tries to interpret this literally, so it doesn't get moved. The script works fine whenever I enter the path directly, but I need to be able to control the path with a variable. How do I do this?

This answer could evolve but as your question stands I see a big issue that is unfortunately poorly documented. -Include and -Exclude only perform their intended functions when partnered with -Recurse. Used without it can yeild 0 results. In your case that would mean nothing is passed through the pipe and Move-Item is not executed.
Currently you are just filtering on *.mp3 which is basic and can just be used with the -filter parameter. While you don't need to I would recommended specifying -Path as well so that you move the files you wanted.
Get-ChildItem -Filter *.mp3 | Other-Stuff ...
You mentioned in comments the following error:
Move-Item : Cannot create a file when that file already exists.
That error is very specific. Either from previous testing or an oversight in new names that file indeed already exists. Two things you can do to help with that problem is use the -WhatIf switch which should quote a file path on the verbose stream so you know where the file would end up.
Second, if you understand the data risk, is to use -Force so that the file will be overwritten by the new one. With Copy-Item it is not a big deal since the original file still exists. Mistakes with Move-Item can be permanent.

Related

PS script copy new files

Pretty noob at writing PS scripts - wrote this up and have been actively using it although still requires some manual intervention trying to achieve my goal, which I would like to automate completely.
I will try my best to explain clearly;
I am trying to copy '.bak' files to a specific directory from a source folder that has files dropped in it on a daily basis. Problem is the way I created the script, every time it runs it creates a new folder with some of the same files as previously copied.
The files being copied all follow the same name structure in date sequence;
xxxx_2018_01_01_2131231.bak
xxxx_2018_01_02_2133212.bak
xxxx_2018_01_03_2199531.bak
How could I write the script so that it copies newer files only and not what has already been copied previously?
It would also be nice to only create a new folder then a certain part of the file name changes.
Here is the script;
$basedir = "Path:\path"
$today = (Get-Date).ToString('MM_dd_yy')
$Filter = '*.bak'
$location = New-Item -Path $basedir -Type Directory -Name $today
Copy-Item -Path 'Path:\path' -Destination $location -Filter $Filter -Recurse
Any pointers are greatly appreciated!
Thanks in advance for your help!
I'm not sure if there is an easy way to code this, but the general answer would be using the Get-ChildItem cmdlet.
"The Get-ChildItem cmdlet gets the items in one or more specified locations. If the item is a container, it gets the items inside the container, known as child items. You can use the -Recurse parameter to get items in all child containers and use the -Depth parameter to limit the number of levels to recurse."
By using the Get-ChildItem, you could get the listing of files that are in both directories, and then compare them to see if they have the same name. Then build an if() argument based on criteria you wish to use to compare them.
It's not the complete answer, but it is a good starting point.
Thanks everyone for pitching in, much appreciated!
I have switched over to the batch file route and have created the following to accomplish my goal;
#echo off
setlocal
set _source="C:\users\user1\desktop\source"
set _dest="C:\users\user1\desktop\dest"
robocopy %_source% %_dest% *.bak /mir /XC /XN /XO
Any opinion on this script is encouraged!

How to use short-cut paths to Compress-Archive to zip current folder into same destination

I am using Compress-Archive and want to zip the current directory into the same path. However I do not want to have to type out the entire file path both times. Is there an easy way to do this?
I am using windows 10 pro.
This works for the most part Compress-Archive . test.zip but I want it to be on the same level as the current directory so I need to put it back one spot.
Something like this is what I want:
path/test
path/test.zip
What I am getting:
path/test
path/test/test.zip
It is going inside the actual folder which is not what I want
You propably want that:
Compress-Archive * ..\test.zip
The wildcard * avoids that the name of the folder is put inside the zip.
Using .. for the output path we go one level up in the directory tree.
This command will fail if test.zip already exists. Either add parameter -update to update the archive or add -force to overwrite the archive. Both can be used even if the archive does not already exist.
If the current working directory is "t", it can be included using the following command. I would note that I do not think putting the destination .zip file in the directory being compressed is a good idea.
Compress-Archive -Path $(Get-ChildItem -Recurse -Exclude t.zip) -DestinationPath .\t.zip -Force
It is shorter if you are willing to use aliases and cryptic switches.
Compress-Archive $(gci -r -e t.zip) .\t.zip -Force
If I have misinterpreted your situation, please leave a comment or improve the information provided by editing the question.

Powershell find and replace not recursing through sub-directories

I would like to recurse through all subfolders of a given folder, and for all files of a given extension (optionally wildcard), search and replace some piece of text.
There's a similar previous question here, and I'm trying to get a modified version of the top answer working for my purpose. It uses Powershell for the find and replace. However, my modified version below is not recursing. I don't remember where I found the recurse part of the first line, so maybe I'm using that incorrectly.
Get-ChildItem *.* -exclude *.ps1* -recurse |
Foreach-Object {
$c = ($_ | Get-Content)
$c = $c -replace 'foo','bar'
[IO.File]::WriteAllText($_.FullName, ($c -join "`r`n"))
}
I am running the code from the Powershell command line as a PS batch file, .ps1, hence the exclusion of that extension.
This works fine if I run it directly in a folder that has some files I want searched, but if I run it from a parent of that folder, the searching/ recursion of subfolders does not happen. It doesn't even try to, which I can tell from the fact that the command finishes running instantaneously. If I run it in my first subfolder, which has about 50 text files, there's a couple seconds delay before it finishes, indicating to me it's actually searching those files.
I've also tried this with the exclude portion removed and the behaviour is the same.
My guess is that *.* matches only items with a dot in them. Folders have no extension, so they aren't matched.
gci -rec -force -exclude *.ps1 | should do what you want.

Recursive listing of a particular file type in Windows PowerShell

I was trying to recursively list using ls all the *.py files in a directory in Windows PowerShell. It didn't work. I checked up on the man page and then online page that -Recurse flag doesn't accept wildcards so *.py wouldn't work.
I want to know whether there is a simple in-built way to recursively list the files of a particular file extension in a directory using Windows PowerShell 2.0?
I am a newbie in PowerShell and don't want to indulge in Shell scripting at this stage. So please recommend in-built commands, flags, etc. only, if any.
By commands I means the in-built keywords of the PowerShell. I am not sure if that is proper vocabulary for PowerShell.
Use the Filter parameter:
ls $path -filter *.py -recurse
This will do the trick for you.
gci -Recurse | ? {$_.name -match ".py"}
or
gci -Recurse -Include *.py
ls is an alias for Get-ChildItem. (So is dir and gci). A simple Google will give you all kinds of examples but the thing to know is -include (or -exclude) are built in parameters that will show (or not show) file types you are looking for. An additional parameter, -filter, can also be used for partial file types. The switch parameter -recurse checks the contents of subfolders in the directory as well. Example01:
gci -path \$computername\$directory -Include *.py -recurse
Example02:
gci -path \$computername\$directory -Filter *.py -recurse
I like to suppress errors so a full example would look like this:
gci -path \Svr01\C$ -Include *.py -recurse -erroraction SilentlyContinue
I don't see how anybody would have got this working at any time.
gci -recurse works fine and lists all
gci -filter *.txt works fine, lists all .txt files but does not recurse
gci -filter *.txt -recurse returns either only .txt files from root or nothing.
It seems to apply the *.txt filter to the directory names therefor seeing no directories and thus does not recurse at all.

Windows power shell script to rename all files in each directory to parent's name

I'm working on learning Windows PS - but until than I need some help writing a quick command:
I have a directory of directories, each directory has a unique name.
I need to rename all files within each of these directories to their parents name! As follows:
Current Structure:
/pdfdirectory/pdf_title_directory/filename.pdf
/pdfdirectory/pdf_title_directory123/filename.pdf
After shell script:
/pdfdirectory/pdf_title_directory/pdf_title_directory.pdf
/pdfdirectory/pdf_title_directory123/pdf_title_directory123.pdf
Thanks!
With good faith that you are learning Powershell and will try out stuff:
gci .\test -recurse | ?{ -not $_.PsIsContainer } |
%{rename-item -path $_.fullname -newname ($_.Directory.Name + $_.Extension)}
Of course, the above will fail it there is more than one file in the directory.
To understand the above learn basic Powershell commands like gci ( alias for Get-ChildItem) and Powershell pipeline concepts.
It's worth noting that you can pipe the output from Get-ChildItem directly into Rename-Item, you don't need a foreach. Also, and this is the clever part, the -NewName parameter value can be a script block that yields the new name. This script block can use $_ to refer to the file object that is currently being processed.
So you could do something like this:
dir -Recurse -Include filename.pdf | ren -NewName { "$($_.Directory.Name).pdf" }
(I think it was Keith Hill that made me aware of this trick.)

Resources