I am learning PS and Couldn't figure this out:
$fileName = #("file1.url","file2.url","file3.url")
foreach ($a in $fileName) {
Get-ChildItem -Path \\mySharePath\* -include $fileName -Recurse
} | Remove-Item -Force
The purpose is to seach in $filename for each file name and once it's found, delete that file.
Summarizing from the comments. I see you got it to work from your local machine but trying to do it to a share. If that sharepath is the server without the actual path to the share folder, it will not work. You cannot browser a server like that for its shares. You can use WMI or the new server modules commandlets to get share names from the server but you have to target the share directly to browse it, and interact with it.
For example, try to just gci that "\mysharepath*" server without the include with recurse and you will get nothing back. Share path should be:
\FileServer\sharedfolder
Recurse that with the correct server name and shared folder and you will get the results you are seeking if the files are in the shared folder.
Related
I have a Script that reads a log file and creates a text file with some output taken from the log file.
The script works and it takes the right log file as long as it is on my C:\ drive.
The Original File is located on the network drive called s:\ but if I want to take this log by entering the whole path where the file lives I get the error that >> The Drive wasnt found and that a Drive called S does not exist.
How can I connect to a network drive?
$inVar = Select-String -path C:\Dev\Script\Ldorado.log -pattern "IN:" -WORKS
$inVar = Select-String -path S:\Lic_Debug\Ldorado.log -pattern "IN:" - Does not work!
Thanks for all the Answers - I actually managed it by changing the Path name in \fs01\ because that represents the S:\ and it works. Thanks
Try prepending the path with "FileSystem::" i.e. -path FileSystem::s:\Lic_Debug\Ldorado.log
This ensures your script uses the FileSystem provider which should correctly recognise the drive.
I am very new to Powershell but have learned that is a very powerful tool to master. I am attempting to take a folder from a location on my PC, Zip it and then paste it into a folder located on a network drive. I have figured out how to create a zip of a folder and send it to a location on my machine but can not figure out how to send it to anywhere else.
Is this the way to go about it?
Is this even possible?
I have been able to use
"Compress-Archive -Path C:\ProgramData\Microsoft C:\users\MYACT\Desktop\Folder"
(I am using goolge as an example because i am not sure if i can put the actual location on here )
I am attempting to use (but failing)
"Compress-Archive -Path C:\ProgramData\Microsoft ftp://ftp.google.com\parent\Sub"
When Run the command i receive
Compress-Archive : The path 'ftp:\ftp.google.com\parentFolder\SubFolder' either does not exist or is not a valid
system path.
Pretty noob at writing PS scripts - wrote this up and have been actively using it although still requires some manual intervention trying to achieve my goal, which I would like to automate completely.
I will try my best to explain clearly;
I am trying to copy '.bak' files to a specific directory from a source folder that has files dropped in it on a daily basis. Problem is the way I created the script, every time it runs it creates a new folder with some of the same files as previously copied.
The files being copied all follow the same name structure in date sequence;
xxxx_2018_01_01_2131231.bak
xxxx_2018_01_02_2133212.bak
xxxx_2018_01_03_2199531.bak
How could I write the script so that it copies newer files only and not what has already been copied previously?
It would also be nice to only create a new folder then a certain part of the file name changes.
Here is the script;
$basedir = "Path:\path"
$today = (Get-Date).ToString('MM_dd_yy')
$Filter = '*.bak'
$location = New-Item -Path $basedir -Type Directory -Name $today
Copy-Item -Path 'Path:\path' -Destination $location -Filter $Filter -Recurse
Any pointers are greatly appreciated!
Thanks in advance for your help!
I'm not sure if there is an easy way to code this, but the general answer would be using the Get-ChildItem cmdlet.
"The Get-ChildItem cmdlet gets the items in one or more specified locations. If the item is a container, it gets the items inside the container, known as child items. You can use the -Recurse parameter to get items in all child containers and use the -Depth parameter to limit the number of levels to recurse."
By using the Get-ChildItem, you could get the listing of files that are in both directories, and then compare them to see if they have the same name. Then build an if() argument based on criteria you wish to use to compare them.
It's not the complete answer, but it is a good starting point.
Thanks everyone for pitching in, much appreciated!
I have switched over to the batch file route and have created the following to accomplish my goal;
#echo off
setlocal
set _source="C:\users\user1\desktop\source"
set _dest="C:\users\user1\desktop\dest"
robocopy %_source% %_dest% *.bak /mir /XC /XN /XO
Any opinion on this script is encouraged!
Developing our own application for our company only, we have developed script used for installation from shared drive. Except the installation itself, the script should also create/update values in the registry of particular user (HKEY_CURRENT_USER).
These values are separated for:
Directories (HKCU:\Software\Classes\Directory)
All File Extensions (HKCU:\Software\Classes\*)
For the directory folder the update is immediate, where for the extensions it seems to take quite some time depending on machine hardware (from 40 sec to 2 minutes).
Now there is a trouble to create "entry" in the registry for the folder named * only. I've got a question for this to resolve (PowerShell: How do I create selector on file/folder, whose name is '*' (asterisk/star)?).
Ignoring the issue above, we have found some solution how the string path works, however I'm not sure what is happening behind the code and do not understand why it takes so long time.
# Directory
New-Item -Path "HKCU:\Software\classes\Directory" -Name "shell" | Out-Null
# All Files Extension
New-Item -Path "HKCU:\Software\classes\[*]" -Name "shell" | Out-Null
One idea is that the [*] solution actually goes through all the file extensions, but the registry itself is showing this NewItem under * folder and not shown under particular extensions:
Another idea about this, is when we have a registry file (*.reg), by running the file the registry entry is added immediately and resolve the case.
Questions:
What is actually happening when we are running the query to add entry under [*] selector?
How can be this process optimized to lower the time for creating new folder in registry for all files' extension?
I suspect what's happening is that the -Path in your New-Item call is recursive because of the wildcard. Hence the delay.
Here's a workaround to the issue:
Set-Location -LiteralPath "HKCU:\Software\classes\*"
New-Item -Name "shell"
New-Item uses the current location as the -Path if not explicitly passed to the function.
I am using Compress-Archive and want to zip the current directory into the same path. However I do not want to have to type out the entire file path both times. Is there an easy way to do this?
I am using windows 10 pro.
This works for the most part Compress-Archive . test.zip but I want it to be on the same level as the current directory so I need to put it back one spot.
Something like this is what I want:
path/test
path/test.zip
What I am getting:
path/test
path/test/test.zip
It is going inside the actual folder which is not what I want
You propably want that:
Compress-Archive * ..\test.zip
The wildcard * avoids that the name of the folder is put inside the zip.
Using .. for the output path we go one level up in the directory tree.
This command will fail if test.zip already exists. Either add parameter -update to update the archive or add -force to overwrite the archive. Both can be used even if the archive does not already exist.
If the current working directory is "t", it can be included using the following command. I would note that I do not think putting the destination .zip file in the directory being compressed is a good idea.
Compress-Archive -Path $(Get-ChildItem -Recurse -Exclude t.zip) -DestinationPath .\t.zip -Force
It is shorter if you are willing to use aliases and cryptic switches.
Compress-Archive $(gci -r -e t.zip) .\t.zip -Force
If I have misinterpreted your situation, please leave a comment or improve the information provided by editing the question.