I have a Script that reads a log file and creates a text file with some output taken from the log file.
The script works and it takes the right log file as long as it is on my C:\ drive.
The Original File is located on the network drive called s:\ but if I want to take this log by entering the whole path where the file lives I get the error that >> The Drive wasnt found and that a Drive called S does not exist.
How can I connect to a network drive?
$inVar = Select-String -path C:\Dev\Script\Ldorado.log -pattern "IN:" -WORKS
$inVar = Select-String -path S:\Lic_Debug\Ldorado.log -pattern "IN:" - Does not work!
Thanks for all the Answers - I actually managed it by changing the Path name in \fs01\ because that represents the S:\ and it works. Thanks
Try prepending the path with "FileSystem::" i.e. -path FileSystem::s:\Lic_Debug\Ldorado.log
This ensures your script uses the FileSystem provider which should correctly recognise the drive.
Related
We have an internal process set up in Powershell which runs an exe file internal.exe which creates a lot log files in an absolute path "C:\This\is\absolute" which contains all the log files of the past 30 days. "C:\This\is\absolute" contains also log files from other applications and while internal.exe runs and creates log files in "C:\This\is\absolute" another application might create a log file there as well.
Now we need to send the log files created by internal.exe and for this they have to be moved to another folder "C:\Move\here" after having been created.
The process is currently simply set up as
Start-Process -FilePath "internal.exe"
I was looking for something like
Get-Outputfiles (Start-Process -FilePath "internal.exe") | foreach {Move-Item -Path $_ -Destination "C:\Move\here"}
but I found only ways to write output to files, e.g., via Out-File. Is there a way to get something like the Get-Outputfiles which lists the paths of output files from a process?
I am very new to Powershell but have learned that is a very powerful tool to master. I am attempting to take a folder from a location on my PC, Zip it and then paste it into a folder located on a network drive. I have figured out how to create a zip of a folder and send it to a location on my machine but can not figure out how to send it to anywhere else.
Is this the way to go about it?
Is this even possible?
I have been able to use
"Compress-Archive -Path C:\ProgramData\Microsoft C:\users\MYACT\Desktop\Folder"
(I am using goolge as an example because i am not sure if i can put the actual location on here )
I am attempting to use (but failing)
"Compress-Archive -Path C:\ProgramData\Microsoft ftp://ftp.google.com\parent\Sub"
When Run the command i receive
Compress-Archive : The path 'ftp:\ftp.google.com\parentFolder\SubFolder' either does not exist or is not a valid
system path.
I am learning PS and Couldn't figure this out:
$fileName = #("file1.url","file2.url","file3.url")
foreach ($a in $fileName) {
Get-ChildItem -Path \\mySharePath\* -include $fileName -Recurse
} | Remove-Item -Force
The purpose is to seach in $filename for each file name and once it's found, delete that file.
Summarizing from the comments. I see you got it to work from your local machine but trying to do it to a share. If that sharepath is the server without the actual path to the share folder, it will not work. You cannot browser a server like that for its shares. You can use WMI or the new server modules commandlets to get share names from the server but you have to target the share directly to browse it, and interact with it.
For example, try to just gci that "\mysharepath*" server without the include with recurse and you will get nothing back. Share path should be:
\FileServer\sharedfolder
Recurse that with the correct server name and shared folder and you will get the results you are seeking if the files are in the shared folder.
Developing our own application for our company only, we have developed script used for installation from shared drive. Except the installation itself, the script should also create/update values in the registry of particular user (HKEY_CURRENT_USER).
These values are separated for:
Directories (HKCU:\Software\Classes\Directory)
All File Extensions (HKCU:\Software\Classes\*)
For the directory folder the update is immediate, where for the extensions it seems to take quite some time depending on machine hardware (from 40 sec to 2 minutes).
Now there is a trouble to create "entry" in the registry for the folder named * only. I've got a question for this to resolve (PowerShell: How do I create selector on file/folder, whose name is '*' (asterisk/star)?).
Ignoring the issue above, we have found some solution how the string path works, however I'm not sure what is happening behind the code and do not understand why it takes so long time.
# Directory
New-Item -Path "HKCU:\Software\classes\Directory" -Name "shell" | Out-Null
# All Files Extension
New-Item -Path "HKCU:\Software\classes\[*]" -Name "shell" | Out-Null
One idea is that the [*] solution actually goes through all the file extensions, but the registry itself is showing this NewItem under * folder and not shown under particular extensions:
Another idea about this, is when we have a registry file (*.reg), by running the file the registry entry is added immediately and resolve the case.
Questions:
What is actually happening when we are running the query to add entry under [*] selector?
How can be this process optimized to lower the time for creating new folder in registry for all files' extension?
I suspect what's happening is that the -Path in your New-Item call is recursive because of the wildcard. Hence the delay.
Here's a workaround to the issue:
Set-Location -LiteralPath "HKCU:\Software\classes\*"
New-Item -Name "shell"
New-Item uses the current location as the -Path if not explicitly passed to the function.
I'm in a small business environment (Win 7 Pro 64 workgroup, single LAN) where I would like to be able to run a script that would make a link on the local machine to a folder on our NAS, copy system restore files, then remove the link. The objective is to isolate system restores as an added layer of protection, this process ideally includes severing direct links to the NAS from any user machine unless copying these files.
I have already allowed access to the System Volume Information folder for my account.
Searches have found a lot of posts about scripts with mklink, but I couldn't find a solid example of what I wanted and the languages I've seen used range from bash to Powershell.
I have scripted a lot in VBA but not with Powershell or even much in VBScript, which language is most appropriate for this? If Powershell I'm going to have to install it on the Win 7 machines, not a huge deal. VBScript or batch would be the easiest for me to write and distribute. I also have perl installed on my machine but would have to install it on a handful of other machines.
My first question is: which language would be best in this situation? I would prefer VBScript or batch if possible, or Powershell. perl if necessary.
Second question: can someone give me an example script? Typical mklink command looks like this (from what I understand):
mklink /d :name: :target:
Should I use the /j flag instead of /d ? Should I mount the drive (I'm unclear on how to do this with Windows CLI or Powershell)?
Also if this will not work in the first place feel free to let me know. Thanks.
So you could do this with PowerShell easily enough. You would not use MKLink, but would create a temporary mapping to the NAS location (been several years since I worked with a real NAS, I'm hoping that I remember right and that they do in fact have a UNC path such as \NAS01\Share). So you would use New-PSDrive, and just not use the -Persistent parameter, so the connection would end when the script ended and the PowerShell session exited. Then you can copy files as needed. In fact, you could set it to check all the files in both locations and only upload files that are updated locally. So assuming you have a folder for each computer to be backed up created on the share you could do this:
New-PSDrive -Name NAS -Root \\NAS01\Share\$env:COMPUTERNAME -PSProvider FileSystem
$Source = 'C:\System Volume Information'
Compare-Object (Get-ChildItem $Source -Recurse) (Get-ChildItem 'NAS' -Recurse) -PassThru |
Where{$_.SideIndicator -eq "<="} | ForEach{Copy-Item $_.FullName -Destination ($_.FullName -Replace [regex]::escape($source), 'NAS:')}
That essentially creates a drive named NAS that's mapped to the network path described, and then pulls a directory listing for both locations, and for items that are in the source that are not in the destination it copies it to the destination. Once the script is finished running PowerShell should exit and remove the link to the NAS when it does.