Block a file with Powershell - windows

I want to block (not unblock) a file with Powershell. I want to cause Windows to believe that a file on disk was downloaded from the internet, or whatever other scenario exists such that files become blocked. I need this to test how some software I'm developing behaves in the presence of a blocked file.

If you're just trying to add the zone identifier you could try something like this:
$data = "[ZoneTransfer]
ZoneId=3"
Set-Content example.txt -Stream "Zone.Identifier" -Value $data

Related

Windows PowerShell - Input file name, output file path

I've just started using PowerShell and I have a task where I need to be able to have the file path displayed on screen when I enter the file name.
Is there a script that allows me to do the below ? :
Ex 1: I enter "test.txt" and I get "C:\Program Files...."
Ex 2: I enter a file name "My Documents" and I also get its path.
I have searched online on how to do this but I didn't quite find what I was looking for and all the queries/answers were too complicated for me to understand.
Can anyone help me out, please?
Thanks in advance!
Here is a starter sample for you.
This example search only within the confine of the paths present is the Path system environment variable. It also only looks for files and do not recurse through these path.
So anything you could access directly from the command line should be available to you through it.
Now, if you want to search the whole drive, you could replace the $DefaultPaths assignment with Get-ChildItem -Path 'C:' -Recurse but doing that each time won't be super efficient.
You could do it and it will work... but it will be slow.
To search on the whole drive or whole filesystem, there are alternative methods that might work better. Some examples of what might entice:
Using a database which you have to buld & maintain to index all the files so that when you search, results are instantaneous and / or very fast
Parsing the MFT table (if using Windows / NTFS filesystem only) instead of using Get-ChildItem (This is not somehting natively doable through a simple cmdlet though) .
Relying on a third party software and interface with (For example, Void Tools Everything search engine already parse MFT and build its own database, allowing users to search instantly through a Windows NTFS filesystem. It also have its own SDK you can plug in through Powershell and retrieve what you seek instantly. The caveats is that you need the software installed first for that solution to work.)
Example: Searching through all paths defined in the Path variable
# What you are looking for. Accept wildcards characters (*)
$Filter = 'notepad.exe'
# Get the System Environment Path variable in an array
$DefaultPaths = $env:Path -split ';'
$Paths =
Foreach ($P in $DefaultPaths) {
# Search for files matching the specified filter. Ignore errors (often if the path do not exist but is sin the Path)
$MatchingFiles = Get-ChildItem -Path $P -Filter $Filter -File -ErrorAction SilentlyContinue
if ($MatchingFiles.count -gt 0) {
$MatchingFiles.Directory.FullName
}
}
$Paths | out-string | Write-Host -ForegroundColor Cyan
Output for Notepad.exe search using this method.
C:\Windows\system32
C:\Windows

Unable to copy item from mapped network drive using powershell as file contains wierd characters

I am trying to copy files from mapped network drive.Some of them gets copied but others are not copied as filename has got some wierd characters.
for example my mapped network drive Z: contains the following files:
skifteretsattest 1(1).pdf
MailBody.msg
k�rekort terje(3).pdf
I am able to copy first two files from mapped network drive but not the last one using the below command
Copy-Item -LiteralPath Z:\$name -Destination I:\Dat\SomePath\ss/ -Force
The error which I get is:
Copy-Item : Could not find file 'Z:\k�rekort terje(3).pdf
I tried [WildcardPattern]::Escape($name) but that also did not work
Kindly help if anybody knows the solution
Maybe you could use robocopy.exe oder xcopy.exe instead?
Maybe old "dir /x" can help to find out the old "8.3" filename (like "GET-GP~1.PS1" for "Get-GPProcessingTime.ps1") and this can be used to copy or rename the file?
I also remember something about bypassing file system logic using unc-like syntax like \\0\driveletter\directory or whatever - unfortunately I don't remember the exact syntax. Maybe someone else does?
Try something like this:
$files = Get-ChildItem -Path "Z:\"
$files | % { Copy-Item -Destination "I:\Dat\SomePath\ss" }

Request assistance writing a PS script to search for a list of files (path + filename) against all Windows servers in my environment

What I'm trying to accomplish:
Create a PS script to run from a single Admin machine, but search against C$ on all Windows servers in AD.
Search for a specific list of paths\filenames that I provide.
If ANY of the specific list of paths\filenames are found on a server, THEN output the server name, and paths\filenames to a *.CSV file titled "Badfiles.csv" on the Admin machine.
Was trying to build from the following syntax but admittedly my old brain is not good at this stuff, and I've only specified a single file here - how do I refer to a list of multiple paths\files? Thank you for helping an old lady out. :)
$name= gc env:computername
$computers= get-content -path C:\Windows\Temp\v.bat
$csvfile = "c:\temp\$badfiles.csv"
foreach ($computer in $computers) {
"\$computer\C$\" | Get-ChildItem -recurse -filter "*.bat"
}
To refer to a list of items whether those are files or computer names you will need to use what is called an Array.
You can create an array in many ways, in your case it might best to create a list in a txt file and afterwards in Powershell you read the list contents using get-content, save the result in a variable and it will automatically be saved as an array!
Then iterate through each of them using what is called a foreach loop, that basically lets you take each of the items in the array and do something with it, then move to the next item and so on until every item has been dealt with.
Now the most important part of what you want to achieve is not clear. Let me explain.
To check if a file exists you can use test-path. That will return true or false and you can then act upon the result of that. You need to define an exact path and name of a file to check for this to work.
If you don't know the exact names and paths of files that need to be checked, you can use Get-ChildItem similarly as you have done in the code you provided. The caveat here is that you have to narrow down the scope of the file search as much as you can. In your example you search for the .bat file extension on the whole machine and that can result in some issues. A typical C drive will have hundreds of thousands if not millions of files and folders. Parsing all of them can take a long time.
So this is an important distinction to understand and what causes confusion for me is that you say in "2. Search for a specific list of paths\filenames that I provide..." yet in the code you use Get-ChildItem to get all files instead of providing a list of filenames.
Further I will assume you have a list of filenames with exact known paths.
Now in your given code I can see you have found some of the right commands but they need to be arranged differently to produce the results you need.
Please review this example code that might help you further:
Example ComputerList.txt file content(list of computer hostnames to check):
computer1
serverXYZ
laptop123
Example FileList.txt file content(List of files to check for in each of the above computers):
c:\temp\virus.bat
c:\games\game.exe
c:\Pictures\personal.jpg
Now the PowerShell code:
# Gets the list of items from TXT files and saves them as arrays in variables
$ComputerNames = Get-Content 'c:\temp\ComputerList.txt'
$FileList = Get-Content 'c:\temp\FileList.txt'
# Define the path and name of CSV report
$BadFiles = "c:\temp\badfiles.csv"
# Define the foreach loop that will iterate through each hostname in computer list
foreach($computer in $ComputerNames){
# Define foreach loop that will iterate through each file in the list and test their path in the current computer in the current iteration
foreach($file in $FileList){
# Test the path of the current file in the loop and append the CSV file if it was found
# Convert the file path to C$ share path
$file = $file -replace("c:","c$")
# Define path of file to test
$FileToTest = "\\$computer\$file"
if (test-path $FileToTest -ErrorAction SilentlyContinue){
# This block will run only when a bad file has been found
# This part can be tricky but it is used to make sure we properly format the current bad file entry when we append it to the resulting CSV file
$BadFile = "" | select Computer,File
# Save information about current computer
$BadFile.computer = $computer
# Save information about current file
$BadFile.file = $file
# Append the entry to an array of found bad files
$BadFileList += $badfile
}
}
}
# When done iterating through every computer and file, save the results in a CSV file
$BadFileList | ConvertTo-Csv -NoTypeInformation | Out-File $BadFiles
The above is a full code snippet you can test and run in your environment. First please create the two TXT files and make sure you run PowerShell with the appropriate permissions to access the C$ network shares of the servers.
The snippet should work but I have not tested it myself. Let me know if there are any errors.
Please test and feel free to ask if you have any follow up questions.

Downloading and opening a series of image urls

What I am trying to do is download 2 images from URL's and open them after download. Here's what I have:
#echo off
set files='https://cdn.suwalls.com/wallpapers/cars/mclaren-f1-gtr-42852-400x250.jpg','http://www.dubmagazine.com/home/media/k2/galleries/9012/GTR_0006_EM-2014-12-21_04_GTR_007.jpg'
powershell "(%files%)|foreach{$fileName='%TEMP%'+(Split-Path -Path $_ -Leaf);(new-object System.Net.WebClient).DownloadFile($_,$fileName);Invoke-Item $fileName;}"
Im getting 'Cannot find drive' A drive with the name 'https' cannot be found.
It's the Split-path command that is having problems but cant seem to find a solution.
You could get away with basic string manipulation but, if the option is available, I would opt for using anything else that is data aware. In your case you could use the [uri] type accelerator to help with these. I would also just opt for pure PowerShell instead of splitting between batch and PS.
$urls = 'https://cdn.suwalls.com/wallpapers/cars/mclaren-f1-gtr-42852-400x250.jpg',
'http://www.dubmagazine.com/home/media/k2/galleries/9012/GTR_0006_EM-2014-12-21_04_GTR_007.jpg'
$urls | ForEach-Object{
$uri = [uri]$_
Invoke-WebRequest $_ -OutFile ([io.path]::combine($env:TEMP,$uri.Segments[-1]))
}
Segments will get you the last portion of the url which is a proper file name in your case. Combine() will build the target destination path for you. Feel free to add you invoke item logic of course.
This also lacks error handling if the url cannot be accessed or what not. So be aware of that possibility. The code above was meant to be brief to give direction.

Picking file names out of a website to download in powershell

Problem: I'm working on making a PowerShell script that will download the sites source code, find all the file targets, and then download said targets. I'm alright for authentication for the moment, so on my test website, I enabled anonymous authentication, enabled directory browsing, and disabled all other default pages, so all I get is a list of files on my site. What I have so far is this:
$source = "http://testsite/testfolder/"
$webclient = New-Object system.net.webclient
$destination = "c:/users/administrator/desktop/test/"
$webclient.downloadstring($source)
The $webclient.downloadstring will return basically the source code of my site, and I can see the files I want wrapped in the rest of the code. My question to you guys is what is the best and/or easiest ways of isolating the links I want so I can do a foreach command to download all of them?
Also, for extra credit, how would I go about adding in code to download folders and the files within those folders from my site? I can at least make seperate scripts to pull the files from each subfolder, but obviously it would be much nicer to get it all in one script.
If you are on PowerShell v3 the Invoke-WebRequest cmdlet may be of help.
To get an object representing the website:
Invoke-WebRequest "http://stackoverflow.com/search?tab=newest&q=powershell"
To get all the links in that website:
Invoke-WebRequest "http://stackoverflow.com/search?tab=newest&q=powershell" | select -ExpandProperty Links
And to just get a list of the href elements:
Invoke-WebRequest "http://stackoverflow.com/search?tab=newest&q=powershell" | select -ExpandProperty Links | select href
If you are on PowerShell v2 or earlier you'll have to create an InternetExplorer.Application COM object and use that to navigate the page:
$ie = new-object -com "InternetExplorer.Application"
# sleep for a second while IE launches
Start-Sleep -Seconds 1
$ie.Navigate("http://stackoverflow.com/search?tab=newest&q=powershell")
# sleep for a second while IE opens the page
Start-Sleep -Seconds 1
$ie.Document.Links | select IHTMLAnchorElement_href
# quit IE
$ie.Application.Quit()
Thanks to this blog post where I learnt about Invoke-WebRequest.
Update:
One could also download the website source like you posted and then extract the links from the source. Something like this:
$webclient.downloadstring($source) -split "<a\s+" | %{ [void]($_ -match "^href=[`'`"]([^`'`">\s]*)"); $matches[1] }
The -split part splits the source along lines that start with <a followed by one or more spaces. The output is placed in an array which I then pipe through a foreach-object block. Here I match each line on the regexp which extracts the links part and outputs it.
If you want to do more with the output you can pipe it further through another block which does something with it.

Resources