Windows batch file: download many small files fast - windows

I have a list of urls (urls.txt):
https://example.com/1.webp
https://example.org/bar2.webp
... 10k more
Files vary in size from 1kb to 100kb.
How can I download these files quickly on a Windows 10 machine without installing any third-party software?
I need it to be in a single file that user can double-click without installing any additional software.
It should work on any decently up-to-date Windows 10 PC. AFAIK it means the PowerShell version is 5.1.
Additional information.
I tried this:
powershell -Command "Invoke-WebRequest https://example.com/1.webp -OutFile 1.webp"
but it extremely slow due to sequential execution.
So far this works in PowerShell fast enough:
Get-Content .\urls.txt |ForEach-Object {
$FileName = Split-Path -leaf $_
Invoke-WebRequest $_ -OutFile $FileName
}
But I can't figure out how to invoke this script with a double-click on a file.
Invoking .ps1 file from a .bat file doesn't work. Error:
download.ps1 cannot be loaded because running scripts is disabled on this system.
Asking user to adjust permissions is not an option.
This works in a clickable .bat file:
powershell -command ^
Invoke-WebRequest https://example.com/1.webp -OutFile 1.webp;
But this script fails silently:
powershell -command ^
Get-Content .\urls.txt |ForEach-Object { ^
$FileName = Split-Path -leaf $_ ^
Invoke-WebRequest $_ -OutFile $FileName ^
} ^

"...how do I iterate over a file lines with it? Sry, I never used Windows" (that must feel like me after a Linux machine).
Open a PowerShell prompt (Start → Run → PowerShell) or just type PowerShell.exe on the command prompt.
At the PowerShell prompt, to run the task in parallel using ForEach-Object -Parallel:
1..9 |ForEach-Object -Parallel { "Invoke-WebRequest https://example.com/$_.webp" -OutFile "$_.webp" }
Where "$_" is the current item (1to9`), you might also use a list here, like:
'One', 'Two', 'Three' |ForEach-Object -Parallel { ...
In case you "need to read it directly from the file", (presuming that you want use the name in the url as your filename) you might do something like this:
Get-Content .\urls.txt |ForEach-Object -Parallel {
$FileName = Split-Path -leaf $_
"Invoke-WebRequest $_ -OutFile $FileName
}
Update
(based on the additional information in your question and comments in this answer)
Final steps to making you command line easy to launch for novice user, taking in account that passing "complex" commands with special characters (as newlines, spaces and double quotes) from a batch file interpreter to PowerShell is quiet a hassle as there are a lot of exceptions on the exceptions. See: these stackoverflow questions
In your case it might be simply putting your commands in a single (quoted) command line and separate each syntax with a semi-colon (;):
powershell -command "Get-Content .\urls.txt |ForEach-Object { $FileName = Split-Path -leaf $_; Invoke-WebRequest $_ -OutFile $FileName }"
But to be on the safe side (in case e.g. a powershell command/parameter requires to be quoted by itself), I would rather supply robust solution which is encoding your command line to base64 and use the -EncodedCommand parameter. See also these answers:
running powershell as shell command having error in StartTime variable for FilterHashtable
Pass complex arguments to powershell script through encoded command
Encoding
To encode your command line to base64:
$Command = {
Get-Content .\urls.txt |ForEach-Object {
$FileName = Split-Path -leaf $_
Invoke-WebRequest $_ -OutFile $FileName
}
}.ToString()
$Bytes = [System.Text.Encoding]::Unicode.GetBytes($Command)
[Convert]::ToBase64String($Bytes)
Download.bat
Including the encoded command line in a (single) batch file add the following command in you batch file where the base64 string is copied from the above ToBase64String conversion:
PowerShell -EncodedCommand CgAgACAAIAAgACAAIAAgACAARwBlAHQALQBDAG8AbgB0AGUAbgB0ACAALgBcAHUAcgBsAHMALgB0AHgAdAAgAHwARgBvAHIARQBhAGMAaAAtAE8AYgBqAGUAYwB0ACAAewAKACAAIAAgACAAIAAgACAAIAAgACAAIAAgACQARgBpAGwAZQBOAGEAbQBlACAAPQAgAFMAcABsAGkAdAAtAFAAYQB0AGgAIAAtAGwAZQBhAGYAIAAkAF8ACgAgACAAIAAgACAAIAAgACAAIAAgACAAIABJAG4AdgBvAGsAZQAtAFcAZQBiAFIAZQBxAHUAZQBzAHQAIAAkAF8AIAAtAE8AdQB0AEYAaQBsAGUAIAAkAEYAaQBsAGUATgBhAG0AZQAKACAAIAAgACAAIAAgACAAIAB9AAoAIAAgACAAIAA=

You could try the foreach-object -parallel method for this case, i tried something simular once with multiple process starts for robocopy to get like 1000 small files (5-10kb) on another harddrive.
I will look up if i can find it again.
Edit 1: you can go over like this for example.
$allmylinks = import-csv -path "path to your csv"
foreach -parallel($link in $allmylinks){
Invoke-WebRequest $link
}

Related

Using ruby variable in a PowerShell script with winrm

EDIT:
I would like to count files/folders of storage containers via virtual machine manager PowerShell cmdlet.
I went over similar questions, but still am struggling with syntax.
I have a ruby script that is executing a PowerShell script on a remote server.
I want to use a ruby variable within the Powershel script.
For example
path_str = "\\XXX\YYY\" #This is the ruby var
PSoutput = shell.run(" #This part is executing the PS script
$Files = Get-ChildItem -Recurse -File -Path #{path_str} | Measure-Object | %{$_.Count}" | stdout
How do I use the ruby variable path_str with the PS script?
I have tried
# {path_str}
\" " + path_str + " \"
Double quotes and single quotes
Nothing worked for me.
There are a few things that I see causing the issues.
# is a reserved character in powershell. When you use #, anything after that is a comment.
You are assigning the output of Get-ChildItem to $files. There will be no output from shell.run() to assign to PSoutput because output from cmdlet is getting assigned to $files.
Get-ChildItem is a powershell specific command, not a command line / dos command that you can execute from within shell, without first calling powershell executable. (this, I am a little doubtful on but quite sure its correct).
What you can do from ruby is, should work,
PSoutput = system("powershell get-childitem -Recurse -File -Path " + #{path_str} + " | Measure-Object | % {$_.Count}")
Once the command executes, you should have a total number of files under the #path_str directory.

Copy files, and skip the ones in use

I am attempting to make a PowerShell script to run every night, to copy over PDF files from C:\share1 to C:\share2 and write to the event log.
My current file copy script bit looks like this:
Try
{
get-childitem -Path C:\Share1 | ForEach-Object { Copy-Item $_.FullName C:\share2 -verbose }
#Writes to event log as success.
}
Catch
#Logs the event as failed
}
The issue I run into here is that the files beeing copied/replaced are in use.
When a file is in use the script stops copying on that said file with a error:
PS>TerminatingError(Copy-Item): "The process cannot access the file 'C:/share2/1.pdf' because it is being used by another process."
I would like to at least modify my script so it continues to copy the remaining files.
If i for example had 100 files and the 3rd one was in use, the entire transfer stops.
How can I modify my script to continue on remaining items?
You are looking for the common -ErrorAction paramter with set to SilentlyContinue:
get-childitem -Path C:\Share1 | ForEach-Object { Copy-Item $_.FullName C:\share2 -verbose -ErrorAction SilentlyContinue }
Note: You don't need the Foreach-Object cmdlet here:
Get-ChildItem -Path C:\Share1 | Copy-Item C:\share2 -verbose -ErrorAction SilentlyContinue
Note 2: As gvee mentioned, this will ignore all errors. Another option would be to use handle.exe from the sysinternals suite to check whether there is an open handle.

Powershell output doesn't log properly when called from Task Scheduler

I need to log my powershell output. My ps file is something like this:
#Set-ExecutionPolicy Unrestricted
$ErrorActionPreference="SilentlyContinue"
Stop-Transcript | out-null
$ErrorActionPreference = "Continue"
$date = (Get-Date).tostring("MMddyy HHmmss")
$filename = 'C:\apierror\logs\' + $date + '.txt'
Start-Transcript -path $filename -append
$python = "C:\Python34\python.exe"
$python_path = "C:\script.py"
cd (split-path $python_path)
& $python $python_path
Stop-Transcript
Now, when I run this file directly from powershell, the output is logged correctly. But when I try to run it from taskscheduler - only some portion of the console output is stored in the file.
Any ideas why that might be?
Using transcript only stored partial output for some reason. I ended up using logs directly into the python file as opposed to powershell. Seems to be working correctly.

Powershell's Start-Process command doesn't start exe anywere outside Powershell ise

I'm writing simple script to unarchive (rar) a project from Teamcenter to temp directory, then run specific program (Mentor), then archive again.
I've read a lot of examples about starting exe from PS, but they mostly relate to small exes like notepad, without dlls and other resources.
In Powershell Ise the script works perfectly. But when I call the script from teamcenter, Mentor is missing dlls.
Before I run Mentor, in the script, I do:
Get-ChildItem Env:
to check environment variables and all variables exist. I tried to set environments manually, like this:
$wf_classpath = Get-ChildItem Env:WF_CLASSPATH
[System.Environment]::SetEnvironmentVariable("WF_CLASSPATH", $wf_classpath.Value, "Process")
Does not work.
I tried to set homefolder:
$mentor = Start-Process $file.FullName -Wait -WorkingDirectory $workdir
Does not work.
Then I tried to call a batch file from the script with environments, does not work.
Try call cmd.exe /c ... does not work.
Full script here, works perfect only in Powershell Ise, if I call the script from other programs, exe does not start.
$shell = new-object -com shell.application
$invocation = $MyInvocation.MyCommand.Definition
$rootpath = $PSScriptRoot
$outpath = "$($PSScriptRoot)\out"
$pathtorar = "c:\Siemens\menutils\Rar.exe"
Remove-Item -Recurse -Force $outpath
New-Item $outpath -ItemType directory
$archive = get-childitem $rootpath | where { $_.extension -eq ".rar" } | Select-Object -First 1
$arglist = "x $($archive.FullName) $($outpath)"
Start-Process -FilePath $pathtorar -ArgumentList $arglist -Wait
Remove-Item -Recurse -Force $archive.FullName
$file = get-childitem $outpath -Recurse | where { $_.extension -eq ".prj" } | Select-Object -First 1
Write-Host "$(get-date -Format yyyy-MM-dd-hh-ss)
Start process: $($file.FullName)"
$mentor = Start-Process $file.FullName -Wait
$arglist = "a -m0 -r -ep1 $($archive.FullName) $($outpath)"
Start-Process -FilePath $pathtorar -ArgumentList $arglist -Wait
Remove-Item -Recurse -Force $outpath
Read-Host -Prompt "Press Enter to exit"
What's the difference between running the script from Powershell Ise and other programs?
How should I set environment variables to run the script from other scripts/programs?
Its probably that your Current directory is not correct and WorkingDirectory in my experience is buggy. The dll's will be obtained from the current directory if they are not at the regular system paths.
Use this function before Start-Process
[IO.Directory]::SetCurrentDirectory($Dir)

How to condense PowerShell script to fit on a single line

Quick question. I am trying to write the following PowerShell script, but I would like it to fit on a single line:
$app = New-Object -comobject Excel.Application
$wb1 = $app.Workbooks.Open("C:\xampp\upload_files\Launchpad.xlsm")
$app.Run("Refresh")
$wb1.Close($false)
$app.Quit()
The pseudo-code would look something like this:
$app = New-Object -comobject Excel.Application AND $wb1 = $app.Workbooks.Open AND "C:\xampp\upload_files\Launchpad.xlsm") AND $app.Run("Refresh") AND $wb1.Close($false) AND $app.Quit()
The reason I want to fit on a line is because I would like to insert the arguments directly in the 'arguments' box of Windows Task Scheduler. The reason for this is that for some reason scripts have been disabled (e.g. I cannot call a .ps1 file...)
I know this will still work, as I already have a "one liner" PS script running. What would the syntax look like??
Kind regards,
G.
Powershell statements can be separated with semicolons:
$app = New-Object -COM 'Excel.Application'; $wb1 = $app.Workbooks.Open("..."); ...
The PowerShell executable takes a -Command parameter that allows you to specify a command string for execution in PowerShell:
powershell.exe -Command "stmnt1; stmnt2; ..."
To run this via Task Scheduler you'd put powershell.exe into the "program" field and -Command "stmnt1; stmnt2; ..." into the "arguments" field of the task.
However, as #alroc said: you should verify why script execution has been restricted. If it's just the default setting you can simply change it by running Set-ExecutionPolicy RemoteSigned or override it by adding -ExecutionPolicy ByPass to a PowerShell command line. However, if the setting is enforced by policy changing/bypassing the setting will fail, and you could get into quite some trouble for violating company policies.
Here is a solution that you might use if the script is not that easy to convert, but you are on Windows running at least PowerShell V5.
It converts the code into Base64 and uses PowerShell.exe with the parameter -encodedCommand to pass the encodedCommand as string.
$command = Get-Content .\YourPowerShellFileContainingTheCode.ps1 -raw
# Get-Content may require "-encoding utf8" or other encodings depending on your file
$encodedCommand = [Convert]::ToBase64String([Text.Encoding]::Unicode.GetBytes($command))
Write-Output "Text for application:"
Write-Output "PowerShell.exe" ""
Write-Output "Text for argurments:"
Write-Output "-encodedCommand $encodedCommand"
It would look like this, but with a much larger command:
Text for application:
PowerShell.exe
Text for argurments:
-encodedCommand SABvACAASABvACAASABvACwAIABzAHQAYQBjAGsAbwB2AGUAcgBmAGwAbwB3AA==

Resources