Curl behaves inconsistently on Windows Powershell - windows

I'm trying to write a Windows Powershell script but when I write $ curl wttr.in for example, I get the expected output however, when I do $a=curl wttr.in;echo $a I just get gibberish. I'm using the curl executable located in C:\Windows\System32\ since I removed the default Powershell aliases related with Invoke-WebRequest (curl and wget). Is there something I'm doing wrong?
Here is what I mean:
curl wttr.in (expected output)
$a=curl wttr.in;echo $a (wrong output)

I believe it has to do with encoding. A workaround would be simply add Out-String when capturing
$a = C:\Windows\system32\curl.exe wttr.in | Out-String
$a

I could not test it (response was "no more querys"), but you can force the output encoding into a specific encoding
Encode a string in UTF-8
may take some testing to find the right output.

Related

What 's the reason of Ruby IO stream in Windows powershell file type encoding

I have a strangeness problem. In Windows7 operation system,I try to run the command in powershell.
ruby -E UTF-8 -e "puts 'どうぞよろしくお願いします,Mr Jason'" > test.txt
When I read test.txt file:
ruby -E UTF-8 -e "puts gets" < test.txt
the result is:
�i0F0^0�0�0W0O0J0X�D0W0~0Y0,Mr Jason
I check test.txt file,find the file type encoding is Unicode,not UTF-8.
What should I do ?
How should I ensure the encoding of the output file type after redirection? Please help me.
tl;dr
Unfortunately, the solution (on Windows) is much more complicated than one would hope:
# Make PowerShell both send and receive data as UTF-8 when talking to
# external (native) programs.
# Note:
# * In *PowerShell (Core) 7+*, $OutputEncoding *defaults* to UTF-8.
# * You may want to save and restore the original settings.
$OutputEncoding = [Console]::OutputEncoding = [Text.UTF8Encoding]::new()
# Create a BOM-less UTF-8 file.
# Note: In *PowerShell (Core) 7+*, you can less obscurely use:
# ruby -E UTF-8 -e "puts 'どうぞよろしくお願いします,Mr Jason'" | Set-Content test.txt
$null = New-Item -Force test.txt -Value (
ruby -E UTF-8 -e "puts 'どうぞよろしくお願いします,Mr Jason'"
)
# Pipe the resulting file back to Ruby as UTF-8, thanks to $OutputEncoding
# Note that PowerShell has NO "<" operator - stdin input must be provided
# via the pipeline.
Get-Content -Raw test.txt | ruby -E UTF-8 -e "puts gets"
In terms of character encoding, PowerShell communicates with external (native) programs via two settings that contain .NET System.Text.Encoding instances:
$OutputEncoding specifies the encoding to use to send data TO an external program via the pipeline.
[Console]::OutputEncoding specifies the encoding to interpret (decoded) data FROM an external program('s stdout stream); for decoding to work as intended, this setting must match the external program's actual output encoding.
As of PowerShell 7.3.1, PowerShell only "speaks text" when communicating with external programs, and an intermediate decoding and re-encoding step is invariably involved - even when you're just using > (effectively an alias of the Out-File cmdlets) to send output to a file.
That is, PowerShell's pipelines are NOT raw byte conduits the way the are in other shells.
See this answer for workarounds and potential future raw-byte support.
Whatever output operator (>) or cmdlet (Out-File, Set-Content) you use will use its default character encoding, which is unrelated to the encoding of the original input, which has already been decoded into .NET strings when the operator / cmdlet operates on it.
> / Out-File in Windows PowerShell defaults to "Unicode" (UTF-16LE) encoding, which is what you saw.
While Out-File and Set-Content have an -Encoding parameter that allows you to control the output encoding, in Windows PowerShell they don't allow you to create BOM-less UTF-8 files; curiously, New-Item does create such files, which is why it is used above; if a UTF-8 BOM is acceptable, ... | Set-Content -Encoding utf8 will do in Windows PowerShell.
Note that, by contrast, PowerShell (Core) 7+, the modern, cross-platform edition now thankfully consistently defaults to BOM-less UTF-8.
That said, with respect to [Console]::OutputEncoding on Windows, it still uses the legacy OEM code page by default as of v7.3.1, which means that UTF-8 output from external programs is by default misinterpreted - see GitHub issue #7233 for a discussion.

Curl Powershell windows 10 slower than command prompt why?

Just a pretty stand curl command call an S3 end point for download using all default values. On a mac, or on a PC using command line I get 103MBsec if cached on cdn and 80mbsec otherwise. Same command, same bucket, same object, using "curl.exe" and I get 1MBSec when call through powershell. I guess powershell does something different that make it's totally slow? I tried using newest curl binary but still the same.
I guess I am misunderstanding what powershell is doing when I use a curl command
curl.exe yourfileonS3 >> output.bin
To complement briantist's helpful answer:
In PowerShell, the redirection operators > and >> are in effect aliases of Out-File and Out-File -Append.
> and >> are therefore not mere byte-stream conduits, and, in fact, PowerShell as of v7.2 does not support sending raw byte output to a file.
Instead, PowerShell invariably decodes output from any external program as text ([string] instances), based on the character encoding reported by [Console]::OutputEncoding] and then, on saving to the target file with Out-File (possibly via > / >>), re-encodes these strings, using that cmdlet's default character encoding (unless overridden with -Encoding in an explicit Out-File call).
Not only does this not preserve the external program's raw byte
output, it adds significant overhead.
To get raw byte processing, call cmd.exe[1] and use its redirection operators:
cmd /c 'curl.exe yourfileonS3 >> output.bin'
See this answer for more information.
[1] On Unix-like platforms, use sh -c 'curl yourfileonS3 >> output.bin'
See mklement0's answer for full context on this (I recommend accepting that one!), and the important point that handling of byte streams in redirection is problematic and error prone in PowerShell and should be avoided.
So I looked into this and I believe the reason is that >> (file redirection) is the slow part.
I originally suspected you might be calling curl (which is aliased to Invoke-WebRequest in Windows PowerShell), but I was able to reproduce the speed difference between curl.exe directly in PowerShell vs cmd.exe, and measure it, this way:
# call curl.exe and do redirection in PowerShell
Measure-Command -Expression { curl.exe https://uploader.codecov.io/v0.1.0_6943/linux/codecov >> delme.bin }
del delme.bin
# call cmd.exe and do redirection there
Measure-Command -Expression { & cmd.exe /c 'curl.exe https://uploader.codecov.io/v0.1.0_6943/linux/codecov >> delme.bin' }
del delme.bin
This was enough to show a stark difference.
I also confirmed that this problem is a little bit worse in Windows PowerShell as opposed to later cross-platform versions (pwsh.exe). In Windows, with version 7.1.0, the same commands above still show a large difference.

Return results for Ubuntu Updates in a Variable

I am trying to get the result from /usr/lib/update-notifier/apt-check on a Ubuntu 16 Server into a Array to make a XML response for a monitoring tool, but somehow the value of this apt-check just refuses to get in my Variable. For simplicity sake, I have omitted the XML creation part.
#!/bin/bash
APTCHECK="/usr/lib/update-notifier/apt-check"
APTResult="$(${APTCHECK})"
echo "Result is $APTResult"
exit 0
if you now run this code with bash -x you will see that the result is returned to the Terminal, but not assigned to the Variable. If I substitute the "command" to something simple like "ls -lah" everything works fine.
I just don't know why this is not working ? Anybody ?
apt-check prints to the stderr, so you need to capture that instead with aptresult=$(/usr/lib/update-notifier/apt-check 2>&1).
The other option is with the --human-readable switch, which'll print to the stdout. The only problem then is that you have to parse the text output (unless the text output is what you actually want).

Batch scripting: Why doesn't redirecting stdout work in this scenario?

If you open up a command prompt and type this:
echo foobar > nul
it will print nothing, since nul swallows all of its input. But if you run the command with PowerShell:
powershell "echo foobar" > nul
it will output foobar to the console. Why is this, and what can I do to fix it?
edit: Here is the output of $PSVersionTable. It looks like I'm using PowerShell v5.0.
Note: I'm assuming you're invoking your command from cmd.exe, not from within PowerShell, which is consistent with the symptoms I'm seeing.
Methinks you've stumbled upon a bug in PS (PowerShell) v5 (not present in v3; comments on the question suggest it's also not in v4), though I don't fully understand why PS is to blame, because I'd expect cmd.exe to handle the redirection.
I may be missing something, however, so do let me know.
PowerShell should send its so-called success stream - things output by default, including with echo, which is an alias of Write-Output - to the outside world's stdout.
In older PS versions >NUL does effectively suppresses PowerShell's output.
Curiously, the bug in v5 only affects NUL, whereas redirecting to an actual file works.
As for workarounds:
If your code is v2-compatible, try this:
powershell -version 2 "echo foobar" > NUL
Otherwise, redirect to an actual file and delete that file afterward:
powershell "echo foobar" > "%TEMP%\NUL-bug-workaround" & del "%TEMP%\NUL-bug-workaround"

Does PSCP work with PowerShell?

I have a PowerShell script that produces a text file. At the end, I would like to copy this file to a Linux server.
From CMD.EXE, I can use PSCP (from Putty), it works and copies the file.
But from PowerShell, either interactively or from a PowerShell batch, PSCP has no visible effect: no error messages and the file is not copied.
Even if I run simply .\PSCP.EXE without arguments, on the CMD command line it displays the options, but from PowerShell it does nothing.
Can PSCP be used from inside PowerShell?
Executing a program from within PowerShell should work identically to CMD, but depending upon how that program produces its output (does it write to STDOUT, STDERR, other?) that may behave differently.
I've been using Rebex's components for FTPS & SFTP within .NET apps & PowerShell scripts; the SFTP package includes an SCP class. Yes, it costs money, but depending upon your usage it may be worthwhile.
Just attempted to automate PSCP from PowerShell. Remember to use pscp's -batch parameter so that, should you do something like enter the wrong password, you won't get asked for input.
$Cmd = "pscp -l username -pw password -batch c:\folder\file.txt server:/home/user1"
Invoke-Expression "& $( $Cmd )"
Otherwise your script will just grind to a halt.
Yes - most any executable can be called from PowerShell. There isn't anything peculiar about pscp.exe in this regard. You may need to preface it with the call operator - the ampersand - &:
PS C:\>& "C:\Program Files (x86)\Putty\pscp.exe" -V
pscp: Release 0.62
The above is direct output from my PowerShell prompt. The call operator is particularly helpful if the path to your executable contains spaces - the call operator is used to tell PowerShell to treat what would be considered a string as something it should try to execute instead.
Please include the full command your are trying to execute as it will help in providing a better answer. You may have a problem with your PATH variable or something else weird if you don't get any output.
If using pscsp from inside a script, e.g. perl
no ampersand
quote like this "my password"
e.g.
"C:\Program Files\Putty\pscp.exe" -C -p -pw "password" /local_dir/file_to_copy user#hostname:/remote_directory
in perl (beware that \ is an escape char in a "string" )
$cmd = q("C:\Program Files\Putty\pscp.exe" -C -p -pw "password" /local_dir/file_to_copy user#hostname:/remote_directory);
system($cmd);

Resources