WebClient powershell Download return an empty file - image

I am trying to download the image below into a local directory but the resulted image appears empty.
http://www.capecountyhomes.com/Homes/Images/Listings/77977723/1/Photo.jpg
This seems like a simple task but for some reason, the resulted image is somehow 0 bytes. below is my code
$wc = new-object System.Net.WebClient
$wc.DownloadFile("http://www.capecountyhomes.com/Homes/Images/Listings/77977723/1/Photo.jpg","C:/images/test.jpeg")
but the image test.jpeg in my local directory is 0 bytes.
Does anyone know what the issue could be?
Thanks

It seems to have something to do with the Headers, if you add a User-Agent the download seems to work:
$wc = new-object System.Net.WebClient
$wc.Headers["User-Agent"] = "Safari"
$wc.DownloadFile("http://www.capecountyhomes.com/Homes/Images/Listings/77977723/1/Photo.jpg","C:/images/test.jpeg")
Probably blank user agents are denied access as a measure against spam.

The url does not exist so I can't test, but my guess is the file really is 0 bytes in size.

Related

Downloading/Installing Visual Studio 2019 with Powershell

I have this code. It is supposed to download visual studio via a powershell script
$url = "https://visualstudio.microsoft.com/thank-you-downloading-visual-studio/?sku=professional&rel=16&utm_medium=microsoft&utm_source=learn.microsoft.com&utm_campaign=link+cta&utm_content=download+commandline+parameters+vs2019"
New-Item -Path 'C:\dev\pub\vs' -ItemType Directory -force
$downloadPath = "C:\dev\pub\vs"
$filePath = "C:\dev\pub\vs\vs_professional.exe"
#Invoke-WebRequest -URI $url -OutFile $filePath
$workloadArgument = #(
'--add Microsoft.Net.Component.4.7.1.SDK'
'--add Microsoft.VisualStudio.Component.Windows10SDK.17134'
'--add Microsoft.Net.Component.4.7.1.TargetingPack'
)
$optionsAddLayout = [string]::Join(" ", $workloadArgument )
$optionsQuiet = "--quiet"
$optionsLayout = "--layout $downloadPath"
$optionsIncludeRecommended = "--includeRecommended"
$vsOptions = #(
$optionsLayout,
$optionsIncludeRecommended,
$optionsAddLayout
$optionsQuiet
)
Start-Process -FilePath $filePath -ArgumentList $vsOptions
For some reason, Invoke-WebRequest isn't downloading the file it is supposed to be. If you go to the link, a file gets automatically downloaded (the correct file), but the cmdlet gives the wrong file. I was wondering how I can get this to download the correct .exe so I can use Start-Process.
Thanks.
The issue is that it's not a direct link to the "actual" installer. When you go to the "Download" page that you mentioned:
https://visualstudio.microsoft.com/thank-you-downloading-visual-studio/?sku=professional&rel=16&utm_medium=microsoft&utm_source=learn.microsoft.com&utm_campaign=link+cta&utm_content=download+commandline+parameters+vs2019
The code displays the page, then the page kicks off a piece of javascript that makes the actual request to the "real" link (i.e. the website doesn't change, and all real links are all stored in a database somewhere).
To get the "real" link, start your web browser, open up the Developer Tools (F12). Go to the "Network" tab. Go to the "download" page that you have above. Wait till you get the download request. In the Network traffic, you should see one request for the vs_Professional.exe page. Then you can copy the link address to the direct download:
This gets you the direct link, which (caution) may change/break with different versions and releases, so you might have to get the new link a couple of times. The link I got for right now is:
https://download.visualstudio.microsoft.com/download/pr/e730a0bd-baf1-4f4c-9341-ca5a9caf0f9f/4358b712148b3781631ab8d0eea42af736398c8b44fba868b76cb255b3de7e7c/vs_Professional.exe

How to completely skip writing files to disk with libtorrent downloads?

I have the following code to download a torrent off of a magnet URI.
#python
#lt.storage_mode_t(0) ## tried this, didnt work
ses = lt.session()
params = { 'save_path': "/save/here"}
ses.listen_on(6881,6891)
ses.add_dht_router("router.utorrent.com", 6881)
#ses = lt.session()
link = "magnet:?xt=urn:btih:395603fa..hash..."
handle = lt.add_magnet_uri(ses, link, params)
while (not handle.has_metadata()):
time.sleep(1)
handle.pause () # got meta data paused, and set priority
handle.file_priority(0, 1)
handle.file_priority(1,0)
handle.file_priority(2,0)
print handle.file_priorities()
#output is [1,0,0]
#i checked no files written into disk yet.
handle.resume()
while (not handle.is_finished()):
time.sleep(1) #wait until download
It works, However in this specific torrent, there are 3 files, file 0 - 2 kb, file 1 - 300mb, file 3 - 2kb.
As can be seen from the code, file 0 has a priority of 1, while the rest has priority 0 (i.e. don't download).
The problem is that when the 0 file finishes downloading, i want to it to stop and not download anymore. but it will sometimes download 1 file -partially, sometimes 100mb, or 200mb, sometimes couple kb and sometimes the entire file.
So my question is: How can i make sure only file 0 is downloaded, and not 1 and 2.
EDIT: I added a check for whether i got metadata, then set priority and then resume it, however this still downloads the second file partially.
The reason this happens is because of the race between adding the torrent (which starts the download) and you setting the file priorities.
To avoid this you can set the file priorities along with adding the torrent, something like this:
p = parse_magnet_uri(link)
p['file_priorities'] = [1, 0, 0]
handle = ses.add_torrent(p)
UPDATE:
You don't need to know the number of files, it's OK to provide file priorities for more files than ends up being in the torrent file. The remaining ones will just be ignored. However, if you don't want to download anything (except for the metadata/.torrent) from the swarm, a better way is to set the flag_upload_mode flag. See documentation.
p = parse_magnet_uri(link)
p['flags'] |= add_torrent_params_flags_t.flag_upload_mode
handle = ses.add_torrent(p)

Image downloaded with wget has size of 4 bytes

I have a problem with downloading certain image.
I'm trying to download image and save it on disk.
Here is the wget command, that I'm using and it works perfectly fine with almost every image. (code above works fine with this url)
wget -O test.gif http://www.fmwconcepts.com/misc_tests/animation_example/lena_anim2.gif
Almost, becouse when I try to download image from this url: http://sklepymuzyczne24.pl/_data/ranking/686/e3991/ranking.gif
It fails. Downloaded file size is 4 bytes. I tried doing this using curl instead of wget, but the results are the same.
I think that the second image (the one not working) might be somehow generated (the image automatically changes, depending on store reviews). I belive that it has something to do with this issue.
Looks like some kind of misconfiguration on the server side. It won't return the image unless you specify that you accept gzip compressed content. Most web browsers nowadays do this by default, so the image is working fine in browser, but for wget or curl you need to add accept-encoding header manually. This way you will get gzip compressed image. Then you can pipe it to gunzip and get a normal, uncompressed image.
You could save the image using:
wget --header='Accept-Encoding: gzip' -O- http://sklepymuzyczne24.pl/_data/ranking/686/e3991/ranking.gif | gunzip - > ranking.gif

FTP: copy, check integrity and delete

I am looking for a way to connect to a remote server with ftp or lftp and make sure the following steps:
Copy files from FTP server to my local machine.
Check if the downloaded files are fine (i.e. md5checksum).
If the download was fine then delete the downloaded files from the FTP server.
This routine will be executed each day from my local machine. What would be the best option to do this? Is there a tool that makes abstraction of all the 3 steps ?
I am running Linux on both client and server machines.
Update: Additionally, I have also a text file that contains the association between the files on the FTPserver and their MD5sum. They were computed at the FTP server side.
First, make sure your remote server supports the checksum calculation at all. Many do not. I believe there's even no standard FTP command to calculate a checksum of a remote file. There were many proposals and there are many proprietary solutions.
The latest proposal is:
https://datatracker.ietf.org/doc/html/draft-bryan-ftpext-hash-02
So even if your server supports checksum calculation, you have to find a client that supports the same command.
Some of the commands that can be used to calculate checksum are: XSHA1, XSHA256, XSHA512, XMD5, MD5, XCRC and HASH.
You can test that with WinSCP. The WinSCP supports all the previously mentioned commands. Test its checksum calculation function or the checksum scripting command. If they work, enable logging and check, what command and what syntax WinSCP uses against your server.
Neither the ftp (neither Windows nor *nix version) nor the lftp support checksum calculation, let alone automatic verification of downloaded file.
I'm not even aware of any other client that can automatically verify downloaded file.
You can definitely script it with a help of some feature-rich client.
I've wrote this answer before OP specified that he/she is on Linux. I'm keeping the Windows solution in case it helps someone else.
On Windows, you could script it with PowerShell using WinSCP .NET assembly.
param (
$sessionUrl = "ftp://username:password#example.com/",
[Parameter(Mandatory)]
$localPath,
[Parameter(Mandatory)]
$remotePath,
[Switch]
$pause = $False
)
try
{
# Load WinSCP .NET assembly
Add-Type -Path (Join-Path $PSScriptRoot "WinSCPnet.dll")
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.ParseUrl($sessionUrl);
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
Write-Host "Downloading $remotePath to $localPath..."
$session.GetFiles($remotePath, $localPath).Check();
# Calculate remote file checksum
$buf = $session.CalculateFileChecksum("sha-1", $remotePath)
$remoteChecksum = [BitConverter]::ToString($buf)
Write-Host "Remote file checksum: $remoteChecksum"
# Calculate local file checksum
$sha1 = [System.Security.Cryptography.SHA1]::Create()
$localStream = [System.IO.File]::OpenRead($localPath)
$localChecksum = [BitConverter]::ToString($sha1.ComputeHash($localStream))
Write-Host "Downloaded file checksum: $localChecksum"
# Compare cheksums
if ($localChecksum -eq $remoteChecksum)
{
Write-Host "Match, deleting remote file"
$session.RemoveFiles($remotePath).Check();
$result = 0
}
else
{
Write-Host "Does NOT match"
$result = 1
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
}
catch [Exception]
{
Write-Host "Error: $($_.Exception.Message)"
$result = 1
}
# Pause if -pause switch was used
if ($pause)
{
Write-Host "Press any key to exit..."
[System.Console]::ReadKey() | Out-Null
}
exit $result
You can run it like:
powershell -file checksum.ps1 -remotePath ./file.dat -localPath C:\path\file.dat
This is partially based on WinSCP example for Verifying checksum of a remote file against a local file over SFTP/FTP protocol.
(I'm the author on WinSCP)
The question was later edited to say that OP has a text file with a checksum. That makes it a completely different question. Just download the file, calculate local checksum and compare it to the checksum you have in the text file. If they match, delete the remote file.
That's a long shot, but if the server supports php, you can exploit that.
Save the following as a php file (say, check.php), in the same folder as your name_of_file.txt file:
<? php
echo md5_file('name_of_file.txt');
php>
Then, visit the page check.php, and you should get the md5 hash of your file.
Related questions:
Calculate file checksum in FTP server using Apache FtpClient
How to perform checksums during a SFTP file transfer for data integrity?
https://serverfault.com/q/98597/401691

tailing log file on windows2008 server

I use the following command to tail a logfile on a w2k8 server from a windows8 client pc:
get-content "file" -wait
The log file shows up and it sits there patiently waiting for new lines to be added,
but new lines never show up when they are added.
It worked fine on w2k3 server but somehow tailing on w2k8 server does not work.
The log file is updated from a C# service:
Trace.Listeners.Add(new TextWriterTraceListener(logFileName, "fileListener"));
Trace.WriteLine(....)
Does anybody know what to do about this?
I repro'd the issue on my WS08 system using Trace class. I tried both Trace.Flush() and writing lots of data (100K) and neither caused get-content -wait to respond.
However I did find a workaround. You'd have to update your C# program. (While experimenting I came to the conclusion that gc -wait is pretty fragile.)
$twtl= new-object diagnostics.TextWriterTraceListener "C:\temp\delme.trace",
"filelistener"
[diagnostics.trace]::Listeners.add($twtl)
# This sequence would result in gc -wait displaying output
[diagnostics.trace]::WriteLine("tracee messagee thingee")
[diagnostics.trace]::flush()
# Here I did file name completion such that c:\temp\delme.trace was in the
# completion list.
# In other words I typed something like d and pressed tab
# And this worked every time
# After searching for quite a while I finally found that
# get-itemproperty c:\temp\d*
# produced the same effect. The following sequence always worked:
# (without requiring a tab press)
[diagnostics.trace]::WriteLine("tracee messagee thingee")
[diagnostics.trace]::flush()
get-itemproperty c:\temp\d*
# To finish up
[diagnostics.trace]::Listeners.remove($twtl)
$twtl.close()
$twtl.dispose()
I think there's a bug somewhere. I suggest filing it on the Connect website.

Resources