Download zip file on site that requires authentication using VBScript - vbscript

I'm trying to download a ZIP file using VBScript that requires authentication. If you go to the site you'll notice it pops up an authentication prompt. The problem I have is after this script runs the ZIP file is too small for what it should be and is corrupt so I can't open it.
My thought is the download isn't working.
Anyone see what I'm doing wrong?
strHDLocation = "C:\Test\file1.zip"
Set xmlHttp = CreateObject("Microsoft.XMLHTTP")
xmlHttp.Open "GET", "http:downloadsite/report-id=123456", False, "myidhere", "mypwhere"
xmlHttp.Send()
Set objADOStream = CreateObject("ADODB.Stream")
objADOStream.Open
objADOStream.Type = 1 'adTypeBinary
objADOStream.Write xmlHttp.ResponseBody
objADOStream.Position = 0 'Set the stream position to the start
Set objFSO = Createobject("Scripting.FileSystemObject")
If objFSO.Fileexists(strHDLocation) Then objFSO.DeleteFile strHDLocation
Set objFSO = Nothing
objADOStream.SaveToFile strHDLocation
objADOStream.Close
Set objADOStream = Nothing

As a bare minimum when using IXmlHttpRequest you should check the Status property to make sure that assumptions are not made about what is being returned.
If xmlHttp.Status = 200 Then
'Successful Request
Else
'Something went wrong
End If
It's likely the request has failed for one reason or another and the ResponseBody contains the failed response not the expected ZIP file.

Related

How Can I Ignore Internet Errors in a VBScript?

I have a vbscript that downloads a file. I need to make it so that if there is no internet that it wont pop up with the error message The operation timed out, or Failed to find the resource specified. I've tried using On Error Resume Next, but alas it does not skip any internet related errors. Any way I can set a timeout or something? It is not a large file, just a 20-line text file. Here is my script:
strFileURL = "https://minecraft-statistic.net/en/server/167.114.43.185_25565/json/"
strHDLocation = "c:\users\public\mc.txt"
Set objXMLHTTP = CreateObject("MSXML2.XMLHTTP")
objXMLHTTP.open "GET", strFileURL, false
objXMLHTTP.send()
If objXMLHTTP.Status = 200 Then
Set objADOStream = CreateObject("ADODB.Stream")
objADOStream.Open
objADOStream.Type = 1 'adTypeBinary
objADOStream.Write objXMLHTTP.ResponseBody
objADOStream.Position = 0 'Set the stream position to the start
Set objFSO = Createobject("Scripting.FileSystemObject")
If objFSO.Fileexists(strHDLocation) Then objFSO.DeleteFile strHDLocation
Set objFSO = Nothing
objADOStream.SaveToFile strHDLocation
objADOStream.Close
Set objADOStream = Nothing
End if
Set objXMLHTTP = Nothing
On Error Resume Next is the only option for capturing errors, I'm not sure why you say it doesn't work. This works for me;
On Error Resume Next
strFileURL = "https://minecraft-statistic.net/en/server/167.114.43.185_25565/json/"
strHDLocation = "c:\users\public\mc.txt"
Set objXMLHTTP = CreateObject("MSXML2.XMLHTTP")
objXMLHTTP.open "GET", strFileURL, false
objXMLHTTP.send()
If Err.Number = 0 Then
If objXMLHTTP.Status = 200 Then
Set objADOStream = CreateObject("ADODB.Stream")
objADOStream.Open
objADOStream.Type = 1 'adTypeBinary
objADOStream.Write objXMLHTTP.ResponseBody
objADOStream.Position = 0 'Set the stream position to the start
Set objFSO = Createobject("Scripting.FileSystemObject")
If objFSO.Fileexists(strHDLocation) Then objFSO.DeleteFile strHDLocation
Set objFSO = Nothing
objADOStream.SaveToFile strHDLocation
objADOStream.Close
Set objADOStream = Nothing
End if
Set objXMLHTTP = Nothing
Else
'Handle the error here.
End If
The way On Error Resume Next works is as follows;
Line triggers an error which is caught by the VBScript runtime.
The error is recorded in the Err object.
The line is skipped and the next statement is run.
This process will continue until an On Error Goto 0 line is reached at which point the default behaviour resumes.
Useful Links
VBScript — Using error handling

Save Image from URL. Fails inside a loop, but not when run once

This Function works:
Sub SaveImage(url, name)
' Set your settings
strFileURL = url
strHDLocation = "D:\Images\" & name
' Fetch the file
Set objXMLHTTP = CreateObject("MSXML2.XMLHTTP")
objXMLHTTP.open "GET", strFileURL, false
objXMLHTTP.send()
If objXMLHTTP.Status = 200 Then
Set objADOStream = CreateObject("ADODB.Stream")
objADOStream.Open
objADOStream.Type = 1 'adTypeBinary
objADOStream.Write objXMLHTTP.ResponseBody
objADOStream.Position = 0 'Set the stream position to the start
Set objFSO = Createobject("Scripting.FileSystemObject")
If objFSO.Fileexists(strHDLocation) Then objFSO.DeleteFile strHDLocation
Set objFSO = Nothing
objADOStream.SaveToFile strHDLocation
objADOStream.Close
Set objADOStream = Nothing
End if
Set objXMLHTTP = Nothing
End Sub
If I call it once, it download the image, if I put it inside a loop it fail with different errors.
I set this function in a page, and set the URL and image name, then it download the image.
I have permission to everyone. NO PERMISSION ISSUE.
If I put it in a seperate page, then take the URL and image name from request, as:
Dim url, name
url = Request("url")
name = Request("name")
Then call the function:
Call SaveImage(url, name)
I get:
ADODB.Stream error '800a0bbc'
Write to file failed.
If I call it from the page, like this:
Call SaveImage("http://www.example.com/images/imageName.jpg", "imageName.jpg")
It dose download the image.
But if from the same place, I call the sub from inside a loop, like this:
Variables are declared above. Array is filled.
For row = 0 To UBound(myArray, 2)
url = CStr(ar(1, row)) : image = id & ".jpg"
Call SaveImage2(url, image)
Next
It fail in the first image.
ADODB.Stream error '800a0bbc'
Write to file failed.
I tried others stream writers, but others act the same with different errors.

vbscript downloading file from cache

I need to download latest file from website without using wget or any external utility. I found following vbscript on stack overflow itself (didnt know link, because I have cleared the cache and history)
' Set your settings
strFileURL = "http://somewebsitehere.com/somefile"
strHDLocation = "D:\somepath\"
' Fetch the file
Set objXMLHTTP = CreateObject("MSXML2.XMLHTTP")
objXMLHTTP.open "GET", strFileURL, false
objXMLHTTP.send()
If objXMLHTTP.Status = 200 Then
Set objADOStream = CreateObject("ADODB.Stream")
objADOStream.Open
objADOStream.Type = 1 'adTypeBinary
objADOStream.Write objXMLHTTP.ResponseBody
objADOStream.Position = 0
'Set the stream position to the start
Set objFSO = Createobject("Scripting.FileSystemObject")
If objFSO.Fileexists(strHDLocation) Then objFSO.DeleteFile strHDLocation
Set objFSO = Nothing
objADOStream.SaveToFile strHDLocation
objADOStream.Close
Set objADOStream = Nothing
End if
Set objXMLHTTP = Nothing
However after using this script, I found it did work correctly for first few trials but later download same file from cache instead of downloading the fresh file from net. I get same file from C:\Users\username\AppData\Local\Microsoft\Windows\INetCache\IE\0XQSU247.
I need to clear the INETCache and then re run the script to get the fresh file.
So how should this script be modified so it gets latest updated file from web server instead of getting files from cache.
This question might be lame, but I have no knowledge of vbscript.

Getting "Access Denied" while downloading an object

I am trying to download a svn client by running a batch script. For that I am using this piece of VBS which I call from the batch file. Now this code works because I have successfully downloaded some files but when I am trying to download this file from sourceforge.net I am getting a access denied error message after send(). Any insight on why this is happening and can be avoided will be helpful.
Set objXMLHTTP = CreateObject("MSXML2.XMLHTTP")
objXMLHTTP.open "GET", strFileURL, false
objXMLHTTP.send()
do until objXMLHTTP.Status = 200 : wscript.sleep(1000) : loop
If objXMLHTTP.Status = 200 Then
Set objADOStream = CreateObject("ADODB.Stream")
objADOStream.Open
objADOStream.Type = 1 'adTypeBinary
objADOStream.Write objXMLHTTP.ResponseBody
objADOStream.Position = 0 'Set the stream position to the start
Set objFSO = Createobject("Scripting.FileSystemObject")
If objFSO.Fileexists(strHDLocation) Then objFSO.DeleteFile strHDLocation
Set objFSO = Nothing
objADOStream.SaveToFile strHDLocation
objADOStream.Close
Set objADOStream = Nothing
End if
Set objXMLHTTP = Nothing
You're getting access denied because you're trying to download a file from a URL that is redirecting you. If you try to download a file directly, you'll find you won't get the error.
You should use the latest version..
Set objXMLHTTP= CreateObject("Msxml2.XMLHttp.6.0")
However using..
Set objXMLHTTP = CreateObject("Microsoft.XMLHTTP")
..is ok for now.
If you were to add sourceforge.com to your trusted sites in IE, go into IE Options->Security, select Trusted sites go into the Custom Level, change "Access data sources across domains" to Enable, that should let you get beyond that Access Denied error.

VBScript getting the same file twice when intermittently reading from a webpage

I'm using the following code to download a webpage and save it to file:
function download(sFileURL, sLocation, async)
download = false
set objXMLHTTP = CreateObject("MSXML2.XMLHTTP")
objXMLHTTP.open "GET", sFileURL, async
on error resume next
objXMLHTTP.send()
if err.number = 0 then
do until objXMLHTTP.Status = 200
wscript.echo objXMLHTTP.Status
wcript.sleep(200)
loop
if objXMLHTTP.Status = 200 Then
set objADOStream = CreateObject("ADODB.Stream")
objADOStream.Open
objADOStream.Type = 1
objADOStream.Write objXMLHTTP.ResponseBody
objADOStream.Position = 0
set objFSO = Createobject("Scripting.FileSystemObject")
If objFSO.Fileexists(sLocation) Then objFSO.DeleteFile sLocation
Set objFSO = Nothing
objADOStream.SaveToFile sLocation
objADOStream.Close
objXMLHTTP.Close
set objADOStream = Nothing
download = true
end if
else
download = false
end if
set objXMLHTTP = Nothing
end function
I'm calling it once passing it (url, filename1, false)
and then I sleep for x seconds
and call it again with (url, filename2, false)
I can see the x delay between the properties of the 2 files on disk, but the second file is the exact same as the first file that is downloaded. I know this for certain because I have a server timer.
Is there some sort of strange caching going on, or something wrong with my download function? To be fair I did copy it from the internets...

Resources