GPM data from GES DISK - download

I want to download GPM data from GES DISK provided by NASA. I used wget and the URLs notepad file received as the described steps in https://disc.gsfc.nasa.gov/data-access. I have no problem with the downloading process in Command Prompt, but after downloading I found that only odd days have been downloaded while the URLs in my notepad file have all days of my desired period. How can I resolve this problem?

Related

Batch Powershell, Downloading a file from a url only results in part of the file?

Alright so I'm new-ish to the whole powershell thing and was figuring out how I could download a file via powershell activated via a batch file.
my current code is basically this
PowerShell (New-Object System.Net.WebClient).DownloadFile('https://www.dropbox.com/s/**********/tester.exe','tester.exe')
so, the first time I ran this as a .bat file, It worked perfectly, the file downloaded fully and ran properly when executed.
However I deleted the file off my pc, ran the .bat again and got back only a portion of the download, e.g. Original file -350 KB --- Downloaded file 209 KB
Because of this the file would not run, claiming incompatibility.
Please help, this is driving me nuts.
I fixed my problem by moving my file onto zippyshare.com instead of trying to get it from dropbox.

How to download a file using command line

I need to download the following file using the command line to my remote computer:
download link
The point is that if I use wget of curl, I just get a html document. but, if I enter this address in my browser (on my laptop), it simply starts downloading.
Now, my question is that since the only way to access my remote machine is through command line, how I can download it directly on that machine using the command line?
Thanks
Assuming that you are using a linux terminal.
You can use a command line browser like Lynx to click on links and download files.
The link provided by you isn't a normal file link, this link sends the filename as a GET variable, another page with form is sent by server as a response to this request. So wget, cURL will not work.
That website likely tracking session and checks if you've submitted the data & confirmed you're not a robot
Try different approach: copy it from your machine to remote via scp:
scp /localpath/to/file username#remotehost.com:/path/to/destination
Alternatively, you may export cookies from your local machine to remote and then pass them to wget with ‘--load-cookies file’ option, but can't guarantee it will work 100% if site also tracks session ID to IP
Here's Firefox extension for exporting cookies:
https://addons.mozilla.org/en-US/firefox/addon/export-cookies/
Once you have cookies.txt file just scp it to remote machine and run wget with '--load-cookies file' option
One of the authors of the corpus here.
As pointed out by a friend of mine, this tool solves all the problems.
https://addons.mozilla.org/en-GB/firefox/addon/cliget/
After installation you just click the download link and copy the generated command to the remote machine. Just tried it, works perfectly. We should put that info on the download page.

Can't curl then unzip zip file

I'm just trying to curl this zip file, then unzip it
curl -sS https://www.kaggle.com/c/word2vec-nlp-tutorial/download/labeledTrainData.tsv.zip > labeledTrainData.tsv.zip
unzip labeledTrainData.tsv.zip labeledTrainData.tsv
but I keep getting the error;
Archive: labeledTrainData.tsv.zip
End-of-central-directory signature not found. Either this file is not
a zipfile, or it constitutes one disk of a multi-part archive. In the
latter case the central directory and zipfile comment will be found on
the last disk(s) of this archive.
I'm using the same syntax as found in this response, I think. Is there something wrong with the file I'm downloading? I feel like I'm making a noob mistake. I run these two commands in a shell script
I am able to replicate your error. This sort of error generally indicates one of two things:
The file was not packaged properly
You aren't downloading what you think you're downloading.
In this case, your problem is the latter. It looks like you're downloading the file from the wrong URL. I'm seeing this when I open up the alleged zip file for reading:
<html><head><title>Object moved</title></head><body>
<h2>Object moved to here.</h2>
</body></html>
Long story short, you need to download from the alternate URL specified above. Additionally, Kaggle usually requires login credentials when downloading, so you'll need to specify your username/password as well.

LFTP, download only files created the same day I execute LFTP

How do I make LFTP download a file from a remote server only if this file was created TODAY (the same day I run LFTP) ?
Use mirror.
It has this --newer-than=SPEC option to download only files newer than specified time. For your specific needs, use --newer-than=now-1days. Now - 1 day should be yesterday therefore lftp will download all the file newer than yesterday.
Refer here for more info: http://lftp.yar.ru/lftp-man.html
EDIT: While I was tweaking my script, I notice there's an --only-newer option which download only newer file which is also useful for your case but with slight changes. --only-newer check the destination folder and download any files from source that's not in the destination folder while --newer-than download any files that's newer than the time you specified without checking the destination folder.

FileZilla FTPing unzip problems

I have a bash script that utilises inotify-tools to wait for .zip files to be dropped in a substructure under the root. From there they are unzipped into a another directory.
When I copy the .zip files in with WinSCP the script executes correctly. Copying the .zip files with Filezilla leads to this error however:
End-of-central-directory signature not found. Either this file is not
a zipfile, or it constitutes one disk of a multi-part archive. In the
latter case the central directory and zipfile comment will be found on
the last disk(s) of this archive.
I've googled this error and the two main issues seem to be an old version of Linux's unzip functionality, which I have a newer version of, and trying to copy files that are > 2gb (this file isn't)
Anyone know the issue here, it seems to me that Linux is trying to unzip the script before it is fully copied to disk? Like I said, only filezilla has this error, I don't get it with winSCP
I believe your main issue is you try to process the ZIP when it is still being transfered. Probably what happens is that as soon as the transfer is initiated WinSCP creates a temporary files to store the transfered data. That event would fire your script before the zip file is complety transfered.
That would explain why you get this error :
End-of-central-directory signature not found. Either this file is not
a zipfile,
So the solution would be to have two folders one for transfer and one for compelete. They should be in the same file system. On transfer complete just move from one folder to another.

Resources