How do download yahoo stock history data file? - download

I am trying to use wget to download the yahoo stock history file,
https://query1.finance.yahoo.com/v7/finance/download/ARDM?period1=1504030392&period2=1506708792&interval=1d&events=history&crumb=TKe9axyOsuR
but it always reports an error code":
Username/password authentication Failed.
However, if I visit that stock page
https://finance.yahoo.com/quote/ARDM/history?p=ARDM
I can click the datafile link, and download the CSV file without giving any user or password.
Any clue?

When you are going in via the browser the system is using your cached yahoo login details to go straight to the information using your cookie.
Set your authentication in your wget by using the following:
wget --user user --password pass http://example.com/
If you are not logging in to the server as seems to be the case here then you could try using two WGET using the first one to grab a cookie and the second one to download the data as follows:
wget -qO- --keep-session-cookies --save-cookies cookies.txt https://finance.yahoo.com/quote/ARDM/history?p=ARDM
followed by
wget -qO- --load-cookies cookies.txt https://query1.finance.yahoo.com/v7/finance/download/ARDM?period1=1504030392&period2=1506708792&interval=1d&events=history&crumb=TKe9axyOsuR

Related

How to get bitbucket OAUTH token via bash script?

I am attempting to get an OAUTH token for bitbucket via a bash script.
At the moment, I'm able to use the following URL:
https://bitbucket.org/site/oauth2/authorize?client_id={key}&response_type=token
I simply visit this via a web browser and hit authenticate, I am then redirected to the callback website, And I can see the token in the URL. For example, if my callback url was stackoverflow.com, My url bar would now contain stackoverflow.com/#access_token=XYZ
What I need to do, is figure out how to hit this url but get this access token in a bash script rather than from a URL.
Relevant doc: https://bitbucket.org/site/oauth2/authorize?client_id=kdBELaEX6HkUexPMRS&response_type=token
Using curl (can be used inside your bash script)
curl -X POST https://bitbucket.org/site/oauth2/access_token \
-d grant_type=client_credentials \
-u key:secret

Prestashop, wget invoice

I'm trying to download invoice with bash script(wget), because I need to save invoices to my server automatically when order is made. I looked for other solutions but that didnt work, like downloading invoice with changing tcpdf.php.The post was up here: https://www.prestashop.com/forums/topic/465729-automatically-generate-invoices-in-a-folder/
So first I need to login to prestashop adminpage I suppose. Logging in got me confused, because how do I know when I'm logged in? This what I did:
wget --save-cookies cookies.txt --keep-session-cookies --post-data 'email=example#gmail.com&passwd=example' http://www.example.ee/admin/index.php?controller=AdminLogin&token=5a01dc4e606baa6c26e95ddea91d3d14
After saving my cookie, I tried to download the invoice by:
wget --load-cookies cookies.txt -e robots=off -A pdf -r -l1 http://www.example.ee/admin/index.php?controller=AdminPdf&token=35b276c05aa6r5eb516437a8d534eb64&submitAction=generateInvoicePDF&id_order=3197
But I get an error: Deleting, www.example.ee/admin/index.php?controller=AdminPdf, since it should be rejected
I think It's because I'm still not logged in?
It would be very nice that if it's possible without wget.
PS changed my website to example,etc

Wget mirror site via ftp - timestamps issue

A site I'm working on requires a large amount of files to be downloaded from an external system via FTP, daily. This is not of my design, it is the only solution offered up by the external provider (I cannot use SSH/SFTP/SCP).
I've solved this by using wget, run inside a cron task:
wget -m -v -o log.txt --no-parent -nH -P /exampledirectory/ --user username --password password ftp://www.example.com/"
Unfortunately, wget does not seem to see the timestamp differences, so when a file is modified, it still returns:
Remote file no newer than local file
`/xxx/data/data.file'
-- not retrieving.
When I manually connect via FTP, I can see differences in the timestamps, so it should be getting the updated file. I'm not able to access or control the target server via any other means.
Is there anything I can do to get around this? Can I force wget to mirror while ignoring timestamps? (I understand this defeats the point of mirroring)...

Bash - download a file, log in if required

I'm using BASH and I need to download a TXT file, which is generated on server-side by request. This means the URL is something like:
http://1.1.1.1:4884/page.aspx?fileID=123456&lang=en&Export=1
Export=1 is caught by the .NET application and I'm provided with a TXT file, based on fileID.
In case I haven't logged in, I'm redirected to a login form with ?ReturnUrl in the URL, redirecting me back to my requested page upon login.
How can I successfully download this file using BASH, cURL/wget/lynx. It has to be non-interactive.
I've tried using the --cookie options for curl and wget and lynx automation (cmd-log). Lynx worked best, but for some reason, the file download prompt could not be automated.
Please help. If any additional info is required, I will provide.
Use curl.
Code following approach:
try to download the file
if failed (redirected to login page), log in and go to begin
You always need to use -c option of curl to store the cookies between curl calls
To log in using curl you need to know the form on the server, that means: names of fields where you usually type your login and password.
To send the data to server use -d option of curl. To send the cookie to server use -b (or --cookie).

Create github repo bash file, NOT be prompted for password

I need to be able to create github repositories via bash scripts that run from a php page, so I need to be able to pass the password in the curl command or the API Key.
However I can not seem to find the API key as I believe this may be redundant now with V3 of the github API
I followed Is it possible to create a remote repo on GitHub from the CLI without opening browser? and it got me as far as being prompted for the password
Bash file looks like this:
#! /bin/bash
a=$1
curl="-u 'USERNAME' -p 'PASSWORD' https://api.github.com/user/repos -d '{\"name\":\""$a"\"}'"
curl $curl
This does not work as it is not liking the -p parameter it seems, tried -u 'USERNAME:PASSWORD' and it did not like that either and I can not seem to find the answer on github pages. Ideally I would use the API key as this would not leave my repo password exposed in my bash file correct?
Many thanks
curl -u 'dmalikov:my_password' https://api.github.com/user/repos -d '{"name":"HI"}' works fine for me, now I have this HI repo.

Resources