I'm trying to download invoice with bash script(wget), because I need to save invoices to my server automatically when order is made. I looked for other solutions but that didnt work, like downloading invoice with changing tcpdf.php.The post was up here: https://www.prestashop.com/forums/topic/465729-automatically-generate-invoices-in-a-folder/
So first I need to login to prestashop adminpage I suppose. Logging in got me confused, because how do I know when I'm logged in? This what I did:
wget --save-cookies cookies.txt --keep-session-cookies --post-data 'email=example#gmail.com&passwd=example' http://www.example.ee/admin/index.php?controller=AdminLogin&token=5a01dc4e606baa6c26e95ddea91d3d14
After saving my cookie, I tried to download the invoice by:
wget --load-cookies cookies.txt -e robots=off -A pdf -r -l1 http://www.example.ee/admin/index.php?controller=AdminPdf&token=35b276c05aa6r5eb516437a8d534eb64&submitAction=generateInvoicePDF&id_order=3197
But I get an error: Deleting, www.example.ee/admin/index.php?controller=AdminPdf, since it should be rejected
I think It's because I'm still not logged in?
It would be very nice that if it's possible without wget.
PS changed my website to example,etc
Related
I am trying to use wget to download the yahoo stock history file,
https://query1.finance.yahoo.com/v7/finance/download/ARDM?period1=1504030392&period2=1506708792&interval=1d&events=history&crumb=TKe9axyOsuR
but it always reports an error code":
Username/password authentication Failed.
However, if I visit that stock page
https://finance.yahoo.com/quote/ARDM/history?p=ARDM
I can click the datafile link, and download the CSV file without giving any user or password.
Any clue?
When you are going in via the browser the system is using your cached yahoo login details to go straight to the information using your cookie.
Set your authentication in your wget by using the following:
wget --user user --password pass http://example.com/
If you are not logging in to the server as seems to be the case here then you could try using two WGET using the first one to grab a cookie and the second one to download the data as follows:
wget -qO- --keep-session-cookies --save-cookies cookies.txt https://finance.yahoo.com/quote/ARDM/history?p=ARDM
followed by
wget -qO- --load-cookies cookies.txt https://query1.finance.yahoo.com/v7/finance/download/ARDM?period1=1504030392&period2=1506708792&interval=1d&events=history&crumb=TKe9axyOsuR
A site I'm working on requires a large amount of files to be downloaded from an external system via FTP, daily. This is not of my design, it is the only solution offered up by the external provider (I cannot use SSH/SFTP/SCP).
I've solved this by using wget, run inside a cron task:
wget -m -v -o log.txt --no-parent -nH -P /exampledirectory/ --user username --password password ftp://www.example.com/"
Unfortunately, wget does not seem to see the timestamp differences, so when a file is modified, it still returns:
Remote file no newer than local file
`/xxx/data/data.file'
-- not retrieving.
When I manually connect via FTP, I can see differences in the timestamps, so it should be getting the updated file. I'm not able to access or control the target server via any other means.
Is there anything I can do to get around this? Can I force wget to mirror while ignoring timestamps? (I understand this defeats the point of mirroring)...
I would like to connect from shell to a Google account in order to simulate navigation from this account with something like wget.
I have tried this command :
wget -T 3 -t 1 -q --secure-protocol=TLSv1 --no-check-certificate --user=$username --password=$password https://mail.google.com/mail/feed0/atom -O -
But I don't manage to validate the connection...
Maybe with curl ?
Thank you.
Login/password is insecure and not supported anymore.
You need at least an "application-specific password" or better OAuth.
I am trying to automate a process which previously was consuming a full-time job: monitoring a series of websites for new posts. This seemed like a relatively simple scripting problem, so I tackled it, wrote a bash script, and set it to run every minute in the crontab. It works great, but after the page changes, it keeps returning false positives for an hour or so, and I can't for the life of me figure out why. It resolves itself after a while, but I don't want to deploy the script until I understand what's happening. Here's my code:
#!/bin/bash
SITENAME=example
wget http://web.site.url/apache/folder/$(date +%Y)/$(date +%m)-$(date +%B) -O $SITENAME.backend.new --no-cache
touch $SITENAME.backend.old
diff $SITENAME.backend.new $SITENAME.backend.old > $SITENAME.backend.diff
if [ -s $SITENAME.backend.diff ]
then sendemail -xu myaddress#mydomain.com -xp password -f myaddress#mydomain.com -t myaddress#mydomain.com -s smtpout.secureserver.net -u $SITENAME -m backend \
&& cp $SITENAME.backend.new $SITENAME.backend.old \
&& echo true
fi
If the only difference between diffs are absolute or not absolute links, consider using the --convert-links switch for wget, like the man said :
-k
--convert-links
After the download is complete, convert the links in the document to make them suitable for local viewing. This affects not only the visible hyperlinks, but any part of the document that links to external content, such as embedded images, links to style sheets, hyperlinks to non-HTML content, etc.
This will convert links to absolute links.
I am working on a simple bash script to download images from the website Tumblr. The idea is to use read to get login info from the user, and wget --post-data to log in, and this is what I have:
read -p "Tumblr login email: " EMAIL
read -p "Tumblr login password: " PASSWRD
wget --user-agent=Mozilla/5.0 --save-cookies cookies.txt --post-data 'email=$EMAIL&password=$PASSWRD' --no-check-certificate https://www.tumblr.com/login
However, it is sending "$EMAIL" and "$PASSWRD" instead of the strings for the variables, is there any way to get it to send values that have been inputted by the user?
change:
--post-data 'email=$EMAIL&password=$PASSWRD'
to:
--post-data="email=$EMAIL&password=$PASSWRD"
bash manual about Quoting: http://www.gnu.org/software/bash/manual/bashref.html#Quoting
important:
Do not use :
--header="Content-Type: text/xml"
together with --post-data. It will override
--header="Content-Type: application/x-www-form-urlencoded"
issued by wget. Post-data will not be received by HttpServlet