How to set location in HTTParty - ruby

We have a method that scrapes pages using a table of URL's with HTTParty. Sometimes the pages it scrapes are in fact redirected to the homepage of the German version of the site. Our server is based in Germany, and I suspect that this is the cause as the method works fine when run from my local in the UK. Is there any way to set the browsing location of HTTParty to the UK in the same way that you can set the user agent browser type?
HTTParty.get(URI::escape(link), :headers => {"User-Agent" => "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_2) AppleWebKit/537.17 (KHTML, like Gecko) Chrome/24.0.1309.0 Safari/537.17"})

Related

Carding attack on Magento 2 / Braintree

I am trying to fend off carding attacks on a Magento 2 store.
The attacker manually creates a shopping cart and from it is able to send repeated requests to Braintree and my store to test credit card numbers. Our security measures quickly detect when this behavior happens from a single IP address but have been much less effective when the attack is distributed.
Surprisingly, Magento 2 allows for requests to come from multiple IP address even though they refer to a single session and shopping cart ID (Note: the security settings for validate REMOTE_ADDR, HTTP_VIA, HTTP_X_FORWARDED_FOR, and HTTP_USER_AGENT are all enabled). I am trying to get Magento's attention to this problem. But in the meantime I am trying to find a workaround.
Take a look at the log below. As you can see these are requests coming from different IP addresses, but they refer to the same shopping cart id (oq2xk8h2h3ghvjrii93o in this case). I would like to create a mechanism that tracks the IP address used for each shopping cart id, and detects if there is an IP address change for an individual cart id. If such change happens, the IP addresses used for the shopping cart get banned, as well as any subsequent IP addresses that try to use the same shopping cart id.
We already have in place: Cloudflare (free), fail2ban, mod_security with OWASP rules. We can leverage and of these.
209.127.191.180, 173.245.52.210, 127.0.0.1 - - [06/Aug/2020:06:05:48 -0400] foobar.com "POST /rest/foobar_view/V1/guest-carts/oq2xk8h2h3ghvjrii93o/payment-information HTTP/1.0" 400 689 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.89 Safari/537.36"
185.164.56.185, 162.158.63.7, 127.0.0.1 - - [06/Aug/2020:06:06:01 -0400] foobar.com "POST /rest/foobar_view/V1/guest-carts/oq2xk8h2h3ghvjrii93o/payment-information HTTP/1.0" 400 689 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.89 Safari/537.36"
45.95.99.226, 162.158.78.135, 127.0.0.1 - - [06/Aug/2020:06:06:15 -0400] foobar.com "POST /rest/foobar_view/V1/guest-carts/oq2xk8h2h3ghvjrii93o/payment-information HTTP/1.0" 400 689 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.89 Safari/537.36"
193.8.127.117, 162.158.62.120, 127.0.0.1 - - [06/Aug/2020:06:06:27 -0400] foobar.com "POST /rest/foobar_view/V1/guest-carts/oq2xk8h2h3ghvjrii93o/payment-information HTTP/1.0" 400 689 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.89 Safari/537.36"
My initial thought was to "watch" the log file, and whenever there is a request that matches the pattern, save the IP address, the shopping cart ID and the timestamp in a separate file. Then generate a separate log file that will contain just the IP addresses I want to ban, and then use fail2ban to read this log file and do the rest.
I have been able to create custom filter/jail/action in fail2ban, the main challenge I have now is the part of "watching" the original server log, and then creating my custom log of bad IP addresses.
Thank you for your help!
You may want to see my fail2ban example Here:
vi /etc/fail2ban/filter.d/restapi.conf
[INCLUDES]
before = restapi.conf
[Definition]
failregex = ^ - .* "POST.HTTP." 400 .*$
vi /etc/fail2ban/filter.d/restapi.conf
[restapi]
enabled = true
port = http,https
filter = restapi
logpath = /var/www/vhosts/yourdomain.com/logs/access_ssl_log
bantime = 86400
findtime = 1200
maxretry = 5
service fail2ban restart
In short, fail2ban watches the log file so no need for a separate watcher.
The only thing that helps is reCaptcha unfortunately.
Plain attacks via single IP address or single CartID are no longer used.
Recent attacks include:
random IP addresses from all over the world
random CartID, generated on Magento 2.3.x sites that are unprotected
So fail2ban will not actually help.
reCaptcha is included in Magento 2.4.x, if you have 2.3.x you will need to get a separate plugin - https://swissuplabs.com/magento-google-recaptcha.html. This is a solid plugin.
If you cant do any of these, just disable Guest Cart and start working towards reCaptcha installation.
If you use Cloudflare, you can put your site "Under Attack Mode" - that puts a javascript challenge in front of the attackers and stops them.
I also recommend putting your payment gateway into authorize only mode. This is so you do not get into trouble with payment provider as you wont need to refund the transactions that went through.

WP website taking over a minute to load on 000webhost.com was working fine

I have a site hosted on 000webhost.com It's been working fine but now takes over a minute to load. I'm not seeing any errors but did notice the following:
Request URL: https://sustainablewestonma.000webhostapp.com/
Referrer Policy: no-referrer-when-downgrade
Provisional headers are shown
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/76.0.3809.100 Mobile Safari/537.36
Appreciate it if someone could look at the site and suggest a course of action. The refresh response is fine. Looking at network in tools it seems the initial request and a stylesheet for css are taking the most time (over a minute) the rest seem to load as expected
website

500 Internal server error received on python get request, the same url works in browser

I am trying to open and download pdfs using python requests based on urls I get from an API. This works for many of the files, but for files stored at one specific site I get a 500 Internal Server error response. In the respone there is a simple html with only the text: Not Authenticated.
When I paste the same url in Chrome I get the pdf. However I can see a "503 - Failed to load resource" error in the console as it failed to load some icon. Can this be relevant somehow?
The url also works when I run it in Postman with no headers at all.
I have seemingly the same issue as described in this question:
python requests http response 500 (site can be reached in browser)
However the fix of adding User-Agent to the header of the request does not help. Can there be some other header data required, and is there any way of checking what request my Chrome browser sends?
Update: I logged what request Chrome is sending and copyed the header to my python request. Still the same error. I have tried with our without the same cookie.
Here is my code:
import requests
headers = {'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8',
'Accept-Encoding': 'gzip, deflate, br',
'Accept-Language': 'nb,en-GB;q=0.9,en-US;q=0.8,en;q=0.7',
'Connection': 'keep-alive',
'Cookie': 'JSESSIONID=a95b392a6d468e2188e73d2c296b; NSC_FS-NL-CET-XFC-IUUQ-8081=ffffffff3d9c37c545525d5f4f58455e445a4a4229a1; JSESSIONID=7b1dd39854eee82b2db41225150e',
'Host': url.split('/')[2],
'Upgrade-Insecure-Requests': '1',
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36'}
response = requests.get(url, headers=headers, verify=True)
I use Python 3.6.3
I found that I only get the error when I run the GET through requests. So I changed to using: urllib.request.urlopen(url)
More info about this approach here: Download file from web in Python 3

Open Uri not opening url even though browser opens it

I am trying to open a url with open-uri and when I open it from my browser, Safari, it takes me to the page in a second. However, when I try to open it with open-uri, it doesn't work. It says Net::ReadTimeout: Net::ReadTimeout, after a minute. It takes one second to open the url with my browser, but it doesn't work with open-uri ever. I have tried to increase the max timeout time but it doesn't work.
open(url).read
This is the code I use to open the url, and it doesn't work when I do it in the code.
Looks like they're protecting their API against really vague requests.
`curl 'http://stats.nba.com/stats/commonallplayers?LeagueID=00&Season=2016-17&IsOnlyCurrentSeason=1' -H 'Accept-Encoding: gzip, deflate, sdch' -H 'Accept-Language: en-US,en;q=0.8,ru;q=0.6' -H 'User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36'`
will do it just fine.

Safari for Windows Intranet Post Data missing

This problem has been puzzling me all morning; essentially I've reduced it down to a simple php page that generates a form and posts the results back to itself to narrow down the issue.
I've tested my problem on a couple of systems, XP, and Win7, both fully. When submitting a form using a POST action on my company Intranet, Safari doesn't seem to receive any POST data, the headers suggest it is being sent, but the page doesn't receive it.
It works fine in Opera, Chrome, IE8,9 and FF 6.0.1 and Safari on a Mac, but not Safari 5.1 + Windows. I'm think it may be related to our Intranet NTLM authentication, but I'm somewhat stumped. Hopefully it's a very daft/easy problem to solve.
Here are some headers from Safari Web inspector when posting to the Intranet:
Request URL:http://intranet/mis/basictest.php
Request Method:POST
Status Code:200 OK
Request Headers
Accept:text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Content-Type:application/x-www-form-urlencoded
Origin:http://intranet
Referer:http://intranet/mis/basictest.php
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.50 (KHTML, like Gecko) Version/5.1 Safari/534.50
Form Data
test:1
submit:Submit
Response Headers
Connection:close
Content-Type:text/html
Date:Thu, 13 Oct 2011 10:55:55 GMT
Server:Microsoft-IIS/6.0
X-Powered-By:PHP/5.2.2, ASP.NET

Resources