How to do a proper handshake with the Coinbase Websocket for Market Data? - websocket

What is the first HTTP request I should send for a proper websocket handshake?
I am sending GET / HTTP/1.1 with some other standard websocket headers but I get a 400 Bad Request. See below
$ telnet localhost 8889
Trying 127.0.0.1...
Connected to localhost (127.0.0.1).
Escape character is '^]'.
GET / HTTP/1.1
Host: ws-feed.exchange.coinbase.com
Connection: Upgrade
Pragma: no-cache
Cache-Control: no-cache
Upgrade: websocket
Origin: http://www.test.com
HTTP/1.1 400 Bad Request
Server: cloudflare-nginx
Date: Thu, 10 Sep 2015 19:25:54 GMT
Content-Type: text/html
Transfer-Encoding: chunked
Connection: keep-alive
Set-Cookie: __cfduid=d3fe870c84fc991b0f2f6fc2c936820471441913154; expires=Fri, 09-Sep-16 19:25:54 GMT; path=/; domain=.coinbase.com; HttpOnly
Strict-Transport-Security: max-age=15552000; includeSubDomains; preload
X-Content-Type-Options: nosniff
CF-RAY: 223d857c6bae01ee-EWR
0

Sec-WebSocket-Version header and Sec-WebSocket-Key header are missing.

Related

Certbot unauthorized and connection errors

I have a spring boot application on Google Cloud, CentOS 7. I wish to install SSL certificate via Let's Encrypt and Certbot. When I use certbot --apache -d mydomain.zone command I receive an error:
My domain is registered on Namecheap. My A records on Google Cloud:
Also I provided google cloud nameservers in Namecheap like in this tutorial: https://www.wpmentor.com/setup-domain-google-cloud-platform/
Can you tell me where the issue is? I also wonder is there an issue with my java code in app. For example sometimes while accessing index page, error_page is called. When I have a method in my controller:
#RequestMapping(value = "/error_page", method = RequestMethod.GET)
public String homeError(Model model)
{
return "/error_page";
}
I have a different certvbot error:
but when I comment/erase my controller method for error page I receive this error:
Can it be it's an application bug? Or issue with apache?
EDIT:
I tried to turn off Tomcat. Now I receive this error:
note: My Apache forwards to 8080, I don't know will it make any issue?
iptables -A PREROUTING -t nat -p tcp --dport 80 -j REDIRECT --to-port 8080
After curl -I -L http://mydomain/.well-known/acme-challenge/zySNHSFB-qL95Ubx4jcIvuHPiiNbwkphE55kFuqP8jM:
HTTP/1.1 302
Vary: Origin
Vary: Access-Control-Request-Method
Vary: Access-Control-Request-Headers
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Cache-Control: no-cache, no-store, max-age=0, must-revalidate
Pragma: no-cache
Expires: 0
X-Frame-Options: DENY
Location: /error_page
Content-Language: en-US
Content-Length: 0
Date: Tue, 15 Feb 2022 20:01:50 GMT
HTTP/1.1 302
Vary: Origin
Vary: Access-Control-Request-Method
Vary: Access-Control-Request-Headers
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Cache-Control: no-cache, no-store, max-age=0, must-revalidate
Pragma: no-cache
Expires: 0
X-Frame-Options: DENY
Location: /error_page
Content-Language: en-US
Content-Length: 0
Date: Tue, 15 Feb 2022 20:01:50 GMT
curl: (47) Maximum (50) redirects followed
I needed to turn off the Apache web server to free my port 80. Also, I deleted iptables rule that forwards traffic from port 80 to port 8080. Now Certbot works

wget gives 403 on accessible files

First time poster with a bizarre issue I am having. I usually install software through conda, but from one moment to the other I stopped being able to use conda install because of a 403 I get from conda trying to access some configuration files. When trying to download those files with wget --spider --debug https://conda.anaconda.org/anaconda/noarch/current_repodata.json, I get the same 403 error.
DEBUG output created by Wget 1.19.4 on linux-gnu.
Reading HSTS entries from /home/jsequeira/.wget-hsts
URI encoding = ‘UTF-8’
Converted file name 'current_repodata.json' (UTF-8) -> 'current_repodata.json' (UTF-8)
Spider mode enabled. Check if remote file exists.
--2020-07-30 11:25:59-- https://conda.anaconda.org/anaconda/noarch/current_repodata.json
Resolving conda.anaconda.org (conda.anaconda.org)... 104.17.92.24, 104.17.93.24, 2606:4700::6811:5d18, ...
Caching conda.anaconda.org => 104.17.92.24 104.17.93.24 2606:4700::6811:5d18 2606:4700::6811:5c18
Connecting to conda.anaconda.org (conda.anaconda.org)|104.17.92.24|:443... connected.
Created socket 5.
Releasing 0x000056545deb1850 (new refcount 1).
Initiating SSL handshake.
Handshake successful; connected socket 5 to SSL handle 0x000056545deb2700
certificate:
subject: CN=anaconda.org,O=Cloudflare\\, Inc.,L=San Francisco,ST=CA,C=US
issuer: CN=Cloudflare Inc ECC CA-3,O=Cloudflare\\, Inc.,C=US
X509 certificate successfully verified and matches host conda.anaconda.org
---request begin---
HEAD /anaconda/noarch/current_repodata.json HTTP/1.1
User-Agent: Wget/1.19.4 (linux-gnu)
Accept: */*
Accept-Encoding: identity
Host: conda.anaconda.org
Connection: Keep-Alive
---request end---
HTTP request sent, awaiting response...
---response begin---
HTTP/1.1 403 Forbidden
Date: Thu, 30 Jul 2020 11:25:59 GMT
Content-Type: text/html; charset=UTF-8
Connection: close
CF-Chl-Bypass: 1
Set-Cookie: __cfduid=d3cd3a67d3926551371d8ffe5a840b04f1596108359; expires=Sat, 29-Aug-20 11:25:59 GMT; path=/; domain=.anaconda.org; HttpOnly; SameSite=Lax
Cache-Control: private, max-age=0, no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Expires: Thu, 01 Jan 1970 00:00:01 GMT
X-Frame-Options: SAMEORIGIN
cf-request-id: 044111dd9600005d4732b73200000001
Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
Vary: Accept-Encoding
Server: cloudflare
CF-RAY: 5baeb8dc2ba65d47-LIS
---response end---
403 Forbidden
cdm: 1
Stored cookie anaconda.org -1 (ANY) / <permanent> <insecure> [expiry 2020-08-29 11:25:59] __cfduid d3cd3a67d3926551371d8ffe5a840b04f1596108359
URI content encoding = ‘UTF-8’
Closed 5/SSL 0x000056545deb2700
Remote file does not exist -- broken link!!!
These files are accessible through the browser, and were always accessible with wget and conda until yesterday, when I was installing some tools not related to these network accesses. How can wget fail to download them?
So this was fixed by reinstalling apt-get. Some configuration file there must have been messed up.

Chrome not setting cookie on AJAX POST

I do an AJAX POST to my webservice, and it sets 2 cookies in the response, but Chrome does not set them. Safari and Firefox do, however.
Here's the request:
POST /api/login HTTP/1.1
Host: 0.0.0.0:8080
Connection: keep-alive
Content-Length: 50
accept: application/json
Origin: http://0.0.0.0:8080
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36
content-type: application/json
Referer: http://0.0.0.0:8080/form
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.8,nl;q=0.6,de;q=0.4,fr;q=0.2,pl;q=0.2
Response:
HTTP/1.1 200 OK
X-Powered-By: Express
Vary: X-HTTP-Method-Override, Accept-Encoding
x-frame-options: sameorigin
set-cookie: keystone.uid=s%3A55efe88923f753865f7a0985%3Ac5aT64aih9lxXi%2BNiSMr1rUJW4kzWyyNUforvUOrckk.JovuV%2FqeoQ32PiuyNPAZ7JcbIxXBcBvj%2FWFp8vf3SQQ; Path=/; HttpOnly
set-cookie: keystone.sid=s%3ADcv5el-TjLRkOSH9vNbvxQoOai-SQj-3.ZTfPFwEZp5mdVHSDZTukO%2FnrDnSpGU3OMW3tQu%2FSz7U; Path=/; HttpOnly
Content-Type: application/json; charset=utf-8
Content-Length: 224
ETag: W/"e0-B6OeRPdDEP0WPVdlZHqarA"
Date: Fri, 06 Nov 2015 14:39:37 GMT
Connection: keep-alive
I'm out of ideas. This doesn't work with a fully qualified domain name on port 80 either.
Found the solution:
You have send the request with credentials (XMLHttpRequest.withCredentials or e.g. credentials: 'include' for whatwg fetch).
Even though this is pointless since you're logging in and don't have any/have invalid cookies, it makes Chrome store the cookies from the returned answer. ¯\_(ツ)_/¯
As pointed out by #Anne, the XMLHTTPRequest specification actually requires user agents to disregard returned cookies unless withCredentials is specified. http://www.w3.org/TR/XMLHttpRequest/#the-withcredentials-attribute

Parse Security: able to see request/response in plain-text via proxy server

I'm attempting to hack the Parse SDK, and it seems that we are able to see requests and responses in plain text via a proxy server between Parse and the app. I assumed the data was encrypted, but a malicious user is able to see our requests and modify them to essentially pull out all of our user information.
Does anyone have any ideas on this?
Here is an example of a custom request and response via Proxy:
POST /1/classes/_User HTTP/1.1
Host: api.parse.com
Content-Type: application/json; charset=utf8
Cookie: _parse_session=---
Accept: */*
Proxy-Connection: keep-alive
X-Parse-Application-Id: ---
X-Parse-Client-Key: ---
X-Parse-Installation-Id: ---
Accept-Encoding: gzip, deflate
X-Parse-OS-Version: 8.2 (12D508)
Accept-Language: en-us
X-Parse-Client-Version: i1.6.5
Content-Length: 51
Connection: keep-alive
X-Parse-App-Build-Version: 11
X-Parse-App-Display-Version: 1.0.0
{"where":{"email":"joe#joe.com"},"_method":"GET"}
HTTP/1.1 200 OK
Access-Control-Allow-Methods: *
Access-Control-Allow-Origin: *
Content-Type: application/json; charset=utf-8
Date: Fri, 10 Apr 2015 01:02:55 GMT
Server: nginx/1.6.0
X-Parse-Platform: G1
X-Runtime: 0.013113
Content-Length: 246
Connection: keep-alive
{"results":[{"company":"","createdAt":"2015-04-10T01:02:35.670Z","discoverable":true,"email":"joe#joe.com","firstName":"Joe","lastName":"Smith","objectId":"yPTx1kyHei","title":"","updatedAt":"2015-04-10T01:02:35.670Z","username":"joe#joe.com"}]}

Firefox sending two get request for a website

Clicking a link results in two calls for the page to the server. I install livehttp and inspected the header but can't figure out why it's sending the second request.
http://example.com/schedule?delete=290376
GET /schedule?delete=290376 HTTP/1.1
Host: example.com
User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.13) Gecko/20110207 Firefox/3.6.13
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 115
Connection: keep-alive
Referer: http://example.com/schedule
Cookie: Code=XXX; CodeHash=XXXXX
HTTP/1.1 200 OK
Date: Tue, 01 Mar 2011 22:09:51 GMT
Server: Apache
X-Powered-By: PHP/5.2.17
Set-Cookie: Code=XXXX; expires=Wed, 02-Mar-2011 00:09:52 GMT; path=/
Set-Cookie: CodeHash=XXXX; expires=Wed, 02-Mar-2011 00:09:52 GMT; path=/
Keep-Alive: timeout=2, max=200
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: text/html
----------------------------------------------------------
http://example.com/schedule?delete=290376
GET /schedule?delete=290376 HTTP/1.1
Host: example.com
User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.13) Gecko/20110207 Firefox/3.6.13
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 115
Connection: keep-alive
Referer: http://example.com/schedule
Cookie: Code=XXXX; CodeHash=XXXXX
HTTP/1.1 302 Moved Temporarily
Date: Tue, 01 Mar 2011 22:09:55 GMT
Server: Apache
X-Powered-By: PHP/5.2.17
Set-Cookie: Code=XXX; expires=Wed, 02-Mar-2011 00:09:55 GMT; path=/
Set-Cookie: CodeHash=XXX; expires=Wed, 02-Mar-2011 00:09:55 GMT; path=/
Location: http://example.org/schedule?errors=5
Keep-Alive: timeout=2, max=200
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: text/html
----------------------------------------------------------
In case you didn't find your solution:
I have stumbled on that same issue, and it seems to be related to the page encoding. If FireFox downloads a page containing invalid characters (for example, utf-8 chars inside a page for which the Content-type header is something else), then it will download the page a second time and parse it in the encoding it has tried to guess from the invalid chars it detected in the first page.
So make sure your page either returns the correct Content-type header, or contains a meta http-equiv header with the correct encoding.
You don't happen to be using firefox, have the web developer toolbar and also have the display page validation on do you?
I am guessing in the dark here as to your enviro but my team and I have been able to demonstrate that having that tool bar installed in firefox and having page validation set to display actually duplicates the POSTs and GETs as it sends that same page data to the validation service.

Resources