Is Ruby's net/http request visible to user? - ruby

I'm using Ruby's library for getting http pages (net/http), for example:
Net::HTTP.get URI.parse(uri)
Is this visible for user somehow? I mean, can the user use firebug (for example) to obtain uri or is this is only handled and visible by the server?

No, Net::HTTP requests are on the server running Ruby. The user cannot monitor those requests unless they had access to the server or the server's network.

Related

How can I GET an API with ruby if it's behind Cloudflare's Anti DDos protection

I was trying to scrape this api, but when i used Net::HTTP.get, it returned the cloudflare page asking me to wait for 5 seconds while it checks my browser.
I looked it up, and there's a module for python and for node.js, but none for ruby. Is this possible with maybe an argument to Net::HTTP or using curl?
You could try specifying a user agent perhaps:
http = Net::HTTP.new(url)
http.request_get(URL, {'User-Agent' =>'whatever'})

What is the difference between cookie and cookiejar?

Today I faced the term "cookiejar" (package net/http/cookiejar). I tried to gather some information regarding it, but got nothing intelligible came out. I know that cookie is key/value pairs that server sends to a client, eg: Set-Cookie: foo=10, browser stores it locally and then each subsequent request browser will send these cookies back to the server, eg: Cookie: foo=10.
Ok, but what about cookiejar? What is it and how does it look like?
As you described in your question, cookies are managed by browsers (HTTP clients) and they allow to store information on the clients' computers which are sent automatically by the browser on subsequent requests.
If your application acts as a client (you connect to remote HTTP servers using the net/http package), then there is no browser which would handle / manage the cookies. By this I mean storing/remembering cookies that arrive as Set-Cookie: response headers, and attaching them to subsequent outgoing requests being made to the same host/domain. Also cookies have expiration date which you would also have to check before deciding to include them in outgoing requests.
The http.Client type however allows you to set a value of type http.CookieJar, and if you do so, you will have automatic cookie management which otherwise would not exist or you would have to do it yourself. This enables you to do multiple requests with the net/http package that the server will see as part of the same session just as if they were made by a real browser, as often HTTP sessions (the session ids) are maintained using cookies.
The package net/http/cookiejar is a CookieJar implementation which you can use out of the box. Note that this implementation is in-memory only which means if you restart your application, the cookies will be lost.
So basically an HTTP cookie is a small piece of data sent from a website and stored in a user's web browser while the user is browsing that website.
Cookiejar is a Go interface of a simple cookie manager (to manage cookies from HTTP request and response headers) and an implementation of that interface.
In general it is a datastore where an application (browser or not) puts the cookies it uses during requests and responses. So it is really a jar for cookies.

Can't figure out how to test proxy with Soundcloud API

I am trying to use my proxy with the Soundcloud API. The format is
client = soundcloud.Client(client_id=client_id,
client_secret=client_secret,
username=username,
password=password,
proxies=proxies)
However, when I pass something into the proxies variable like
proxies = {'http': 'notavalidip'}
the client is still able to log in and function normally. Why is this happening and how can I test that when I pass an actual valid proxy it will actually be used? I believe this API uses the Python requests library, if that helps.
All those options get handed down to make_request eventually being passed into kwargs inside the request_func, which is indeed backed by the requests library.
Your proxy is passing only because it has the wrong scheme. All connections to Soundcloud are made via https, and not http by default. This means that you have no proxy setup, since your proxies dictionary has no https key.
See here how proxy is simply set to None because the dictionary didn't have the required scheme.
After modifying your proxies variable to https instead of http I got an exception thrown (ProxyError('Cannot connect to proxy.'), so no silent fails.
Hope this makes sense.

Switching from https to http

Is it correct to switch from HTTPS to HTTP (say by clicking a click which has full path in the href - with HTTP). Appreciate if someone let know what are the implications in such cases.
Thanks.
This actually can be a security risk, it depends on your situation.
If you create a session in that HTTPS part, and then visit a HTTP page of the same domain, the session cookie will be sent along with the unsecure HTTP request (plaintext). This makes your site vulnerable to session hijacking, an attacker can use this session id and has the same privileges as the logged in user has.
In PHP you can prevent this behaviour, calling the session_set_cookie_params() function, setting the $secure parameter to true. This tells the browser, to send the cookie to HTTPS pages only.
The browser will load a page from a non SSL source. No real implications as far as security is concerned.
Switching from HTTPS to HTTP is entirely correct if that is what the link is intended to do.
Implications include losing the encrypted communication link between client and server that HTTPS provides.
ssl encryption for the http (s) is used to protect transmitted information. For example, use https on the lognin page. After logging on, you can redirect to http.
So switching https and http is quite all right.

How do I simulate a web session and store the received cookie using Ruby?

I am testing a messaging application that uses single sign-on(SSO). I need to simulate a user connecting to WAM using SSO, then I need to get the cookie from the server and store it for further communication events. Is anyone familiar with how this might be done using Ruby?
Mechanize does that automatically. There are also other HTTP client gems that may support cookies, e.g. this httparty example.
If your intention or need is to do this more 'organically' as it were, then you could use Watir to drive a browser, and retrieve the cookie (assuming it's stored on disk and not a session cookie) from wherever the browser stores it using normal file I/O functions. If it's a session cookie that's a bit tricker but you can usually see them with developer tools like firebug.
If you want something operating at the HTTP level that is simulating a browser, then See the other answer provided by MichaelW

Resources