Client bandwidth usage with Ruby's net/http - ruby

I am trying to track the bandwidth usage of individual requests in ruby, to see how much of my network usage is being split between different API calls.
Nothing I can find in net/http or ruby socket classes (TCPSocket, et. al) that seems to have a decent way to do this with out much monkey patching.
I have found a number of useful tools on linux for doing this, but none of them give me the granularity to inspect inside the http requests at the headers (so I could figure out which url we are requesting). The tools I am using are vnStat and ipfm -- which are great system bandwidth or host/network monitoring.
Ideally I would like to do something within the ruby code to track the data sent/received. I think if I could just get the raw header and add that length to the body length for both transfer and receive that would be Good Enoughâ„¢.

Can you use New Relic for Ruby? It has great visualizations/graphs for network usage/api calls.

It seems like it would be pretty easy to write a middleware to track this using Faraday.

Related

How can I test/send multiple (fake) ajax-requests at once to a (node.js) server?

At a certain point your (node.js) app works good with your single requests, and you would like to see what happens if fifty people use it at the same time. What will happen to the memory usage? What happens to the overall speed of the response?
I reckon this kind of testing is done a lot, so I was thinking there might be a relatively easy helper program for that.
By relatively easy I mean something convenient like POSTman - REST client is for single request and response testing.
What is your recommended (or favorite) method of testing this?
We use http://jmeter.apache.org/ , free and powerful ... you can set test use cases and run them

the geo coder to fetch more requests

I am working with geocoder gem and like to process more number of requests from an IP. By default Google API provides only 2500 requests per day.
Please share your thoughts on how I can do more requests than the limit?
As stated before: Using only Google API the only way around the limitation is to pay for it. Or in a more shady way make the requests form more than one IP/API-Key which i would not recommend.
But to stay on the save side i would suggest mixing the services up since there a few more Geocoding APIs out there - for free.
With the right gem mixing them is also not a big issue:
http://www.rubygeocoder.com/
Supports a couple of them with a nice interface. You would pretty much only have to add some rate-limiting counters making sure you stay within the limits of each provider.
Or go the heavy way of implementing your own geocoding. With for example your own running Openstreetmaps database. The Database can be downloaded here: http://wiki.openstreetmap.org/wiki/Planet.osm#Worldwide_data
Which is the best way depends on what your actual requirements are and what ressources you have available.

Sending messages between computers

I'd like to start investigating client/server communication. I've started to look at Distributed Objects and a tad at CFNetwork. Let's just say I'm looking for something more my speed (which is slower).
I'd like to be able to send a message from one computer to another, possibly carrying a string or some other type of data. I'm thinking of building a simple student response system where one computer is acting as a server and the clients are connecting and sending data to it.
I'm looking for resources that might help me out as well as suggestions of where to start understanding the concepts involved. I've been teaching myself Objective-C and am a relative newbie to programming, so I know I have holes in my understanding.
"Sockets" is the canonical answer.
If you're interested, here's a great introduction to socket programming (biased toward C, but still very informative):
Beej's Guide to Network Programming
Another way of doing it really simple is by letting the server set up a local http server (inside it self), and then let the clients simply make http requests. By doing that you let the http layer do all the fancy sockets stuff. More simple, and with more overhead, but may be suitable for your case. Also a lot easier to debug, since you can use your browser to test the connection. There are many ways of implementing a HTTP server in cocoa, can't remember which one i've used, but a quick google pointed me at this one for example

Best ruby binding/gem for curl/libcurl

I want to use the curl tool through ruby. So far I have invoked curl through the command line using curl and then parsing the data dumped from a file. However, I would like to use it from within my application. That would give me better control over the handling etc.
There are few gems out there http://curb.rubyforge.org/ and http://curl-multi.rubyforge.org/ However it's not clear which one is the best to use. I have the following criteria for decision
Stability and reliability of the library
Comprehensive support of underlying curl features. (I would be needing data posting, forging HTTP headers, redirects and multi-thread requests heavily.)
It would be great to get some feedback.
Thanks for your help.
-Pulkit
I highly recommend Typhoeus. It relies on lib-curl, and allows for all sorts of parallel and async possibilities. It offers ssl, stubbing, follows redirects, allows custom headers, true parallel requests for blazing speed, and generally has yet to let me down. Also, it is well maintained--at the moment, the last commit was 2 days ago!

Does some optimized web servers for single page application exists?

When we do single page application, the webserver basically does only one things, it gives some data when the client asks them (using JSON format for example). So any server side language (php, ror) or tool (apache, ningx) can do it.
But is there a language/tool that works better with this sorts of single page applications that generates lot of small requests that need low latency and sometimes permanent connection (for realtime and push things)?
SocketStream seems like it matches your requirements quite well: "A phenomenally fast real-time web framework for Node.js ... dedicated to creating single-page real time websites."
SocketStream uses WebSockets to get lowest latency for the real-time portion. There are several examples on the site to build from.
If you want a lot of small requests in realtime by pushing data - you should take a look at socket type connections.
Check out Node.js with Socket.io.
If you really want to optimize for speed, you could try implementing a custom HTTP server that just fits your needs, for example with the help of Netty.
It's blazingly fast and has examples for HTTP and WebSocket servers included.
Also, taking a look at GWAN may be worthwile (though I have not tried that one yet).
http://en.wikipedia.org/wiki/Nginx could be appropriate

Resources