What HTTP traffic monitor would you recommend for Windows? [closed] - windows

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I need the sniffer to test network traffic of applications developed by me for Windows and Facebook.
Basic requirements:
display request and response
display HTTP headers
display the time it took to complete HTTP request
Now I'm using HTTP Analyzer.
A very good tool, but it terminates with some error after 10-15 min running on Vista.

Wireshark if you want to see everything going on in the network.
Fiddler if you want to just monitor HTTP/s traffic.
Live HTTP Headers if you're in Firefox and want a quick plugin just to see the headers.
Also FireBug can get you that information too and provides a nice interface when your working on a single page during development. I've used it to monitor AJAX transactions.

I now use CharlesProxy for development, but previously I have used Fiddler

Try Wireshark:
Wireshark is the world's foremost
network protocol analyzer, and is the
de facto (and often de jure) standard
across many industries and educational
institutions.
There is a bit of a learning curve but it is far and away the best tool available.

Microsoft Network Monitor (http://www.microsoft.com/downloads/details.aspx?FamilyID=983b941d-06cb-4658-b7f6-3088333d062f)

Fiddler is great when you are only interested in the http(s) side of the communications. It is also very useful when you are trying to inspect inside a https stream.

I like TcpCatcher because it is very simple to use and has a modern interface. It is provided as a jar file, you just download it and run it (no installation process). Also, it comes with a very useful "on the fly" packets modification features (debug mode).

I use Wireshark in most cases, but I have found Fiddler to be less of a hassle when dealing with encrypted data.

Related

Golang websocket client [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
I want to make client websocket connections to exertnal server each connection = goroutine and reader. I was looking informations on the internet but I found how to create server websocket tutorials.
Can anyone be so kind and make a trivial example and walk me through. I am using standart golang libary https://golang.org/x/net/websocket.
I created some code but when I closed one connection program exited with EOF information. I won't post the code because it's probably bad duo to the fact it was my first try.
I know how to read/send message from websocket but I don't know how to create multiple connections.
Any informations, examples would be appreciate, thanks for reading
You can use the Gorilla WebSocket library
Here's an example of it's use as a client
Golang official doc recommends to use gorilla for building websocket based application. Still the problem is, gorilla websocket is not event based. Applications need to handle concurrent read and write operations. Developers need to write custom goroutines for handling connect, disconnect and read events.
I think it is better to have a library handling everything for you.
So, I decided to write down my own client implementation - gowebsocket on top of gorilla. You can find more detailed explaination here Getting started with websocket client in go
You can check the comparison given on this link.
https://yalantis.com/uploads/ckeditor/pictures/4265/websocket-libraries.png
Article suggests to go for Gobwas(https://github.com/gobwas/ws). Its best performance wise and offers all the required features needed for websockets related applications.

How do I do capacity test a websocket server? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am looking to capacity test my websocket server but don't really know where to start?
I am able to write a AI that will send messages to test the usage, but how would I simulate/make 100, 500, 1000 connections etc...?
I had a similar problem a little while ago when I had to load test thousands of connections against a server using the socket.io library. I was not able to find any off-the-shelf-solutions to do this so in the end I ended up building my own test using Node.js and a few for loops.
The advantage of Node is you can pretty much copy and paste the client side javascript into your server code so it's pretty simple to simulate the client and then you only need to make multiple connections to generate load. It's a quick and easy way to run the required javascript to establish the socket connection (assuming this is how you connect to your socket).
The gotcha I hit was running more than 600 listeners tended to max out the CPU on my node box but a little bit of AWS magic solved that.
Another issue is reporting results. There's not really any concept of response time with a socket connection, at least not in the classic sense, so it's hard to know when things are going wrong - at least from the client side perspective. But from monitoring the server we were able to see when connections failed and when resources started to get scarce and this was enough for us to benchmark how many connections it could support.
Autobahn Testsuite was designed to meet that need but the performance section of the tool still says "Under Development".
You could use JMeter for this purpose and get the WebSocket sampler plug-in from here: http://github.com/maciejzaleski/JMeter
For that many connections 1000 you might need to get more than one agent machine to achieve your task. This doesn't necessarily have to be dedicated server as you could deploy agents on few workstations (developers/testers machines) and used them for your test purpose. You could limit the impact by scheduling test execution to run out-of-hours.
Jmeter plugin is having severe limitations with number of concurrent users. It was working well only till ~450 users. Then I tried with artillery library(https://artillery.io/docs/testing_websockets.html) but this library also has restrictions with loops with their web socket package.

ruby: libraries, frameworks, servers providing concurrency for development of a web based chat [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
What ruby concurrency lib/framework should I use for the development of a web based chat?
I have read about Eventmachine and Celluloid libraries, and about Sinatra::Synchrony, Cramp, Goliath and Gserver concurrency-ready-servers. If I am getting this right, all these libs or servers implement concurrency using two main different approaches: the reactor pattern (mostly all of them), or the use of multithreading (i.e. gserver, ...).
Now if this is all correct, and I hope it is, could someone:
correct me if it is not...
point out other actively developed libraries or frameworks that I've missed ?
The reason I am asking this is that I am trying to build, for learning purposes, a web based chat using ruby on server side. It will interact with client using websockets or Server Side Events, with Jquery or something else.
Also I've read about using ruby with a Xmpp server, or pub/sub messaging system (like Faye). If I put one of these in the dish, am I correct if I say if that it all shrinks down to having to worry only about making requests to those servers in a non-blocking way, rather than having to set-up a complete "non-blocking" ruby chat server ?
I know this is a bit convoluted, but I hope it still make sense..
But in case I am going totally the wrong direction about something, can someone please give me at least a general, vague idea of what I need to understand better ?
Thanks!
Funny you should ask. Peter Cooper from Ruby Weekly mentioned (Issue 116 - October 25, 2012) a talk subtitled "Ruby developers need to stop using EventMachine. It's the wrong direction," which spawned some interesting debate on HN, since many frameworks are built on top of it (Goliath, Cramp, etc.)
The disenchanted flock either to Celluloid (with Sidekiq as its most famous client), to the Node.js platform or to other languages that offer solid concurrency primitives from the get go. Yes, Go, Erlang, Clojure...
Personally, I implemented a realtime web-based chat not long ago using Cramp, Redis Pub/Sub and Websockets, loosely adapted from the following demo code. It worked as advertised, but the traffic it gets doesn't compare to the requirements of some high volume systems elsewhere.

Use only for downloading: FTP vs HTTP in my server? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I want to get some files from my computer (text, video, images) and I'd like to download them to a folder on my Android device. I have been looking for alternatives and I think that there are two ways for doing that, but I don´t know if there is a great difference in using one or another.
Which protocol is better and why, FTP or HTTP, in my case? I don`t need uploading anything, and the size of the files is not too big. (I guess around 5M the biggest file)
I think HTTP is easier and FTP is fastest, could be? But I would like, thinking in programming, which is better.
In terms of speed, for file sizes larger than roughly 10kB both are equivalent. The difference is that FTP sends pure, raw data on its data channel without any headers so it has a slightly smaller overhead. But HTTP sends only around 12 or so lines of text as header for each file before blasting raw data onto the channel. So for files of around 10kB or less, yes HTTP overhead can be quite high - around 1% to 2% of the total bandwidth. For large files, the dozen or so lines of HTTP header becomes negligible.
FTP wastes one socket though for the control channel so for lots of users HTTP is twice more scalable. Remember, your OS has a limited number of sockets that it can open.
And finally, the most important consideration is that a lot of people access the internet through firewalls. Be it corporate, or school or dormitory or apartment building. And a lot of firewalls are configured to only allow HTTP access. You may find sometimes you can't get access to your files because of this. There are ways around this of course but it's one additional hassle you have to think about.
Additional answer:
I saw you asking about access restriction and security. The slight downside with HTTP is that you need to write your own web app to implement this. Web servers like Apache can be configured to do this just by writing a configuration file using HTTP basic authentication.
Luckily, people have had this problem before and some of them have written software to do this. Google around for "HTTP file server" and you'll find many implementations. Here's one fairly good open source web app: http://pfn.sourceforge.net/
Also, if you really want security you should set up SSL/TLS for your server regardless weather you end up using FTP or HTTP.
I would recommend HTTP. It allows you downloading file with multiple connections, you can easily share urls and you would also be able to download it in a restricted environment where all the ports except http are blocked.
FTP is more suitable if you want to control access to files on per user basis and require good amount of uploading also.
Addition:
You can implement security in http also using .htaccess files. However, it is not very scalable and not suitable for too many users with different access rights.
There are several other methods of protecting file on http. You will be able to find a lot of open source utilities on http://sourceforge.net which will let you do that. When speed is concerned, http is best. It allows you to fetch arbitrary part of the file and hence it is possible to have a multi thread download.
You will notice that most of file sharing sites use http and it is so for scalability reason.

Graphical HTTP client for windows [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I am looking for a Windows graphical utility for performing HTTP operations.
For example, I want to be able to say things like:
POST to http://example.org/test/service
With a POST body: "Data goes here"
Does anyone know a good piece of software for doing this?
I too have been frustrated by the lack of good graphical http clients available for Windows. So over the past couple years I've been developing one myself: I'm Only Resting, "a feature-rich WinForms-based HTTP client." It's open source (Apache License, Version 2.0) with freely available downloads.
It currently has fairly complete coverage of HTTP features except for file uploads, and it provides a very good user interface with great request and response management.
Here's a screenshot:
Update: For people that still come across this, Postman is your best bet now: https://www.getpostman.com/apps
RestClient is my favorite. It's Java based. I think it should meet your needs quite nicely. I particularly like the Auth suppport.
https://github.com/wiztools/rest-client
Have you looked at Fiddler 2 from Microsoft?
http://www.fiddler2.com/fiddler2/
Allows you to generate most types of request for testing, including POST. It also supports capturing HTTP requests made by other applications and reusing those for testing.
You can use Microsoft's WFetch tool also. This is a good tool for all HTTP operations.
You could try Jsonium tool http://jsonium.org- nice free tool specialized on requests with JSON in bodies and responses
http://www.ieinspector.com/httpanalyzer/
http://www.microsoft.com/downloads/details.aspx?FamilyID=B134A806-D50E-4664-8348-DA5C17129210&displaylang=en
https://addons.mozilla.org/en-US/firefox/addon/9780/
http://soft-net.net/SendHTTPTool.aspx
https://addons.mozilla.org/en-US/firefox/addon/966/
Honestly, for simplistic stuff like that I typically whip up a quick HTML form in a local file and load that up in a browser.
I like rest-client a lot for the purposes you described. It's a Java application to test REST-based web services.
If anybody is still interest Eclipse Labs Rest Client tool is an excellent choice. I'm trying it in Windows in an EXE version and works smoothly.
I've worked also with Rest Client previously and its great too.
https://play.google.com/store/apps/details?id=com.snmba.restclient
works from Android Tablets & Phones. Flexible enough to try various combinations.

Resources