Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
Relating to this question, Upload files directly to Amazon S3 from ASP.NET application, is there any way to do this and have a progress bar?
---- EDIT ----
Two days later and still no luck with a direct way. Found one thing that looks promising but not free: http://www.flajaxian.com/
Uses flash to upload directly to S3 with a progress bar.
I'm looking for a solution as well. Maybe this will be of some help,
From AWS Dev Commnity
But in many languages (PHP, Java), for
big files, you have to use streams
through which the language environment
will take chunks of your big file one
after the other (in order to fill up
central memory with huge amount of
data for the http POST of S3 needed
for the upload.
The nice thing about stream is that
they have a callback called whenever
the next chunk is read for to further
PUT (in the https sense) data to S3.
You can use this callback to compute
and display the progress on the client
UI.
See the doc of libcurl to see in
details how all this works.
Update: It looks like there are two straightforward options.
Flash, via the FileReference class
With a Java applet
I personally hate using 3rd party extensions (Flash, Java) to make an app function, but I haven't found another way.
html5 javascript can allow you to do it, if you don't mind lack of browser support (Firefox and Chrome only as of this post_
Example here: https://developer.mozilla.org/en/Using_files_from_web_applications
This isn't specific to AWS, but may help you get closer.
Another approach is to use something like Uber Uploader (http://uber-uploader.sourceforge.net/) which is a perl / php hybrid solution with a progress bar. You would simply upload the files to your server and then have your server FTP them in the background to the final destination. It is an extra step but it gives you some time to do any processing / encoding / etc. that you may need to do before sending to S3.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I'm developing a new product, and one of the design requirements is to implement an embedded web server on the microcontroller.
The web pages should be responsive and dynamic like single page application (SPA) web pages and there are 3 pages to be implemented with light images and graphics.
I plan to pick out a microcontroller from the STM32 range, and my questions are related to the hardware design part :
what are the minimum Microcontroller requirements to implement an embedded web server in terms of performance and memory?
what is the approximate size of the used memory for the lwIP stack, web server, and client-side code?
where to store the webpages? internal Flash, ROM, External Flash?
And finally, what is the complexity level of the implementation in comparison to the traditional HTTP request?
Thanks,
Network connectivity and sufficient RAM+Flash to run your server. If using TLS (i.e. HTTPS), some processing power (preferably a crypto accelerator) will come in handy.
Depends on what you're planning to serve :) Let's assume a single concurrent client connection and a web server serving simple dynamic pages implemented in C. You'll want at around 100-200 KiB of RAM for the network and HTTP server - maybe much more if doing anything non-trivial. Add around 50-100 KiB more for TLS. This will be enough to implement a few simple text-based config and status pages. As for the amount of Flash (code memory), depends on how much code you write and how big your web assets are :) Note that TLS libraries are rather large, perhaps around 300-500 KiB. These estimates don't include any server-side scripting languages (javascript, python, ...) - C only.
Unless you have specific requirements, your web assets should be few, small and fit (as text or binary blobs) into the same Flash as everything else.
It's more complex. Depends on what you compare it with. It's not like you're going to implement the HTTP protocol yourself - find a library for that. But almost nothing is free in a microcontroller environment. You manage your own memory, your own threads, your own everything.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I want to get some files from my computer (text, video, images) and I'd like to download them to a folder on my Android device. I have been looking for alternatives and I think that there are two ways for doing that, but I donĀ“t know if there is a great difference in using one or another.
Which protocol is better and why, FTP or HTTP, in my case? I don`t need uploading anything, and the size of the files is not too big. (I guess around 5M the biggest file)
I think HTTP is easier and FTP is fastest, could be? But I would like, thinking in programming, which is better.
In terms of speed, for file sizes larger than roughly 10kB both are equivalent. The difference is that FTP sends pure, raw data on its data channel without any headers so it has a slightly smaller overhead. But HTTP sends only around 12 or so lines of text as header for each file before blasting raw data onto the channel. So for files of around 10kB or less, yes HTTP overhead can be quite high - around 1% to 2% of the total bandwidth. For large files, the dozen or so lines of HTTP header becomes negligible.
FTP wastes one socket though for the control channel so for lots of users HTTP is twice more scalable. Remember, your OS has a limited number of sockets that it can open.
And finally, the most important consideration is that a lot of people access the internet through firewalls. Be it corporate, or school or dormitory or apartment building. And a lot of firewalls are configured to only allow HTTP access. You may find sometimes you can't get access to your files because of this. There are ways around this of course but it's one additional hassle you have to think about.
Additional answer:
I saw you asking about access restriction and security. The slight downside with HTTP is that you need to write your own web app to implement this. Web servers like Apache can be configured to do this just by writing a configuration file using HTTP basic authentication.
Luckily, people have had this problem before and some of them have written software to do this. Google around for "HTTP file server" and you'll find many implementations. Here's one fairly good open source web app: http://pfn.sourceforge.net/
Also, if you really want security you should set up SSL/TLS for your server regardless weather you end up using FTP or HTTP.
I would recommend HTTP. It allows you downloading file with multiple connections, you can easily share urls and you would also be able to download it in a restricted environment where all the ports except http are blocked.
FTP is more suitable if you want to control access to files on per user basis and require good amount of uploading also.
Addition:
You can implement security in http also using .htaccess files. However, it is not very scalable and not suitable for too many users with different access rights.
There are several other methods of protecting file on http. You will be able to find a lot of open source utilities on http://sourceforge.net which will let you do that. When speed is concerned, http is best. It allows you to fetch arbitrary part of the file and hence it is possible to have a multi thread download.
You will notice that most of file sharing sites use http and it is so for scalability reason.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I am in the process of building a web based chat app written in Ruby. I would like to provide the ability to also connect to this chat server using an XMPP client. So I am looking for a library that will handle being a real XMPP server which I can tie into with my existing Ruby code (or by using something like Redis in between). However, I am having a hard time finding the server library (though I can find many libraries for acting as a client which consumes or interacts with the server). I'm also not very experienced with XMPP to begin with so I may be asking for the wrong thing. Do you know of an XMPP server library I can use?
XMPP server libraries generally don't make much sense, as XMPP servers (like HTTP servers for example) run as separate standalone long-lived processes. You don't usually embed them into your application.
XMPP is even a step further from HTTP - there are HTTP server libraries that allow you to listen on a port, wait for requests, and send a response. XMPP is completely different in this aspect - XMPP sessions are long-lived, and require constant attention. Using an XMPP server library your application would spend most of the time inside that library - at which point, why isn't it as good as running a separate process?
I know it's a tempting idea, but having developed an XMPP server and thinking about this (people have requested it before you) I just concluded it made very little sense (even if it is technically possible).
Many XMPP servers allow custom plugins for integration with other systems, and there are servers in Ruby if that's a requirement for you (e.g. Vines).
Try XMPP4R
For example - connection and authentication:
require "xmpp4r"
robot = Jabber::Client::new(Jabber::JID::new("sample#xmpp.ru"))
robot.connect
robot.auth("password")
And sending message:
message = Jabber::Message::new("recipient#xmpp.ru", "Hi there!")
message.set_type(:chat)
robot.send message
But the library is somewhat unstable under Windows, but great in Linux.
There is also XMPP server implementation under Ruby using XMPP4R - http://code.google.com/p/xmpp-rserve/
EDIT
Maybe this is what you want. Looks like a library suitable for server usage - https://github.com/sprsquish/blather
Found it on XMPP official page - http://xmpp.org/xmpp-software/libraries/
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I am looking for a Windows graphical utility for performing HTTP operations.
For example, I want to be able to say things like:
POST to http://example.org/test/service
With a POST body: "Data goes here"
Does anyone know a good piece of software for doing this?
I too have been frustrated by the lack of good graphical http clients available for Windows. So over the past couple years I've been developing one myself: I'm Only Resting, "a feature-rich WinForms-based HTTP client." It's open source (Apache License, Version 2.0) with freely available downloads.
It currently has fairly complete coverage of HTTP features except for file uploads, and it provides a very good user interface with great request and response management.
Here's a screenshot:
Update: For people that still come across this, Postman is your best bet now: https://www.getpostman.com/apps
RestClient is my favorite. It's Java based. I think it should meet your needs quite nicely. I particularly like the Auth suppport.
https://github.com/wiztools/rest-client
Have you looked at Fiddler 2 from Microsoft?
http://www.fiddler2.com/fiddler2/
Allows you to generate most types of request for testing, including POST. It also supports capturing HTTP requests made by other applications and reusing those for testing.
You can use Microsoft's WFetch tool also. This is a good tool for all HTTP operations.
You could try Jsonium tool http://jsonium.org- nice free tool specialized on requests with JSON in bodies and responses
http://www.ieinspector.com/httpanalyzer/
http://www.microsoft.com/downloads/details.aspx?FamilyID=B134A806-D50E-4664-8348-DA5C17129210&displaylang=en
https://addons.mozilla.org/en-US/firefox/addon/9780/
http://soft-net.net/SendHTTPTool.aspx
https://addons.mozilla.org/en-US/firefox/addon/966/
Honestly, for simplistic stuff like that I typically whip up a quick HTML form in a local file and load that up in a browser.
I like rest-client a lot for the purposes you described. It's a Java application to test REST-based web services.
If anybody is still interest Eclipse Labs Rest Client tool is an excellent choice. I'm trying it in Windows in an EXE version and works smoothly.
I've worked also with Rest Client previously and its great too.
https://play.google.com/store/apps/details?id=com.snmba.restclient
works from Android Tablets & Phones. Flexible enough to try various combinations.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I need the sniffer to test network traffic of applications developed by me for Windows and Facebook.
Basic requirements:
display request and response
display HTTP headers
display the time it took to complete HTTP request
Now I'm using HTTP Analyzer.
A very good tool, but it terminates with some error after 10-15 min running on Vista.
Wireshark if you want to see everything going on in the network.
Fiddler if you want to just monitor HTTP/s traffic.
Live HTTP Headers if you're in Firefox and want a quick plugin just to see the headers.
Also FireBug can get you that information too and provides a nice interface when your working on a single page during development. I've used it to monitor AJAX transactions.
I now use CharlesProxy for development, but previously I have used Fiddler
Try Wireshark:
Wireshark is the world's foremost
network protocol analyzer, and is the
de facto (and often de jure) standard
across many industries and educational
institutions.
There is a bit of a learning curve but it is far and away the best tool available.
Microsoft Network Monitor (http://www.microsoft.com/downloads/details.aspx?FamilyID=983b941d-06cb-4658-b7f6-3088333d062f)
Fiddler is great when you are only interested in the http(s) side of the communications. It is also very useful when you are trying to inspect inside a https stream.
I like TcpCatcher because it is very simple to use and has a modern interface. It is provided as a jar file, you just download it and run it (no installation process). Also, it comes with a very useful "on the fly" packets modification features (debug mode).
I use Wireshark in most cases, but I have found Fiddler to be less of a hassle when dealing with encrypted data.