How is it possible to compare ASGI framework with ASGI server? - performance

How is it possible to compare ASGI framework with ASGI server? What the points of comparison? What the reason of comparison framework and server?
In this benchmark i see Starlette (ASGI framework) and Uvicorn (ASGI server),
actually Starlette works over the Uvicorn.
How it is possible to compare them?
Am i right - it's the same like compare warm with salty?
https://www.techempower.com/benchmarks/#hw=ph&test=fortune&l=zijzen-sf

TL;DR: Yes, it doesn't make sense to compare the web framework with the server. The scopes are different.
That said... The application used to benchmark uvicorn is a pure ASGI application.
You can see the application used for TechEmpower here. With this setup, there's no overhead of the web framework itself.
The Starlette benchmark runs on uvicorn as well, and you can see the code here.
Disclaimer: I'm maintainer of Uvicorn & Starlette. - It really doesn't matter for this question, it's just to give a bit of credibility.

Related

Can I have two or more web processes using Heroku

I'm trying to implement reasonably complex architecture using Heroku. I have a Java application that reads/writes data from one source using REST and puts results onto a queue using RabbitMQ. A Django application then reads from this queue parsing data collected then saves to it's database. The Django application feeds Android and ISO apps through GraphQL. The problem I have is Heroku only seems to let me define one web process in my Procfile where in fact I need two. One for the Java application and one for the Django application. Is there anyway I can make this work?
There is no un-hacky / good solution to this. And as the comments stated, it is a bad idea to combine codebases here.
Following Heroku's ideas here, you would split these into separate applications/services that communicate to each other via HTTP or the queue.
Many addons you can attach to multiple applications if they are shared. So you have the same queue.

What service to use for deploying my flask + dash application

I am building a small application with dash and flask. Where my user can upload his csv/excel file and have a look at the graphs being generated.
I assume the size of each excel file could be around hardly 50MB max / week.
I have 'ZERO' knowledge on servers and deployment etc. Can anyone guide or enlighten me on this area. Also this application is just for an internal purpose so we are not allowed to go easy on the budget.
My random google searches gave me options like,
1. AWS
2. Heroku
Which would be a right option and why ? Considering price and ease of use.
Thanks !
I will share some of mine web dev knowledge, so.. in my company we use flask for all server dev, using many of his libs(like marshmallow, sqlalchemy, etc) and making improvements to them, flask offers you a big flexibility and fast development, but your request thread is poor, so i highly recommend to use a load balancer, the most famous load balancer for flask is Gunicorn, is easy to set and use. For Http server we use Nginx, its like Apache, but make to work with Websockets more easy, and to use with Gunicorn just make a proxy. For the Host, we use AWS, and work very fine for big and little applications, but your application is small and your budget too, so i recommend use the pythonanywhere server, its easy to use and optimized for python webservers. And for frontend we use Vue.js framework, makes our page more beautiful and fast to dev.

Is the Go language built-in http server a production server?

I did not see that answer in the documentation, https://golang.org/pkg/net/http/.
It seems pretty complete, but typically I find the built in web servers are never recommended, such as Python, PHP, etc., for anything but development.
Yes. It is a 'production' server if you use it as such. There is no reason why you would not. It is was made with the intent of you using it for real production applications, not just for testing and playing around with the language.

Does some optimized web servers for single page application exists?

When we do single page application, the webserver basically does only one things, it gives some data when the client asks them (using JSON format for example). So any server side language (php, ror) or tool (apache, ningx) can do it.
But is there a language/tool that works better with this sorts of single page applications that generates lot of small requests that need low latency and sometimes permanent connection (for realtime and push things)?
SocketStream seems like it matches your requirements quite well: "A phenomenally fast real-time web framework for Node.js ... dedicated to creating single-page real time websites."
SocketStream uses WebSockets to get lowest latency for the real-time portion. There are several examples on the site to build from.
If you want a lot of small requests in realtime by pushing data - you should take a look at socket type connections.
Check out Node.js with Socket.io.
If you really want to optimize for speed, you could try implementing a custom HTTP server that just fits your needs, for example with the help of Netty.
It's blazingly fast and has examples for HTTP and WebSocket servers included.
Also, taking a look at GWAN may be worthwile (though I have not tried that one yet).
http://en.wikipedia.org/wiki/Nginx could be appropriate

How does Node.js perform compared to Apache?

Is Node.js quicker and more scalable than Apache? Are there any performance figures to back up Node.js's performance for a web application over Apache?
UPDATE: Ok maybe my question (above) is confusing because I am a little confused as to how Node.js sits within a web stack. Under what circumstances should I consider using Node.js instead of a more traditional stack like PHP, MySQL and Apache - or does Node.js play it's part in this stack?
Node.js is a framework particularly well suited for writing high performance web applications without having to understand how to implement concurrency at a low level. It is a framework for writing server-side JavaScript apps using non-blocking IO: passing continuations to IO calls rather than waiting on results. Node.js provides a system API (filesystem access, network access, etc.) where all of the API calls take a continuation which the runtime will execute later with the result, rather than block and return the result to the original caller.
You can use by itself, if you like. But you might want a dedicated reverse proxy in front of Node.js: something like Apache, Nginx, LigHTTPD, etc. Or, for clustering a bigger app, you might want something like HAProxy in front of multiple running Node.js app servers.
There is a recent (July 28th, published 30th) Google Tech Talk about Node.js where there are a few performance numbers and where he also talks about scaling.

Resources