Languages Offering Scalability for Server-client Application [closed] - client-server

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I want to learn how to get started with developing highly scalable server/client applications--more specifically for non-web-based/not-in-a-browser desktop clients. I figure that developing a very minimalistic chat application (roughly comparable to AIM/Skype) is a reasonable way to get started down such a path of learning about servers/clients and scalability.
I am unsure which programming language would be appropriate for this task considering my emphasis on scalability. Personally, the only languages I am interested in working with are Java, C#, and C/C++. As far as the server OS goes, I will be dealing with Linux, so C# in my case would imply Mono.
I suppose my specific interest boils down to what language to use on the server, since it is the infrastructure supporting the application which has to be highly scalable. I have heard mixed reviews of Java and C# server scalability. My intuition would suggest that they are both perfectly reasonable choices, but then I hear about others running into problems once they reach a certain threshold of application/user traffic. It is hard to know what to make of hearsay, but I do suppose that the lack of bare-metal support of these languages could hinder scalability at certain thresholds. When I hear about C/C++, I hear mention of the great Boost libraries (ex. such as Boost.Asio) offering the ultimate scalability. But then I am scared off when I hear that sockets in particular are much more complex to deal with in C/C++ than with other languages like Java/C#.
What is an effective way to get started in making highly scalable server-client applications such as a chat client? Of the ones which I have mentioned, which programming language is adequately suited for developing such applications? What other languages should I consider for such an application?
EDIT: the term "scale" most directly relates to scaling to serve a large number of users (perhaps tens or hundreds of thousands, maybe millions).

"Scale" - in which way has it to scale? Scaling with CPU cores, with users or with code base?
You could ask: Which language implementation is the fastest? Which language will handle a lot of requests without problems?
In every language implementation you will need to have strategies to build a distributed system. If you have to worry about speed, you should rather worry about having a possibility to distribute your system on many machines.
If you want maximal scalability in terms of cores and non-blocking request, go with Erlang. It will handle a shitload of traffic on server side.

Each of the languages you mentioned will scale.
If your serious about this, you should choose the language you know best and build it - you havent even prototyped your idea yet and your concerning yourself with scale.
We could list many programs and websites written in each of the languages above that scale perfectly well (We can also list many that dont).

Related

dart as server web application or golang [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I am planing to write an social network application and thinking about to use dart on backend. But I am asking me, if dart would be the right choice for backend application. I was searching in internet about this topic, but could not find something useful, that convince me to use dart for backend.
I am thinking to use golang, but I have to learn this language from zero. In dart I have some experience and really like dart language.
You know, I try make the right choice to use right tools, so I need some suggestions? Dart or golang?
I know, that java provide me hug ecosystem, but i don't like to learn java at all, just don't like java.
I would choose neo4j database, i am pretty sure, this suits for social network.
I don't like Java very much and developed a server in Go. I enjoyed working with Go a lot. Then I started with the client and because I also don't like JavaScript (in fact I hate it) I went with Dart which I like a lot. I already did a lot of server side work in Dart. A big advantage is that I can share code between client and sever, which is actually a lot in my current project.
I like Go better for server side development but I will stick with Dart for projects where I already use Dart for the client. It's just too difficult and time-consuming to stay up-to-date on two different languages (and ecosystem and libraries) for such new and fast evolving languages.
I think each language is a very good fit for this Job. Java has an awful lot of existing libs, tutorials, ... but most of the important things are covered in Go and Dart as well.
Basically I would say, stick with the language you know best. Learning a new language will impede your productivity for several months at least.
If you design your application in terms of loosely coupled modules then you can write(prototype) modules in Dart and if one day you will need to re-implement some of them in GO (for example you need better performance) it will be doable. Look at REST
Also it will make your application scalable and you will be able to expose API for separate modules (for example, if you need some of them as a phone app back-end or for the 3-d party developers)

Online multiplayer browser-based game server technology? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I would like to build a simple cross-browser multiplayer game (like chess or a card game) which will communicate to a server using sockets.
I have some intermediate knowledge of Ruby language, but I'm not so convinced that it is a good solution for a multi-client server, so a thought that Node.js or Socket.io might be a better one. I know that Java or C++ could be great for the job, but I am not too comfortable with none of them, so that is the reason that I'm levitating towards Server-side JavaScript.
My question is, what do you think is the best solution for a project like this one? What might be the best server-side technology on which I will build an entire game and communication logic? Maybe some combination of them? Any comment regarding speed, server load, hosting solutions and development speed for each of the technologies will be greatly appreciated.
If you're comfy with JavaScript, you've got nothing to lose by giving node.js a go: the learning curve will be gentle. It's a pretty cool server tech.
Only disadvantage with node js is of course it won't scale like java. At all. This is often fine for web apps because you can throw a caching layer in front (reverse proxy), which greatly mitigates this. I imagine this won't be reasonable for your application, since the game state will change too frequently.
Node js can "scale" though, by spinning up more instances. If one server can easily accomodate more than one "game world", then this is straightforward. If you need to split a gameworld across multiple servers, then servers must cooperate. Beware this scenario though, it's not as simple as it first may seem: it's called the "multi master" problem and is one of the hungry internet monsters.

What is a tool to build simple team task management? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 years ago.
Improve this question
I have a specific goal in mind: I want to make a to-do-list-type app for my group at work to use. My key requirements are to have very easy entry and removal of items, as well as work in an OS X environment. The first requirement is because anything that is easy to use is more likely to be used, and the second is because that's what we're on. The application will need to be live-updating among multiple users, but authentication is not a requirement. Distribution of the same app to other remote teams is a plus, but not required. Just a shared to-do list, with task-specific things to be added. It doesn't have to be a web app; native is great but the world at large seems to be more and more web-related these days.
I've been looking at a number of technologies such as Ruby (and Rails), PHP, MAMP, Cappuccino, FileMaker, Trac, and a few other options - but the paradox of choice means that I don't really know what is "best" to use. Looking at that list it is obvious that I don't really know what I should be looking at, let alone how to decide on things. I'm drowning in a sea of opportunities and a surplus if "good enough."
I am a somewhat-experienced with Objective-C and Cocoa, but excluding Cappuccino, those skills don't directly relate. I'm rather excited to learn new things, so my existing skill set is not especially important.
What sounds natural for this? I'm fully prepared for the fact there is no "right" option. Who here has a favorite methodology? What's a good application stack that has proven itself in rapid development time and future flexibility?
TL;DR: I want to make a concurrently-updated todo app for a small work team. It specifically does not need to be feature-filled, but should be "simple" to build and maintain. What is the right tool for the job?
EDIT: My team does nothing related to software dev, but my own personal mindset is that of a software dev. Part of the reason I am not afraid to roll up my sleeves and learn something new is a matter of personal development.
I would highly recommend that you use an existing solution rather than build your own. Teams building their own management software has a long history of sucking up lots of time, energy and talent for little benefit.
As for which solution you should use, it depends on the kind of work your team does. If they do software development, as I suspect they might based on your question, Trac is an excellent option.
Based on your requirements, IMHO, the GoogleTask is the best one for you. If want desktop (not web) app, check Things (commercial).

Are there reports or thesis about the performance of Google App Engine or other cloud platforms [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Are there reports or thesis about the performance of Google App Engine or other cloud platforms?
I'am writing an article about how to choose an appropriate cloud platform, and want to reference some test data.
A little work with Google may bring up some material that others have found. For instance the canonical resource for Azure benchmarking is here: http://azurescope.cloudapp.net/. However, there's not much comparative material as it really doesn't make sense.
Comparing cloud platforms solely on performance is like comparing apples with bananas with oranges. Each have their own qualities that make them appropriate for a particular kind of application.
For example, in broad terms, for multi-platform use where you have control of the underlying OS, go EC2; for a managed Windows application platform go Azure; or for a managed Java/Python platform choose App Engine. Once you've chosen the platform you can pretty much then pay for the performance you need.
Bear in mind too that "performance" means different things for different applications. The application I'm working on, for instance, relies heavily on SQL database performance. That will have a very different performance profile from (say) an application that uses a key-value pair storage system, or an application that's mostly static HTML.
So, in practice, there aren't much in the way of performance benchmarks out there because every application is different.

Is it possible to create a faster computer from many computers? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
How can I use several computers to create a faster environment? I have about 12 computers with 4GB each and 2GHz each. I need to run some time consuming data transform and would like to use the combined power of these machines. They are all running Win2003 server.
Basically we have a large number of video files that we need to transform so our analysts can do their analysis. The problem is complicated by the fact I can't tell you more about the project.
I moved it to: https://serverfault.com/questions/40615/is-it-possible-to-create-a-faster-computer-from-many-computers
Yes, it's called Grid Computing or more recently, Cloud Computing.
There are many programming toolkits that are available to distribute your operations across a network. Everything from doing builds to database operations to complex mathematics libraries to special parallel programming languages.
There are solutions for every size product, from IBM and Oracle down to smaller vendors like Globus. And there are even open-source solutions, such as GridGain and NGrid (the latter is on SourceForge).
You will get speed only if you can split the job and run it on multiple computers parallely. Can you do that with your data transform program? One of the things I am aware of is that Amazon supports map-reduce. If you can express your data transform problem as a map-reduce problem, you can potentially leverage Amazon's cloud based Hadoop service.
There's really no "out of the box" way to just combine multiple computers into one big computer in a generic way like that.
The idea here is distributed computing, and you would have to write a program (possibly using an existing framework) that would essentially split your data transform into smaller chunks, send those off to each of the other computers to process, then aggregate the results.
Whether this would work or not would depend on the nature of your problem - can it be split into multiple chunks that can be worked on independently or not?
If so, there are several existing frameworks out there that you could use to build such an application. Hadoop for example, which uses Map-Reduce would be a good place to start.
Your program will probably need to be modified to to take advantage of multiple machines.
One method of doing this is to use an implementation of MPI (possibly MSMPI as you're using Windows Server)
Try looking at Condor. Their homepage is light on info, so check out the wikipedia article first.
There are a variety of tools out there that can use a distributed network of computers.
An example is Incredibuild
Imagine a Beowulf cluster of these things!

Resources