Real-time webbrowser game - asp.net-mvc-3

I'm trying to create a real-time webbrowser game in ASP.Net MVC3, but I'm unsure about what's the best approach to processing 'real-time' events on the server side.
Imagine that the client wants to upgrade a building, upgrading a building takes time. A record gets inserted in the database that holds the end-time and on the client a ajax timer start's running. I was thinking about having a windows service running all the time. Every second the service checks the table and does the real processing when the time passed the end-time. However I could imagine that when you have a huge amount of data to process this can get problematic.
What would be the best way of doing this?
Cheers.

I've seen games like that before. You do not need to count down yourself, you just need to store the datetime for whenever the building will be ready.
When a page is requested you send a timestamp of when the building will be completed, and use javascript to count down. For example, like this.

Related

Laravel - Efficiently consuming large external API into database

I'm attempting to consume the Paypal API transaction endpoint.
I want to grab ALL transactions for a given account. This number could potentially be in the 10's of millions of transactions. For each of these transactions, I need to store it in the database for processing by a queued job. I've been trying to figure out the best way to pull this many records with Laravel. Paypal has a max request items limit of 20 per page.
I initially started off with the idea of creating a job when a user gives me their API credentials that gets the first 20 items and processes them, then dispatches a job from the first job that contains the starting index to use. This would loop forever until it errored out. This doesn't seem to be working well though as it causes a gateway timeout on saving those API credentials and the request to the API eventually times out (before getting all transactions). I should also mention that the total number of transactions is unknown, so chaining doesn't seem to be the answer as there is no way to know how many jobs to dispatch...
Thoughts? Is getting API data best suited for a job?
Yes job is way to go . I’m not familiar with paypal api but it’s seems requests are rate limited paypal rate limiting.. you might want to delay your api requests a bit.. also you can make a class to monitor your api requests consumption by tracking the latest requests you made and in the job you can determine when to fire the next request and record it in the database...
My humble advise
please don’t pull all the data your database will get bloated quickly and you’ll need to scale each time you have a new account it’s not easy task.
You could dispatch the same job at the end of the first job which queries your current database to find the starting index of the transactions for that job.
So even if your job errors out, you could dispatch it again, then it will resume from where it was ended previously
May be you will need link your app with another data engine like AWS, anyway I think the best idea is creating an APi, pull only the most important data, indexed, and keep the all big data in another endpoint, where you can reach them if you need

HowTo: Inform application that database table row is updated?

I am in process of developing an MFC based Windows based application, using PostgreSQL which would perform
Fetches information from the UI
Performs some logic and store related information to the database
The stored information has to be send immediately OR at schedule interval (ex. at 5:00 on xyz date) over the network
Currently, we have developed a dispacher mechanism (thread ), which constantly polls the database for new information inserted in the database. The thread fetches the information and send to the network module.
But, I feel this is not the correct approach as
Polling every time is a overhead. There can be times when there is nothing to execute
It is not real time , because we poll after every 5 seconds
So
Is there a way to send a trigger to my network module as soon as information is updated in database?
Or any better way to achieve this task?
Thanks in advance.
You can use the listen/notify feature of PostgreSQL for this.
http://www.postgresql.org/docs/current/static/sql-listen.html
http://www.postgresql.org/docs/current/static/sql-notify.html
The clients interested in the messages would execute a listen statement and the trigger would then notify them.
I don't use C# so, but according to the manual you can retrieve the messages in an asynchronous manner - which still involves some "lightweight" polling as the notification message is only sent as part of the answer of the server. The manual claims that running an "empty" statement (such as ;) will be enough. Using Java/JDBC I used a simple select 42 which doesn't impose a big workload on the server as no tables are touched.
This polling is defintely faster and more scalable than actually retrieving the table's data.
Yes you are right #RDX, you shouldnt poll it every time rather you could write a trigger in Postgres and from that trigger try calling a java program which could be seen in the below thread.
Calling java pgm from Postgres trigger

AJAX Polling Question - Blocking Or Frequent?

I have a web application that relies on very "live" data - so it needs an update every 1 second if something has changed.
I was wondering what the pros and cons of the following solutions are.
Solution 1 - Poll A Lot
So every 1 second, I send a request to the server and get back some data. Once I have the data, I wait for 1 second before doing it all again. I would detect client-side if the state had changed and take action appropriately.
Solution 2 - Block A Lot
So I start a request to the server that will time-out after 30 seconds. The server keeps an eye on the data on the server by checking it once per second. If the server notices the data has changed it sends the data back to the client, which takes action appropriately.
Scenario
Essentially, the data is reasonably small in size, but changes at random intervals based on live events. The thing is, the web UI will be running something in the region of 2,000 instances, so do I have 2,000 requests per second coming from the UI or do I have 2,000 long-running requests that take up to 30 seconds?
Help and advice would be much appreciated, especially if you have worked with AJAX requests under similar volumes.
One common solution for such cases is to use static json files. Server-side scripts update them when the data is changed and they are served by fast and light webserver (like nginx). Since files are static and small - webserver will do that right in cache, in very fast manner.
Consider a better architecture. Implementing this kind of messaging system is trivial to do right in something like nodeJS. Message dispatch will be instantaneous, and you won't need to poll for your data on either side.
You don't need to rewrite your whole system: The data producer could simply POST the updates to the nodeJS server instead of writing them to a file, and as a bonus, you don't even need to waste time on disk IO.
If you started without knowing any nodeJS, you could still be done in a couple hours, because you can just hack up the chat example.
I can't comment yet, but I would agree with geocar. Running live or almost live web services with just polling will be solution stuck between a rock and a hard place.
You could also look into web sockets to allow push as this sounds a better solution for this than just updating every second to 30 seconds.
Good luck!

tweepy Streaming API integration with Django

I am trying to create a Django webapp that utilizes the Twitter Streaming API via the tweepy.Stream() function. I am having a difficult time conceptualizing the proper implementation.
The simplest functionality I would like to have is to count the number of tweets containing a hashtag in real time. So I would open a stream, filtering by keywords, every time a new tweet comes over the connection i increment a counter. That counter is then displayed on a webpage and updated with AJAX or otherwise.
The problem is that the tweepy.Stream() function must be continuously running and connected to twitter (thats the point). How can I have this stream running in the background of a Django app while incrementing counters that can be displayed in (near) real time?
Thanks in advance!
There are various ways to do this, but using a messaging lib (celery) will probably be the easiest.
1) Keep a python process running tweepy. Once an interesting message is found, create a new celery task
2) Inside this carrot task persist the data to the database (the counter, the tweets, whatever). This task can well run django code (e.g the ORM).
3) Have a regular django app displaying the results your task has persisted.
As a precaution, it's probably a good ideal to run the tweepy process under supervision (supervisord might suit your needs). If anything goes wrong with it, it can be restarted automatically.

Reverse AJAX? Can data changes be 'PUSHED' to script?

I have noticed that some of my ajax-heavy sites (ones I visit, not ones I have built), have certain auto-refresh features. For example, in GMail, if I get a new message, I see the new message without a page reload. It's the same with the Facebook browser-based IM client. From what I can tell, there aren't any java applets handling the server-browser binding, so I'm left to assume it's being done by AJAX and perhaps some element I'm unaware of. So by my best guess, it's done in one of two ways:
The javascript does a steady "ping" to a server-side script, checking for any updates that might be available (which would explain why some of these pages bring any other heavy-duty pages to a crawl). or
The javascript sits idly by and a server-side script actually "Pushes" any updates to the browser. But I'm not sure if this is possible. I'd imagine there is some kind of AJAX function that still pings, but all it simply asks "any updates?" and the server-script has a simple boolean that says "nope" or "I'm glad you asked." But if this is the case, any data changes would need to call the script directly so that it has the data changes ready and makes the change to that boolean function.
So is that possible/feasible/how it works? I imagine something like:
Someone sends an email/IM/DB update to the server, the server calls the script using the script's URL plus some relevant GET variable, the script notes the change and updates the "updates available" variable, the AJAX gets the response that there are in fact updates, the AJAX runs its normal "update page" functions, which executes the normal update scripts and outputs them to the browser.
I ask because it seems really inefficient that the js is just doing a constant check which requires a) the server to do work every 1.5 seconds, and b) my browser to do work every 1.5 seconds just so that on my end I can say "Oh boy, I got an IM! just like a real IM client!"
Read about Comet
I've actually been working on a small .NET Web App that uses the Ajax with long polling technique described.
Depending on what technology you're using, you could use thread signaling mechanisms to hold your request until an update is retrieved.
With ASP.NET I'm running my server on a single machine, so I store a reference to my Producer object (which contains a thread that processes the data). To initiate the data pull, my service's Subscribe method is called, which creates a Consumer object that's registered with the Producer. If the Consumer is long polling mode, it has a AutoResetEvent which is signaled whenever it receives new data, and whenever the web client makes a request for data, the Consumer first waits on the reset event, and then returns it.
But you're mentioning something about PHP - as far as I know persistence is maintained through serialization, not actually keeping the object in memory, so I don't know how you could reference a Producer object using $_CACHE[] or $_SESSION[]. When I developed in PHP I never really knew anything about multithreading so I didn't play around with it, but I guess you can look into that.
Using infinite loops is going to consume a lot of your processing power - I would exhaust all other options first.

Resources