Actually i need to show server's real time logs in Jsp textarea in a dynamic way.I have got logs by using ajax callback method but I do not know this is right way or not. Is there any solution to synchronize the server and client for fetching server logs to jsp, please help me.
If you are retriving logs from database with finegrane timestamp then the ajax polling will be the best approach. I would recomand cacheing logs after retriving from database if your applications has a large userbase.
Here is a good tutorial for longpolling. you can reduce the wait time.
Long polling using jquery
Ajax request can get the logs from the server side, it's Poll Mode.
If you want a better performance, You can use a Ajax Push technology pattern known as Comet.
Here's the project called CometD.
http://cometd.org/
Related
I am hoping someone can point me in the right direction. I have a CF2021 Server which uses a Node.js websocket server and CF pages (via javascript) as a client. Messages from user to user work as expected, so no issue there.
This CF Server also has a custom API built using CFML that handles and routes inbound SMS messages. My question is; what would be the best way to send the SMS message (by now its json) to the Node.js websocket to it can send it to the user(s).
I tried using the same javascript that the browser client uses, but it appears that the CFML API script is "browser-less", so that doesn't work, or should it?
I thought something like Apache Groovy may be the solution, but I am having difficulties with any websocket example I have found.
Any suggestions would be greatly appreciated.
thanks in advance
Flow matters.
If you want to handle an incoming message by delivering it to all currently logged in users who are subscribed to messages of the current sort: set up your message handler to deliver using lucee/adobe-coldfusion websockets. Be forewarned, Lucee takes some setup time, but once running, it is a great solution.
If you don't need immediate delivery, or you need a super simple solution: I actually have had some success with "Long Polling" you just have to remember to use "flush" early in the request, before any pause/sleep, then loop your message lookup requests for new data, with a 1-5 second delay between each loop. Once new data is found, I like to return the request to the client, close that polling request and start a new polling request using the client side. I typically won't poll for more than 60 seconds. Even if the returned json object is empty.
The logic behind varnish is that it never touches your ruby/php code base and served directly by the cache. What if I have an ecommerce site and for each category page I want to log a particular page viewed by user/ip address X at what time, I have put this logging code in my php code. However when I run the app with Varnish then I lost all of this ability. I am pretty new to gateway proxy cache, can anyone enlighten me?
The easiest most efficient way to solve this is to create an ajax request that does just the logging part. That way you can still cache your whole page whereas you disable cache for the ajax request to enable it to log all users. The IP you would forward from varnish to the ajax request (with X-Forwarded-For), the URL you easily get with javascript and include in the ajax call (browser referral headers are not reliable).
A rather simple thing is to write a script to parse varnish logs, and retrieve pages of interests, ips and other interesting informations. It can be run once a day or more frequently, depends on your needs.
By using ajax request as in #Clarence response you risk to do not include the visitors with javascript not activated (but you can have your stats in real time)
You can add some logic to Varnish to forward IP address, so you could have complete webserver logs of client IP & requested URL.
This example is for apache: Varnish Client IP not logging in Apache Logs
[Edit] The above suggestion only works for cache misses. Client side JS recommended.
You could also consider using javascript to poll servers with information, such as Google Analytics. http://www.google.com/analytics/
I'm building a Grails app which queries several API's across the Web. The problem is that this queries are very time consuming and it is really annoying for the user to click one button and wait so much time without nothing changes in the page.
The basic architecture of my app is, when the user clicks the button, the client side performs an Ajax request with Prototype library to the server side. The server side, then, connects to the Last.fm API and retrieve a list of events. When the server side is finished populating the list with events it sends a JSON response to the client side which allows the client side to create markers for each event on a Google map.
What I want to achieve is, instead of waiting for all the events being retrieved from the API to send the JSON response, the server side sends a JSON response as soon as it retrieve one event, allowing the client side to populate the map while other events are yet being retrieved.
Any ideas how can I implement this? I read about the concept of Ajax Push but I'm not sure if it is what I need.
Thanks for the help!
There is no way to open a listening connection on the client that your server might connect to. Instead, what you need is a connection to the server that is kept alive and can be used to receive events. This way, you could directly return the "action" request and inform the client through your persistent event connection once the last.fm request was completed.
That said, the way I would implement is using ajax keep alive.
Take a look at cometd plugin.
I'am trying to improve the performance of my gwt app.
My app uses a lot of rpc request, so i am trying to caching them in the client.
Each rpc request returns a list of records (normally 100 records). I'am storing them in the client as a Java List, but I notice that the browser can not carry ok with this amount of objects. It performance cracks.
I'am thinking of storing the result of each request into a cookie using some kind of JSON and retrieving it when needed. In other words, caching the request in cookies better than in the RAM of the client browser.
Can somebody suggest me anything?
Will I success by following this approach or is this a stupid think?
Does anybody has a better solution?
Maybe you want to have a look at this question: Client side caching in GWT
Cookies are actually a terrible place to store stuff, because they get sent to the server on every request, even RPC (Ajax) requests.
I think what you want is local storage, which has some kind of implementation in every modern browser. Not sure about older IE's but you could fall back to Flash for local storage there.
Edit
GWT doesn't have any native support for local storage (that I know of) but you could always use JSNI for that.
Also storing stuff in JS is a perfectly valid way to do it, you just need to make sure stuff expires and falls out of the cache. Is your current cache solution just growing forever? Because that certainly will kill the browser eventually.
The cookie is sent as a field in the header of the HTTP response by a web server to a web browser and then sent back unchanged by the browser each time it accesses that server.
think about the traffic...
#darkporter I completely agree and would like to add that Web Storage support are coming in the next GWT 2.3 release. So if you need it right now, you could pull those classes from gwt trunk and use them in your project.
I'm developing web application that uses AJAX a lot. It sends several AJAX requests to server per second and update page's content based on the request result.
Is there a way I can save all responses sent from server to browser using AJAX to disc in order to analyze it later?
It is not good to make 'alert' after each request because there are too many requests and I'll not be able to see if the browser works well in this case.
I've tried to use firebug or greasemonkey for Firefox but I failed to save data to disk using them.
Is there any solution for it? Thanks for your help.
Have you tried Fiddler?
It's an application which monitors all http traffic. It's good, and it's free.