I have an ajax page which pulls data from a database. I'd like to add a processing applet to visualize the data but i can't figure out how to update the visualization as the data changes. The idea is to be able to push new data into the visualization.
I'm not tied to the processing technology, anything will work. Processing just seems to be the easiest way to make it look nice. Thanks for the advice.
The easiest way is to construct your own XML structure (base64 encode binary data if you need) and add a timer in your applet to retrieve updates from the server (through HTTP requests). How to prepare and process the XML is up to you.
Applets are a bit heavy-weight for visualization, so if the same thing can be done in Flash, I'd recommend using that instead. Flash also got support for HTTP requests (or you can let javascript handle it).
I'm not sure of how you could facilitate communication between the two, but as a possible alternative you could look at processing.js, which is processing implemented in javascript.
applets run in their own sandbox. Look at the java.net.URL and java.net.HttpURLConnection classes.
You could make http requests from the java applet. I don't know anything about processing applets though.
Related
I am using a react-native app with relay modern.
Currently our app's fetchQuery implementation, just does a fetch on the network (like in https://facebook.github.io/relay/docs/en/network-layer.html),
Although there is a possibility of another local-network layer like https://github.com/relay-tools/relay-local-schema which returns data from a local-db like sqlite/realm.
Is there a way to setup offline-first response from local-network layer, followed by automatic request to real network which also populates the store with fresher data (along with writing to local-db)?
Also should/can they share the same store?
From the requirements of Network.create(), it should return a promise containing the payload, there does not seem a possibility to return multiple values.
Any ideas/help/suggestions are appreciated.
What you trying to achieve its complex, and ill go for the easy approach which is long time cache.
As you might know relay modern uses a local storage and its exact copy of the data you are fetching, you can configure this store cache as per your needs, no cache on mutations.
To understand how this is achieve the best library around to customise Relay Modern or Classic network layer you can find in https://github.com/nodkz/react-relay-network-modern
My recommendation: setup your cache and watch your request.... (you going to love it)
Thinking in Relay,
https://facebook.github.io/relay/docs/en/thinking-in-relay.html
Thrift sounds awesome but can't find some basic stuff I'm used to in RPC frameworks (as HttpServlet). Example of the things I can't find: session management, filtering, upload/download progress.
I understand that the missing stuff might be a management layer on top of Thrift. If so, any example of such a layer? Perhaps AOP (Aspect Oriented)?
I can't imagine such a layer that compiles to all languages and that's I'm missing. Taking session management as an example, there might be several clients that all need to do some authentication and pass the session_id upon each RPC. I would expect a similar API for all languages doing so.
Anyone knows of a a management layer for Thrift?
So thrift itself is not going to help you out a lot here.
I have had similar desires, and have a few suggestions:
1. Put your management objects into the IDL
Simply add an api token or common transfer data struct as a parameter to all of your service methods. Set it as parameter id 15 so that it will always be the last parameter, even if you add others in the middle.
As the first step in your handler you can validate/store/do whatever with the extra data.
This has the advantage that it is valid in any platform that thrift supports.
2. Use thrift over http
If you use http as your transport, you can include whatever data as you want as http headers, and the thrift content as the body.
This will often require a custom http client for every platform you use to inject the data, and a custom handler on the server to use the data, but neither of those are prohibitively difficult.
3. Hack the protocol
It is possible to create your own custom protocol that wraps another protocol and injects custom data. Take a look at how the multiplexed protocol works in the thrift library for most languages:
c# here. It sends the method name across the wire as service:method. The multiplexed processor unwraps this encoding and passes it on to the appropriate processor.
I have used a similar method to encode arbitrary key/value pairs (like http headers) inside the method name.
The downside to this is that you need to write a more complicated extension for each platform you will be using. Once. It varies a bit from language to language how this works, but it is generally simple enough once you figure it out once.
These are just a few ideas I have had, and I am sure there are others. The nice thing about thrift is how the individual components are decoupled from each other. If you have special needs you can swap any of them out as you need to to add specific functionality.
I would like to create a web page which would allow multiple users to work together on a page, Imagine a web based editor that allowed to users to change the documents as an example of this type of feature.
How would more experienced programmers go about implementing this as i really cant seem to formulate any way to even begin going about this task. Would there be any programming librarys that make implementing this feature easier or is it just too complex to even think about?
I am creating this webapp primarily using GWT and SmartGWT if that helps.
Thanks for any input you may have.
There is indeed a cometd-like library for gwt - http://code.google.com/p/gwteventservice/
Wiki:
In web development, Comet is a neologism to describe a web application model in which a long-held HTTP request allows a web server to push data to a browser, without the browser explicitly requesting it. Comet is an umbrella term for multiple techniques for achieving this interaction. All these methods rely on features included by default in browsers, such as JavaScript, rather than on non-default plugins.
In practice:
In normal way client can receive resources by request->responce. It is no possible to send data directly to client without request. With comet you can hold realtime connection between client and server and exchange data in realtime.
Check out: docs.google.com. They are using comet.
Etherpad.com is a service that used to do this. It has been since bought by Google, and the code released as open-source. You can see several links on the etherpad.com page for the source download and related information.
I am sure answer for this question will be very subjective, I simply want to know what the options are out there (for building a proxy to load external contents).
Typically I used cURL in php and pass a variable like proxy.url to fetch content. Then make an AJAX call with Javascript to populate the contents.
EDIT:
YQL (Yahoo Query language) seems a very promising solution to me, however, it has a daily usage limit which essentially prevents me from using it for large scale projects.
What other options do I have? I am open to any language, any platform, key criteria are: performance and scalability.
Please share your ideas, thoughts and experience on this topic.
Thanks,
you dont need a proxy server or something else.
Just create a cronjob to fetch the contents every 5 minutes (or whenever you want).
You just need to create a script that grabs the content from the web and saves it (to a file, a database, ...), which will be started by the cronjob.
If somebody requests your page, you just need to send the cached content out and do with it whatever you want to do.
I think scalability and performance will be no problem.
Depending on what you need to do with the content, you might consider Erlang. It's lightening fast, ridiculously reliable, and great for scaling.
I want to build an Ajax gui, that is notified on any state changes happening in my ejb application. To achieve this, I thought I build an stateful ejb (3.0) that implements the Observable interface to which the Ajax client is added as an observer.
First, is this possible with Ajax. If yes, is this a good design idea or is there a more propriate way to do this?
Thanks in advance!
Cheers,
Andreas
It sounds like you are interested in 'Reverse-Ajax', where the client is notified when an event happens server side. This is different than standard Ajax, where an asynchronous event is sent to the server based on some client action. Reverse Ajax is possible, and one framework that does a very good job of allowing this is and simplifying the underlying complexity is DWR.
http://directwebremoting.org/dwr/reverse-ajax
You'll want to read up on the performance implications of the various ways to implement based on your expected load, webapp container, etc. regardless of which framework you use.
As for whether or not it is good practice, that really depends on your application. If it is important to get near-real time data pushed back to the client and you don't want to use something like Flex or other heavier frameworks, then I'd say you are on the right track. If the data does not need to be real time, or if your load is extremely high, then perhaps a more simple approach like a scheduled page refresh will save you some complexity and help with performance.
Now, some time later there is a new possible answer to your question: Usage of Websockets
From the previously linked website by Pete: "The web was not designed to allow web servers to make connections to web browsers..." That is changing now with html5.
http://en.wikipedia.org/wiki/WebSockets