Am currently working on an Xamarin Form application, which requires me to interact with a Web API. There are, however times, when the call may fail, due to internet connectivity or server breaks down. In such situations, I would like to put the data that needs to be send in a queue, and try later on.
I was able to put in a queue, however, my question is, how I can run some kind of timers so that I can keep trying the API at regular interval for the ones in queue. Could someone guide me in understanding what is the best practice in such scenarios ?
Thanks
If you don't want to implement this yourself there is a Library for that.
Especially take a look at the Retry keyword.
Related
I am new to using graphql and we have built a backend graphql server using elixir and we are building a frontend app using react and react-relay.
My question is whether it is better to have one large subscription at the root of my query renderer instead of having loads of smaller subscriptions for individual components. I think I would prefer using lots and lots of smaller subscriptions rather than fewer (or even one) very large subscriptions but there are concerns that too many subscriptions will be very heavy. Is this valid?
TIA
There are a few things to consider here, and really, they all depend on what your definition of "very heavy" is. Note "very heavy" might mean something very different for your Elixir server implementation than it does on the client, so I will attempt to cover some directions you may want to investigate for both here.
What is your subscription transport? Websockets can be expensive and difficult to scale on both ends at a certain point, but if you can deal with unidirectional data flow (server to client only), SSE (Server-Sent Events) are a great option. See more on a breakdown between SSE and WS here. This is more a comment on your server than on your client.
From an API design perspective, I'd caution against the few (or one) large subscriptions idea. Why? Inevitably, you are going to be pushing data on the client that it never asked for; this causes unnecessary work for both client and server. Furthermore, an individual component should only be able to subscribe to data screams with data specifically designated for it. If you go the large subscription route, then you'll have to write a good deal of defensive code to filter the event stream, looking for the data you need. That shouldn't be your responsibility to micromanage, not to mention the dirty event stream on your server.
This is not necessarily to lead you down the "small subscription" route either. Ultimately, you might want to look at this hybrid approach , which articulates my opinions on the matter better than I can myself. TL;DR design the subscriptions API so that you can enjoy the tightly scoped benefits of lots of small subscriptions ("per entity," as the author titles them), but still allow you to share payloads and reuse the same handlers that your mutations do to resolve data.
Plus, if you wanted to use persisted queries the hybrid approach is going to serve you better.
I am looking for a realtime hosted push/socket service (paid is fine) which will handle many connections/channels from many clients (JS) and server api which can subscribe/publish to those channels from a PHP script.
Here is an example:
The client UI has a fleet of 100 trucks rendered, when a truck is modified its data is pushed to channel (eg. /updates/truck/34) to server (PHP subscriber), DB is updated and receipt/data is sent back to the single truck channel.
We have a prototype working in Firebase.io but we don't need the firebase database, we just need the realtime transmission. One of the great features of firebase.io is that its light and we can subscribe to many small channels at once. This helps reduce payload as only that object data that has changed is transmitted.
Correct me if I am wrong but I think pusher and pubnub will allow me to create 100 truck pub/subs (in this example) for each client that opens the site?
Can anyone offer a recommendation?
I can confirm that you can use Pusher to achieve this - I work for Pusher.
PubNub previously counted each channel as a connection, but they now seem to have introduced multiplexing. This FAQ states you can support 100 channels over the multiplexed connection.
So, both of these services will be able to achieve what you are looking for. There will also be more options available via this Realtime Web Tech guide which I maintain.
[I work for Firebase]
Firebase should continue to work well for you even if you don't need the persistence features. We're not aware of any case where our persistence actually makes things harder, and in many cases it actually makes your life a lot easier. For example, you probably want to be able to ask "what is the current position of a truck" without needing to wait for the next time an update is sent.
If you've encountered a situation where persistence is actually a negative for you, we'd love to hear about it. That certainly isn't our intention.
Also - we're not Firebase.io -- we're just Firebase (though we do own the firebase.io domain name).
This is more a theoretical question than a practical one, but given I undestand the principles of SOA I am still a bit unsure about if this can be applied to any app.
The usual example is where a client wants to know something from a server thus we implement a service that can provide that information given a client request, it can be stateless or statefull, etc.
But what happens when we want to be notified when something happens on the server, maybe we call a service to register a search and want to be notified when a new item arrives to the server that matches or search.
Of course that can be implemented using polling and leverage that using long timeouts, but I can not see a way in the usual protocols to receive events from the server without making a call to ask.
If you can point me to an example, or tell me an architecture that could support then you have made my day.
Have you considered pub-sub (ie; WS-Eventing, WS-Notification)? These are the usual means to pushing "stuff" to interested consumers/subscribers.
You want to use a Publish-Subscribe design. If you are using WCF checkout Programming WCF by Juval Lowery. In the appdendix he shows how to build a Pub-Sub system that is actually fully Per-Call. It doesn't even rely on CallbackContracts and keeping long running Channels open and so doesn't require any reconnection logic when communication is broken...let alone the need for any polling.
For a pet project, I have been looking for a web chat script capable of running potentially tens of thousands of users simultaneously. I don't want to use any kind of applet or browser extension, so on the client side, it should be simple Ajax. On the server side I'm pretty much open to anything.
I'm not looking for bells and whistles, a simple text-only chat is more than enough, as long as it supports a number of 'channels' or 'rooms' simultaneously, and a very large number of users.
When I first started researching the chat scripts out there, it seemed like the only viable option was to run an IRC server and just build a web interface on top of that. I know I could get good performance and stability with that setup, but could I get better performance by using something else?
Any ideas?
You might want to check cometd
I believe there are some chat scripts already using cometd.
I have no idea regarding stability tho.
You can have a look at Jabbify.
Not sure about the rooms and channels part, but it is built on the AJAX and MVC model.
I am going with Twitch.me, which is based on node.js
I want to build an Ajax gui, that is notified on any state changes happening in my ejb application. To achieve this, I thought I build an stateful ejb (3.0) that implements the Observable interface to which the Ajax client is added as an observer.
First, is this possible with Ajax. If yes, is this a good design idea or is there a more propriate way to do this?
Thanks in advance!
Cheers,
Andreas
It sounds like you are interested in 'Reverse-Ajax', where the client is notified when an event happens server side. This is different than standard Ajax, where an asynchronous event is sent to the server based on some client action. Reverse Ajax is possible, and one framework that does a very good job of allowing this is and simplifying the underlying complexity is DWR.
http://directwebremoting.org/dwr/reverse-ajax
You'll want to read up on the performance implications of the various ways to implement based on your expected load, webapp container, etc. regardless of which framework you use.
As for whether or not it is good practice, that really depends on your application. If it is important to get near-real time data pushed back to the client and you don't want to use something like Flex or other heavier frameworks, then I'd say you are on the right track. If the data does not need to be real time, or if your load is extremely high, then perhaps a more simple approach like a scheduled page refresh will save you some complexity and help with performance.
Now, some time later there is a new possible answer to your question: Usage of Websockets
From the previously linked website by Pete: "The web was not designed to allow web servers to make connections to web browsers..." That is changing now with html5.
http://en.wikipedia.org/wiki/WebSockets