Scheduled data updates on Android Wear - wear-os

I'd like some high-level input on how I'm approaching a watch face I'm building as a teaching tool. It's quite simple - grabbing trending topics from Twitter once an hour and displaying them on the watch. Basically, my flow is as follows:
I have an AlarmManager on the paired phone that polls the Twitter API and pulls down fresh data, set to fire a BroadcastReceiver
When invoked, the BroadcastReceiver uses the Data Layer API to send a one-way message to the wearable containing the downloaded data
Upon receipt, the wearable app saves the inbound data to SharedPreferences as a persistent data store and redraws the topics to the watch face from SharedPreferences
Presumably this all happens in the background, whether the watch face is actively being displayed or not.
I'm thinking alternatively I could also save the Twitter data to SharedPreferences on the phone app and then use more elegant Data Layer API syncing job between the phone and watch that's triggered when the watch face becomes visible. Or, running things on the wearable via the Lollipop JobScheduler API.
Anyone see any glaring areas I could design better? Thanks!

Keep in mind, that Data API. already gives you persistence, so you don't need to involve SharedPreferences. The way you should approach this, is to use only Data API. Your phone is the producer of data items and the wearable is the consumer, which displays information based on what it receives in data items.
The way it works is like this:
your phone fetches trends from Twitter;
it goes through existing data items, deletes old ones and new ones;
the watch face eventually receives the updates about the data items; it starts showing data from new ones and stops showing data from deleted ones;
You might also consider, what the watch face does when it starts. This is simple: it just reads all the existing data items and displays data based on them.
In general: if you store the data in the data items, you don't need to copy them to any other persistent storage. Data items are persistent storage that is shared between connected devices.

Related

Difficulty Understanding Event Sourcing Microservice Event Receiving/Communication

I've been aware of event sourcing, CQRS, DDD and micro services for a little while and I'm now at that point where I want to try and start implementing stuff and giving something a go.
I've been looking into the technical side of CQRS and I understand the DDD concepts in there. How both the write side handles commands from the UI and publishes events from it, and how the read side handles events and creates projections on them.
The difficulty I'm having is the communication & a handling events from service-to-service (both from a write to read service and between micro services).
So I want to focus on eventstore (this one: https://eventstore.com/ to be less ambiguous). This is what I want to use as I understand it is a perfect for event sourcing and the simple nature of storing the events means I can use this for a message bus as well.
So my issue falls into two questions:
Between the write and the read, in order for the read side to receive/fetch the events created from the write side, am i right in thinking something like a catch up subscription can be used to subscribe to a stream to receive any events written to it or do i use something like polling to fetch events from a given point?
Between micro services, I am having an even harder time... So when looking at CQRS tutorials/talks etc... they always seem to talk with an example of an isolated service which receives commands from the UI/API. This is fine. I understand the write side will have an API attached to it so the user can interact with it to perform commands. E.g. create a customer. However... say if I have two micro services, e.g. a order micro service and an shipping micro service, how does the shipping micro service get the events published from the order micro service. Specifically, how does those customer events, translate to commands for the shipping service.
So let's take a simple example of: - Command created from the order's API to place an order. - A OrderPlacedEvent is published to the event store. How does the shipping service listen and react to this is it need to then DispatchOrder and create ain turn an OrderDispatchedEvent.
Does the write side of the shipping microservice then need to poll or also have a catch up subscription to the order stream? If so how does an event get translated to an command using DDD approach?
something like a catch up subscription can be used to subscribe to a stream to receive any events written to it
Yes, using catch-up subscriptions is the right way of doing it. You need to keep the stream position of your subscription persisted somewhere as well.
Here you can find some sample code that works. I am not posting the whole snippet since it is too long.
The projection service startup flow is:
Load the checkpoint (first time ever it would be the stream start)
Subscribe to the stream from that checkpoint
The runtime flow will then be:
The subscription will then call the function you provide when it receives an event. There's some plumbing there to do, like if you subscribe to $all, you need to filter out system events (it will be easier in the next version of Event Store)
Project the event
Store the new checkpoint
If you make your projections idempotent, you can store the checkpoint from time to time and save some IO.
how does the shipping micro service get the events published from the order micro service
When you build a brand new system and you have a small team working on all the components, you can make a shortcut and subscribe to domain events from another service, as you'd do with projections. Within the integration context (between the boxes), ordering should not be important so you can use persistent subscriptions so you won't need to think about checkpoints. Event Store will do it for you.
Be aware that it introduces tight coupling on the domain event schema of the originating service. Your contexts will have the Partnership relationship or the downstream service will be a Conformist.
When you move forward with your system, you might decide to decouple those contexts properly. So, you introduce a stable event API for the service that publishes events for others to consume. The same subscription that you used for integration can now instead take care of translating domain (internal) events to integration (external) events. The consuming context would then use the stable API and the domain model of the upstream service will be free in iterating on their domain model, as soon as they keep the conversion up-to-date.
It won't be necessary to use Event Store for the downstream context, they could just as well use a message broker. Integration events usually don't need to be persisted due to their transient nature.
We are running a webinar series about Event Sourcing at Event Store, check our web site to get on-demand access to previous webinars and you might find interesting to join future ones.
The difficulty I'm having is the communication & a handling events from service-to-service (both from a write to read service and between micro services).
The difficulty is not your fault - the DDD literature is really weak when it comes to discussing the plumbing.
Greg Young discusses some of the issues of subscription in the latter part of his Polygot Data talk.
Eventide Project has documentation that does a decent job of explaining the principles behind how the plumbing fits things together.
Between micro services, I am having an even harder time...
The basic idea: your message store is fundamentally a database; when the host of your microservice wakes up, it queries the message store for messages after some checkpoint, and then feeds them to your domain logic (updating its own local copy of the checkpoint as needed).
So the host pulls a document with events in it from the store, and transforms that document into a stream of handle(Event) commands that ultimately get passed to your domain component.
Put another way, you build a host that polls the database for information, parses the response, and then passes the parsed data to the domain model, and writes its own checkpoints.

Azure notification hubs + APNS result in missed topic notifications. Must I archive all sent notifications for iOS devices?

Azure notification hubs have a feature that allow for subscribing to various topics in a many to many relationship. (many devices to many declared topic strings)
Suppose I take these steps:
I send an iOS device a notification, "test 1".
The device goes offline.
I send "test 2"
I send "test 3".
The device comes back online.
APNS only sends "test 3". Test 2 was dropped
Not to mention that APNS will only notify the most recent event "Test 3", it can also drop additional alerts in iOS 11 if I exceed the 30 maximum per day.
One of the things I like about Azure Hub service, is that I can manage that subscription "state" in an external storage. Now however, it seems I have to track the subscriptions myself, rebuilding part of the Azure HUB architecture... archiving out the subscriptions, topics, etc so the device can query the server for all missing events.
Question
How do I reconcile the features of Azure Hub and topic subscription with the issue of dropped APNS pushes?
You're correct that there's nothing ANH (or you as a developer) could do about the dropped notifications because that's the way ANPS is designed. Which means that the solution to your problem would really depend on what kind of application you're building, the architecture and user scenarios you're targeting.
A couple of ideas I have in mind which may or may not works for you depending on what you're trying to do are:
Send a silent push to the topic once in a while that would trigger the app to query the server on whether something has been missed
If the nature of the app is such that people open it often anyway, then you could do a background check at the time they open the app
Of course, in both of these scenarios, you'll have to build some additional infrastructure on your end to keep track of which device received or missed certain notifications. One thing that might help you not to have to rebuild parts of the NH that are already there is using Pet Message Telemetry (PMT). I haven't experimented with dropped notifications, but hopefully, there's a way to tell a dropped vs delivered message apart using PMT (looks like Dropped value of the PnsErrorDetailsUri field is something similar to what you need). And having that might help you simplify and reduce the amount of data you need to keep on your end to be able to tell whether someone had missed a notification or not.

Persistent subscriber in Firebase

Is there built-in support or any way of implementing a persistent subscription in Firebase?
I need to set up a backend which reacts to certain events in my Firebase database. If the backend has crashed or is being restarted I need it to catch up with anything that has happened while it was down.
For example, I want to re-index certain objects in ElasticSearch when they change. If the backend is down I need to re-index any changed objects when the backend comes back up again.
Nothing is built in for that, although you can definitely build it on top of Firebase by adding a isIndexed or isDirty property to the items.
But the more common approach is to stuff the items that need to re-indexed into a queue and use a worker process that removes them from the queue when they've been handled. I highly recommend using firebase-queue for that.

Push and pull with Parse

So what I need is a kind of a push and pull web service mechanism; Certain devices will be sending data to my parse backend and some others should be able to receive the newly added data as it's being added. Think of it as a restaurant environment where customers send their order via their phones and the restaurant manager receives the orders on his pc real time.
I know I can use push notifications but I want to target specific users (in this case the manager alone). I guess I can have a specific push notification channel in which only the manager is added, but I am not sure if I can send proper json data in bulk or just simple strings. Maybe there's a smarter way of going about it.
Any suggestions?
Many thanks,
Polis
You can use the Parse Cloud for this purposes. So certain devices (you can differentiate in cloud or in client side) can call the cloud method. The called cloud method can make http request to your server (manager pc real time). From now on your server side can deliver coming message to your manager in real time. In this solution, I assume that you have your own server for web users (like manager) and mobile application for client user (customers).
Hope this can give you an idea. Regards.
You can use Push notification for this purpose. In my opinion that would be your best option.
When registering for push notification on client side, you can set a column owner to user pointer. Now when sending push notification from one user to another you can query the Installation class for other user's pointer. You can send push notification either from client side or writing cloud code for afterSave trigger. Cloud code is a better option.
The downside of this approach is that if other user did not allow push notifications then this would fail. The second user would still be able to get the data when they open the app, but won't get push notifications.
***I built a chat app using this approach on Parse.com
You don't need a complicated channel setup, before you save your installation, do a line like this:
[installation setObject:[PFUser currentUser] forKey:#"owner"];
[installation saveInBackground]; // ... completion or whatever
Then, just query:
PFQuery *installationQuery = [PFInstallation query];
[installationQuery whereKey:#"owner" equal:userImLookingFor];
Then, it's like PFPush w/ query or something.
(I'm typing from memory, so some of these might need to be slightly tweaked)

Exchange EWS MessageId -> Available in ActiveSync too?

Is there anyway to get the same "MessageId" you can get in Exchange EWS when using ActiveSync?
I thought this was an Exchange way to identify each message uniquely, but I can't seem to find a way to retrieve it using ActiveSync.
EDIT: I've got 2 applications, one that stores info using ActiveSync, and one that stores info using EWS, and I want them to be able to work separately on the same message.... To do this, I was hoping to use the EWS MessageId, which seems to be a GUID type identifier for each individual message. (Note: This doesn't appear to be the same Message-ID as is found in email headers).
Sadly, you're mostly out of luck.
ActiveSync is not an integration protocol, it's a mobile synchronization protocol designed for low-bandwidth communication devices like smart phones. A lot of capabilities in EWS will not exist in EAS.
Long-term message identification and correlation isn't as important for mobile devices. They simply get told what messages are in each folder, and allow the user to manipulate them. At any time the Exchange server may tell its EAS-connected clients to "re-sync" which causes them to forget the messages they have on the device and pull them cleanly from the server. That happens a lot with EAS, sometimes a couple of times an hour, depending on what is happening with that mailbox. For example, deleting a folder via Outlook causes a FolderSync to happen, and that forces connected devices to cleanly re-sync again.
Therefore EAS appears to have left behind the notion of GUIDs or other long term IDs for messages. Instead, the server will assign ephemeral IDs that are valid only until the next big resync is forced (which could happen at any time). You'll probably see Exchange give very simple IDs like 7:45 (which means message ID 45 within folder 7, IIRC). However after a resync that might have the number 7:32 (if the user deletes other messages in that folder) or something like 4:22 (if the message gets moved to another folder entirely).
Other EAS servers like Zimbra, Kerio or Notes Traveler might assign GUIDs, but from memory this is how Exchange behaves. Your only option might be to put a hidden correlation ID of your own into the body or subject of messages you're interested in. That will allow you to track the lifecycle of the items you're interested in, at the expense of some odd stuff being visible to users in their message contents.
#Brian is correct - There are no global unique identifiers for ActiveSync items that can be used to correlate with EWS (With some exceptions, for instance a meeting invite has a UID, as do Events which can be used with some hackery to retrieve an EWS ID for the related EWS calendar event) and there are no fields that aren't visible to the user that can be hijacked for adding your own data with which to correlate. This is most apparent in email, contacts, tasks, notes etc...
However if you are syncing both, it is possible to use the meta data in the objects to match. For instance, for contacts write a hashing algorithm that combines the data from the first name, last name, company name, etc... fields and produces a result. This can be run on the data from both sides and will have relatively little object collision for matching (and those that do collide will have exactly the same visible data to the user anyway so in most cases it won't matter that you didn't get an exact alignment)

Resources