I have been testing events API against a relatively new account (created June 13, 2012). Until today, calls to the events API with a stream position of 0 would return the very first events that occurred on the account. This allowed me to track all the files that had been added to my account without traversing the folder hierarchy. Starting this week, I am no longer getting events for uploads during the first few days after I opened my account. My call is as follows:
curl 'https://www.box.com/api/2.0/events?stream_type=changes&limit=100&stream_position=0' -L -H 'Authorization: BoxAuth api_key=xxxxx&auth_token=xxxxx'
Do older events get dropped from the events queue from time to time? If so, is there any way to know how far back the queue goes? (Assuming old events do get dropped, just checking the earliest item in the queue would be unreliable since, if that event is more recent than the last one I processed, it's not possible to tell whether the account has just been inactive or whether interim events have been dropped.)
Any guidance would be very much appreciated. Hopefully I'm just doing something wrong and the older events are still accessible one way or another.
Events aren't maintained for the lifetime of the user's account. The specific length of time for which events are saved hasn't been specified yet as we're still looking at usage of the API, but we will indicate this before the v2 API is made GA.
Related
I have installed Auditbeat today and want to view audit logs which are older than a month in Elasticsearch. Is this achievable somehow?
Tldr;
Auditbeat is a data shipper so it is just going to move the data around.
This data is mainly event, and Auditbeat listen to then, it is not reading them from a file so sadly I don't think so.
To understand
Auditbeat is composed of 3 modules:
AuditD
Get event from the kernel as the come.
This module establishes a subscription to the kernel to receive the events as they occur.
So no way to get the data from earlier this month.
File integrity
This module uses features of the operating system to monitor file changes in realtime.
It will not be able to get events from a month back.
system
Event information is sent as the events occur (e.g. a process starts or stops).
Yet again, not possible.
So you won't be able to access the data from a month from now.
I've been thinking about how to design a system that supports user created scheduled alerts. My problem is once the alerts are created and inserted into a database, I don't know what the best way to go about scheduling those alerts. Polling the database to see which alerts need to go out next doesn't seem entirely right to me.
What are some ways this could be handled on a scale where say a million users could create their own custom alerts like change baby diaper at 3pm everyday?
This problem is very suitable for cloud platforms. For example, you could use GCP Cloud Scheduler to invoke a cloud function when the alert is supposed to be sent out. The cloud function then calls some API to alert the user.
If cloud platforms are not an option, you could have your application spawn a new thread when an alert is created, and sleep that thread for a certain duration. When it wakes up, it sends the alert. Less elegant and less scalable than the first solution, but it would still work.
Azure notification hubs have a feature that allow for subscribing to various topics in a many to many relationship. (many devices to many declared topic strings)
Suppose I take these steps:
I send an iOS device a notification, "test 1".
The device goes offline.
I send "test 2"
I send "test 3".
The device comes back online.
APNS only sends "test 3". Test 2 was dropped
Not to mention that APNS will only notify the most recent event "Test 3", it can also drop additional alerts in iOS 11 if I exceed the 30 maximum per day.
One of the things I like about Azure Hub service, is that I can manage that subscription "state" in an external storage. Now however, it seems I have to track the subscriptions myself, rebuilding part of the Azure HUB architecture... archiving out the subscriptions, topics, etc so the device can query the server for all missing events.
Question
How do I reconcile the features of Azure Hub and topic subscription with the issue of dropped APNS pushes?
You're correct that there's nothing ANH (or you as a developer) could do about the dropped notifications because that's the way ANPS is designed. Which means that the solution to your problem would really depend on what kind of application you're building, the architecture and user scenarios you're targeting.
A couple of ideas I have in mind which may or may not works for you depending on what you're trying to do are:
Send a silent push to the topic once in a while that would trigger the app to query the server on whether something has been missed
If the nature of the app is such that people open it often anyway, then you could do a background check at the time they open the app
Of course, in both of these scenarios, you'll have to build some additional infrastructure on your end to keep track of which device received or missed certain notifications. One thing that might help you not to have to rebuild parts of the NH that are already there is using Pet Message Telemetry (PMT). I haven't experimented with dropped notifications, but hopefully, there's a way to tell a dropped vs delivered message apart using PMT (looks like Dropped value of the PnsErrorDetailsUri field is something similar to what you need). And having that might help you simplify and reduce the amount of data you need to keep on your end to be able to tell whether someone had missed a notification or not.
I'd like some high-level input on how I'm approaching a watch face I'm building as a teaching tool. It's quite simple - grabbing trending topics from Twitter once an hour and displaying them on the watch. Basically, my flow is as follows:
I have an AlarmManager on the paired phone that polls the Twitter API and pulls down fresh data, set to fire a BroadcastReceiver
When invoked, the BroadcastReceiver uses the Data Layer API to send a one-way message to the wearable containing the downloaded data
Upon receipt, the wearable app saves the inbound data to SharedPreferences as a persistent data store and redraws the topics to the watch face from SharedPreferences
Presumably this all happens in the background, whether the watch face is actively being displayed or not.
I'm thinking alternatively I could also save the Twitter data to SharedPreferences on the phone app and then use more elegant Data Layer API syncing job between the phone and watch that's triggered when the watch face becomes visible. Or, running things on the wearable via the Lollipop JobScheduler API.
Anyone see any glaring areas I could design better? Thanks!
Keep in mind, that Data API. already gives you persistence, so you don't need to involve SharedPreferences. The way you should approach this, is to use only Data API. Your phone is the producer of data items and the wearable is the consumer, which displays information based on what it receives in data items.
The way it works is like this:
your phone fetches trends from Twitter;
it goes through existing data items, deletes old ones and new ones;
the watch face eventually receives the updates about the data items; it starts showing data from new ones and stops showing data from deleted ones;
You might also consider, what the watch face does when it starts. This is simple: it just reads all the existing data items and displays data based on them.
In general: if you store the data in the data items, you don't need to copy them to any other persistent storage. Data items are persistent storage that is shared between connected devices.
Is there anyway to get the same "MessageId" you can get in Exchange EWS when using ActiveSync?
I thought this was an Exchange way to identify each message uniquely, but I can't seem to find a way to retrieve it using ActiveSync.
EDIT: I've got 2 applications, one that stores info using ActiveSync, and one that stores info using EWS, and I want them to be able to work separately on the same message.... To do this, I was hoping to use the EWS MessageId, which seems to be a GUID type identifier for each individual message. (Note: This doesn't appear to be the same Message-ID as is found in email headers).
Sadly, you're mostly out of luck.
ActiveSync is not an integration protocol, it's a mobile synchronization protocol designed for low-bandwidth communication devices like smart phones. A lot of capabilities in EWS will not exist in EAS.
Long-term message identification and correlation isn't as important for mobile devices. They simply get told what messages are in each folder, and allow the user to manipulate them. At any time the Exchange server may tell its EAS-connected clients to "re-sync" which causes them to forget the messages they have on the device and pull them cleanly from the server. That happens a lot with EAS, sometimes a couple of times an hour, depending on what is happening with that mailbox. For example, deleting a folder via Outlook causes a FolderSync to happen, and that forces connected devices to cleanly re-sync again.
Therefore EAS appears to have left behind the notion of GUIDs or other long term IDs for messages. Instead, the server will assign ephemeral IDs that are valid only until the next big resync is forced (which could happen at any time). You'll probably see Exchange give very simple IDs like 7:45 (which means message ID 45 within folder 7, IIRC). However after a resync that might have the number 7:32 (if the user deletes other messages in that folder) or something like 4:22 (if the message gets moved to another folder entirely).
Other EAS servers like Zimbra, Kerio or Notes Traveler might assign GUIDs, but from memory this is how Exchange behaves. Your only option might be to put a hidden correlation ID of your own into the body or subject of messages you're interested in. That will allow you to track the lifecycle of the items you're interested in, at the expense of some odd stuff being visible to users in their message contents.
#Brian is correct - There are no global unique identifiers for ActiveSync items that can be used to correlate with EWS (With some exceptions, for instance a meeting invite has a UID, as do Events which can be used with some hackery to retrieve an EWS ID for the related EWS calendar event) and there are no fields that aren't visible to the user that can be hijacked for adding your own data with which to correlate. This is most apparent in email, contacts, tasks, notes etc...
However if you are syncing both, it is possible to use the meta data in the objects to match. For instance, for contacts write a hashing algorithm that combines the data from the first name, last name, company name, etc... fields and produces a result. This can be run on the data from both sides and will have relatively little object collision for matching (and those that do collide will have exactly the same visible data to the user anyway so in most cases it won't matter that you didn't get an exact alignment)