How can I download the CSV of an event with the attached data, in this case the ID sent by the app related to a specific article read? At the moment I can only download a CSV containing how many events are triggered per day, without any additional information.
You should be able to do this with Parse Explorer - navigate to Analytics tab, select, "Explorer", specify a query, and export your data to either CSV or JSON; the only limitation here is that the number of rows are limited to a 1000, but you should be able to get all your data by running multiple queries on different time periods.
Apparently it's not possible at the moment and it won't be possible in the nearest future.
Source of the information: https://groups.google.com/forum/#!topic/parse-developers/TMWC1v5Doik
Related
I am validating the data from Eloqua insights with the data I pulled using Eloqua API. There are some differences in the metrics.So, are there any issues when pulling the data using API vs .csv file using Eloqua Insights?
Absolutely, besides undocumented data discrepancies that might exist, Insights can aggregate, calculate, and expose various hidden relations between data in Eloqua that is not accessible by an API export definition.
Think of the api as the raw data with the ability to pick and choose fields and apply a general filter on those, but Insights/OBIEE as a way to calculate that data, create those relationships across tables of raw data, and then present it in a consumable manner to the end user. A user has little use with a 1 gigabyte csv of individual unsubscribes for the past year, but present that in several graphs on a dashboard with running totals, averages, and timeseries, and it suddenly becomes actionable.
Does anyone know if there is a way to generate a report that details how many search requests the GSA has handled over a given timeframe?
On the admin console: Reports > Search Logs.
Select the appropriate collection or All Collections.
Select the desired date range for the Report Timeframe.
From memory this only has access to a max 90 days historical data though so if you haven't been regularly exporting this data than you'll need to extrapolate the values from what is available.
As pointed by #BigMikeW Logs only retain data for 90 days. Unless you download every 90 days, you wont get it.
Other way is integration with Google Analytics and pass all search data to GA search behavior. That way you can use GA to play around and export for a year or even more. This is what I use.
We have billions of records indexed in ES cluster, each document will contain fields like account id, transaction id, user name and so on (few free-text string data fields)
My application will query ES based on some user search params (e.g return transactions for user 'A' between X and Y dates and some other filters) and I want to store/export response data to csv/excel file.
For my use case, number of documents returned from ES might be in 100s of thousands or million(s), my question is what are various ways to export "large" amount of data from ES?
These requests are "real-time" requests and not batch processing (e.g - requested user is waiting for exported file to be created).
I read about pagination (size/from) and scroll approach but not sure if these are the best ways to export large dataset from ES. (size/from approach has max setting as 10K if I read it correctly and scroll option is NOT much recommended for realtime use case).
Would like to know from experts.
If your users need to export a large quantity of data, you need to educate them not to expect that export to be done in real-time (for the sake of the well-being of your other users and your systems).
That's definitely a batch processing job. The user triggers the export via your UI, some process will then wake up and do it asynchronously. When done you notify the user that the export is available for download at some location or you send the file via email.
Just to name an example, when you want to export your data from Twitter, you trigger a request and you'll be notified later (even if you have just a few tweets in your account) that your data has been exported.
If you decide to proceed that way, then nothing prevents you anymore from using the scan/scroll approach.
Is it possible to track search results that don't results in click-thrus - situations when returned results are not very helpful/interesting and none of them is clicked?
Possible yes. Out of the box, no.
The GSA does have a click tracking feedback loop but the data that you are asking for is not collected. It collects searches, clicks but you can't get a report on "failed searches". In order to do this, you would have add a custom ct type, export the data and run your own reports. Or...use your favorite analytics tool to do the same.
What you are asking is something automatically done by GSA. The component is called ASR. Advance Search Reporting. If enabled, it will monitor the user activities and rank the results based on usage. It basically works like a metric system.
You can read more about it here https://www.google.com/support/enterprise/static/gsa/docs/admin/74/gsa_doc_set/xml_reference/advanced_search_reporting.html
We are trying to crawl all messages of every group on Yammer (including All Company Group) using https://www.yammer.com/api/v1/messages.json?group_id=<>&access_token=<>,nut its giving me duplicates and also i am not getting complete messages. Is there any way to do this?
Is there any way to get new users joined on Yammer after specific date?
Any sort of help is appreciated.
The best way to get this information is to use the Data Export API. This API is available to paid networks and outputs a ZIP file containing CSV files containing all messages, and list of users. You can pass a parameter called "since" to this API and it'll only provide data since a particular time. The users.csv file also includes a joined at date.
If you attempt to iterate over messages you will hit some limits. These limits are technical in nature and you would need to revert to the search API to find much older messages. Unfortunately you will have to put up with these limitations if you are dealing with the free version of Yammer as the data export is only available with the paid version.
I achieved this a different way. I used the export API to get a list of all of the groups.
https://export.yammer.com/api/v1/export?model=Group&access_token=
Then I looped through the list of groups and pulled all of the message data for each group and combined them into one *.json
https://www.yammer.com/api/v1/messages/in_group/###.json
Where ### is the group ID extracted from the groups export data.