We are working with webengage and defined some events and funnels. to analyze this data we have to manual go to each event and export some data.
Is there a way to get all webengage data in a way so we can automatically analyze it?
Related
we have the application performance index available in the JMeter dashboard.
how can we programmatically get the performance index values to analyze this data further?
I am not sure to understand your question.
But if you want to get the data from HTML in a usable way, you should know that JMeter also generates in the output folder a JSON file called statistics.json that you can parse easily to automate.
My data source is a WebSocket API that provides a channel to listen to.
The final destination is for use in PowerBI for near Real-Time reporting.
Ideally I need to first write this data to an Oracle DB for data transformation before using DirectQuery in PowerBI.
Also, have Talend at my disposal for ETL.
What would be the best practice solution look like?
I don't know if this is the best practice, but here's how I would do it with Talend:
tRESTClient (API) --> format/extract data (JSON, XML, etcc.) --> tDBOutput (Oracle)
Afterwards, depending on the amount of data to be processed, we could do a first step of collecting the data and saving it in DB.
In a second step, we prepare either with Talend in new tables or views in DB, the desired data for PowerBI
I am validating the data from Eloqua insights with the data I pulled using Eloqua API. There are some differences in the metrics.So, are there any issues when pulling the data using API vs .csv file using Eloqua Insights?
Absolutely, besides undocumented data discrepancies that might exist, Insights can aggregate, calculate, and expose various hidden relations between data in Eloqua that is not accessible by an API export definition.
Think of the api as the raw data with the ability to pick and choose fields and apply a general filter on those, but Insights/OBIEE as a way to calculate that data, create those relationships across tables of raw data, and then present it in a consumable manner to the end user. A user has little use with a 1 gigabyte csv of individual unsubscribes for the past year, but present that in several graphs on a dashboard with running totals, averages, and timeseries, and it suddenly becomes actionable.
How can I download the CSV of an event with the attached data, in this case the ID sent by the app related to a specific article read? At the moment I can only download a CSV containing how many events are triggered per day, without any additional information.
You should be able to do this with Parse Explorer - navigate to Analytics tab, select, "Explorer", specify a query, and export your data to either CSV or JSON; the only limitation here is that the number of rows are limited to a 1000, but you should be able to get all your data by running multiple queries on different time periods.
Apparently it's not possible at the moment and it won't be possible in the nearest future.
Source of the information: https://groups.google.com/forum/#!topic/parse-developers/TMWC1v5Doik
I know that my overall problem is generally approached using two of the more common solutions such as a join data set or a sub-table, sub-report. I have looked at those and I am not sure this will work effectively.
Background:
JDBC data source has local data which includes a series of id's that reference a record in a master data repository interfaced via a web service. This is where the need for a scripted data source arises. The data can be filtered on either attributes within the local JDBC data and/or the extended data from the web service. The complication is that my only interface is the id argument to the webservice.
Ideal Solution:
Aside from creating a reporting table or other truly desirable scenarios I am looking to creating a unified data source through a single scripting data source that will handle all the complexities. This leaves the report generation and parameter creation a bit cleaner, hopefully. The idea is to leverage the JDBC query as well as the web service queries in the scripted data source do the filtering and joins and create that singular unified view.
I tried using the following code as a reference to use the existing JDBC connection in the BIRT report definition to execute the query. However if I think my breakdown on what should be in open vs fetch given this came from beforeFactory for a completely different purpose may be giving me errors...truth is I see no errors it just returns 0 records.
a link
I have also found a code snippet to dynamically load a JDBC connection but that seems a bit obtuse and a ton of overhead for what I am needing to do. a link
In short: How in all-that-is-holy do you simply run a query against a database within a scripted data source if you wanted to do. The merit of doing that is another issue, but technically how?
Thanks in Advance!