How to Sort my values/measure descending on my crosstab. i'm using IBM Cognos Analytics Dashboard by the way.
Related
What do you think about using Elasticsearch as a BI platform. Is it possible to have resources like drill down, aggregates, historical data as a traditional DW environment? What is your opinion? I am a currently satisfied Qlik and Power BI analyst and user :-). I would like to know if it is a good idea to change my environment for a new project. Thanks!
In some cases ElasticSearch can be considered for BI purposes. It is good in aggregate queries, and especially good if you need to filter by 'like' criteria. However some drawbacks are also present:
you cannot join data from different documents as you can do in SQL. Only very limited join functionality is present.
maintaining of ES cluster may be not so simple as you might expect.
aggregate queries on sub-collections (nested queries) might be very insufficient, or not supported by BI tool at all.
ES is good for ad-hoc reporting with 'live' connection, however many popular BI tools cannot connect to ES in this way (say, PowerBI doesn't support direct query connection to ES). For dashboards in fact you don't have a real choice - this is Kibana only. If you interested in tabular reports like pivot tables, you can also check SeekTable.
In case of you what to make a time-series data dashboard, it is a very good idea to make your dashboards via Kibana. You are able to make different dashboards and even manipulate data and make new data properties by Kibana. you can also use different Kibana charts in other applications by using an iframe.
I am new for developing the ELK. So, I have an idea to do with the Elasticsearch data with some dashboard design using Kibana. I have tried we are able to create dashboards like pie, graph etc. However, they are all based on the count and average related. So, I want data like table with whole data. But, a way of the form the query based on that it will generate table. I would like to know whether it is possible in Kibana?
I am trying to visualize time series data stored in elastic search using grafana.
I have the legend setup to show 2 decimal places but it does not reflect in the UI.
The decimal places show up for other dashboard panels with a tsdb datasource. So this issue is specific to using grafana with elasticsearch. Is there any other configuration setup I am missing here which will help me achieve this?
Just found out that elastic search does not allow displaying values without some sort of aggregation and in my case aggregation is resulting in values getting rounded.
There was a related request which seemed to not get much traction in kibana.
https://github.com/elastic/kibana/issues/3572
In short not feasible as of [2.x] elastic search.
I have many-to-many relationship in db. when records are indexed into Elastic search will be using application side join. given joining two resultsets from Elasticsearch will have to be merged at an application level - will be losing out of the box pagination that Elastic search offers (on merged results sets at application side) and also most importantly aggregations. trying to find out way to go about leveraging ES aggregations and pagination while using application side join and merging results on application side?
I have been working on ways to import Google Analytics raw data without having to use a premium account .So far this is the nearest link to what I want to do
How to extract data from Google Analytics and build a data warehouse (webhouse) from it?
I want to load that data into elastic search and display using kibana .What is the best ETL approach for this ? Has anyone tried to display GA data using ELK stack ?
You should do it in two times
First, get the info, a very very useful site is https://developers.google.com/webmaster-tools/v3/how-tos/search_analytics but you have first to have a google wembaster tool account and create oauth credential on https://console.developers.google.com/apis
Then once you have your data, find a way to import them in elasticsearch, I'm still looking for the best way to do so, maybe transform the result table into csv and then using https://www.elastic.co/guide/en/logstash/current/plugins-filters-csv.html
Have a look at this:
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-http_poller.html
You can use this to poll an endpoint, in this case GA, and load the response data into Elasticsearch. You may want to filter the response with the Split and / or Mutate plugins as well.
I have done this same setup.
Extracted data from Google Analytics with 7 Dimensions and 6 Metrics, out of which 2 Dimensions were primary key (Timestamp and ID). This was done using R.
Did some transformations on the data using linux awk and sed commands.
Loaded the data into Apache Hive with the row column formatting, created like total 9 tables.
Joined all the 9 tables in Hive using Hive Join queries, with 2 primary keys.
Used elasticsearch-hadoop connector to load the final resulting table to elasticsearch. Had to do a little data transformations to match Hive and Elasticsearch data types.
Used Kibana to visualize the data in Elasticsearch.
Now I am planning to avoid all the manual steps and somehow automate all the steps above.