Microsoft business intelligence platform vs QlikView - business-intelligence

I have no experience at all with QlikView, but I have read great reviews about it. According to Gartner Research Report 2012, QlikView is in the top quadrant together with Microsoft, Oracle, SAP and IBM (leaders and visionaries).
I am interested in hearing from the community how QlikView stacks up against Microsoft's business intelligence (BI) platform. In my company they are choosing between Microsoft and QlikView for a future solution to be built. We are basically a Microsoft shop, but I read that QlikView is designed for user friendliness, super intuitive, etc.
I also read that some cons for choosing QlikView are:
High hardware requirements
Technical resources (people who know QlikView) are very rare
Licencing costs are high
Scalability issues
Any insight in this matter will be greatly appreciated.

I'm an ex-Microsoft employee who sold their BI toolset - I'm currently with Tableau.
To address your initial question. A "Power+" user can generally get Qlik up and running fairly quickly -- faster than if that user needed to learn a BI platform like Microsoft/ IBM / SAP Oracle. Everything you need is in one place and you don't need to worry about installing / configuring / caring for multiple components like you do with Microsoft - SQL Server, SQL Analysis Services, SharePoint (at least).
In my opinion, Qlik needs fewer "tech people" than the same solution created with the Microsoft platform. Qlik's scripting / importing capabilities are a little "techy / clunky", but once you get past that, things aren't very difficult - I figured it all out in an afternoon.
If you're dealing with large data sets, you will have to invest some $$ in RAM, because Qlik must load the entire dataset into memory. Yes, RAM is relatively cheap, but data sizes are growing faster than the cost of RAM is coming down - so it's something to think about.
Since Qlik needs to "own all the data" inside it's in-memory database, it may be challenged down the road. For example, you're going to see very few people running "big data" stores like Teradata / Greenplum, etc. putting Qlik on top b/c of of the HW requirements to re-store all the data. Microsoft is more flexible in that regard is it CAN store data either in memory (the new Tablular Data Model of SSAS) or go directly to the data in the original database.
You'll find user acceptance is generally much higher for the "Data Discovery" tools like Qlik, Tableau and Spotfire - Users just tend to "get" these tools and feel more in control of what's going on. They can get their work done with less assistance from IT.
I completely agree with the previous poster - you really should consider additional tools in the "Data Discovery" space when doing a bake-off against Microsoft - there are free / eval versions of Tableau, Spotfire, and Qlik - try them each out. I'm, of course partial to Tableau - it's friendlier still than Qlik, allows for in-memory storage or can connect "live" to the data source, and has very high-end visualizations.
Good luck and have fun!

Microsoft Power BI has a fully-featured FISMA HIPAA secure cloud for about $144 a user a year, and their desktop client is upgraded monthly and is completely free.
More importantly, SSRS will host Power BI internally on-premise as a first-class citizen, productionizing the entire server component for FREE in Q2 2017
Power BI also has Revolution R statistical analytics deeply integrated, beating every data discovery tool on the market in statistical integration, period.
There is no Qlik or Tableau answer to this above statement as these TCO and feature rollouts are incredibly expensive as the rest of the data discovery cohort, and Qlik does not do monthly iteration releases.
In 2017 it will become apparent that Power BI is the best solution for data discovery without spending seven million dollars on a five year TCO to accomplish it.
Gartner 2016 has Microsoft a top-ranked in nearly every data category, which cannot be said for any of the products which were listed in the page (or any other company).

Why just those two? Have a look at Tableau, Altosoft, Spotfire. They are certainly easier to use than MS and don't have QV troubles.

A poster above embedded the 2016 Gartner Magic Quadrant for Modern BI but that firm's more analytical review of product strengths is their "Critical Capabilities" report. I'm biased myself of course, but you can see TIBCO Spotfire out scores Tableau, Qliktech and Microsoft for critical capabilities.
TIBCO's TERR engine is unique in the market in that it is the only higher performance, supported, commercial engine for R, tightly coupled to the visual experience in Spotfire. Revolution has great packages but they are based on the original open source R engine, so inherit its flaws.
Here's a key summary table:
2016 Gartner Critical Capabilities Summary Scores Table
I think you could see the whole reprint here:
https://www.gartner.com/doc/reprints?id=1-2RVRL5R&ct=151110&st=sb

Related

ETL Tool Which is most configurable

I am looking for best suited ETL Tool for the following criteria.
Supports MongoDB
Accepts Metadata as input (Or accepts file and builds its metadata on the fly)
provides configurable Mapping. (mapping can be defined from outside development, using some file ot table)
Please suggest the tool which caters to the above needs.
Hmm, your questing is looking most configurable ETL tool. From past years of experience in ETL process, I can inform you that you will never find such tool that meets all your demands. Especially when you have Enterprise level data warehouse (needed because of high and complex reporting needs), the only one software solution is to build your own custom project based ETL software, which is often ungrateful.
But (big BUT), you can achieve at least 80% of needs with existing tools. Plugins, smart usage of scripts, good data-flow design and (if needed) small custom software in pair with scheduling could help you out to fulfill imagined process. ETL process doesn't seem to be different in compare to any other work - 80% of the work is done in 20% of time, and the rest of work (20%) is done in 80% of time.
My suggestion for you:
Pentaho Data Integration - free and open source
PDI is powerfull ETL tool, and surley can meet your demands. There is a plenty of plugins, solid level community and fine API if you're going to develop more plugins.
Pentaho Data Integration + Integration Server - Enterprise Edition - "cheap enough" for almost every medium size projects
Enterprise edition has everything like free edition, including more plugins (JMS producer for example), version control system, instaview's and ect.
Beside, it has it own Server so scheduling is software based (not OS based), logging, better management and most important thing - support!
Informatica or Microsoft SSIS - expensive and brilliant
I would not wasting words for this tools. Informatica is primary ETL oriented company that using Informatica on high level require deep understanding of DB/DWH design, ETL process, PL/SQL, dimensional modeling ect.
SSIS is primary constructed for SQL Server, so I don't see high usage needs if at least one of your source db or target db (DWH) is not running on SQL Server.
Conclusion
This is just a scratch of plenty tools that market provide to us. Someone else would probably not even mention these tools. Please look one of the lists.
Almost each BI system has it own ETL tool. Maybe the good choice would be to use it together, in that way you will be in possibility to use maximum from both.
Note: Good ETL project manager, or ETL developer can extend tool advantages to level that better/more expensive have!

purpose of IBM Cognos Business Intelligence and Financial Performance Management?

I've been mandated to find out what IBM Cognos does, and I cannot find useful information on the subject apart what I can read from the IBM Website and Wikipedia.
What I'm after is some concrete examples of what Cognos can do for businesses and organisations that intend to use it.
Financial Performance Management I have no idea about but we use BI 8.4/10.1 quite a bit. The Cognos product line is actually quite large and we only really use the baseline BI stuff with Framework Manager but I'll try and help you out, based on how we use it.
Think of BI itself as an application that lets you view your data in many different ways. Now so far, it's no different to Jasper Reports or BIRT (which, despite its name, appears to provide very little BI stuff).
It does this by modelling the data (models are created with Framework Manager hence why we use it over and above the standard reporting interface) to translate raw data into business data and also relational to dimensional data if your database isn't already dimensional.
It's this business view of the data combined with the dimensionality which allows really neat manipulation within Cognos BI.
You can create reports in a truly multi-dimensional way, aggregating data in various ways across things like dates, products, geographical regions, stores, divisions and so on (depending on your dimensional setup).
All of the reports are really dynamic in that you can collapse or expand individual dimensions at will so, if for example you want to drill down on a poorly-performing state to see which individual stores in that state are causing problems, it's a simple click on an icon.
No re-querying of the data, everything just happens in very quick time. And the charts and data that can be produced are very nice.
And, on top of that, Cognos BI comes with an inbuilt query studio and report studio which allow the creation of ad-hoc reports in the exact same interface the user sees when running standard reports. No more of the Eclipse-Designer/Web-App separation that we had to endure with BIRT.
Sorry if this sounds evangelistic but we're transitioning from BIRT to Cognos BI, and the difference is substantial.
Now you may not find a lot of information outside of the IBM website, although we did find a couple of dedicated sites when we first started examining the transition. Unfortunately, I don't have them available any more since the IBM information is more than adequate.
We also make a lot of use of the IBM developerWorks forums (we use Tivoli Common Reporting which ships with the Cognos runtimes) and the microsite as well. As well as the forums, there's a whole section of developerWorks dedicated to Cognos.
A bit late, but for the benefit of anyone browsing ... Cognos BI is essentially web based reporting/ dashboarding/ analytics. Historically it connected to relational databases only; from v8.4 onwards (and moreso from v10) it also connects to OLAP cube data sources. It's designed for end user self service reporting and includes mobile as well as web connectivity.
Cognos FPM provides in-server memory OLAP cube modelling (based on the TM1 engine). A key point of difference is that it permits end-user writeback and is generally used for budgeting and 'what-if' scenario modelling. Modelling is facilitated by Rules, which enable data modification. It also scales to the max. As noted above it may be integrated to Cognos BI (as well as being stand-alone), which means that a single dashboard may include reports from both relational & OLAP sources, and provide planning. So it's very powerful.
Note that Cognos Express provides essentially the same tools for the midmarket.
A little late but in case anyone else comes here and is looking for information, I would like to enhance #paxdiablo's answer. He was talking only about Modeling and reporting tool which is the best known Cognos.
There is also a powerful tool named Metric Studio which can track in an easy way, how business is performing. This tool is IMHO the best of the Cognos Suite, since it is truly BI for the high management.
Another thing that I love from Cognos (been using it since 2004) is the administration. From an IT perspective it is way easier to make things happen in Cognos rather than any other tool I've seen (BO included).
Just to name a few: you can link row-filtering with LDAP information (e.g. roles and customers); burst reporting through cognos content or email... the possibilities are huge.

Is Pentaho ETL and Data Analyzer good choice?

I was looking for ETL tool and on google found lot about Pentaho Kettle.
I also need a Data Analyzer to run on Star Schema so that business user can play around and generate any kind of report or matrix. Again PentaHo Analyzer is looking good.
Other part of the application will be developed in java and the application should be database agnostic.
Is Pentaho good enough or there are other tools I should check.
Pentaho seems to be pretty solid, offering the whole suite of BI tools, with improved integration reportedly on the way. But...the chances are that companies wanting to go the open source route for their BI solution are also most likely to end up using open source database technology...and in that sense "database agnostic" can easily be a double-edged sword. For instance, you can develop a cube in Microsoft's Analysis Services in the comfortable knowledge that whatver MDX/XMLA your cube sends to the database will be intrepeted consistently, holding very little in the way of nasty surprises.
Compare that to the Pentaho stack, which will typically end interacting with Postgresql or Mysql. I can't vouch for how Postgresql performs in the OLAP realm, but I do know from experience that Mysql - for all its undoubted strengths - has "issues" with the types of SQL that typically crops up all over the place in an OLAP solution (you can't get far in a cube without using GROUP BY or COUNT DISTINCT). So part of what you save in licence costs will almost certainly be used to solve issues arising from the fact the Pentaho doesn't always know which database it is talking to - robbing Peter to (at least partially) pay Paul, so to speak.
Unfortunately, more info is needed. For example:
will you need to exchange data with well-known apps (Oracle Financials, Remedy, etc)? If so, you can save a ton of time & money with an ETL solution that has support for that interface already built-in.
what database products (and versions) and file types do you need to talk to?
do you need to support querying of web-services?
do you need near real-time trickling of data?
do you need rule-level auditing & counts for accounting for every single row
do you need delta processing?
what kinds of machines do you need this to run on? linux? windows? mainframe?
what kind of version control, testing and build processes will this tool have to comply with?
what kind of performance & scalability do you need?
do you mind if the database ends up driving the transformations?
do you need this to run in userspace?
do you need to run parts of it on various networks disconnected from the rest? (not uncommon for extract processes)
how many interfaces and of what complexity do you need to support?
You can spend a lot of time deploying and learning an ETL tool - only to discover that it really doesn't meet your needs very well. You're best off taking a couple of hours to figure that out first.
I've used Talend before with some success. You create your translation by chaining operations together in a graphical designer. There were definitely some WTF's and it was difficult to deal with multi-line records, but it worked well otherwise.
Talend also generates Java and you can access the ETL processes remotely. The tool is also free, although they provide enterprise training and support.
There are lots of choices. Look at BIRT, Talend and Pentaho, if you want free tools. If you want much more robustness, look at Tableau and BIRT Analytics.

Are there any good free or cheap tools for building an Oracle Database diagram?

I need to diagram an oracle database and I am hoping to find some good tools that are either cheap, or free.
Ideally the tool should allow me to draw the relationships between the tables, as well as remove unwanted tables from the diagram.
I already have access to MS Visual Studio 2008 as well as SSMS 2008, but I don't believe either will provide much help with oracle.
I asked this question here on serverfault, and I had several answers. However after I tried most of the tools I ran into problems with all of them.
I prefer SQL server over oracle, but I have one legacy oracle system to manage, and I am finding myself climbing an uphill battle against the numerous errors oracle throws at you on a minute by minute basis.
Have a look at TOADSoft and especially Toad Data Modeler (Toad is a very famous tool).
Another well known commercial tool is PL/SQL Developer. This is a more integrated solution (not only graphical modeling).
In both case, I didn't check the pricing but I'm sure they are worth it (and the prices must be insignificant in comparison to Oracle's license).
Like your friends over on serverfault, I had a really good experience with PowerArchitect. And it's free. . . .
Maybe I don't understand, but its only a diagram. In which any UML tool will do the job, even Visio, which should have for free or next to nothing for you, not to mention the tools in that blog. And there always pencil and paper.
Visio professional will let you reverse engineer the database schema and I've done this with Oracle before. It's actually quite good for this as you can organise the diagram into subject areas (i.e. separate pages). You can also annotate the diagrams with missing foreign keys; this is quite a useful feature for making sense of vendors' databases.
'Enterprise Architect' versions will also allow you do generate DDL from the diagrams, and you can often get VSEA2002 or VSEA2003 quite cheaply; these versions come with the EA version of Visio bundled.
I think Visio has a feature called "Reverse Engineering", with which you can specify a database connection and it will automagically draw the Diagram for you. The database connection can be anything accessible via ODBC.
(MySQL also offers such a feature in its MySQL Workbench, though I don't remember, if it was possible to specify a different database system than MySQL itself)
As long as we're mentioning pencil and paper, I'll throw in the next step up from pencil and paper. It's MS Access.
If you have MS Access on your PC, and if you can set up table links from MS access to Oracle, you can use MS Access to generate relationship diagrams, which you can then print.
You have to do a fair amount of manual work, compared to some of the pricier tools.
Set up an empty MS Access database. Then set up a table link to each of the tables in your Oracle database. Then use the Access relationship tool to draw relationship lines between each foreign key and the key it references. Classify each relationship as many to one. This creates the lines between the boxes. You can use the Access interface to drag the boxes around on the diagram until you like the visual layout. You can print the resulting diagram.
Oh, and by the way, you can create local tables to act as snapshots of some of the data, and MS queries to reload the local tables from the Oracle data. That way you can mess around with the local copies without writing to the Oracle database. You can even set up table links to a SQL server database, and move data across, bit by bit.
Depending on the complexity of your Oracle DB, and your diagramming needs, this could be enough of a tool for you.
Oracle's own SQL Developer Data Modeler has a "Free to download, free to learn, unlimited evaluation", whatever that means.

Test Reporting

We are migrating our test report data (unit, regression, integration, etc..) from an XML format to a database format for better analysis. Right now the majority of our test analysis is done using the CruiseControl.NET dashboard, but this is limited to primarily the most recent test data. Older test data can be accessed but not easily compared to new test data. We want to pin point problem components and better narrow down bugs. With the onset of tons of information brought on by our newly implemented regression and integration testing I would like to see some better metrics generated (possibly performance and the like). Have you worked with any business intelligence systems that will provide a framework for accurately and easily implementing some sort of analysis and reporting?
I have looked into JasperReports and Pentaho but I'm struggling with implemetation of Pentaho at the moment. Should I continue my fight with the system? Is this what I'm looking for?
You could always just use SQL Server Reporting Services and Report Builder (MS's web based designer) or Report Designer (component of Visual studio). It's pretty easy to get this set up too.
Report Builder: http://msdn.microsoft.com/en-us/library/ms155933.aspx
Report Designer: http://msdn.microsoft.com/en-us/library/ms157166.aspx
Tutorial: http://www.simple-talk.com/sql/learn-sql-server/beginning-sql-server-2005-reporting-services-part-1/
How to add Reporting Services to an existing SQL Server: http://www.mssqltips.com/tip.asp?tip=1444
There are a few end user reporting solutions around as well that make it easier to dynamically create reports, if you're willing to invest a bit of cash.
My company produce one: http://www.rsinteract.com has a very cheap standard edition with a limited number of reports (30 day free trial). It reports directly off SQL server with Reporting Services installed. It won best of TechEd 2006 - http://windowsitpro.com/article/articleid/53944/best-of-tech-ed-2006-winners.html
We actually use ours to analyse the support requests from clients i.e. which component is failing most, who reports the most bugs etc. Not tried it on test data.
There's also Proclarity, ApexSQL Report, and Tableau all of which are good.
You could try looking at rolling your own (if you know what you're looking for) using Processing written by Ben Fry. It's best accompanied by his book "Visualizing Data".
The tool is free and I guess you can get a free 45 day trial of O'Reilly Books Online to get a head start and see if its right for you. I do know there are chapters on reading and crunching data from all kinds of sources (including XML and databases) and then making meaningful and useful visualisations from them.
I'm currently using it to get my head round the dependency complexities of an inherited code base and its been massively useful.
Which part of Pentaho?
The Kettle project has stuff to convert your Cruise Control info and load it into a relational database. That's probably a good module to get working properly, especially if you're almost done figuring it out. I hope you'll share this stuff. I could use it too.
The Platform will autoschedule stuff once Kettle has it loading.
To make Mondrian really useful you'll need to work out a fact / dimension organization to your test data. That may or may not be worth your trouble at this point.
Once you have your data loaded you'll probably be able to get a lot of benefit out of simple SQL queries like this...
select *
from test
where failed='yes'
order by testno, date desc
and this...
select max(date), min(date), testno
from test
where failed='yes'
group by testno
order by testno
and stuff like that. You might consider creating views in your table server for your favorite queries.
There are myriad ways to convert your sql queries into reports, including the pentaho reporting module, BIRT (an eclipse plugin), Crystal Reports, and all kinds of PHP or JSP stuff you could put together.

Resources