Qlik sense, Issues loading data - business-intelligence

I have a Qlik Sense application with an obdc connection. When I try to select the columns in data load editor it appears as the second image. I can't find what is causing this error, my other apps are working fine. If I load few rows it works(1000) but loading the whole table (80000 rows) fails. Thanks in advance.

Related

Tableau (Desktop & Server) - How to fix Error "ORA-01406: fetched column value was truncated" when building large tables?

I am developing dashboards in Tableau Desktop, for which we retrieve data via a custom SQL query (live connection to an Oracle database).
I am able to load my data. For now, we are building tables to display the data.
I unfortunately cannot provide images, but the tables have several dimensions on the rows shelf (for example, name of the product, code for this product, country where it is produced, ...), and we then have Measure Names (KPIs) on the columns shelf. The layout of the table is fixed as it is legally defined.
However, when I drag and drop fields to build the view, I encounter this error at some point:
Error "ORA-01406: fetched column value was truncated".
When I am developing the reports in the acceptance environment of my database, it doesn't happen. But as soon as I switch to production data, the error appears.
The reports will need to be published to Tableau Server, which cannot be done with this error.
The tables for which it happens are quite large, but we were able to build larger tables without this issue.
Do you have any idea on how to solve this issue?
Thanks in advance!

Oracle Apex Interactive Report bad performance while loading

I have an interactive report in one of my APEX application. The SQL query used in the IR runs pretty fine when executed in SQL Developer.
But, at times in the application it gets stuck and requires more time than usual to load the IR. (Usually it takes less than 5 secs to load but at times more than 50 secs).
What might be the possible reasons for it to load slow ?
The query is well tuned and IR has default settings with no modification. I have also checked the stats on the tables and it is fresh.
The SQL query used in IR fetches 10k records.
If you go into Component View and then click Interactive Report under Regions, there is a setting near the bottom under the Performance heading called Maximum Rows To Process. Also limiting the number of rows to display sped things up for me.
Sorry but i can't write comments. Is there any database view in your query?
I have similar situation where query from database view with 6 mil. records take around 3 min to complete in Oracle Apex IR and 10-15 seconds in SQL Developer. So after some research i try to put sql from view directly into IR and result was almost same as this in SQL Developer.
Also You can remove pagination from IR or change it from "x to y from z" to be only "x to y".
I hope this can help you.
Query response time in SQL Developer versus any other Web browser cannot be compared directly. Some of the reasons for its slugishness could be related to server setup, server load, current user traffic, page load processes, page and region rendering, number of regions,components and plugins, navigation menu query, report query, number or columns and rows being displayed, row content length, apex items especially LOV with SQL queries, etc.
From your question, it looks like performance issue is not consistent and so, I think issue may be related to server setup or traffic. Try to check if you see any difference in load time after bouncing the server, if that's an option. Try to isolate the problem and if the issue is specific to interactive report, build a classic report and compare times.
Another thing that has helped me in past is to compare and verify compute times using APEX Debugger, here is the screenshot.
Also look at network and timeline tabs in Chrome debugger,
Implement indexes on your tables
Verify with your DBA if you have database locks
Verify the amount of logs in Database
Switch to classic reports.
Regards

SSIS does not pull the whole source data rows

I've an SSIS package which has source as a Oracle view.
Select * From VwWrkf
When I execute it , I get only 3rd of the data. There is about 1.5mil rows. But there is about 450K that Tabular loads.
Any reason why thay could be?
Use fast load at destination OLEDB task which clears buffer faster and allow all records to process. May be as the buffer getting filled and the records not processed it might not getting the rest of records and might be the connection timedout.
The issue was the date format of a particular section of the report. It did something which Microsoft did not like.
Related document could be found here
It is nothing about "SSIS does not pull the whole source data rows".
If you preview the table data ,it shows only sample data right?.Likewise in the case with
select count(*) as well.If you can run the data flow,it would pick all the data form the source and will be loading it into target table.
If you still doubt,Instead of checking the ssis preview ,can you/is it possible to load data into a destination temp table,and check whether all the data being loaded into destination temp table

Delphi: ClientDataSet is not working with big tables in Oracle

We have a TDBGrid that connected to TClientDataSet via TDataSetProvider in Delphi 7 with Oracle database.
It goes fine to show content of small tables, but the program hangs when you try to open a table with many rows (for ex 2 million rows) because TClientDataSet tries to load the whole table in memory.
I tried to set "FetchOnDemand" to True for our TClientDataSet and "poFetchDetailsOnDemand" to True in Options for TDataSetProvider, but it does not help to solve the problem. Any ides?
Update:
My solution is:
TClientDataSet.FetchOnDemand = T
TDataSetProvider.Options.poFetchDetailsOnDemand = T
TClientDataSet.PacketRecords = 500
I succeeded to solve the problem by setting the "PacketRecords" property for TCustomClientDataSet. This property indicates the number or type of records in a single data packet. PacketRecords is automatically set to -1, meaning that a single packet should contain all records in the dataset, but I changed it to 500 rows.
When working with RDBMS, and especially with large datasets, trying to access a whole table is exactly what you shouldn't do. That's a typical newbie mistake, or a borrowing from old file based small database engines.
When working with RDBMS, you should load the rows you're interested in only, display/modify/update/insert, and send back changes to the database. That means a SELECT with a proper WHERE clause and also an ORDER BY - remember row ordering is never assured when you issue a SELECT without an OREDER BY, a database engine is free to retrieve rows in the order it sees fit for a given query.
If you have to perform bulk changes, you need to do them in SQL and have them processed on the server, not load a whole table client side, modify it, and send changes row by row to the database.
Loading large datasets client side may fali for several reasons, lack of memory (especially 32 bit applications), memory fragmentation, etc. etc., you will flood the network probably with data you don't need, force the database to perform a full scan, maybe flloding the database cache as well, and so on.
Thereby client datasets are not designed to handle millions of billions of rows. They are designed to cache the rows you need client side, and then apply changes to the remote data. You need to change your application logic.

Rhomobile inserting into local database using CSV or XML from external web server

I am currently developing a Rhomobile application. I have a backend database which holds customer information. I have got from the webserver a csv string (or XML - I am able to parse the XML using REXML) which contains all the customers. Each time I sync the device I am going to reset the customer table on the device and re-insert all data from the backend database. I am not using RhoSync and the device will be using property bag.
Is it possible to use the CSV or XML data to insert into the customers table? If so, how would I go about it?
At the moment the only option I can see that would work would be to manually loop through the CSV/XML and insert into the database manually; this isn't very elegant.
Any help will be much appreciated, sorry if this is a dumb question; still relatively new to this framework.
I have come to the conclusion that the only way is to loop through the csv/xml, which with the help of a database transaction this doesn't take long.
Using fixed schema also increases the performance a lot as property bag has to do column inserts (so if you have lots of columns - there is lots of inserts per record).
Also in Rhomobile garbage collection is turned off, so if you are trying to process large data sets your device will quickly run out of memory:
GC.enable
The above solves this issue

Resources