OBIEE12C: NULL appearing as option in dashboard prompt choice list - obiee

I have a dashboard prompt with various fields to allow the client to filter the data by year, quarter, month.
I have my own development environment and when I test the prompt in my OBI server everything looks fine. But when I deploy the prompt in the client environment something weird happens: the NULL value appears as an option in the choice list.
Any clue about why could this happen?

Your prompt is populated by a query. That query accesses the database. If your query returns a NULL and the query is the same in both environments that means that your database is the place where there's a difference.
a) Grab the two queries from both environments
b) Run both queries against their REPSECTIVE databases
c) Compare the results
d) Run both queries against the OTHER database
e) Compare the results
If the doesn't show you where the error source is, then your prompts are different and you made the error while configuring things in the front-end or the RPD and they are not as you say identical.

Related

Accessing report output table in Oracle APEX

I have written an SQL query to return an Interactive report in APEX, and I would like to create another (classic) report on the same page that summarizes that report. The report gives test results, and I want to summarize it by grade level (so x # of As, y Bs, etc)
It's a complicated query that takes a while to run, so I'd like to avoid just running it again and using aggregation functions.
Does APEX store that report in a variable or table as it does variables (something like :TABLE_NAME) that I could just run a query on? I haven't been able to find anything on that in documentation or searching.
Thanks!
One option is to
enter parameters you use to filter data in that complex query
instead of running the report directly, create a push button and a process on it
that process should insert the result into a global temporary table (GTT) you'd, of course, have to create
why? Because it can be shared by multiple users, while everyone sees only their own data
interactive report would fetch & display data from the GTT
classic report would summarize data from the GTT

Long Running Query on MSSQL

In my team, we need to connect to Oracle, Sybase and MSSQL very frequently... We use Oracle's SQLDeveloper 3.3.2 Version to connect all 3 (using third party libs). This tool often has a problem that select queries never ends... Even if we get the results, it will keep on running... And because of this we receive database alerts for long running queries...
E.g.
Select * from products
If products has million records, then SQLDeveloper will show top records but in background the query will keep on running.
How Can this problem be solved?
Or
Is there a better product which can fulfill our need.
Your query - select * from products - is asking the database engine to send millions of records to your client application (SQLDeveloper in this case).
While SQLDeveloper (and many other GUIs of a similar design) will show you the first 30 (or 50, or 100, etc) rows, as far as the database engine is concerned you're still asking to see millions of rows hence your query continues to 'run' in the database engine.
For example, in Sybase ASE the query will show up with a status of 'send sleep' meaning the database engine is waiting for the client application to request the next batch of records to send down the connection.
To 'solve' this issue you have a few options:
using SQLDeveloper: scroll through (ie, display on your monitor) the
rest of the multi-million row result set [likely not what you want to
do; likely you don't have the time/desire to hit the 'Next' button
100's of thousands of times]
kill off your query after you've received/viewed the first set of
records [not recommended as there will likely be times when you
'forget' to kill of your query, thus earning the wrath of your DBA]
write your query to pull back only the records you REALLY want/need to see (eg, add a WHERE clause to limit the set of rows)
see if SQLDeveloper has any sort of configuration option to
auto-kill any 'long running' queries [I have no idea if this is even
doable in a client application]
see if the DBA can configure your login with a resource limit (eg,
auto-kill queries if they run for more than XX seconds)

Analysis Services database returns oddly formatted numbers on MDX queries

I have a SQL Server 2008 Database server hosting two Analysis services databases. When from the same machine I run a MDX query in SSMS against both of these DBs (e.g. SELECT { [Gender].[M], [Gender].[W] } on 0, { [Area].[Town].[3101000], [Area].[Town].[3152007] } on 1 FROM [Population] WHERE ([Time].[Years].[2005], [Population]) the one query returns a table consisting of floating number values formatted like 123,23 while the other DB returns floating numbers formatted like 123.23
In Visual Studio, measures from both projects are defined as double values, in the source databases values are taken from NUMERIC columns.
I populated these two DBs from two different relational databases (both however share the same collation settings) and also the Visual studio solutions used to deploy the AS DBs were different (still for the second one I just took a copy of my first vs solution and basically only removed some unnecessary dimensions and cubes).
My question is: What are the suspects to look for that can cause the difference in the formatting of MDX results?
I already checked the collation settings of the source DBs which seem to be identical
I checked for language settings in the AS DB but did not find any suitable setting (did I maybe oversee something)
I checked the FormatString properties in the cubes which were unset in both solutions
Any further ideas here?
Is there anything in the "Calculations" tab of the cube in VS/BIDS? You can override format strings for measures there.
Also, are you setting language in the connection string to SSAS? This is possible, and is used by clients to automatically format numbers in the locale of the user. Seems unlikely if you're running this in in SSMS but I thought I'd ask.

How to monitor web application DB query execution plans?

Is there a way in TOAD or some other tool to monitor queries being executed by your web app?
I'd like to examine the explain/execution plans for the web app queries.
I'm debugging why the webapp queries are slower than when run from sqlplus.
Generally you can track and anlyse from three points.
Firstly SQL, mostly through the v$sql view.
Secondly through session (starting with v$session).
Finally through time (measuring, normally at either a system or session level, for a period of time).
If a particular SQL statement, such as SELECT * FROM table WHERE type = :val, is executed then the database will make a quick hash of it and see if there is a matching statement in the cache. The statement not only has to match on the text, but on certain environmental settings too (such as Parsing user, Optimizer Goal, bind variable types, NLS settings...).
If there is no matching statement, then the database will feed it to the optimizer to come up with a query plan. If there is a match, then the plan already determined for that statement will be used.
So I would suggest your first step is to take an SQL which has been executed by both the web-app and from sqlplus and see if it is using the same plan. You should be able to look in v$sql for the statement of interest and see how many occurrences it has).
If you have multiple occurrences, especially with different MODULE/ACTION/SERVICE values, then you can look at the plans to see if they differ (DBMS_XPLAN.DISPLAY_CURSOR). If you have only one occurrence then the SQL is being shared and you need to take a different approach to isolating the web-app executions from the sqlplus executions.
One way to do that would be to trace the execution of the SQL through both a web-app session and sqlplus session (DBMS_MONITOR). Then tkprof or similar on the trace files and look for differences.
can't help you with doing it through TOAD, but you can't go wrong in getting an understanding of the underlying tools and techniques.
Yes. There is a way to monitor a web app callings to queries to DB in Oracle TOAD.
START -> All Programs -> Quest Software -> TOAD for Oracle -> Tools -> SQL Monitor
With this tool you select the process ([TOAD, Web_dev (I dont remember the name of debug)] "running" in this case, "debug" too). This tool shows what stored procedure or function is calling the app.

Fast query runs slow in SSRS

I have an SSRS report that calls out to a stored procedure. If I run the stored procedure directly from a query window, it will return in under 2 seconds. However, the same query run from an 2005 SSRS report takes up to 5 minutes to complete. This is not just happening on the first run, it happens every time. Additionally, I don't see this same problem in other environments.
Any ideas on why the SSRS report would run so slow in this particular environment?
Thanks for the suggestions provided here. We have found a solution and it did turn out to be related to the parameters. SQL Server was producing a convoluted execution plan when executed from the SSRS report due to 'parameter sniffing'. The workaround was to declare variables inside of the stored procedure and assign the incoming parameters to the variables. Then the query used the variables rather than the parameters. This caused the query to perform consistently whether called from SQL Server Manager or through the SSRS report.
I will add that I had the same problem with a non-stored procedure query - just a plain select statement. To fix it, I declared a variable within the dataset SQL statement and set it equal to the SSRS parameter.
What an annoying workaround! Still, thank you all for getting me close to the answer!
Add this to the end of your proc: option(recompile)
This will make the report run almost as fast as the stored procedure
I had the same problem, here is my description of the problem
"I created a store procedure which would generate 2200 Rows and would get executed in almost 2 seconds however after calling the store procedure from SSRS 2008 and run the report it actually never ran and ultimately I have to kill the BIDS (Business Intelligence development Studio) from task manager".
What I Tried: I tried running the SP from reportuser Login but SP was running normal for that user as well, I checked Profiler but nothing worked out.
Solution:
Actually the problem is that even though SP is generating the result but SSRS engine is taking time to read these many rows and render it back.
So I added WITH RECOMPILE option in SP and ran the report .. this is when miracle happened and my problem got resolve.
I had the same scenario occuring..Very basic report, the SP (which only takes in 1 param) was taking 5 seconds to bring back 10K records, yet the report would take 6 minutes to run. According to profiler and the RS ExecutionLogStorage table, the report was spending all it's time on the query. Brian S.'s comment led me to the solution..I simply added WITH RECOMPILE before the AS statement in the SP, and now the report time pretty much matches the SP execution time.
I simply deselected 'Repeat header columns on each page' within the Tablix Properties.
If your stored procedure uses linked servers or openquery, they may run quickly by themselves but take a long time to render in SSRS. Some general suggestions:
Retrieve the data directly from the server where the data is stored by using a different data source instead of using the linked server to retrieve the data.
Load the data from the remote server to a local table prior to executing the report, keeping the report query simple.
Use a table variable to first retrieve the data from the remote server and then join with your local tables instead of directly returning a join with a linked server.
I see that the question has been answered, I'm just adding this in case someone has this same issue.
I had the report html output trouble on report retrieving 32000 lines. The query ran fast but the output into web browser was very slow. In my case I had to activate “Interactive Paging” to allow user to see first page and be able to generate Excel file. The pros of this solution is that first page appears fast and user can generate export to Excel or PDF, the cons is that user can scroll only current page. If user wants to see more content he\she must use navigation buttons above the grid. In my case user accepted this behavior because the export to Excel was more important.
To activate “Interactive Paging” you must click on the free area in the report pane and change property “InteractiveSize”\ “Height” on the report level in Properties pane. Set this property to different from 0. I set to 8.5 inches in my case. Also ensure that you unchecked “Keep together on one page if possible” property on the Tablix level (right click on the Tablix, then “Tablix Properties”, then “General”\ “Page Break Options”).
I came across a similar issue of my stored procedure executing quickly from Management Studio but executing very slow from SSRS. After a long struggle I solved this issue by deleting the stored procedure physically and recreating it. I am not sure of the logic behind it, but I assume it is because of the change in table structure used in the stored procedure.
I Faced the same issue. For me it was just to unckeck the option :
Tablix Properties=> Page Break Option => Keep together on one page if possible
Of SSRS Report. It was trying to put all records on the same page instead of creating many pages.
Aside from the parameter-sniffing issue, I've found that SSRS is generally slower at client side processing than (in my case) Crystal reports. The SSRS engine just doesn't seem as capable when it has a lot of rows to locally filter or aggregate. Granted, these are result set design problems which can frequently be addressed (though not always if the details are required for drilldown) but the more um...mature...reporting engine is more forgiving.
In my case, I just had to disconnect and connect the SSMS. I profiled the query and the duration of execution was showing 1 minute even though the query itself runs under 2 seconds. Restarted the connection and ran again, this time the duration showed the correct execution time.
I was able to solve this by removing the [&TotalPages] builtin field from the bottom. The time when down from minutes to less than a second.
Something odd that I could not determined was having impact on the calculation of total pages.
I was using SSRS 2012.
Couple of things you can do, without executing the actual report just run the sproc from within the data tab of reporting services. Does it still take time?
Another option is to use SQL Profiler and determine what is coming in and out of the database system.
Another thing you can do to test it, so to recreate a simple report without any parameters. Run the report and see if it makes a difference. It could be that your RS report is corrupted or badly formed that may cause the rendering to be really slow.
Had the same problem, and fixed it by giving the shared dataset a default parameter and updating that dataset in the reporting server.
DO you use "group by" in the SSRS table?
I had a report with 3 grouped by fields and I noticed that the report runed very slowly despite having a light query, to the point where I can't even dial values in the search field.
Than I removed the groupings and now the report goes up in seconds and everything works in an instant.
In our case, no code was required.
Note from our Help Desk: "Clearing out your Internet Setting will fix this problem."
Maybe that means "clear cache."

Resources