i have a crystal report that calls a function in Oracle. this function uses a select that selects a view as one of its' tables. this view uses multiple links to other databases(the view is a union of several queries). whenever this function runs, it appears as though the query for the view is run, and every link that creates the view seems to go into enq- DX contention.
is this related to this known issue?
http://surachartopun.com/2008/12/dbink-hangs-enq-dx-contention.html
the reason i ask is that my research seems to indicate that this problem should only happen when linking to different versiions of Oracle but all of the ones i am using are 10.2
Has this method ever worked? It does not sound like a good way to do things. I can just imagine the function causing an enormous amount of round trips as it processes each row.
Some other ways of doing this include putting the function as close to the biggest data table as possible and keeping the processing on that database. Only return what you need from the function.
You could create a table representing your desired data set from a query every morning and use Crystal to query that directly. Just refresh your data as required using stored procedures or advanced queues.
A materialized view on one of the database sources with a refresh would be another approach.
EDIT: yes, using database links can be resource intensive and cause the kind of problems you have seen. Has this ever worked or is there a new requirement you need a solution to.
If it used to work what changed?
If this is new I humbly suggest a different approach as I have suggested.
Related
I recently working with Oracle database to generate some reports. What I need is to get result sets of specific records (only SELECT statement), sometimes are large records, to be used for generating the report in excel file.
At first, the reports are queried in Views but some of them are slow (have some complex subqueries). I was asked to increase the performance and also fixed some field mapping. I also want to tidy things up, because when I query against View, I must specifically call the right column name. I want to separate the data works into database, and the web app just for passing parameters and call the right result set.
I'm new to Oracle, so which is better to do this kind of task? Using SP or Function? or in what condition that maybe View is better?
Makes no difference whether you compile your SQL in a view, SP or function. It is the SQL itself that matters.
As long as you are able to meet your requirements with the views they should be a good option. If you intend to break-up your queries into multiple ones for achieving better performance then you should go for stored procedures. If you decide to go for stored procedure then it would be advisable to create a package and bundle all the stored procedures together in the package. If your problem is performance then there may not be a silver bullet solution for the same. You will have to work on your queries and design for the same.
If the problem is performance due to complex SELECT query (queries), you can consider tuning the queries. Often you will find queries written 15-20 years ago, which do not use functionality and techniques that were introduced by Oracle in more recent versions (even if the organization spent the big bucks to buy the more recent versions - making it into a waste of money). Honestly, that may be too much of a task for you if you are new at Oracle; also, some slow queries may have been written by people just like you, many years ago - before they had a chance to learn a lot about Oracle and have experience with it.
Another thing, if the reports don't need to use the absolute current state of the underlying tables (for example, if "what was in the tables at the end of the business day yesterday" is acceptable), you can create a materialized view. It will not work any faster than a regular view, but it can run overnight (say), or every six hours, or whatever - so that the further reporting processing from there will not have to wait for the queries to complete. This is one of the main uses of materialized views.
Good luck!
I'm learning about Oracle Views and I got the concept of views but little confused about performance.
I was watching video and there I listen that oracle view can increase the performance. Suppose I have created view like below.
CREATE VIEW SALES_MAN
AS
SELECT * FROM EMP WHERE JOB='SALESMAN';
Ok now I have executed query to get SALES_MAN detail.
SELECT * FROM SALES_MAN
Ok now confusion start.
I listened in video that once the above query SELECT * FROM SALES_MAN will be executed the DATA/RECORD will be placed into cache memory after hitting the oracle DB. and if I will execute same query( IN Current Session/Login ) Oracle Engine will not hit to Database and will give you record from CACHED-MEMORY is it right?
But I have read on many websites that View add nothing to SQL performance more
Another Reference that says view not help in performance. Here
So Views increase performance too or not?
A view is just a kind of virtual table defined by a query. It has no proper data and performance will depend on the underlying table(s) and the query definition.
But you also have Materialized View wich stores the results from the view query definition. Data is synchronized automatically, and you can add indexes to the view. In this case, you can achieve better performances. The cost to pay is
more space (the view contains data dupplicated),
no access to the up to date data from the underlying tables
You (and perhaps the creator of the un-cited video) are confusing two different concepts. The reason the data that was in the cache was used on the second execution was NOT because it was the second execution of the view. As long as data remains in the cache, it is available for ANY query that needs it. The very fact that the first execution of the view had to get the data from disk should be a strong clue. And what if your second use of the view wasn't until hours or days later when the data was no longer in the cache? The answer remains. A view does NOT improve performance.
I'm working on migrating application from Access to Oracle and have faced with a strange issue . So we have a regular oracle schema - nothing fancy. on the top of that schema I created a number of views - approximately 15. These views use each other and dependency tree can be deep - I'd say up to 6-8 levels.
So now I faced with an issue when I cannot create another view - CPU at oracle servers goes to 50% when I execute 'create or replace view' statement and it takes forever. Views are right now in such a state that selection data from these views may take time, but the issue appears in 'create' statement. I'm not using 'select * ...' in the views and the problematic view depends just on two others.
I'm using Oracle 10g Enterprise v 10.2
In SQL Server I'm familiar with Profiler and would do a trace, view schema locks, but I don't know Oracle that much.
Will appreciate any hints. Thank you.
Views referencing views referencing views strikes me as highly unnecessary. I know we're all supposed to be in favour of "don't repeat yourself" but DRY is a guideline, not a cast-iron rule. It's certainly not intended to be applied so compactly in a database context that nothing compiles.
So try separating out all the views, so that each one references only tables in the FROM clause. That should solve your problem and allow you to make progress with your code.
You can always review the situation later. The neat thing about a view is that it is just an interface. If you subsequnetly want to refactor some views, to replace tables with views than you will be able to, with the minimum of inconvenience (unless you re-introduce the compilation hang).
Looking for a bit of advice on how to optimise one of our projects. We have a ASP.NET/C# system that retrieves data from a SQL2008 data and presents it on a DevExpress ASPxGridView. The data that's retrieved can come from one of a number of databases - all of which are slightly different and are being added and removed regularly. The user is presented with a list of live "companies", and the data is retrieved from the corresponding database.
At the moment, data is being retrieved using a standard SqlDataSource and a dynamically-created SQL SELECT statement. There are a few JOINs in the statement, as well as optional WHERE constraints, again dynamically-created depending on the database and the user's permission level.
All of this works great (honest!), apart from performance. When it comes to some databases, there are several hundreds of thousands of rows, and retrieving and paging through the data is quite slow (the databases are already properly indexed). I've therefore been looking at ways of speeding the system up, and it seems to boil down to two choices: XPO or LINQ.
LINQ seems to be the popular choice, but I'm not sure how easy it will be to implement with a system that is so dynamic in nature - would I need to create "definitions" for each database that LINQ could access? I'm also a bit unsure about creating the LINQ queries dynamically too, although looking at a few examples that part at least seems doable.
XPO, on the other hand, seems to allow me to create a XPO Data Source on the fly. However, I can't find too much information on how to JOIN to other tables.
Can anyone offer any advice on which method - if any - is the best to try and retro-fit into this project? Or is the dynamic SQL model currently used fundamentally different from LINQ and XPO and best left alone?
Before you go and change the whole way that your app talks to the database, have you had a look at the following:
Run your code through a performance profiler (such as Redgate's performance profiler), the results are often surprising.
If you are constructing the SQL string on the fly, are you using .Net best practices such as String.Concat("str1", "str2") instead of "str1" + "str2". Remember, multiple small gains add up to big gains.
Have you thought about having a summary table or database that is periodically updated (say every 15 mins, you might need to run a service to update this data automatically.) so that you are only hitting one database. New connections to databases are quiet expensive.
Have you looked at the query plans for the SQL that you are running. Today, I moved a dynamically created SQL string to a sproc (only 1 param changed) and shaved 5-10 seconds off the running time (it was being called 100-10000 times depending on some conditions).
Just a warning if you do use LINQ. I have seen some developers who have decided to use LINQ write more inefficient code because they did not know what they are doing (pulling 36,000 records when they needed to check for 1 for example). This things are very easily overlooked.
Just something to get you started on and hopefully there is something there that you haven't thought of.
Cheers,
Stu
As far as I understand you are talking about so called server mode when all data manipulations are done on the DB server instead of them to the web server and processing them there. In this mode grid works very fast with data sources that can contain hundreds thousands records. If you want to use this mode, you should either create the corresponding LINQ classes or XPO classes. If you decide to use LINQ based server mode, the LINQServerModeDataSource provides the Selecting event which can be used to set a custom IQueryable and KeyExpression. I would suggest that you use LINQ in your application. I hope, this information will be helpful to you.
I guess there are two points where performance might be tweaked in this case. I'll assume that you're accessing the database directly rather than through some kind of secondary layer.
First, you don't say how you're displaying the data itself. If you're loading thousands of records into a grid, that will take time no matter how fast everything else is. Obviously the trick here is to show a subset of the data and allow the user to page, etc. If you're not doing this then that might be a good place to start.
Second, you say that the tables are properly indexed. If this is the case, and assuming that you're not loading 1,000 records into the page at once and retreiving only subsets at a time, then you should be OK.
But, if you're only doing an ExecuteQuery() against an SQL connection to get a dataset back I don't see how Linq or anything else will help you. I'd say that the problem is obviously on the DB side.
So to solve the problem with the database you need to profile the different SELECT statements you're running against it, examine the query plan and identify the places where things are slowing down. You might want to start by using the SQL Server Profiler, but if you have a good DBA, sometimes just looking at the query plan (which you can get from Management Studio) is usually enough.
IF a create thousands of view, Does it hamper the database performance. I mean is there any problem with creating thousands of view in oracle. Please explain as I am new in this area...I am using oracle...
The simple existence of these views shouldn't harm performance at all. However, once those views start being used it's possible that there will be some negative performance impact. Oracle tries to "remember" the plan for each statement that it sees, but it compares statements by comparing the source code (the SQL). Your thousands of views will all be named differently since you can't have multiple views with the same name, and thus each time one of them is used Oracle is going to have to do a full parse of the SQL, even if it's something as basic as
SELECT * FROM VIEW_1;
and
SELECT * FROM VIEW_2;
All these re-parses will certainly take some time.
What's different about each of these views? I think it might be a good idea to step back and consider other possibilities. Questions I'd ask include
What is to be accomplished here?
Why are thousands of different views needed?
Is there some other way to accomplish what needs to be done without creating all these views?
I don't know the answers to 1 and 2, but I'm reasonably sure that the answer to #3 is "Yes".
Good luck.
Oracle views are an encapsulation of a
complex query and must be used with
care. Here are the key facts to
remember:
Views are not intended to improve SQL
performance. When you need to
encapsulate SQL, you should place it
inside a stored procedure rather than
use a view. Views hide the complexity
of the underlying query, making it
easier for inexperienced programmers
and end users to formulate queries.
Views can be used to tune queries with
hints, provided that the view is
always used in the proper context.
source: Guard against performance issues when using Oracle hints and views
View is as heavy to run as the select that creates it but Oracle loads balance and single select can't harm the DB. If you have thousands concurrent selects going then you might have a problem. The amount of views is not important but how heavy they are and how much you use them.
You would actually need to show the views code and tell what you are actually trying to do.
View should not affect performance, if optimizer is smart enough. I remember cases with other DB engines when Views do harm performance. As in many performance cases - I suggest to measure your particular case.