VB6 Recordset "Open" taking more time to show result as compared to backend - vb6

I am using a query to find data.
Same query taking less time i.e 2 sec to execute from backend.
But in code same query taking more time i.e 30 sec in recordset.open.
Database : Sybase
Thanks
Code Sample :
Dim rsRoute As New ADODB.Recordset
---------------------------------------------
If rsRoute.State = 1 Then rsRoute.Close
Set rsRoute = New ADODB.Recordset
Set rsRoute.ActiveConnection = con
rsRoute.CursorLocation = adUseClient
rsRoute.CursorType = adOpenKeyset
rsRoute.LockType = adLockBatchOptimistic
strCmd = " select * from Table where CoumnVal =1 "
con.Errors.Clear
On Error Resume Next
rsRoute.Open strCmd

There are several types of CursorsType and two different CursorLocation varieties. On the Sybase database (ASE back in the day) the performance differs wildly depending on what you choose. Try both client-side and server-side cursors and see what happens.
If you just need to loop through the result once, select the adOpenForwardOnly cursor type. It usually results in the best performance.
EDIT: Based on the code you posted, try a) not locking anything (e.g LockType), b) using a adOpenForwardOnly cursor, a) the keep the cursor on the server (adUseServer)

Related

Anyway to iterate quickly over a collection of UDTs?

Let's say I have a collection of UDTs. I populate it as below:
public type udtEmp
Id as long
Name as string
end type
dim col as new Collection
dim empRec as udtEmp, empDummy as udtEmp
for n = 1 to 100000
empRec = empDummy ' reset record
emp.Id = n
emp.Name = "Name " & n
col.add emp, cstr(emp.Id)
next
Now I want to loop through it. I am using a Long data type as the index to .Item()
dim n as long
For n = 1 To 100000
emp = col.Item(n)
Next
The code above works, but it's really slow - takes 10,000 milliseconds to iterate. If I accessed the collection via a key, its much faster - 78 milliseconds.
For n = 1 To 100000
emp = col.Item(cstr(n))
Next
The problem is that when I iterate over collection, I don't have the keys. If I had a collection of objects instead of UDTs, I could do for each obj in col, but with UDTs, it won't let me iterate in that manner.
One of my thoughts was to have a secondary collection of indexes and keys to point to the main collection, but I am trying not to complicate the code unless I absolutely have to.
So what are my options?
the elegance of the code or the performance of it is a serious decision you have to make. the choice should be based on the impact of the results. for each is elegant but slow and goes with objects and classes. but if the speed is a mater then use UDT and arrays.
in your case, i think an array of UDT is best suited for your situation. and to gain more speed , try to access arrays using SAFE_ARRAY (that you can google for it), the result is much impressive.
You can use a user typed class collection. It'll provide the for-each iteration ability with great performance.
Easiest way to make that happen is through the Class Builder Utility (https://msdn.microsoft.com/en-us/library/aa442930(v=vs.60).aspx). You might need to first run the Add-in Manager and load the Class Builder Utility. (I think that there were install options regarding these features when you installed vb6/vs6? So if you don't see the Class Builder Utility in the Add-in manager it's could be due to that).
To match your udt sample, using the Class Builder Utility, first add a class (eg: Employee), with two properties (eg: EmpId and EmpName, long and string types respectively). Then add a collection (eg: Employees) based on the Employee class. Save it to the project (that will create two new class modules) and close the Utility.
Now you can create the new Employees collection, load it up, and iterate through it via index, key or for-each. (note: don't use a pure number for the key - requesting an item by a key that is a pure number, even as a string, will be interpreted as an index request, it'll be slow and you probably won't get the desired item)
Also - once the new classes have been created, you can add customized properties and methods to them to handle whatever kinds of fancy stuff you may have requirements for.
Dim i As Long
Dim Emp As Employee
Dim colEmp As New Employees
Dim name As String
' Loading
For i = 1 To 100000
colEmp.Add i, "name" & CStr(i), "key" & CStr(i)
Next i
' iterate with index
For i = 1 To 100000
Set Emp = colEmp(i)
name = Emp.EmpName
Next i
' iterate with key
For i = 1 To 100000
Set Emp = colEmp("key" & i)
name = Emp.EmpName
Next i
'iterate with for-each
For Each Emp In colEmp
name = Emp.EmpName
Next Emp
Timings
On my system for the above code:
Loading time: 1 second
Index time: 20 seconds
Key time: 0.29 seconds
For-each time: 0.031 seconds

"Multiple- Step operation generated errors. check each status value." error in VB6 Application

when I try to insert a value to recordset in the 'Description' field. it showing a error like
runtime error '-2147217887(80040e21)'
Multiple- Step operation generated errors. check each status value.
sql = "SELECT * FROM vePODetail WHERE vePOID=" & Str(ado_veReceive.Recordset("vePOID")) & " ORDER BY vePODetailID"
rs.ActiveConnection = g_cnnCompany
rs.Open sql
Do While Not rs.EOF
ado_veReceiveDetailWF.Recordset.AddNew
ado_veReceiveDetailWF.Recordset("vePODetailID") = rs("vePODetailID")
ado_veReceiveDetailWF.Recordset("prMasterID") = rs("prMasterID")
ado_veReceiveDetailWF.Recordset("Description") = rs("Description")
ado_veReceiveDetailWF.Recordset("QuantityReceived") = rs("QuantityOrdered") -rs("QuantityReceived")
ado_veReceiveDetailWF.Recordset.Update
rs.MoveNext
Loop
rs.Close
the field in the recordset acccepts only 50 char.
Please tell how to increase the size/length of the field in the recordset.
If the field is 50 chars long, you must change the DB's definition of the field from 50 to whatever you need. You cannot do that through a recordset
Assuming you're using SQL Server, you can change your query using a CAST operation:
sql = "SELECT vePODetailID,prMasterID,CAST(Description as VARCHAR(100)) AS Description, QuantityReceived FROM vePODetail WHERE vePOID=" & Str(ado_veReceive.Recordset("vePOID")) & " ORDER BY vePODetailID"
That should set the length of the Description field in the recordset to 100 characters. You can do this in other db platforms as well, but the syntax may be different for the CAST.

Does ADO Recordset need to be closed?

I have an ADO recordset (not ADO.NET) that I populate in every iteration of a loop.
My question is: Do I need to close the recordset at end of every iteration so it gets populated with fresh data in next iteration OR I could just use the unclosed recordset to populate with new data in next iteration. Please look at code sample below.
set rs=Server.CreateObject("ADODB.recordset")
for count = 0 to 3
rs.Open "Select * from Customers where CustomerId = " & count, conn
'do some processing of data in recordset
'rs.Close 'NOT VERY SURE IF I NEED TO DO THIS
next
You cannot open a recordset again :
Error 3705 : Operation is not allowed when the object is open
So given the sample above which requires a different selection of data, you must close the recordset.

Picking query based on parameter in Oracle PL/SQL

Ok, say I have a query:
SELECT * FROM TABLE_AWESOME WHERE YEAR = :AMAZINGYEAR;
Which works very nicely. But say I want to be able to return either just those results or all results based on a drop down. (e.g., the drop down would have 2008, 2009, ALL YEARS)
I decided to tackle said problem with PL/SQL with the following format:
DECLARE
the_year VARCHAR(20) := &AMAZINGYEAR;
BEGIN
IF the_year = 'ALL' THEN
SELECT * FROM TABLE_AWESOME;
ELSE
SELECT * FROM TABLE_AWESOME WHERE YEAR = the_year;
END IF;
END;
Unfortunately, this fails. I get errors like "an INTO clause is expected in this SELECT statement".
I'm completely new to PL/SQL so I think I'm just expecting too much of it. I have looked over the documentation but haven't found any reason why this wouldn't work the way I have it. The query I'm actually using is much much more complicated than this but I want to keep this simple so I'll get answer quickly.
Thanks in advance :)
There is a real danger in the queries offered by Jim and Alex.
Assumption, you have 20 years of data in there, so a query on YEAR = return 5% of the blocks. I say blocks and not rows because I assume the data is being added on that date so the clustering factor is high.
If you want 1 year, you want the optimizer to use an index on year to find those 5% of rows.
If you want all years, you want the optimizer to use a full table scan to get every row.
Are we good so far?
Once you put this into production, the first time Oracle loads the query it peaks at the bind variable and formulates a plan based on that.
SO let's say the first load is 'All'.
Great, the plan is a Full table scan (FTS) and that plan is cached and you get all the rows back in 5 minutes. No big deal.
The next run you say 1999. But the plan is cached and so it uses a FTS to get just 5% of the rows and it takes 5 minutes. "Hmmm... the user says, that was many fewer rows and the same time." But that's fine... it's just a 5 minute report... life is a little slow when it doesn't have to be but no one is yelling.
That night the batch jobs blow that query out of the cache and in the morning the first user asks for 2001. Oracle checks the cache, not there, peeks at the variable, 2001. Ah, the best plan for that is an index scan. and THAT plan is cached. The results come back in 10 seconds and blows the user away. The next person, who is normally first, does the morning "ALL" report and the query never returns.
WHY?
Because it's getting every single row by looking through the index.... horrible nested loops. The 5 minute report is now at 30 and counting.
Your original post has the best answer. Two queries, that way both will ALWAYS get the best plan, bind variable peeking won't kill you.
The problem you're having is just a fundamental Oracle issue. You run a query from a tool and get the results back INTO the tool. If you put a select statement into a pl/sql block you have to do something with it. You have to load it into a cursor, or array, or variable. It's nothing to do with you being wrong and them being right... it's just a lack of pl/sql skills.
You could do it with one query, something like:
SELECT * FROM TABLE_AWESOME WHERE (? = 'ALL' OR YEAR = ?)
and pass it the argument twice.
In PL/SQL you have to SELECT ... INTO something, which you need to be able to return to the client; that could be a ref cursor as tanging demonstrates. This can complicate the client.
You can do this in SQL instead with something like:
SELECT * FROM TABLE_AWESOME WHERE :AMAZING_YEAR = 'ALL' OR YEAR = :AMAZINGYEAR;
... although you may need to take care about indexes; I'd look at the execution plan with both argument types to check it isn't doing something unexpected.
Not sure about using a SqlDataSource, but you can definately do this via the system.data.oracle or the oracle clients.
You would do this via an anonymous block in asp.net
VAR SYS1 REFCURSOR;
VAR SYS2 REFCURSOR;
DECLARE
FUNCTION CURSORCHOICE(ITEM IN VARCHAR2) RETURN SYS_REFCURSOR IS
L_REFCUR SYS_REFCURSOR;
returnNum VARCHAR2(50);
BEGIN
IF upper(item) = 'ALL' THEN
OPEN L_REFCUR FOR
SELECT level FROM DUAL
CONNECT BY LEVEL < 15 ;
ELSE
OPEN L_REFCUR FOR
SELECT 'NONE' FROM DUAL ;
END IF;
RETURN L_REFCUR;
END ;
BEGIN
:SYS1 := CURSORCHOICE('ALL');
:SYS2 := CURSORCHOICE('NOT ALL');
end ;
/
PRINT :SYS1 ;
PRINT :SYS2 ;
whereas you would simply create an output param (of type refcursor) -- instead of the var sys# refcursors) and pretty much just amend the above code.
I answered a similar question about getting an anonymous block refcuror here
How to return a RefCursor from Oracle function?
This kind of parameter shall be processed from within your code so that your OracleCommand object only executes either queries.
using (var connection = new OracleConnection(connString)) {
connection.Open();
string sql = "select * from table_awesome";
sql = string.Concat(sql, theYear.Equals(#"ALL") ? string.Empty : " where year = :pYear")
using (var command = connection.CreateCommand()) {
command.CommancText = sql;
command.CommandType = CommandType.Text;
var parameter = command.CreateParameter();
parameter.Name = #":yearParam";
parameter.Direction = ParameterDirection.Input;
parameter.Value = theYear;
var reader = command.ExecuteQuery();
if (!reader.HasRows) return;
while (reader.Read()) {
// Extract your data from the OracleDataReader instance here.
}
}
}

GridView (RadGrid) and Custom Paging

Ok, so I'm trying to get my custom paging going on the Telerik RadGrid (similar to the asp:Gridview), but I'm still hitting a wall. (the first part of my question was answered here)
So I have implemented the suggestion. I use the following Stored Proc
ALTER PROCEDURE [dbo].[bt_HealthMonitor_GetAll]
(
#StartRowIndex int,
#MaximumRows int
)
AS
SET NOCOUNT ON
Select
RowNum,
[ID],
[errEx],
[errURL],
[errSource],
[errUser],
[errMessage],
[errIP],
[errBrowser],
[errOS],
[errStack],
[errDate],
[errNotes]
From
(
Select
[ID],
[errEx],
[errURL],
[errSource],
[errUser],
[errMessage],
[errIP],
[errBrowser],
[errOS],
[errStack],
[errDate],
[errNotes],
Row_Number() Over(Order By [ID]) As RowNum
From dbo.[bt_HealthMonitor] t
)
As DerivedTableName
Where RowNum Between #StartRowIndex And (#StartRowIndex + #MaximumRows)
Order By [ID] Desc
Then another stored procedure to get the record count
ALTER PROCEDURE [dbo].[bt_HealthMonitor_GetRecordCount]
AS
SET NOCOUNT ON
return (Select Count(ID) As TotalRecords From bt_HealthMonitor)
And I'm using LINQ to SQL to bind to my RadGrid
Protected Sub RadGrid1_NeedDataSource(ByVal source As Object, ByVal e As Telerik.Web.UI.GridNeedDataSourceEventArgs)
Dim startRowIndex As Integer = (RadGrid1.CurrentPageIndex * RadGrid1.PageSize)
Dim maximumRows As Integer = RadGrid1.PageSize
Dim HealthMonitorDC As New DAL.HealthMonitorDataContext
Dim r = HealthMonitorDC.bt_HealthMonitor_GetAll(startRowIndex, maximumRows)
RadGrid1.DataSource = r
End Sub
Protected Sub Page_PreInit(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.PreInit
Dim HealthMonitorDC As New DAL.HealthMonitorDataContext
Dim count = HealthMonitorDC.bt_HealthMonitor_GetRecordCount()
RadGrid1.MasterTableView.VirtualItemCount = count.ReturnValue
RadGrid1.VirtualItemCount = count.ReturnValue
End Sub
But the problem I'm experiencing is that the grid only grabs the first 10 rows (as expected) but I need to get it so that it will recognize that there are 200 rows in the table so that the paging icons show up.
If I use the dropdownlist to display 50 records, then 50 show up, but still no paging icons to get me to the next 50.
What am I doing wrong?
You need to tell the grid how many records there are in total. This is done by setting the grid's VirtualItemCount property (you will have to query the total number of records).
For details, have a look at the documentation page or refer to the online demo for custom paging.
Martin is correct regarding VirtualItemCount. The easiest place to implement this is in the NeedDataSource event.
Remember that you'll need to put some logic in there to account for fewer records on the last page. That means that if you have 14 records with 5 per page, you want to make sure your logic only tries to retrieve 4 records on the last call.
Here's how I did it (using a generic list):
If gridRecords.Count < (grid.pagesize * (grid.pageIndex + 1)) Then
gridRecords.GetRange(grid.pageIndex * grid.pagesize, gridRecords.Count - (grid.pagesize * grid.pageIndex))
Else
gridRecords.GetRange(grid.pageIndex * grid.pagesize, grid.pagesize)
End If
Obviously, you'll want to do this as part of your data access call if you're only retrieving the records from the database as you go.
You can implement also using the ObjectDataSource.
http://www.unboxedsolutions.com/sean/archive/2005/12/28/818.aspx
i.e using RadGrid with ObjectDataSource with CustomPaging i.e Paging logic needs to implemented on your own.
Also ObjectDataSource has two methods
1. SelectMethod (Where you can specify the method which returns the data)
2. SelectCountMethod (Where you can specify the method which returns the total Count)

Resources