How to know in advance if data would fit in a DataSet - oracle

I´m maintaining an old Delphi 7 App using Allround Automations' DOA 4 to access an Oracle DB.
The app has an adhoc configurable reporter unit that sometimes fails because data selected from the DB won't fit in the TOracleDataSet. My question is how to know in advance if the data would fit.
I tried running a count(*) query first which multiplied by DataSet.recordsize gives me the data volume I had to allocate. But I don´t know how much memory is available to get the data or if that memory is limited by the dataset component.
Greetings, Marcelo.

Related

high volume data storage and processing

I am building a new application where I am expecting a high volume of geo location data something like a moving object sending geo coordinates every 5 seconds. This data needs to be stored in some database so that it can be used for tracking the moving object on a map anytime. So, I am expecting about 250 coordinates per moving object per route. And each object can run about 50 routes a day. and I have 900 such objects to track. SO, that brings to about 11.5 million geo coordinates to store per day. I have to store about one week of data at least in my database.
This data will be basically used for simple queries like find all the geocoordates for a particular object and a particular route. so, the query is not very complicated and this data will not be used for any analysis purpose.
SO, my question is should I just go with normal Oracle database like 12C distributed over two VMs or should I think about some big data technologies like NO SQL or hadoop?
One of the key requirement is to have high performance. Each query has to respond withing 1 second.
Since you know the volume of data (11.5 million) you can easily simulate the all your scenario in Oracle DB and test it well before.
My suggestions are you need to go for day level partitions & 2 sub partitions like objects & routs. All your business SQL has to hit right partitions always.
and also you might required to clear older days data. or Some sort of aggregation you can created with past days and delete your raw data would help.
its well doable 12C.

Minimize round-trips between client and database with oracle forms 10

our clients are working in particular conditions and their latency is high.
All our oracle forms 10 are calling a procedure that translates all the labels, etc that will be displayed in the form. So for every label to translate, a call is made from the client to the database (the procedure translating the labels is a database package and the data is in a table).
We would like to reduce all these round trips and I was thinking that, if I can store locally (in a .jar?) the whole dictionary (around 1 meg when exported in text) I could improve the performance. Around 300 back&forth are actually done everytime we open a new form.
So my goal is to
- store locally some data
- be able locally (via an attached library?) to read the data
I am a bit lost. How can I do it (st) and how can I access it?
Many thanks in advance
What you could do is make an attached library with a package with global constants for all your translations. Then you can call these, because this is a forms package and not a db package no calls are made.

Ruby - updating large amounts of data into Postgres DB on Heroku

I have a Ruby application Postrgress DB on heroku.
Some of the data is was migrated incorrectly.
I want to update the data on about 5 thousand rows. I do not want to blow the DB away and remigrate.
What would be the best way to do this.
I have update small amounts of data using the active record type sql but not sure for a large amount.
thanks in advance
Maggs
It really depends on how many fields or what type of data you specifically you are trying to change.
Are your fields changing specifically for each one or are you able to run a loop to iterate through each row to update the value? Please be more specific in terms of the data and what kind of editing you are trying to do.

Rhomobile inserting into local database using CSV or XML from external web server

I am currently developing a Rhomobile application. I have a backend database which holds customer information. I have got from the webserver a csv string (or XML - I am able to parse the XML using REXML) which contains all the customers. Each time I sync the device I am going to reset the customer table on the device and re-insert all data from the backend database. I am not using RhoSync and the device will be using property bag.
Is it possible to use the CSV or XML data to insert into the customers table? If so, how would I go about it?
At the moment the only option I can see that would work would be to manually loop through the CSV/XML and insert into the database manually; this isn't very elegant.
Any help will be much appreciated, sorry if this is a dumb question; still relatively new to this framework.
I have come to the conclusion that the only way is to loop through the csv/xml, which with the help of a database transaction this doesn't take long.
Using fixed schema also increases the performance a lot as property bag has to do column inserts (so if you have lots of columns - there is lots of inserts per record).
Also in Rhomobile garbage collection is turned off, so if you are trying to process large data sets your device will quickly run out of memory:
GC.enable
The above solves this issue

Slow in filling .NET DataSet from Oracle 9i

This is a problem that my friend asked over the phone. The C# 3.5 program he has written is filling a Dataset from a Patient Master table which has 350,000 records. It uses the Microsoft ADO.NET driver for Oracle. The ExecuteQuery method takes over 30 seconds to fill the dataset. However, the same query (fetching about 20K records) takes less than 3 second in Toad . He is not using any Transactions within the program. It has an index on the column (Name) which is being used to search.
These are some alternatives i suggested :-
1) Try to use a Data Reader and then populate a Data table and pass it to the form to bind it to the Combo box (which is not a good idea since it is likely to take same time)
2) Try Oracles' ADO.NET Driver
3) Use Ants profiler to see if you can identify any particular ADO.NET line.
Has anyone faced similar problems and what are some ways of resolving this.
Thanks,
Chak.
You really need to do an extended SQL trace to see where the slowness is coming from. Here is a paper from Cary Millsap (of Method R and formerly of Hotsos) that details doing this:
http://method-r.com/downloads/doc_details/10-for-developers-making-friends-with-the-oracle-database-cary-millsap
Toad would typically only fetch the first x rows (500 in my setup). So double check if the comparison is valid.
Then you should try to seperate the db stuff from the form stuff if possible to see if the db is taking up the time.
If that's the case, try the Oracle libraries if that is any faster, we've seen 50% improvements between the latest Oracle driver and the standard Microsoft driver.
Without knowing the actual code he uses to accomplish his tasks and not knowing the number of rows he's actually fetching (I'm hoping he doesn't read all 350K of them?) it's impossible to say anything that's gonna help him.
Have him add a code snippet to the question for clarity.

Resources