ORACLE not showing all records - oracle

I am having issues with getting the correct number of rows. I ran this code a while back and according to the data, this is the number of rows I should be getting.
Original
But i just ran it again after probably a few weeks and i am sure the data is not messed with because the tables have the same data as before
Ran Again

Related

Unable to Export Data in Toad for Oracle

My Toad for Oracle recently updated to version 14 and since then I have been having an issue with exporting data. This applies to data sets that are a few thousand records as well as results with few million records.
In other words I'm currently attempting to export a data set with around 50k rows however when I try to export it, I get the following error message
Qry: Cannot perform this operation on a closed dataset
I also, get the above error when trying to skip to the last row of the results using toad but that also doesn't work.
I never had this issue in the past and was able to export as many rows as I wanted. Is there some sort of setting which I need to enable in Toad to allow this. I have a coworker who is able to export the data using the same query results so I'm not sure why is it specifically me that is having this issue.

Oracle data insertion strange behavior. Data is truncating while inserting

I am using oracle database as back end of one of my projects.
I have a table which has a ADDRESS_TYPE column nvarchar(3)
In some scenarios the system trying to insert a text 'Business' into the column ADDRESS_TYPE. When I try it from the local it is showing an error like the value is too large for the column, this one is an expected result. But the same code I have deployed in QA and Production but in both cases the data got inserted as 'Bus' so the text 'Business' got truncated and it only inserts 'Bus'. My local instance and the QA instance pointing to the same Database.
var data= new MYTABLE();
data.ADDRESS_TYPE= 'Business';
context.MYTABLE.AddObject(data);
context.SaveChanges();
Note : I am using Entity framework.
I have tried to insert/update the data directly into the database and I am getting the error value is too large for column. This behavior only happen in the hosted sites. I am planning to put a log on the code for getting more idea , but before that I thought of asking here , so if anyone faced the same situation then they might can help me I guess.
I have tried to do the same operation locally using release mode there also getting the exception
Anyone have any idea/suggestions so that I can investigate on that way.

Postgres reltuples seems to return twice the number of values

I am using this query to get approx row count.
SELECT reltuples AS approximate_row_count FROM pg_class WHERE relname = 'table_name';
This was recommended by this article to get fast approx row counts: https://wiki.postgresql.org/wiki/Count_estimate
But sometimes it seems to give twice the number of rows. This is happening only after upgrading to 9.6.8. It used to work right all the time.
The problem seems to fix itself when I run ANALYZE table_name. After sometime, the problem returns.
I am not exactly sure why this is happening. How can I fix this problem?
My guess right now is that this is to do with a bug with Analyze vacuum functionality in Postgres 9.8.6 after writing to the postgres mailing list. Expected to be fixed in the next minor update of Postgres.
Source: https://www.postgresql.org/message-id/CAFWmNu8SfSgBWcMCaWJfDLbcFUN3riC9jDuzOd08QsJgAqv%2B4A%40mail.gmail.com
3.

Why would ActiveRecord::Base.execute_query stop returning all the results after the 20th row?

I'm working on a test suite that requires me to pull data out of an Oracle database. I was using a query that was already written since it would be easier than setting up all the activerecord objects. I'm not using rails and instead using gem activerecord 3.2.6.
I've found that of the 14 columns of data in the return suddenly, starting with the 21st row, only 12 of them actually have data. This query returns full data for rows past 20 when used in Oracle SQL Developer so I know it's not the query.
Could it be that I need to give it more time to finish returning all the data? This doesn't really seem likely since it's always the 21st row that first has the problem. Is it an issue with the fact that some of the data can be NULL and once it finds the first NULL activerecord screws up?
I really have no clue what to do here and the only thing I can think to try is to install different versions of activerecord and hope that one of them works properly.
"This doesn't really seem likely since it's always the 21st row that first has the problem." Is it an issue with the fact that some of the data can be NULL and once it finds the first NULL activerecord screws up?
Perhaps you can try making one of the rows < 21 (say row 8) have some NULL data and see if the query chokes there?

WebUtil's CLIENT_TEXT_IO.PUT_LINE when writing CSV files

We're migrating our Oracle forms and Oracle reports from 6i to 10g over Windows 7. But when we changed the new PC's with Windows 7, users reported several reports and some forms that generates CSV files, they were generating incomplete data or files in blank -no records, just headers-.
Looking around we find out that when we use BETWEEN CLAUSE like this:
SELECT id, name, lastname FROM employee WHERE date_start BETWEEN :P_INIT_DATE AND :P_FINAL DATE
The resulting file was in blank or with records with mismatched dates, so we deduced there were a problem between Windows 7 date understanding and the Oracle database or whatever, we don't know yet. We could solve all this doing a double conversion TO_DATE(TO_CHAR(:P_DATE)) but now, when we want to generare a CSV file with forms 10g using CLIENT_TEXT_IO.PUT_LINE, we're experimenting a strange behavior. Webutil starts writting a file, but when this reaches certain number of lines it overwrites the same file starting in the beginnig of the CSV file again. So when you open the file in excel you only see the X last lines.
I would really apprecieate any help to fix this problems. There is no specific question, I just explain the problem we have, looking for help
CLIENT_TEXT_IO caches records before writing them to your file. I've seen several different thresholds in the range you cite. If your Form code issues a SYNCHRONIZE; every so many records written, the cache will be flushed each SYNCHRONIZE. I'm not writing large files at the moment, but in the past 100 records per SYNCHRONIZE has worked well. Check your timings carefully; 100 may be too few records per SYNCHRONIZE. Since the number I've seen varies from shop to shop, I'd wager it's NOT related solely to number of records, but how many bytes you stuff into your cache.

Resources