ADO showing "hidden columns" with SQL Native Client - vb6

I'm working on a legacy application using VB6 and Classic ASP. We're using disconnected ADO recordsets to pass data back and forth. Normally this works. But what has started happening recently is for any inner/outer join, ADO is including these in the available records to choose from. So when were specifying a column to update (in the cases it errors out, the primary key column), it in turns updates the wrong column (with the same name). I know it's normal for ADO to pull the primary keys for any joined tables, but the default for this is for ADO to hide them. In our case ADO isn't hiding them.
What I've narrowed it down to is the SQL Native Client driver is not working correctly. I can go back to the SQL Server driver (SQL 2000) and it works great, but as soon as I switch back to SQL Native Client, it exhibits the behavior above. I've checked the properties on the open connection and the properties of the recordsets themselves, they match in every instance except one (the count of how many hidden columns there are which makes sense, as SQL Native isn't hiding them).
I've tried everything from deleting the MSADC folder from IIS and re-adding it, I've uninstalled SQL Native and reinstalled it (and subsequently upgraded it to the newest version). I've recreated the ODBC connection several times as well in the process of troubleshooting it. At this point I'm at a loss.
Also one thing to add, it appears SQL Native Client works fine on our other servers and no one else is having this issue. Anyone might have an idea of what could be happening? Thanks!
Edit : Example of what's happening (this occurs on for any query (stored procedures if it matters) and with >= 1 joins of any kind)
select temp_id, temp_value on temp_test
inner join another_table on another_table.temp_id = temp_test.temp_id
inner join yet_another_table on yet_another_table.another_id = another_table.another_id
this'll produce in the ado recordset :
SQL Native Client
(0) temp_id
(1) temp_value
(2) temp_id (primary key of another_table)
(3) another_id (primary key of yet another_table)
SQL Server driver
(0) temp_id
(1) temp_value
SQL Server 2005 will show it as it should be as : temp_id, temp_value

this occurs on for any query (stored procedures if it matters)
It's not the issue described here is it? :
If a change in the connection string changes the behavior, I would suppose that you have two different schemas, and then two versions of the same stored procedure; and the one that is executed with SQL Nativ Client is the incorrect one.

I have exactly the same scenario, and have had it for over a year on our servers and servers at our client. I never found a solution and as a result we simply have to use the SQL Server driver, which is a shame as SQL Native seems to connect significantly faster.
It's nothing to do with different schemas or different versions of the same stored proceedure as suggested above. I use a file dsn and simply changing the driver name changes the behaviour to/from that mentioned above. It seems to happen to all views (probably stored proceedures too as indicated)
If anyone does find a solution I'd be keen to hear about it.
Warwick

Related

Azure Data Studio "unknown property IsColumnSet"

I'm trying to use Azure Data Studio to connect to a SQL Server database.
I am able to connect to the database and view the list of objects (tables, stored procedures, etc.).
When I click to expand the "Columns" group inside a table, I get the following error from Azure Data Studio:
unknown property IsColumnSet
I am able to expand the Keys, Constraints, Triggers, Indexes, etc. but I can't expand the Columns.
I am able to right-click and query the table to see all the columns, but I'd like to know what to do to get the columns list expanded in the server side bar tree.
I tried uninstalling and reinstalling Azure Data Studio. No luck.
FWIW, while this error persists in the UI when drilling down a table to view the columns, you can still get this information via SQL like so:
select * from INFORMATION_SCHEMA.COLUMNS where TABLE_NAME = 'someTable';
Verified this works with ADS on Mac, connected to SQL Server 2005.
Apparently Azure Data Services "only officially supports SQL Server 2014+" and this problem would seem to be due to lack of support for older servers.
https://github.com/Microsoft/azuredatastudio/issues/3774
So although they might improve the error message to indicate lack of support, I think otherwise we're out of luck. A long-overdue server upgrade is in order methinks...

Jetbrains Datagrip cannot connect sql server without specify database

i try to use Datagrip as my primary sql server query tools, bit i meet a problem that i can not move forward.
When i setup the project data source, i have to choose a Database, otherwise it will goto the default tempdb, how i can work like sql server management studio, that i can see all database list.
i have tried both jTds and Microsoft drivers, both not works.
or can i choose multi-database? i do not want to create one data source for one database.
if i connect to mysql, it works like expected.
tks
It is possible to connect without specifying the database. Just leave this field blank. After entering other settings (host, port, user, password), go to Schemas tab in Data Source properties and select all the Databases and schemas you want to work with. Then invoke Synchronize action for this Data Source in Database tool window.
DataGrip 2016.2 EAP claims to have support for showing multiple databases. See https://blog.jetbrains.com/datagrip/ and find for "Database View". Unfortunately, as of the date I'm writing this, it doesn't seem to work at all.
Unfortunately, Andrey's suggestion did not work for me.

Linking Oracle tables to Access 2007 File (Performance)

I'm running into a problem while linking some tables and views present on a Oracle 11g database to a Access 2007 file.
I'm using the Oracle Client (SQORA32.DLL) version 11.02.00.03.
If the view/table returns a small amount of data, there's no problem. The problem happens when the view or the table returns a "large" amount of data. I've tried to increase the buffer size on the driver (default is 64000) to see if that happens. I've also removed the "Enable query timeout" option - otherwise I would get a "Query cancelled by user" or a "ODBC - Call Failed" error.
In order to link the tables/views, I've used the "native tool" (External Data -> ODBC Database -> Link to data source by creating a linked table).
I was wondering if I could retrieve the data from the tables/views using vba. Sometimes, I (you should read "I" as "the users") may need to update data on some tables (control tables).
Please let me know your thoughts.
EDIT: Our goal with this project was to migrate from SQL Server 2005 to Oracle 11gR2. After analyzing the behaviour of the Access files regarding the SQL Server, I've concluded that the results are showing like a "cursor" - if you scroll down on the result window, it will load more.
I think that this may be the issue because, AFAIK, Oracle (driver, maybe?) pulls everything from the DB and, only then, populates MS Access.
It's a long time after this so here goes the solution. MS access has a flag for the ODBC connection as "Treat Float as Numeric". This have made the trick.

How to retrieve Oracle server collation in order to set up linked server

After searching here and on the web, I finally decided to post the question. I am running an SQL 2000 server, and linked an Oracle 9i server. Everything works fine when I run queries, and even updates and inserts from and to the Oracle Linked server (using both Microsoft OLE DB driver and Oracle OLE DB driver) using the OPENQUERY approach.
The problem is that, in order to clean code a bit, I want to use four part names in my queries. I am doing this also when querying other SQL linked servers.
But when I run the queries against Oracle using four part names I get this error: ( I am translating the error message from spanish. Probably the original message in english id different)
ERROR: OLE DB 'MSDAORA' returned an invalid column definition.
Error Code: 7318
Digging a bit, I learned that this is probably related to nos having the right collation name set in the linked server properties.
Now... I am not an Oracle expert, so I need to find out what collation is the schema I connect to using in Oracle (apps... yes, I know, I know... )
So, the plain question is... How do I find out what collation Oracle is using? I have access to the Oracle server via Toad... is there any query I can run in order to find this out?
Thanks!
I think you're looking for the NLS settings, which you can find from these views:
V$NLS_PARAMETERS — "Current values"
NLS_DATABASE_PARAMETERS — What the database was created with.
NLS_INSTANCE_PARAMETERS — From changes by ALTER SYSTEM
NLS_SESSION_PARAMETERS — Combined, plus ALTER SESSION
V$PARAMETER — System parameters, where a lot of this is changed (contains all kinds of stuff)
The documentation is rather lacking here (or, quite likely, I couldn't find the right document), but this should get you what you want.
You will probably have to log in as SYSDBA to read some of these views.

Linq to Sql Update not working

I have somewhere around 20 tables that I am working with. I can update the User table just fine, however, when I try to update my Address table, nothing happens. I don't receive an exception and the method looks like it executes ok but when I check my data, the values are still the same.
I'm thinking that it has to do with the fact that I moved my database out from under a server and onto my local SQL lite instance. I did change the connection strings in the config and thought that it would take care of the problem (as i stated, I can still select from all of the tables using linq). Has anyone encountered this before or have some idea of what might be going on?
Edit 1 - I'm not very familiar with relocating databases with linq. I do know that SQLMETAL, when I run it, removes all of the customization that i have done inside of my datacatalog. Does just changing the connection in the config work or do I actually have to use SQLMETAL every time the db moves (the structure doesn't change)
When I did this I had to hand modify the constructor to use a different connection string in the File that Visual Studio generates for your instance of the database. I had the same issue as you did and this fixed the problem for me.

Resources