enter image description hereWe are migrating Oracle DB from 12c Windows Oracle to 18c Linux Oracle installation. Some of the columns (derived columns) in views has become unicode and not usable by downstream SSIS. Any reason why these views are giving error in SSIS? SSIS is complaining that in 18c columns have become unicode
12c View definition is
enter image description here
18 c View defintion
enter image description here
I think your problem matches this :
Oraoledb: Cannot Convert Between Unicode And Non-Unicode String Data Types (Doc ID 960508.1)
This is due to a difference in metadata reported by the OleDb provider depending upon whether NLS_LANG environment variable is set. Typically this behaviour would be observed when the SSIS package is developed in an environment that has NLS_LANG set, and then deployed to an environment that does not have it set. The difference in metadata results in the error, and is being investigated via Oracle bug number bug 7836009.
To resolve this issue, set NLS_LANG to the same setting as the box the package was developed on, which result in the same metadata being reported in both cases.The NLS_LANG environment variable should be set in the registry under
HKEY_LOCAL_MACHINE\SOFTWARE\ORACLE\KEY_<homename>\NLS_LANG
Export the NLS_LANG variable in Linux with the same value as the original database
The behavior can also be worked around by refreshing the SSIS package after deployment, which will refresh the metadata.
Hope it helps.
Regards.
Related
A coworker made an SSIS package that pulls data from Oracle and transfers it to a SQL Server database that is nearly identical, so it's a lot of data flow tasks simply with an OLE DB Source (Oracle) to an OLE DB Destination (SS). When I open it on my computer, I get the error "Column [column name] cannot convert between unicode and non-unicode string data types" on all the source tasks. If I add a data conversion task to convert the unicode columns to non-unicode, all works fine but I really want this to work like how he has it because it's running on the production server like this. My best guess is it has to do with the installing of the Oracle client or drivers or the NLS_LANG variable but I can't seem to solve it. My environment variable NLS_LANG = AMERICAN_AMERICA.WE8ISO8859P1
I was worried something went wrong with my Oracle client installation because of my registry values. Now I have 3 clients installed since I went through the install process again. These are the third client's reg values and I added NLS_LANG myself and rebooted. I'm more of a SQL Server developer, so I'm possibly saying something wrong here.
The solution was to set the NLS_LANG environment variable value to AMERICAN_AMERICA.WE8MSWIN1252 to match what my coworker had and what my registry has because I somehow didn't notice they were different! However, neither NLS_LANG were set to start with so the real solution was to add this in. I rebooted and when I reopened the package, got zero errors.
My package runs fine from both my desktop and my ETL server when I RDP into it. However, when running as part a job, I get the following error on all my string columns: "Error: Column "***STATUS" cannot convert between unicode and non-Unicode string data types."
The error occurs on an OLE DB Command component that updates a table in an Oracle database. None of my columns on either the SQL/SSIS side nor the Oracle side are Unicode. Here's the metadata directly leading into my OLD DB Command component.
I verified that the External Columns on the OLE DB Command component in question exactly match that metadata. I've also tried explicitly converting the columns to Unicode before inserting in case they were Unicode (I know they're not) on the Oracle side, but that leads to a hard error (red X) and the same message.
Here's the Oracle schema:
Command:
Anyone have any idea on how to get this to run from the agent?
Based on the following oracle support case:
Oraoledb: Cannot Convert Between Unicode And Non-Unicode String Data Types
The following error message cause is:
Developing an SSIS package that uses the Oracle OLEDB Provider on a 32 bit operating system and then deploying to a 64 bit SQL Server installation
Possible Workarounds
Note that i didn't tested these workarounds before.
(1) Try running the package in 32-bit mode:
From Visual Studio
GoTo Project properties >> Debugging >> Run64BitRuntime = False
From SQL Agent
Check the following link:
SSIS Package Not Running as 32bit in SQL Server 2012
(2) Install Oracle x64 oledb provider
64-bit Oracle Data Access Components (ODAC) Downloads
On the OLE DB Source ensure you have delay validation set to true.
I set the default code page to false and had the code page as 1252.
I have tried this both with 32 bit and 64 bit.
Also need to make the flag validateexternalmetadata = false
We have an SSIS package downloading data from an Oracle database to an SQL Server datawarehouse. For this datawarehouse, several environments are set up; Development, Test and Production. Dev and test share a machine, Prod is stand-alone.
When the SSIS package is run on the PROD machine, it downloads the Varchar2 columns from our Oracle source database to MSSQL in DT_WSTR format and saves this to a NVarchar column. I.E. all steps involved support Unicode.
When this same package is run against the same source database on the DEV/Test box, it somehow sees the external columns as being Varchar, derives this to DT_STR in the data flow and refuses to store this in an NVarchar column.
All OS's are Win2K8r2, MSSQL 2008 64 bits. The package is run in 32bits mode, same behaviour is seen when run from BIDS or from SQL Agent.
Anyone care to guess why? I've already seen the suggestion to disable validating external metadata (https://stackoverflow.com/a/18383598/2903056), but that's not a practical suggestion for our situation.
An old question I know, but seems to still be relevant. And since I could not find a suitable answer in the last 3 months I have been searching, I figure now is as good a time as any to post my findings.
I have had the same curious behaviour and have finally been able to resolve it.
My layout looked like this:
Oracle 10g R2 database on Windows 2003 Server (lets call it ORA)
Dev machine with Windows 8, Visual Studio 2012 + SSDT, Sql Express 2012,
ODT 12.1.0.21 (lets call that DEV)
Sql Server 2012 on Windows 2012 Server, Oracle Client 11.2 (lets call that TEST)
Both DEV and TEST were connecting to ORA. DEV was reporting VARCHAR2 columns as DT_WSTR while TEST would insist that they are DT_STR.
I then installed ODT 12.1.0.21 on TEST and the problem was solved. Notably, I used the "machine wide" option during the install. I am not sure how much of an impact that had.
There seems to be a difference in the datatypes that are returned by the Oracle OleDb providers across the different versions of the client side components.
Check the value of the NLS_LANG in the registry.
reg query HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\ORACLE\KEY_<orahome> /f NLS_LANG
If it matches the server's character set, OraOLEDB will use regular (non-Unicode) datatype DBTYPE_STR, otherwise it uses Unicode-mode, datatype DBTYPE_WSTR.
If the NLS_LANG field is missing, it defaults to US7ASCII which almost certainly will not match your database and you will be using Unicode datatypes.
To get the server's characterset, do:
SELECT parameter, value FROM nls_database_parameters WHERE parameter = 'NLS_CHARACTERSET';
Check Metadata validation property value if it true make it false
We were handed a dump file by another team who are gone away for the rest of the year. When we try to import the dump into our own database, we get the following error:
IMP-00038: Could not convert to environment character set's handle
Upon some research, our speculation is that we have a mismatch between the NLS_LANG setting of the source machine and our local machine. We currently don't have any means to check what the value of NLS_LANG is on the source machine.
So, having just a dump file in our hands, is there a way to figure out the NLS_LANG value with which the export was done? From the looks of it, we should be able to override the NLS_LANG environment variable before running the import client (imp).
Another thing is, the dump was done from an 11g instance and our imp version is 10. I read that imp is not forward compatible. Could this be the issue here (instead of the NLS_LANG mismatch)?
Ates, try impdp - sometimes that could help :-)
easiest way on unix is:
#>imp username/password file=test.dmp show=y
Import: Release 10.2.0.3.0 - Production on Fri Nov 26 08:38:47 2010
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
With the Partitioning, OLAP and Data Mining options
Export file created by EXPORT:V10.02.01 via conventional path
import done in US7ASCII character set and AL16UTF16 NCHAR character set
import server uses WE8ISO8859P1 character set (possible charset conversion)
Version exp/imp is a problem:
I usually use the V10 export program and make it connect to the V11 database.
Make sure you have a alias for dev11 in your tnsnames.ora in dev10's oracle_home.
hostname{oracle}# . oraenv
ORACLE_SID = [oracle] ? dev10
hostname{oracle}#
hostname{oracle}#>exp username/password#dev11 full=y dumpfile=dump.exp
Even though the file is a binary garble, there are some human-readable text excerpts. I saw the following strings in there, and I think this answers my question:
<CHARSET>AL32UTF8</CHARSET><NCHARSET>AL16UTF16</NCHARSET>
...
NLS_LANGUAGE='AMERICAN' NLS_TERRITORY='AMERICA'
Another thing is, the dump was done
from an 11g instance and our imp
version is 10. I read that imp is not
forward compatible. Could this be the
issue here (instead of the NLS_LANG
mismatch)?
A: You 're right; You cannot import a dump file created with a given Oracle Client, since the Oracle Client of your Target Oracle Database is older.
Although it is not recommended, you can export a newer Source Oracle Database (i.e. 10g+) with an older Oracle Client (i.e. 10g), since you are using the same Oracle Client version to import into your older Target oracle Database (i.e. 10g).
Assumption: The Oracle Client version of your Source Database is the same -or newer- as your Target Oracle Database version. Note that mixing tools Datapump (11g) and imp (-10g) import utility does not work.
Interessant link Using Different Releases and Versions of Export
Maybe it was exported using expdp....try impdp ....that is what I saw when searching google, and truly it worked for me for this same issue.
We are in the process of moving from the .NET Microsoft oracle driver to the ODP.NET driver.
One of the problems we have had is this error:
ORA-12705: Cannot access NLS data files or invalid environment specified
We were able to stop the error by modifying the registry and changing the setting (see this question)
In our case we changed
HKEY_LOCAL_MACHINE - SOFTWARE - ORACLE - NLS_LANG
which was set to NA
to be the same as
HKEY_LOCAL_MACHINE - SOFTWARE - ORACLE - HOME0 - NLS_LANG
which was set correctly
My question is why would there be different NLS_LANG settings in the registry, and might there be any knock on effects of changing this value?
Update:
I've just found in the Oracle NLS FAQ the following
For Oracle version 7:
HKEY_LOCAL_MACHINE\SOFTWARE\ORACLE
For Oracle Database versions 8, 8i and
9i:
HKEY_LOCAL_MACHINE\SOFTWARE\ORACLE\HOMEx\
where "x" is the unique number
identifying the Oracle home.
HOME0 is the first installation
For Oracle Database 10g:
HKEY_LOCAL_MACHINE\SOFTWARE\ORACLE\KEY_<oracle_home_name>
There you have an entry with name
NLS_LANG
OK, so there are different registry settings for different versions...
Note:
Some people are confused by finding a
NLS_LANG set to "NA" in
HKEY_LOCAL_MACHINE\SOFTWARE\ORACLE
when no version 7 was installed. This
is used for backwards compatibility,
and can be ignored.
I have Oracle 9i, so now I'm even more confused - why is the ODP.NET dll looking at the Oracle 7 registry setting?
I had a similar problem with the;
ORA-12705: Cannot access NLS data files or invalid environment specified
The ODP.NET dll's or instant client were reading the registry;
HKEY_LOCAL_MACHINE\SOFTWARE\ORACLE\NLS_LANG=NA
The value NA caused the error.
This was because I already had a client installation but I wanted to use the oracle instant client via network drive for the a VB.NET app with ODP.NET.
My simple fix in my vb.net solution was for example to adjust the environment for the application via:
Environment.SetEnvironmentVariable("NLS_LANG",
"AMERICAN_AMERICA.WE8MSWIN1252",
EnvironmentVariableTarget.Process)
nb. The Oracle "NLS FAQ" link is no longer valid (2012)
PER Oracle Notes on the 11g ODP release, the following can cause this error:
HKEY_LOCAL_MACHINE\SOFTWARE\ORACLE\NLS_LANG=NA <--- This NA does in fact cause this error if set to NA.
You can try DELETING the key if not needed or setting it to a valid NLS_LANG setting for your locale.
For us we set it to AMERICAN_AMERICA.WE8MSWIN1252.
In our case we did not want to make any potentially breaking changes to the Oracle registry because we were installing our web service on a production Oracle 9i server.
The solution was simply to prevent ODAC from being able to see any ORACLE registry keys by denying all access to that key for the user ID our web service was running as.
Start -> Run... regedit (as an administrator)
Navigate to HKEY_LOCAL_MACHINE\SOFTWARE\ORACLE
Right click on the ORACLE key -> Permissions...
Click the Add... button.
Add the web service user name configured in your web service's application pool identity (e.g. IUSR_MyWebService); this is the user name that appears against your w3wp.exe process in Task Manager.
Press OK.
For the new user permissions, check "Deny" against the Full Control permission and press OK.
This worked just fine and as a bonus we have ensured that our application is isolated from any future changes to the ORACLE registry keys.
Tip: you can prove to yourself that the user in question has no access to the keys in question by closing any running instances of the Registry Editor, start a CMD prompt as that user (using Run As...) and then launching regedit from the command prompt.
the Oracle Client (ORACLE_HOME\bin\ora*.dll) is looking for a file named "oracle.key" in the same directory. This file contains the name of the registry key which belongs to this Oracle client installation. (e.g. "Software\ORACLE\HOME3")
hth
Andreas
This was all resolved in the end by installing the ODAC 11 client components (downloaded from the Oracle website). I think the system was getting confused because we had copied the ODAC dlls across rather than fully installing the client. ODP.NET is expecting an Oracle 11 client and didn't know where to find the Oracle Home.
NB if you are installing the ODAC components using xCopy deployment then do not install them to an existing Oracle Home directory (eg c:\oracle\ora92 for 9i client). This causes a 'Provider is not compatible with the version of Oracle Client’ error.