PDO cannot find ODBC driver - oracle

Trying to migrate to PDO for an ODBC connection to a remote database.
My system is Windows 7 Pro, with a FortiClient VPN connection to the remote domain, and Tunnel Mode connected. (Added in edit: Running PHP 5.4, so PDO should be installed by default.)
The data source configuration in the Windows 7 ODBC Data Source Administrator is as follows (some names changed for security):
Under System DSN:
System Data Source Name = 'TheBigDB'
Driver = 'Oracle in instantclient_11_2' (64-bit)
TNS Service Name = '10.10.1.20:1521/BIGDB'
UserID = 'BigDBUser'
All other settings are defaults for installation of driver
The following piece of PHP non-PDO code is able to create a connection usable for queries:
if (!($myConn = odbc_connect('TheBigDB','BigDBUser','myPwd'))) {
echo "No ODBC connection<br />";
}
So I can connect to the database. The problem is doing it in PDO. The following PHP...
try {
$odbcConn = new PDO('odbc:Driver={Oracle in instantclient_11_2};Server=10.10.1.20:1521;Database=BIGDB;Uid=BigDBUser;Pwd=myPwd');
} catch (PDOException $e) {
echo 'PDO connection failed: ' . $e->getMessage();
}
... results in a PDO connection failed: could not find driver message. My first try relied more heavily on the Data Sources Administrator; that looked like this:
try {
$odbcConn = new PDO('MyBigDB','BigDBUser','myPwd');
} catch (PDOException $e) {
echo 'PDO connection failed: ' . $e->getMessage();
}
Same thing -- could not find driver.
So here's my question: Given a Windows 7 system with ODBC through a VPN connection to an Oracle database that all seems to work without PDO, how do I migrate the connection parameters to create a PDO connection? Maybe a better question is why mess with PDO for the connection, but I keep reading that PDO is more secure, so I'm trying to use it.

More digging got me an answer. The correct syntax for creating a PDO object that uses a Windows Data Sources ODCB connection is not what I showed, but instead this (minus the try/catch):
$myConn = new PDO('odbc:TheBigDB','BigDBUser','myPwd');
BUT... with PHP 5.4.12 using ODBC to touch an Oracle 11 DB, you need to edit the correct php.ini file (see this WampServer forum topic for a discussion of which of three php.ini files you need to edit). Under Dynamic Extensions, uncomment extension=php_pdo_odbc.dll and extension=php_pdo_oci.dll (I did both; not sure which one did the trick...)
Another tip: In my scripts, neither odbc_connect() nor PDO() worked for me unless my connection was created in Windows Data Sources Administrator under the System DSN tab. It did not work under User DSN.

Related

Teradata Oracle Wire Protocol ODBC connection test error, specified driver could not be loaded

I am trying to test my ODBC connection in Teradata on UNIX using adhoc file.
teradata/client/17.10/odbc_64/samples/C/adhoc
when trying to connect to Oracle DSN, I get following error
adhoc: (SQL Diagnostics) STATE=IM003, CODE=0, MSG=[DataDirect][ODBC lib] Specifi ed driver could not be loaded
No driver entry found under the DSN :
I have the entries in
odbc.ini as
[BIC_DEV_ORA_TW]
Description=Teradata 8.0 Oracle Wire Protocol
Driver=/opt/teradata/client/17.10/tbuild/odbc/lib64/_Tora28.so
AlternateServers=
ApplicationUsingThreads=1
AccountingInfo=
Action=
ApplicationName=
ArraySize=60000
AuthenticationMethod=1
BulkBinaryThreshold=32
BulkCharacterThreshold=-1
BulkLoadBatchSize=1024
BulkLoadFieldDelimiter=
BulkLoadRecordDelimiter=
CachedCursorLimit=32
CachedDescLimit=0
CatalogIncludesSynonyms=1
CatalogOptions=0
ClientHostName=
ClientID=
ClientUser=
ConnectionReset=0
ConnectionRetryCount=0
ConnectionRetryDelay=3

How to install JDBC driver on Databricks Cluster?

I'm trying to get the data from my Oracle Database to a Databricks Cluster. But I think I'm doing it wrong:
On the cluster library I just installed the ojdbc8.jar and then after that I opened a notebook and did this to connect:
CREATE TABLE oracle_table
USING org.apache.spark.sql.jdbc
OPTIONS (
dbtable 'table_name',
driver 'oracle.jdbc.driver.OracleDriver',
user 'username',
password 'pasword',
url 'jdbc:oracle:thin://#<hostname>:1521/<db>')
And it says:
java.sql.SQLException: Invalid Oracle URL specified
Can someone help? I've been reading documentations but there's no clear instruction on how I should actually install this jar step by step. I might be using the wrong jar? Thanks!
I have managed to set this up in Python/PySpark as follows:
jdbcUrl = "jdbc:oracle:thin:#//hostName:port/databaseName"
connectionProperties = {
"user" : username,
"password" : password,
"driver" : "oracle.jdbc.driver.OracleDriver"
}
query = "(select * from mySchema.myTable )"
df = spark.read.jdbc(url=jdbcUrl, table=query1, properties=connectionProperties)
I am using the Oracle JDBC Thin Driver instantclient-basic-linux.x64-21.5.0.0.0, as available on the Oracle webpages. The current version is 21.7 I think, but it should work the same way.
Check this link to understand the two different notations for jdbc URLs

RobotFramework Oracle DB Connection Issue

I'm trying to check if the oracle DB connection using the Keyword:
Connect To Database Using Custom Params
Following libraries are imported:
Database Library
JayDeBe API
This this is the connection string used:
'oracle.jdbc.driver.OracleDriver', 'jdbc:oracle:thin:#//DBHostname:Port/DBName', ['user', 'pass']
We do not get any response that can identify if the connection is established or rejected.
We see this message in RIDE:
'oracle.jdbc.driver.OracleDriver', 'jdbc:oracle:thin:#//DBHostName:Port/DBName', ['user', 'pass']
20170927 17:07:54.438 : INFO : Executing : Connect To Database Using Custom Params : jaydebeapi.connect(db_api_2.connect('oracle.jdbc.driver.OracleDriver', 'jdbc:oracle:thin:#//DBHostName:Port/DBName', ['user', 'pass']))
Can someone please help us?
We found the solution: In our case we have python 2.7(32bit) therefore we need the following:
1. Oracle Instant Client Downloads for Microsoft Windows (32-bit). you can download it from here: http://www.oracle.com/technetwork/topics/winsoft-085727.html
Set the variable path in environment variable for the Oracle Instant client.
Download cx_Oracle API (32bit- cx_Oracle-6.0.2-cp27-cp27m-win32.whl (md5) ) from the location:https://pypi.python.org/pypi/cx_Oracle/
Open command prompt and run: pip install cx_Orcle --upgrade
Open the RIDE and use the following keyword:
Connect To Database Using Custom Params cx_Oracle 'user', 'password', 'DBHOSTNAME:PORT/DBNAME'

cx_Oracle connection by python3.5

I've tried several attempt to connect Oracle DB but still unable to connect. Following is my code to connect. However, I could connect Oracle DB through the terminal like this:
$ sqlplus64 uid/passwd#192.168.0.5:1521/WSVC
My evironment: Ubuntu 16.04 / 64bit / Python3.5
I wish your knowledge and experience associated with this issue to be shared. Thank you.
import os
os.chdir("/usr/lib/oracle/12.2/client64/lib")
import cx_Oracle
# 1st attempt
ip = '192.168.0.5'
port = 1521
SID = 'WSVC'
dsn_tns = cx_Oracle.makedsn(ip, port, SID)
# dsn_tns = cx_Oracle.makedsn(ip, port, service_name=SID)
db = cx_Oracle.connect('uid', 'passwd', dsn_tns)
cursor = db.cursor()
-------------------------------------------------
# 2nd attempt
conn = "uid/passwd#(DESCRIPTION=(SOURCE_ROUTE=OFF)(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=192.168.0.5)(PORT=1521)))(CONNECT_DATA=(SID=WSVC)(SRVR=DEDICATED)))"
db = cx_Oracle.connect(conn)
cursor = db.cursor()
------------------------------------------------------
# ERROR Description
cx_Oracle.InterfaceError: Unable to acquire Oracle environment handle
The error "unable to acquire Oracle environment handle" is due to your Oracle configuration being incorrect. A few things that should help you uncover the source of the problem:
when using Instant Client, do NOT set the environment variable ORACLE_HOME; that should only be set when using a full Oracle Client or Oracle Database installation
the value of LD_LIBRARY_PATH should contain the path which contains libclntsh.so; the value you selected looks like it is incorrect and should be /usr/lib/oracle/12.2/client64/lib instead
you can verify which Oracle Client libraries are being loaded by using the ldd command as in ldd cx_Oracle.cpython-35m-x86_64-linux-gnu.so

Not able to connect to hive on AWS EMR using java

I have setup AWS EMR cluster with hive. I want to connect to hive thrift server from my local machine using java. I tried following code-
Class.forName("com.amazon.hive.jdbc3.HS2Driver");
con = DriverManager.getConnection("jdbc:hive2://ec2XXXX.compute-1.amazonaws.com:10000/default","hadoop", "");
http://docs.aws.amazon.com/ElasticMapReduce/latest/DeveloperGuide/HiveJDBCDriver.html.As mentioned in the developer guide, added jars related with hive jdbc driver to class path.
But I am getting exception when trying to get connection.
I was able to connect to hive server on simple hadoop cluster using above code (with different jdbc driver).
Can someone please suggest if I am missing something?
Is it possible to connect to hive server on AWS EMR from local machine using hive jdbc?
(Merged Answer from the comments)
Hive is running on port 10000 but only locally, you have to create a ssh tunnel to the emr.
The following is from the documentation for hive 0.13.1
Create Tunnel
ssh -o ServerAliveInterval=10 -i path-to-key-file -N -L 10000:localhost:10000 hadoop#master-public-dns-name
Connect to JDBC
jdbc:hive2://localhost:10000/default
You can use the code using the library JSch
public static void portForwardForHive() {
try {
if(session != null && session.isConnected()) {
return;
}
JSch jsch = new JSch();
jsch.addIdentity(PATH_TO_SSH_KEY_PEM);
String host = REMOTE_HOST;
session = jsch.getSession(USER, host, 22);
// username and password will be given via UserInfo interface.
UserInfo ui = new MyUserInfo();
session.setUserInfo(ui);
session.connect();
int assingedPort = session.setPortForwardingL(LPORT, RHOST, RPORT);
System.out.println("Port forwarding done for the post : " + assingedPort);
} catch (Exception e) {
System.out.println(e);
}
}
Not sure if you've resolved this yet, but its a bug in EMR that's just bitten me.
For direct jdbc connectivity like you are doing, you must include the jdbc drivers in your shaded uber-jar. For jdbc access from within dataframes, you cannot access the jar in your uber-jar (another unrelated bug), but you must specify it on the command line (S3 is a convenient place to keep them):
--files s3://mybucketJAR/postgresql-9.4-1201.jdbc4.jar
However, even after this you will run into another problem if you are specifically trying to access hive. Amazon has built their own jdbc drivers with a different class hierarchy to the normal hive driver (com.amazon.hive.jdbc41.HS2Driver), however the EMR cluster includes the standard Hive jdbc driver in its standard path (org.apache.hive.jdbc.HiveDriver).
This is automatically registered as being capable of handling the jdbc:hive and jdbc:hive2 urls, so when you try to connect to a hive URL it finds this one first and uses it - even if you specifically register the amazon one. Unfortunately, this one is not compatible with amazon's EMR build of Hive.
There are two possible solutions:
1: Find the offending driver and unregister it:
Scala example:
val jdbcDrv = Collections.list(DriverManager.getDrivers)
for(i <- 0 until jdbcDrv.size) {
val drv = jdbcDrv.get(i)
val drvName = drv.getClass.getName
if(drvName == "org.apache.hive.jdbc.HiveDriver") {
log.info(s"Deregistering JDBC Driver: ${drvName}")
DriverManager.deregisterDriver(drv)
}
}
Or
2: As I found out later, you can specify the driver as part of the connect properties when you attempt to connect:
Scala example:
val hiveCredentials = new java.util.Properties
hiveCredentials.setProperty("user", hiveDBUser)
hiveCredentials.setProperty("password", hiveDBPassword)
hiveCredentials.setProperty("driver", "com.amazon.hive.jdbc41.HS2Driver")
val conn = DriverManager.getConnection(hiveDBURL, hiveCredentials)
This is a more "correct" version as it should override any preregistered handlers even if they have completely different class hierarchies.

Resources