I am new to ESB and have the Talend Runtime installed on my local machine and am able to run it partially. By partially I mean the child job of the ESB service does everything EXCEPT for performing an insert statement to an Oracle database. The error is the following:
Exception in component tOracleConnection_1
java.lang.ClassNotFoundException: oracle.jdbc.OracleDriver cannot be found
I'm a bit confused because prior to attempting to make the Oracle connection, it makes a connection, successfully, to a Greenplum database. So, my question is how do I point the ESB job, or the child job itself to the connection driver needed for Oracle?
I've found this link https://help.talend.com/reader/AskO0G1x~W7LBNnA0laezg/qEeVYD~sI3lhozl8PjJxRA and have attempted both the "bundle" method as well as the "installing simple copy to deploy folder" with no success.
Any help will be appreciated!
Thanks.
You can copy the required jars in [talend install directory]\esb\ContainerESB\lib\ext
Related
I'm trying to connect to my on-premise Oracle database in order to migrate and copy some tables over to Azure SQL, but am not able to do so despite making sure all the connection parameters match the provided values in tnsnames.
Am I missing something? The error says the socket is closed but haven't gotten any useful information other than this prior issue, but doesn't contain any solution. I currently use Oracle 11.2.0.3 so the ADF connector should support this version.
Not sure what else I need to check. Any thoughts would be greatly appreciated!
Your screenprint shows you are using the AutoResolveIntegrationRuntime but as you say your Oracle db is on-premises you would need a Self-hosted Integration runtime (SHIR) as per this article. You would still need an SHIR for an IaaS Oracle db. Ideally the SHIR should be 'close' to the datasource so probably on-premises in the same network.
Do you have any proxy or firewall configured?
Have you tried creating the linked service and then testing the connection? Sometimes it occurred to me that I failed to test the connection of a new linked service but when creating it and retesting the connection is successful ...
I have SonarQube 5.6 version installed and RDS PostgreSQL DB connected to it on AWS. I have this setup since a long time and many projects run on every day schedule on SonarQube. Not getting any issue or errors there. but looks like my Database configuration is not correct. because when i looked into Database. I don't see much more movement or anything stored there. I have updated conf/sonar.properties files with database endpoint and credentials. It looks like it's connected. How to make sure this? like, my database is getting used by sonarqube?
Because sonarQube documentation is saying, No database required after 5.2 version.
can someone please explain me architecture, What is right way to setup this?
I am getting an error as follow,
INFO web[o.sonar.db.Database] Create JDBC data source for jdbc:postgresql:sonarprod.cyfa9ycgfky0.us-east-1.rds.amazonaws.com 2017.02.24 19:54:03
ERROR web[o.a.c.c.C.[.[.[/]] Exception sending context initialized event to listener instance of class org.sonar.server.platform.PlatformServletContextListener java.lang.IllegalStateException: Can not connect to database. Please check connectivity and settings (see the properties prefixed by 'sonar.jdbc.').
I have checked everything is correct in connection string, username, password. all looks correct to me. I have specific ports open for communication. what does this error means? what am i missing?
Thanks in Advance.
You can see which database is used by SonarQube by having a look at server's log.
For instance, here's the entry you'll find when PostgreSQL is used :
2017.02.15 16:46:39 INFO web[][o.sonar.db.Database] Create JDBC data source for jdbc:postgresql://localhost:5432/sonar
Has anyone been able to use the new JDBC drivers for BigQuery in JetBrains DataGrip?
I've followed the these steps
Created a driver in DataGrip with all the jar files
Created a database with a connection string with a service account file
The connection test says successful, but once I try to query something I receive an error:
java.lang.ClassNotFoundException: com.google.api.client.json.JsonFactory
I've added the following files from the Simba ZIP into the DataGrip driver:
GoogleBigQueryJDBC42.jar
jackson-core-2.1.3.jar
google-api-client-1.22.0.jar
google-api-services-bigquery-v2-rev320-1.22.0.jar
google-http-client-1.22.0.jar
google-http-client-jackson2-1.22.0.jar
google-oauth-client-1.22.0.jar
So I'm not sure what to do next. I tried changing their order in DataGrip but it didn't seem to make a different.
My connection string also looks OK I think:
jdbc:bigquery://https://www.googleapis.com/bigquery/v2:443;ProjectId=...;OAuthType=0;OAuthPvtKeyPath=...;OAuthServiceAcctEmail=...;
You may get this error when the driver JAR files are not referenced correctly in the tool. I have listed out the steps I used to connect to BigQuery via DataGrip.
Add a new driver by adding all the JAR files from the zip. The correct class name should be selected from the "Class" drop down in this step.
Add a new data source by selecting the newly created BigQuery JDBC driver. Provide the correct connection URL in this step.
If the test connection succeeds, create a new query for the same datasource.
Make sure your query uses the correct format "dataset.tablename" and is running on the data source you just tested.
For me replacing P12 with Json worked. But, cannot use DataGrip or in general JDBC to access BigQuery because of various query/incompatibility issues.
This video can be referred : https://www.youtube.com/watch?v=r9l2c_aQPoQ&ab_channel=JetBrainsTV
to use the new simba jdbc drivers for BigQuery in JetBrains DataGrip. It covers all steps one by one for working setup.
Here is the blog which refers this video: https://blog.jetbrains.com/datagrip/2018/07/10/using-bigquery-from-intellij-based-ide/
Drivers can be downloaded at : https://cloud.google.com/bigquery/providers/simba-drivers
Note: Make sure to go through comments on blog to authenticate without creating service account on gcp.
Hope this is helpful!
I'm trying to get LibreOffice's Base v5.1.4.2, running on Ubuntu v16.04 to connect to a Hive v1.2.1 database via JDBC. I added the following jars, downloaded from Maven Central, to LibreOffice's classpath ('Tools -> LibreOffice -> Advanced -> Class Path'):
hive-common-1.2.1.jar
hive-jdbc-1.2.1.jar
hive-metastore-1.2.1.jar
hive-service-1.2.1.jar
hadoop-common-2.6.2.jar
httpclient-4.4.jar
httpcore-4.4.jar
libthrift-0.9.2.jar
commons-logging-1.1.3.jar
slf4j-api-1.7.5.jar
I then restarted LibreOffice, opened Base, selected 'Connect to an existing database' -> 'JDBC' and set the following properties:
I entered the credentials and clicked the 'Test Connection' button, which returned a "the connection was established successfully" message. Great!
In the LibreOffice Base UI, the options under the 'Tables' panel were grayed out. The options in the queries tab were not, so I tried to connect to Hive.
The 'Use Wizard to Create Query' option prompts for a password and then returns "The field names from 'airline.on_time_performance' could not be retrieved."
The JDBC connection is able to connect to Hive and list the tables, though it seems to have problems retrieving the columns. When I try to execute a simple select statement, the 'Create Query in SQL View' option returns a somewhat cryptic "Method not supported" message:
The error message is a bit vague. I suspect that I may be missing a dependency since I am able to connect to Hive from Java using JDBC.
I'm curious to know if anyone in the community has LibreOffice Base working with Hive. If so, what am I missing?
The Apache JDBC driver reports "Method not supported" for most features, just because the Apache committers did not bother to handle the list of simple yes/no API calls. Duh.
If you want to see by yourself, just download DBVisualizer Free, configure the Apache Hive driver, open a connection, and check the Database Info tab.
Now, DBVis is quite permissive with lame drivers, but it seems that LibreOffice is not.
You can try the Cloudera Hive JDBC driver as an alternative. You just have to "register" -- i.e. leave your e-mail address -- to access the download URL; it's simpler to deploy than the Apache thing (based on the Simba SDK, all Hive-specific JARs are bundled) and it works with about any BI tool. So hopefully it works with LibreThing too.
Disclaimer: I wish the Apache distro had a proper JDBC driver, and anyone could use it instead of relying of "free" commercial software. But for now it's just a wish.
During creating of DB-based MDS-connection in JDeveloper list of partitions is empty.
I have tried to install Oracle SOA Suite 11g on both Oracle and SQL Server and have this issue in JDeveloper with different jdbc-drivers.
Of course, MDS schemas in database are created using Oracle Repository Creation utility and both sys/sa and DEV_MDS users were tried.
I have looked into JDeveloper Messages tab and see such error:
WARNING: Error reading db partitions for connection name Connection1. Reason : MDS-00003: error connecting to the database
Unable to start the Universal Connection Pool: oracle.ucp.UniversalConnectionPoolException: Error during pool creation in Universal Connection Pool Manager MBean: oracle.ucp.UniversalConnectionPoolException: Error during pool creation in Universal Connection Pool Manager: java.sql.SQLException: Invalid Universal Connection Pool configuration: java.sql.SQLException: Unable to create factory class instance with provided factory class name: java.lang.ClassNotFoundException: com.microsoft.sqlserver.jdbc.SQLServerXADataSource
Error during pool creation in Universal Connection Pool Manager MBean: oracle.ucp.UniversalConnectionPoolException: Error during pool creation in Universal Connection Pool Manager: java.sql.SQLException: Invalid Universal Connection Pool configuration: java.sql.SQLException: Unable to create factory class instance with provided factory class name: java.lang.ClassNotFoundException: com.microsoft.sqlserver.jdbc.SQLServerXADataSource
It is strange, because it is class from sqljdbc4.jar which I have specified as JDBC-driver (Microsoft SQL Server JDBC Driver 3.0).
So I tried jTDS SQL Server driver and received such error:
Apr 26, 2011 9:52:01 PM oracle.tip.tools.ide.common.resourcepalette.adapter.mds.DBConnectionInfo
WARNING: Error reading db partitions for connection name Connection2. Reason :
This answer is coming a "bit" late, but hopefully it will be of some use to the next coder who stumbles upon this.
I'm currently banging my head to the wall trying to get a simple SOA/BPM/ADF application built and deployed using MS SQL Server as the backend DB containing the MDS data.
I was able to create a DB connection to the SQL Server instance with JDeveloper, but I ran into the same problem as Denys when I tried to create a new MDS Connection: The list of partitions was empty and after several hours (or days) I discovered the same error message in the Messages tab:
java.lang.ClassNotFoundException: com.microsoft.sqlserver.jdbc.SQLServerXADataSource
even though the actual DB connection was working properly.
Also, whenever I tried to build my application using JDeveloper's build command or Maven or Ant tasks, I received the same error.
All of the errors pointed in the direction of a missing JDBC driver, which was actually not missing.
I was finally able to at least partially solve the issue, although I had to use very dirty hacks.
Solution to create MDS connection in JDeveloper:
To get this to work I had to make the MDS module realize that there actually exists a JDBC driver for SQL server, so I added the driver's jar into the module's classpath in ${jdev.home}/extensions/oracle.mds.dt.jar#META-INF/extension.xml:
<classpath>c:/dev/jdbc/mssql/sqljdbc4.jar</classpath>
In my opinion, it should have been enough to just have the driver in the project's library settings, but somehow that just wouldn't cut it.
Solution to get the ant scac task working:
I got the build a bit forward by doing essentially the same thing. I added the JDBC driver's jar into scac's classpath by modifying ${jdev.home}/bin/ant-sca-compile.xml:
<path id="scac.tasks.class.path">
<!-- Added this line -->
<pathelement path="c:/dev/jdbc/mssql/sqljdbc4.jar"/>
</path>
All in all, these are not the kind of solutions I was hoping for, but maybe someone else will benefit from them.
Now I'm at the point where my composite.xml validation fails because of missing and/or broken wsdl files:
[scac] Validating composite "C:\install\fod\CompositeServices\OrderBookingComposite\bin/..//composite.xml"
[scac] error: location {/ns:composite/ns:import[#location='oramds:/apps/FusionOrderDemoShared/services/orderbooking/OrderBookingProcessor.wsdl']}(15,125): Load of wsdl "oramds:/apps/FusionOrderDemoShared/services/orderbooking/OrderBookingProcessor.wsdl" failed
[scac] error: location {/ns:composite/ns:import[#location='oramds:/apps/FusionOrderDemoShared/services/partnersupplier/PartnerSupplierComposite.wsdl']}(25,30): Load of wsdl "oramds:/apps/FusionOrderDemoShared/services/partnersupplier/PartnerSupplierComposite.wsdl" failed
[scac] error: location {/ns:composite/ns:import[#location='oramds:/apps/FusionOrderDemoShared/services/oracle/fodemo/storefront/store/service/common/serviceinterface/StoreFrontService.wsdl']}(29,30): Load of wsdl "oramds:/apps/FusionOrderDemoShared/services/oracle/fodemo/storefront/store/service/common/serviceinterface/StoreFrontService.wsdl" failed
... continues with errors for everything else
This error occurs when trying to execute the compile-build-all task in Oracle's Fusion Order Demo application. Any advice regarding this is most welcome.