Error creating driver 'Hadoop / Apache Drill' instance in DBeaver 5.3.4 - jdbc

I am getting below error while trying to connect to a drillbit instance running on my RHEL box from DBeaver installed on my windows machine .
Can't create driver instance
Error creating driver 'Hadoop / Apache Drill' instance.
Most likely required jar files are missing.
You should configure jars in driver settings.
Reason: can't load driver class 'org.apache.drill.jdbc.Driver'
Error creating driver 'Hadoop / Apache Drill' instance.
Most likely required jar files are missing.
You should configure jars in driver settings.
Reason: can't load driver class 'org.apache.drill.jdbc.Driver'
org.apache.drill.jdbc.Driver
org.apache.drill.jdbc.Driver
I had previous downloaded file drill-jdbc-all-1.17.0.jar and in Edit Driver setting in libraries tab added this file and it had worked for me. I had also kept drill-jdbc-all-1.17.0.jar inside installation path of DBeaver DBeaver/plugins directory. Strangely this is not working now.

I had Clicked "Add File" in Edit Connection ( Libraries tab ) to add single jar file drill-jdbc-all-.jar. Depending on situation people can "Add Folder" to add folder with Java classes/resources and "Add Artifact" to add Maven artifact (see below).
After I added jar files I was able to find all JDBC driver classes which present in these jars. Just click on the "Find Class" button and DBeaver will show all of them. In most cases there is just one driver class in the driver. Then i clicked ok and Entered port number on the other page and tried to reconnect and was able to connect fine. Also this link could be helpful.
https://dbeaver.com/docs/wiki/Database-drivers/

One thing to add...
If you're having difficulty connecting, try the following connection string with your hostname:
jdbc:drill:drillbit=localhost

Related

Apache Zeppelin configuration for connect to Hive on HDP Virtualbox

I've been struggling with the Apache Zeppelin notebook version 0.10.0 setup for a while.
The idea is to be able to connect it to a remote Hortonworks 2.6.5 server that runs locally on Virtualbox in Ubuntu 20.04.
I am using an image downloaded from the:
https://www.cloudera.com/downloads/hortonworks-sandbox.html
Of course, the image has pre-installed Zeppelin which works fine on port 9995, but this is an old 0.7.3 version that doesn't support Helium plugins that I would like to use. I know that HDP version 3.0.1 has updated Zeppelin version 0.8 onboard, but its use due to my hardware resource is impossible at the moment. Additionally, from what I remember, enabling Leaflet Map Plugin there was a problem either.
The first thought was to update the notebook on the server, but after updating according to the instructions on the Cloudera forums (unfortunately they are not working at the moment, and I cannot provide a link or see any other solution) it failed to start correctly.
A simpler solution seemed to me now to connect the newer notebook version to the virtual server, unfortunately, despite many attempts and solutions from threads here with various configurations, I was not able to connect to Hive via JDBC. I am using Zeppelin with local Spark 3.0.3 too, but I have some geodata in Hive that I would like to visualize this way.
I used, among others, the description on the Zeppelin website:
https://zeppelin.apache.org/docs/latest/interpreter/jdbc.html#apache-hive
This is my current JDBC interpreter configuration:
hive.driver org.apache.hive.jdbc.HiveDriver
hive.url jdbc:hive2://sandbox-hdp.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
hive.user hive
Artifact org.apache.hive:hive-jdbc:3.1.2
Depending on the driver version, there were different errors, but this time after typing:
%jdbc(hive)
SELECT * FROM mydb.mytable;
I get the following error:
Could not open client transport for any of the Server URI's in
ZooKeeper: Could not establish connection to
jdbc:hive2://sandbox-hdp.hortonworks.com:10000/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;hive.server2.proxy.user=hive;?tez.application.tags=paragraph_1645270946147_194101954;mapreduce.job.tags=paragraph_1645270946147_194101954;:
Required field 'client_protocol' is unset!
Struct:TOpenSessionReq(client_protocol:null,
configuration:{set:hiveconf:mapreduce.job.tags=paragraph_1645270946147_194101954,
set:hiveconf:hive.server2.thrift.resultset.default.fetch.size=1000,
hive.server2.proxy.user=hive, use:database=default,
set:hiveconf:tez.application.tags=paragraph_1645270946147_194101954})
I will be very grateful to everyone for any help. Regards.
So, after many hours and trials, here's a working solution. First of all, the most important thing is to use drivers that correlate with your version of Hadoop. Needed are jar files like 'hive-jdbc-standalone' and 'hadoop-common' in their respective versions and to avoid adding all of them in the 'Artifact' field of the %jdbc interpreter in Zeppelin it is best to use one complete file containing all required dependencies.
Thanks to Tim Veil it is available in his Github repository below:
https://github.com/timveil/hive-jdbc-uber-jar/
This is my complete Zeppelin %jdbc interpreter settings:
default.url jdbc:postgresql://localhost:5432/
default.user gpadmin
default.password
default.driver org.postgresql.Driver
default.completer.ttlInSeconds 120
default.completer.schemaFilters
default.precode
default.statementPrecode
common.max_count 1000
zeppelin.jdbc.auth.type SIMPLE
zeppelin.jdbc.auth.kerberos.proxy.enable false
zeppelin.jdbc.concurrent.use true
zeppelin.jdbc.concurrent.max_connection 10
zeppelin.jdbc.keytab.location
zeppelin.jdbc.principal
zeppelin.jdbc.interpolation false
zeppelin.jdbc.maxConnLifetime -1
zeppelin.jdbc.maxRows 1000
zeppelin.jdbc.hive.timeout.threshold 60000
zeppelin.jdbc.hive.monitor.query_interval 1000
hive.driver org.apache.hive.jdbc.HiveDriver
hive.password
hive.proxy.user.property hive.server2.proxy.user
hive.splitQueries true
hive.url jdbc:hive2://sandbox-hdp.hortonworks.com:10000/default
hive.user hive
Dependencies
Artifact
/opt/zeppelin/interpreter/jdbc/hive-jdbc-uber-2.6.5.0-292.jar
Next step is to go to Ambari http://localhost:8080/ and login as admin. To do that first you must login on Hadoop root account via SSH:
ssh root#127.0.0.1 -p 2222
root#127.0.0.1's password: hadoop
After successful login, you will be prompted to change your password immediately, please do that and next set Ambari admin password with command:
[root#sandbox-hdp ~]# ambari-admin-password-reset
After that you can use admin account in Ambari (login and click Hive link in the left panel):
Ambari -> Hive -> Configs -> Advanced -> Custom hive-site
Click Add Property
Insert followings into the opening window:
hive.security.authorization.sqlstd.confwhitelist.append=tez.application.tags
And after saving, restart all Hive services in Ambari. Everything should be working now if you set the proper Java path in 'zeppelin-env.sh' and port in 'zeppelin-site.xml' (you must copy and rename 'zeppelin-env.sh.template' and 'zeppelin-site.xml.template' in Zeppelin/config directory, please remember that Ambari also use 8080 port!).
In my case, the only thing left to do is add or uncomment the fragment responsible for the Helium plug-in repository (in 'zeppelin-site.xml'):
<property>
<name>zeppelin.helium.registry</name>
<value>helium,https://s3.amazonaws.com/helium-package/helium.json</value>
<description>Enable helium packages</description>
</property>
Now you can go to the Helium tab in the top right corner of the Zeppelin sheet and install the plugins of your choice, in my case it is 'zeppelin-leaflet' visualization. And voilĂ ! Sample vizualization from this Kaggle dataset in Hive:
https://www.kaggle.com/kartik2112/fraud-detection
Have a nice day!

Unable to configure server for deploying rules using IBM ODM

I followed this YouTube video
https://www.youtube.com/watch?v=dIJ-VLkuw0s
at 10:15 min, they have added "http://localhost:9080/res" as server URL
when I tried the same , I got below error
The location "http://localhost::9080/" is not a valid file system path: : is an invalid character in resource name 'localhost::9080'.
I downloaded WebSphere from https://www.ibm.com/support/pages/websphere-liberty-developers
WebSphere Liberty Web Profile 8 21.0.0.4 file(93MB)
unzipped the downloaded file
I ran server from location by running bat file from location
C:\Users\vaageesh\wlp-webProfile8-21.0.0.4\wlp\bin\server.bat
still when I added "http://localhost:9080/res" as server URL
Target servers tab in
"Create Deployment configuration" step, its throwing error
could you guide me exactly how to configure server and pre-requisites needed to deploy rules on this server with configurations we have done
you need to unselect "Local Rule ...." to have a proper wizard that is relative to Web deploiement.
The screenshot you are mentioning is about local directory folder deploiement.
See picture.
Best
Emmanuel

Remote Debugging via Dbeaver Client

I have a jdbc driver and using Dbeaver as client application.
How should i debug the jar.
I am using Intellij IDEA and there is option for remote debugging,
this is option available in intellij but i am not sure how to use it in Dbeaver.
-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=7777
add it to dbeaver.ini file that located in dbeaver installation directory

The Server Instance cannot be started because the Integrated Weblogic domain was not built successfully.-jdeveloper

I am having jdeveloper 11.1.1.6.
I created domain named "jdev_domain" successfully.
I created simple application in jdeveloper and deploy it using integratedweblogicserver.The deployment finished successfully. But while I am running that application it throws following error:
The Server Instance cannot be started because the Integrated Weblogic domain was not built successfully.
The extensions log is:
Warning: Classpath entry
C:\Oracle\Middleware\jdeveloper\was\com.ibm.ws.admin.client_7.0.0.jar
not found. Warning: Classpath entry
C:\Oracle\Middleware\jdeveloper\was\com.ibm.ws.ejb.thinclient_7.0.0.jar
not found. Warning: Classpath entry
C:\Oracle\Middleware\jdeveloper\was\com.ibm.ws.jpa.thinclient_7.0.0.jar
not found. Warning: Classpath entry
C:\Oracle\Middleware\jdeveloper\was\com.ibm.ws.orb_7.0.0.jar not
found. Warning: Classpath entry
C:\Oracle\Middleware\jdeveloper\was\ejb3exceptions.jar not found.
Warning: Classpath entry
C:\Oracle\Middleware\jdeveloper\was\ibmorb.jar not found. Warning:
Classpath entry
C:\Oracle\Middleware\jdeveloper\was\oracle.webservices.standalone.client.jar
not found. Warning: Classpath entry
C:\Oracle\Middleware\jdeveloper\was\tools.jar not found. Warning:
Classpath entry
C:\Oracle\Middleware\jdeveloper\was\wsclient_extended.jar not found.
Eventhough there is errorlog, as I am beginner I don't know how to solve it. Please solve my issue.
Can you clarify how you created the domain? (You don't need to do this manually).
Are you able to start the integrated WebLogic at all?
If not try to remove the defaultDomain directory from the JDeveloper System folder.
I uninstall the jdeveloper and installed it again....I didnot create domain using configuration wizard....and it is working now.
The below site gives required information about configuration of integrated weblogic server
https://docs.oracle.com/middleware/12212/lcm/SOAQS/SOA-INTEGRATING.htm#SOAQS475
Make sure the environment variables are set properly. For eg)
setenv JAVA_HOME /ade_autofs/gd29_3rdparty/JDK6_MAIN_LINUX.X64.rdd/121114.1.6.0.38.0B05/jdk6
If it doesn't work, try removing the default domain directory. It is generally located inside the view, in .jdev_user_home with name as systemxx.x.
I too had the same problem.Tried different methods but failed. In worst case delete folder containing jdeveloper. Go to AppData->Roaming->Jdeveloper .Delete the folder "system11.1.1.7.40.66.61.3". Go to control panel and uninstall "Oracle fusion Middleware 11.1.1.x.x".
Freshly install jdeveloper again and start the server instance.

weblogic 12cR2. How to add library before run config.sh

This is maybe not an issue for who knows how config.sh work when you config weblogic after you installed it into your disk.
My question is: After I installed weblogic 12cR2. How I can add a customer JDBC driver jar file into weblogic 12cR2 before I run config.sh as when I do config, system need use it to connect my database?
I tried to put it in:
$ORACLE_HOME/oracle_common/modules/db2jcc4.jar
and then, I add the full driver jar file path into java class path with this file:
$ORACLE_HOME/oracle_common/common/bin/commExtEnv.sh
Then I restart my Mac and then, redo config.sh to set up component datasource (em module need it) for RCU. Weblogic blaming driver not found.
but, if I start weblogic server by using $DOMAINH_HOME/startWebLogic.sh. weblogic can find that driver in the path.
What's wrong I did? Please advise!
You don't have to change "commExtEnv.sh". In Weblogic 12c you can create a file called "setUserOverrides.sh" in order to customize the Weblogic starting paramters like the classpath. In your case you could have something like this:
# add custom libraries to the WebLogic Server system classpath
if [ "${POST_CLASSPATH}" != "" ] ; then
POST_CLASSPATH="${POST_CLASSPATH}${CLASSPATHSEP}$ORACLE_HOME/oracle_common/modules/db2jcc4.jar"
export POST_CLASSPATH
else
POST_CLASSPATH="$ORACLE_HOME/oracle_common/modules/db2jcc4.jar"
export POST_CLASSPATH
fi
The "setUserOverrides.sh" must be placed in the bin folder of your domain, where setDomainEnv.sh, setStartupEnv.sh and startWebLogic.sh reside.
You can find more information here in the Oracle documentation: http://docs.oracle.com/middleware/1212/wls/START/overview.htm#START250

Resources