Hive Odbc "Error from Hive: ETIMEDOUT" - jdbc

Getting error while connecting to hiveserver2
Hive server 2 & hadoop is up and running in my local machine
Trying the below code
try {
Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
Connection con = DriverManager.getConnection("jdbc:odbc://dsn=hello;driver=Microsoft Hive ODBC Driver;Host=127.0.0.1;Port=5000;HiveServerType=2;AuthMech=0");
Statement stmt = con.createStatement();
ResultSet res = stmt.executeQuery("SHOW TABLES");
ResultSetMetaData rsmdData = res.getMetaData();
int count = rsmdData.getColumnCount();
while (res.next()) {
for (int i = 1; i <= count; i++) {
System.out.print(res.getString(i) + "\t");
}
System.out.println();
}
I am getting the below error
Exception in thread "main" java.sql.SQLException: [Microsoft][HiveODBC] (34) Error from Hive: ETIMEDOUT.
at sun.jdbc.odbc.JdbcOdbc.createSQLException(Unknown Source)
at sun.jdbc.odbc.JdbcOdbc.standardError(Unknown Source)
at sun.jdbc.odbc.JdbcOdbc.SQLDriverConnect(Unknown Source)
at sun.jdbc.odbc.JdbcOdbcConnection.initialize(Unknown Source)
at sun.jdbc.odbc.JdbcOdbcDriver.connect(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at hivedev.HiveODBC.main(HiveODBC.java:20)
However I am able to connect through beeline cli

Related

My password always rejected when try to connect Clickhouse server

I have an issue about Clickhouse server. I ran the server on Docker and I am trying the connect server on Java language but I have an issue. My password is always rejected but my password is correct. What can I do? Where is my mistake?
public static void main(String[] args) throws Exception {
Properties prop = new Properties();
prop.setProperty("ssl", "false");
prop.setProperty("sslmode", "NONE"); // NONE to trust all servers; STRICT for trusted only
prop.setProperty("username", "default");
prop.setProperty("password", "AufsdfFMdfsdfPffdsdBIgxasdzqqfdoıfksdfz");
Connection conn = DriverManager.getConnection("jdbc:ch:https://localhost:18123/", prop);
Statement stmt = conn.createStatement();
stmt.execute("CREATE TABLE IF NOT EXISTS jdbc_test(idx Int8, str String) ENGINE = MergeTree ORDER BY idx");
try (PreparedStatement ps = conn.prepareStatement(
"insert into jdbc_test select col1, col2 from input('col1 Int8, col2 String')")) {
for (int i = 0; i < 10; i++) {
ps.setInt(1, i);
ps.setString(2, "test:" + i); // col1
// parameters will be write into buffered stream immediately in binary format
ps.addBatch();
}
// stream everything on-hand into ClickHouse
ps.executeBatch();
}
ResultSet rs = stmt.executeQuery("select * from jdbc_test");
while (rs.next()) {
System.out.println(String.format("idx: %s str: %s", rs.getString(1), rs.getString(2)));
}
}
My error below.
Exception in thread "main" java.sql.SQLException: Code: 516. DB::Exception: default: Authentication failed: password is incorrect, or there is no user with such name.

Getting Randomly Error:Caused by: org.hibernate.ObjectNotFoundException: No row with the given identifier exists:

In my program code and entity are working fine, but randomly at any point of time this error comes.
If i restart the server's it starts working but after some days again same issue is coming.
Below are the log scripts of the error:
com.pb.gpp.backoffice.job.Exception.GPPBackOfficeException: Error occured during select operation in following method and tablegetFileTypeDetail, transactionfiletype and transactionfiledetail
at com.pb.gcs.backoffice.transaction.controller.DaoImpl.getFileTypeDetail(DaoImpl.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:96)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:260)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:94)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
at com.sun.proxy.$Proxy178.getFileTypeDetail(Unknown Source)
at com.pb.gcs.backoffice.transaction.filecreate.writer.TransactionFileWriter.getCompletePath(TransactionFileWriter.java:178)
at com.pb.gcs.backoffice.transaction.filecreate.writer.TransactionFileWriter.initializeItemWriter(TransactionFileWriter.java:163)
at com.pb.gcs.backoffice.transaction.filecreate.writer.TransactionFileWriter.beforeStep(TransactionFileWriter.java:61)
... 16 more
Caused by: org.hibernate.ObjectNotFoundException: No row with the given identifier exists: [com.pb.gcs.backoffice.transaction.model.configuration.TransactionfiledetailEntity#311]
at org.hibernate.internal.SessionFactoryImpl$1$1.handleEntityNotFound(SessionFactoryImpl.java:244)
at org.hibernate.event.internal.DefaultLoadEventListener.load(DefaultLoadEventListener.java:212)
at org.hibernate.event.internal.DefaultLoadEventListener.proxyOrLoad(DefaultLoadEventListener.java:262)
at org.hibernate.event.internal.DefaultLoadEventListener.onLoad(DefaultLoadEventListener.java:150)
at org.hibernate.internal.SessionImpl.fireLoad(SessionImpl.java:1098)
at org.hibernate.internal.SessionImpl.internalLoad(SessionImpl.java:1025)
at org.hibernate.type.EntityType.resolveIdentifier(EntityType.java:671)
at org.hibernate.type.EntityType.resolve(EntityType.java:489)
at org.hibernate.type.ComponentType.resolve(ComponentType.java:667)
at org.hibernate.type.ComponentType.nullSafeGet(ComponentType.java:349)
at org.hibernate.type.ManyToOneType.hydrate(ManyToOneType.java:190)
at org.hibernate.persister.entity.AbstractEntityPersister.hydrate(AbstractEntityPersister.java:2926)
at org.hibernate.loader.Loader.loadFromResultSet(Loader.java:1673)
at org.hibernate.loader.Loader.instanceNotYetLoaded(Loader.java:1605)
at org.hibernate.loader.Loader.getRow(Loader.java:1505)
at org.hibernate.loader.Loader.getRowFromResultSet(Loader.java:713)
at org.hibernate.loader.Loader.processResultSet(Loader.java:943)
at org.hibernate.loader.Loader.doQuery(Loader.java:911)
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(
public FileTypeDetail getFileTypeDetail(String partnerId, String carrier) throws GPPBackOfficeException {
try {
Session session = configurationSessionFactory.getCurrentSession();
session.clear();
String query = " Select tft.*, tfd.*" +
"from transactionfiletype tft " +
"join transactionfiledetail tfd " +
"on tft.filetype=tfd.filetype " +
"where tfd.activeflag=1 and tfd.filetype ='transactionfile' and tfd.partnerid =:partnerId and tfd.carrier =:carrier";
Query queryFileTypeMapping = session.createSQLQuery(query)
.addEntity("transactionfiletype", TransactionfiletypeEntity.class)
.addEntity("transactionfiledetail", TransactionfiledetailEntity.class);
queryFileTypeMapping.setParameter("partnerId", partnerId);
queryFileTypeMapping.setParameter("carrier", carrier);
List<Object[]> resultList = queryFileTypeMapping.list();
session.flush();
List<FileTypeDetail> fileTypeDetailList = new ArrayList<FileTypeDetail>();
FileTypeDetail fileTypeDetail;
for (Object[] fileDetail : resultList) {
fileTypeDetail = new FileTypeDetail();
fileTypeDetail.setTransactionfiletypeEntity((TransactionfiletypeEntity) fileDetail[0]);
fileTypeDetail.setTransactionfiledetailEntity((TransactionfiledetailEntity) fileDetail[1]);
fileTypeDetailList.add(fileTypeDetail);
}
return fileTypeDetailList.get(0);
} catch (Exception ex) {
GPPBackOfficeException gppBackOfficeException = new GPPBackOfficeException(Constants.SYSTEM_PREFIX, Constants.ERROR_CODE_0019, helper.getExceptionInfo(Constants.SYSTEM_PREFIX, Constants.ERROR_CODE_0019).getErrorclientmessage()
+ "getFileTypeDetail, transactionfiletype and transactionfiledetail", ex);
throw gppBackOfficeException;
}
}

Not able to access Hive Tables through JDBC in Hortonworks Sandbox 2.0

I am using Hortonworks Sandbox 2.0. I tried the following program in the Eclipse IDE, but was not table to access my Hive tables. Got the following errors. What do I have to do to resolve this?
I used this also: hive –service hiveserver, and got Not able to connect. I am using VMware.
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.Statement;
import java.sql.DriverManager;
public class HiveJdbcClient {
private static String driverName = “org.apache.hadoop.hive.jdbc.HiveDriver”;
/**
* #param args
* #throws SQLException
*/
public static void main(String[] args) throws SQLException {
try {
Class.forName(driverName);
} catch (ClassNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
System.exit(1);
}
Connection con = DriverManager.getConnection(“jdbc:hive://localhost:10000/default”, “”, “”);
Statement stmt = con.createStatement();
String tableName = “testHiveDriverTable”;
stmt.executeQuery(“drop table ” + tableName);
ResultSet res = stmt.executeQuery(“create table ” + tableName + ” (key int, value string)”);
// show tables
String sql = “show tables ‘” + tableName + “‘”;
System.out.println(“Running: ” + sql);
res = stmt.executeQuery(sql);
if (res.next()) {
System.out.println(res.getString(1));
}
// describe table
sql = “describe ” + tableName;
System.out.println(“Running: ” + sql);
res = stmt.executeQuery(sql);
while (res.next()) {
System.out.println(res.getString(1) + “\t” + res.getString(2));
}
// load data into table
// NOTE: filepath has to be local to the hive server
// NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line
String filepath = “/tmp/a.txt”;
sql = “load data local inpath ‘” + filepath + “‘ into table ” + tableName;
System.out.println(“Running: ” + sql);
res = stmt.executeQuery(sql);
// select * query
sql = “select * from ” + tableName;
System.out.println(“Running: ” + sql);
res = stmt.executeQuery(sql);
while (res.next()) {
System.out.println(String.valueOf(res.getInt(1)) + “\t” + res.getString(2));
}
// regular hive query
sql = “select count(1) from ” + tableName;
System.out.println(“Running: ” + sql);
res = stmt.executeQuery(sql);
while (res.next()) {
System.out.println(res.getString(1));
}
}
}
—-
GOT ERROR
Exception in thread “main” java.sql.SQLException: Could not establish connection to 172.31.153.71:10000/default: java.net.ConnectException: Connection refused: connect
at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:117)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106)
at java.sql.DriverManager.getConnection(DriverManager.java:582)
at java.sql.DriverManager.getConnection(DriverManager.java:185)
at com.coe.convert.hive.temp.htw.HiveJdbcClient.main(HiveJdbcClient.java:28)
Run hive on another port. And then use the new port number it will work. Use the following command to change the port number 10000 to 10001.
hive --service hiveserver -p 10001
Then Use ,
172.31.153.71:10000/default

PIG UDF throwing error

I am getting an error in PIG script.
PIG SCRIPT :
REGISTER /var/lib/hadoop-hdfs/udf.jar;
REGISTER /var/lib/hadoop-hdfs/udf2.jar;
INPUT_LINES = Load 'hdfs:/Inputdata/DATA_GOV_US_Farmers_Market_DataSet.csv' using PigStorage(',') AS (FMID:chararray, MarketName:chararray, Website:chararray, Street:chararray, City:chararray, County:chararray, State:chararray, Zip:chararray, Schedule:chararray, X:chararray, Y:chararray, Location:chararray, Credit:chararray, WIC:chararray, WICcash:chararray, SFMNP:chararray, SNAP:chararray, Bakedgoods:chararray, Cheese:chararray, Crafts:chararray, Flowers:chararray, Eggs:chararray, Seafood:chararray, Herbs:chararray, Vegetables:chararray, Honey:chararray, Jams:chararray, Maple:chararray, Meat:chararray, Nursery:chararray, Nuts:chararray, Plants:chararray, Poultry:chararray, Prepared:chararray, Soap:chararray, Trees:chararray, Wine:chararray);
FILTERED_COUNTY = FILTER INPUT_LINES BY County=='Los Angeles';
REQUIRED_COLUMNS = FOREACH FILTERED_COUNTY GENERATE FMID,MarketName,$12..;
PER = FOREACH REQUIRED_COLUMNS GENERATE FMID,MarketName,fm($2..) AS Percentage;
STATUS = FOREACH PER GENERATE FMID,MarketName,Percentage,status(Percentage) AS Stat;
UDF1 :
import java.io.IOException;
import org.apache.pig.EvalFunc;
import org.apache.pig.data.Tuple;
public class fm extends EvalFunc<Integer>
{
String temp;
int per;
int count=0;
public Integer exec(Tuple input) throws IOException {
if (input == null || input.size() == 0)
return -1;
try
{
for(int i=0;i<25;i++)
{
if(input.get(i) == "" || input.get(i) == null)
return -1;
temp = (String)input.get(i);
if(temp.equals("Y"))
count++;
}
per = count*4;
count = 0;
return per;
}
catch(Exception e)
{
throw new IOException("Caught exception processing input row ", e);
}
}
}
UDF2 :
import java.io.IOException;
import org.apache.pig.EvalFunc;
import org.apache.pig.data.Tuple;
public class status extends EvalFunc<String>
{
public String exec(Tuple input) throws IOException
{
if (input == null || input.size() == 0)
return null;
try
{
String str = (String)input.get(0);
int i = Integer.parseInt(str);
if(i>=60)
return "HIGH";
else if(i<=40)
return "LOW";
else
return "MEDIUM";
}
catch(Exception e)
{
throw new IOException("Caught exception processing input row ", e);
}
}
}
Dataset :
https://onedrive.live.com/redir?resid=7F81451078F4DBE8%21113
ERROR :
Pig Stack Trace
ERROR 2078: Caught error from UDF: status [Caught exception processing input row ]
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias STATUS. Backend error : Caught error from UDF: status [Caught exception processing input row ]
at org.apache.pig.PigServer.openIterator(PigServer.java:828)
at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:696)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
at org.apache.pig.Main.run(Main.java:538)
at org.apache.pig.Main.main(Main.java:157)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 2078: Caught error from UDF: status [Caught exception processing input row ]
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators.POUserFunc.getNext(POUserFunc.java:365)
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators.POUserFunc.getNext(POUserFunc.java:434)
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.getNext(PhysicalOperator.java:340)
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.processPlan(POForEach.java:372)
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.getNext(POForEach.java:297)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.runPipeline(PigGenericMapBase.java:283)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.map(PigGenericMapBase.java:278)
It appears that your problem may be that you are casting your input to a String in your status UDF. Your fm UDF actually returns an Integer. So instead you should have:
Integer i = (Integer)input.get(0);
This definitely will cause a problem unless you fix it. Without the original error message I can't say whether or not there is some other problem that occurs earlier.
I would have expected your stack trace to include the original exception message, which would help you debug this issue. Strange that it doesn't. Without it all you have to go off of is analyzing the code.
This might help with debugging in the future:
throw new IOException("Caught exception processing input row " + e.getMessage(), e);
For the fm UDF, I also recommend making the variables temp, per, and count local to the exec method instead of instances of the class, because they don't need to be. This probably won't cause an error but it is better coding practice.

glassfish 3.1.2 - ResultSetWrapper40 cannot be cast to oracle.jdbc.OracleResultSet

I recently migrate from glassfish 3.1.1 to 3.1.2 and I got the following error
java.lang.ClassCastException: com.sun.gjc.spi.jdbc40.ResultSetWrapper40 cannot be cast to oracle.jdbc.OracleResultSet
at the line
oracle.sql.BLOB bfile = ((OracleResultSet) rs).getBLOB("filename");
in the following routine:
public void fetchPdf(int matricola, String anno, String mese, String tableType, ServletOutputStream os) {
byte[] buffer = new byte[2048];
String query = "SELECT filename FROM "
+ tableType + " where matricola = " + matricola
+ " and anno = " + anno
+ ((tableType.equals("gf_blob_ced") || tableType.equals("gf_blob_car")) ? " and mese = " + mese : "");
InputStream ins = null;
//--------
try {
Connection conn = dataSource.getConnection();
//Connection conn = DriverManager.getConnection(connection, "glassfish", pwd);
java.sql.Statement stmt = conn.createStatement();
ResultSet rs = stmt.executeQuery(query);
if (rs.next()) {
logger.info("select ok " + query);
oracle.sql.BLOB bfile = ((OracleResultSet) rs).getBLOB("filename");
ins = bfile.getBinaryStream();
int length;
while ((length = (ins.read(buffer))) >= 0) {
os.write(buffer, 0, length);
}
ins.close();
} else {
logger.info("select Nok " + query);
}
rs.close();
stmt.close();
//conn.close();
} catch (IOException ex) {
logger.warn("blob file non raggiungibile: "+query);
} catch (SQLException ex) {
logger.warn("connessione non riuscita");
}
}
I'm using the glassfish connection pool
#Resource(name = "jdbc/ape4")
private DataSource dataSource;
and the jdbc/ape4 resource belongs to an oracle connection pool with the following param
NetworkProtocol tcp
LoginTimeout 0
PortNumber 1521
Password xxxxxxxx
MaxStatements 0
ServerName server
DataSourceName OracleConnectionPoolDataSource
URL jdbc:oracle:thin:#server:1521:APE4
User glassfish
ExplicitCachingEnabled false
DatabaseName APE4
ImplicitCachingEnabled false
The oracle driver is ojdbc6.jar, oracle DB is 10g.
Could anyone help me what is happening? On Glassfish 3.1.1 it was working fine.
There is no need for not using standard JDBC api in this code. You are not using any Oracle-specific functionality so rs.getBlob("filename").getBinaryStream() will work just as well.
If you insist on keeping this, turn off JDBC Object wrapping option for your datasource.

Resources