How to use the same CQ in two different threads - chronicle

lets say I have two threads which want to communicate using the same Chronicle queue,
writer:
SingleChronicleQueue queue_en = SingleChronicleQueueBuilder.binary(path).build();
ExcerptAppender appender = queue_en.acquireAppender();
....
reader:
SingleChronicleQueue queue_en = SingleChronicleQueueBuilder.binary(path).build();
ExcerptTailer tailer = queue_en.createTailer();
...
now when I start my application I got:
[Thread-1] n.o.c.q.i.t.SingleTableStore Failed to acquire a lock on the table store file. Retrying
Exception in thread "Thread-1" java.lang.NoSuchMethodError: net.openhft.chronicle.core.util.Time.sleep(JLjava/util/concurrent/TimeUnit;)V
at net.openhft.chronicle.queue.impl.table.SingleTableStore.doWithExclusiveLock(SingleTableStore.java:153)
at net.openhft.chronicle.queue.impl.table.SingleTableBuilder.build(SingleTableBuilder.java:111)
at net.openhft.chronicle.queue.impl.single.SingleChronicleQueueBuilder.initializeMetadata(SingleChronicleQueueBuilder.java:430)
at net.openhft.chronicle.queue.impl.single.SingleChronicleQueueBuilder.preBuild(SingleChronicleQueueBuilder.java:995)
at net.openhft.chronicle.queue.impl.single.SingleChronicleQueueBuilder.build(SingleChronicleQueueBuilder.java:328)
at
....
may I ask you how to do it correctly ?

This means you are using an incompatible version of core with queue
I suggest you use a chronicle-bom to ensure all the versions were tested together.
Here is an example from
https://github.com/OpenHFT/Chronicle-Queue-Demo/tree/master/order-processor
<dependencyManagement>
<dependencies>
<dependency>
<groupId>net.openhft</groupId>
<artifactId>third-party-bom</artifactId>
<type>pom</type>
<version>3.6.15</version>
<scope>import</scope>
</dependency>
<dependency>
<groupId>net.openhft</groupId>
<artifactId>chronicle-bom</artifactId>
<version>2.17.482</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>net.openhft</groupId>
<artifactId>chronicle-queue</artifactId>
</dependency>

Related

SpringBoot Upgrade 2.3.7 to 2.5.4 - Issue with KafkaStreamsAutoConfiguration

These are the Dependencies in the pom.xml. spring-cloud.version is 2020.0.3
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dependencies</artifactId>
<version>${spring-cloud.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-streams</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-function-context</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-sleuth-zipkin</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-sleuth</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-binder-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-binder-kafka-streams</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<dependency>
<groupId>io.zipkin.brave</groupId>
<artifactId>brave-instrumentation-spring-web</artifactId>
<version>5.6.10</version>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-function-kotlin</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
...Rest omitted
Application fails to start due to
Description:
Parameter 0 of method kafkaStreamsFunctionProcessorInvoker in org.springframework.cloud.stream.binder.kafka.streams.function.KafkaStreamsFunctionAutoConfiguration required a bean named '&getRestTemplate_registration' that could not be found.
Action:
Consider defining a bean named '&getRestTemplate_registration' in your configuration.
This is the weird part - i do have a Bean defined
/**
* Functional bean which gets / builds the RestTemplate for the given request
* #return the rest template or null if no configuration exists
*/
#Bean
fun getRestTemplate(): (RestCallRequest) -> RestTemplate? {
return { restCallRequest -> getRestTemplateNameForUrl(restCallRequest.url)?.let(::getOrBuildRestTemplate) }
}
and it's being Autowired like shown below in another Class
#Autowired
private lateinit var getRestTemplate: (RestCallRequest) -> RestTemplate?
Works in SpringBoot 2.3.7 but not now after the update. Can anyone please give me a hint how to resolve this.
Why is the Bean getting prefixed with an & and postfixed with _registration? I'm not using that Term in the whole application.
Desperatly i defined a Bean as suggested. Outcome was:
Consider defining a bean named '&getRestTemplate_registration_registration' in your configuration.
After a lot of debugging, the only workaround we were able to find is:
#SpringBootApplication(exclude =[KotlinLambdaToFunctionAutoConfiguration::class])
A Bug report containing a small sample project to reproduce the problem is written: https://github.com/spring-cloud/spring-cloud-function/issues/735.

Keycloak admin-client: Unable to find a MessageBodyReader of content-type application/json

I'm new to keycloak and spring. I'm trying to use the keycloak admin-client to create a user in my spring-boot project like this:
Keycloak kc = Keycloak.getInstance(
"http://localhost:8080/auth",
"master", // the realm to log in to
"admin", "password", // the user
"admin-cli");
CredentialRepresentation credential = new CredentialRepresentation();
credential.setType(CredentialRepresentation.PASSWORD);
credential.setValue("test123");
UserRepresentation user = new UserRepresentation();
user.setUsername("testuser");
user.setFirstName("Test");
user.setLastName("User");
user.setCredentials(Arrays.asList(credential));
kc.realm("master").users().create(user);
but I keep getting this error:
Exception in thread "main" javax.ws.rs.client.ResponseProcessingException: javax.ws.rs.ProcessingException: RESTEASY003145: Unable to find a MessageBodyReader of content-type application/json and type class org.keycloak.representations.AccessTokenResponse
at org.jboss.resteasy.client.jaxrs.internal.ClientInvocation.extractResult(ClientInvocation.java:158)
at org.jboss.resteasy.client.jaxrs.internal.proxy.extractors.BodyEntityExtractor.extractEntity(BodyEntityExtractor.java:60)
at org.jboss.resteasy.client.jaxrs.internal.proxy.ClientInvoker.invoke(ClientInvoker.java:107)
at org.jboss.resteasy.client.jaxrs.internal.proxy.ClientProxy.invoke(ClientProxy.java:76)
at com.sun.proxy.$Proxy18.grantToken(Unknown Source)
at org.keycloak.admin.client.token.TokenManager.grantToken(TokenManager.java:89)
at org.keycloak.admin.client.token.TokenManager.getAccessToken(TokenManager.java:69)
at org.keycloak.admin.client.token.TokenManager.getAccessTokenString(TokenManager.java:64)
at org.keycloak.admin.client.resource.BearerAuthFilter.filter(BearerAuthFilter.java:52)
at org.jboss.resteasy.client.jaxrs.internal.ClientInvocation.invoke(ClientInvocation.java:443)
at org.jboss.resteasy.client.jaxrs.internal.proxy.ClientInvoker.invoke(ClientInvoker.java:105)
at org.jboss.resteasy.client.jaxrs.internal.proxy.ClientProxy.invoke(ClientProxy.java:76)
at com.sun.proxy.$Proxy26.create(Unknown Source)
at me.phuongtm.KeycloakAdminClientDemoApplication.main(KeycloakAdminClientDemoApplication.java:68)
Caused by: javax.ws.rs.ProcessingException: RESTEASY003145: Unable to find a MessageBodyReader of content-type application/json and type class org.keycloak.representations.AccessTokenResponse
at org.jboss.resteasy.core.interception.jaxrs.ClientReaderInterceptorContext.throwReaderNotFound(ClientReaderInterceptorContext.java:42)
at org.jboss.resteasy.core.interception.jaxrs.AbstractReaderInterceptorContext.getReader(AbstractReaderInterceptorContext.java:80)
at org.jboss.resteasy.core.interception.jaxrs.AbstractReaderInterceptorContext.proceed(AbstractReaderInterceptorContext.java:53)
at org.jboss.resteasy.client.jaxrs.internal.ClientResponse.readFrom(ClientResponse.java:266)
at org.jboss.resteasy.client.jaxrs.internal.ClientResponse.readEntity(ClientResponse.java:196)
at org.jboss.resteasy.specimpl.BuiltResponse.readEntity(BuiltResponse.java:212)
at org.jboss.resteasy.client.jaxrs.internal.ClientInvocation.extractResult(ClientInvocation.java:122)
... 13 more
It seems to be a problem with RESTEASY, so I added resteasy-jackson2-provider as suggested by keycloak docs, but no luck.
Here are my pom.xml dependecies:
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.keycloak</groupId>
<artifactId>keycloak-admin-client</artifactId>
<version>3.2.0.Final</version>
</dependency>
<dependency>
<groupId>javax.ws.rs</groupId>
<artifactId>javax.ws.rs-api</artifactId>
<version>2.0</version>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-client</artifactId>
<version>3.1.4.Final</version>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-multipart-provider</artifactId>
<version>3.1.4.Final</version>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-jackson2-provider</artifactId>
<version>3.1.4.Final</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
I spent hours and hours trying to find an answer, I even tried this MatteoM's solution:
https://stackoverflow.com/a/40462534/7441720
But no luck.
I appreciate all the help I can get
UPDATE:
It appears to be an issue of providers:
10:06:45.027 [main] WARN org.jboss.resteasy.resteasy_jaxrs.i18n -
RESTEASY002145: NoClassDefFoundError: Unable to load builtin provider
org.jboss.resteasy.plugins.providers.jackson.ResteasyJackson2Provider from
jar:file:/C:/Users/hp/.m2/repository/org/jboss/resteasy/resteasy-jackson2-
provider/3.1.4.Final/resteasy-jackson2-provider-3.1.4.Final.jar!/META-
INF/services/javax.ws.rs.ext.Providers
I can't seem to find a solution, I checked the jar and the class
ResteasyJackson2Provider
exists!
UPDATE:
From what I can tell, it was an issue with dependecies, I switched to gradle (but it should work for maven as well), my new dependencies are:
compile('org.keycloak:keycloak-admin-client:3.2.1.Final')
compile('org.jboss.resteasy:resteasy-client:3.0.14.Final')
compile('org.jboss.resteasy:resteasy-multipart-provider:3.0.14.Final')
compile('org.jboss.resteasy:resteasy-jackson2-provider:3.0.14.Final')
compile('org.jboss.resteasy:resteasy-jaxb-provider:3.0.14.Final')
compile('org.jboss.resteasy:resteasy-jettison-provider:3.0.14.Final')
compile('org.jboss.spec.javax.ws.rs:jboss-jaxrs-api_2.0_spec:1.0.0.Final')
As for keycloak, I created a Configuration class with a bean:
#Bean
public Keycloak keycloakBeanConfig(){
return KeycloakBuilder.builder().serverUrl(keycloakUrl).realm(realm) //
.grantType(OAuth2Constants.CLIENT_CREDENTIALS).clientId(clientId) //
.clientSecret(clientSecret)
.build();
}
try to use resteasy-jackson-provider and removeresteasy-jackson2-provider. you can find dependency below:
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-jackson-provider</artifactId>
<version>3.1.3.Final</version>
</dependency>

java.lang.NoClassDefFoundError: Could not initialize class com.datastax.driver.core.Cluster

I have a maven project A
pom.xml for the project A :
<dependencies>
<dependency>
<groupId>com.app.cops</groupId>
<artifactId>cassandra-logging</artifactId>
<version>0.0.1-SNAPSHOT</version>
</dependency>
</dependencies>
pom.xml of the project cassandra-logging:
<dependencies>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty</artifactId>
<version>3.9.0.Final</version>
</dependency>
<dependency>
<groupId>com.codahale.metrics</groupId>
<artifactId>metrics-core</artifactId>
<version>3.0.2</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.5</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-mapping</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>19.0</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.3.1</version>
</dependency>
</dependencies>
The cops-logging projecct has the following code :
CassandraCopsComponentLogger.instance = new CassandraCopsComponentLogger();
String hosts = CassandraClientUtil.getHost();
String localDC = CassandraClientUtil.getLocalDC();
Cluster cluster;
if (StringUtils.isNotEmpty(localDC))
{
cluster = Cluster.builder().addContactPoints(hosts.split(","))
.withCredentials(CassandraCopsComponentLogger.USER_NAME, CassandraCopsComponentLogger.AUTH_CODE)
.withQueryOptions(new QueryOptions().setConsistencyLevel(ConsistencyLevel.LOCAL_ONE))
.withLoadBalancingPolicy(new TokenAwarePolicy(DCAwareRoundRobinPolicy.builder().withLocalDc(localDC).build())).build();
}
else
{
cluster = Cluster.builder().addContactPoints(hosts.split(","))
.withCredentials(CassandraCopsComponentLogger.USER_NAME, CassandraCopsComponentLogger.AUTH_CODE)
.withQueryOptions(new QueryOptions().setConsistencyLevel(ConsistencyLevel.LOCAL_ONE)).build();
}
Session session = cluster.connect();
CassandraCopsComponentLogger.mappingManager = new MappingManager(session);
I keep on getting exception on the following line :
cluster = Cluster.builder().addContactPoints(hosts.split(","))
I have a unit test in cassandra-logging project which works fine with this code. But when I call the same code from project A I get the
java.lang.NoClassDefFoundError: Could not initialize class com.datastax.driver.core.Cluster
I was able to resolve this by downgrading the cassandra dependency version to <version>2.1.9</version>. Not sure how that helped but I was able to move forward.
Update the version of cassandra-driver-core and cassandra-driver-mapping to 3.3.0 - this worked for me:
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.3.0</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-mapping</artifactId>
<version>3.3.0</version>
</dependency>
I ran into similar issue. The error i got was :
Factory method 'sqlSession' threw exception;
nested exception is java.lang.NoClassDefFoundError: com/datastax/oss/protocol/internal/SegmentCodec
relevant pom.xml segment:
<properties>
..
<oss-java-driver.version>4.9.0</oss-java-driver.version>
..
</properties>
..
<dependency>
<groupId>com.datastax.oss</groupId>
<artifactId>java-driver-core</artifactId>
<version>${oss-java-driver.version}</version>
</dependency>
<dependency>
<groupId>com.datastax.oss</groupId>
<artifactId>java-driver-query-builder</artifactId>
<version>${oss-java-driver.version}</version>
</dependency>
Solution that worked for me : I have to explicitly declare dependency on native-protocol to override the version of native-protocol.
<properties>
..
<cassandra-native-protocol.version>1.4.11</cassandra-native-protocol.version>
..
</properties>
..
<dependency>
<groupId>com.datastax.oss</groupId>
<artifactId>native-protocol</artifactId>
<version>${cassandra-native-protocol.version}</version>
</dependency>
Reference : https://community.datastax.com/answers/8185/view.html

Pointing HiveServer2 to MiniMRCluster for Hive Testing

I've been wanting to do Hive integration testing for some of the code that I've been developing. The two major requirements of the testing framework that I need:
It needs to work with a Cloudera version of Hive and Hadoop
(preferably, 2.0.0-cdh4.7.0)
It needs to be all local. Meaning, the Hadoop cluster and Hive
server should start on the beginning of the test, run a few queries,
and teardown after the test is over.
So I broke this problem down into three parts:
Getting code for the HiveServer2 part (I decided to use a JDBC
connector over a Thrift service client)
Getting code for building an in-memory MapReduce cluster (I decided to use MiniMRCluster for this)
Setting up both (1) and (2) above to work with each other.
I was able to get (1) out of the way by looking at many resources. Some of these that were very useful are:
Cloudera Hadoop Google User Group
Hive JDBC Client Wiki
For (2), I followed this excellent post in StackOverflow:
Integration Testing Hive Jobs
So far, so good. At this point of time, my pom.xml in my Maven project, on including both above functionalities, looks something like this:
<repositories>
<repository>
<id>cloudera</id>
<url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.1</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
</dependency>
<!-- START: dependencies for getting MiniMRCluster to work -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-auth</artifactId>
<version>2.0.0-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-test</artifactId>
<version>2.0.0-mr1-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.0.0-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.0.0-cdh4.7.0</version>
<classifier>tests</classifier>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.0.0-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.0.0-cdh4.7.0</version>
<classifier>tests</classifier>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>2.0.0-mr1-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>2.0.0-mr1-cdh4.7.0</version>
<classifier>tests</classifier>
</dependency>
<!-- END: dependencies for getting MiniMRCluster to work -->
<!-- START: dependencies for getting Hive JDBC to work -->
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-builtins</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-cli</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-metastore</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-serde</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-common</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.thrift</groupId>
<artifactId>libfb303</artifactId>
<version>0.9.1</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.15</version>
</dependency>
<dependency>
<groupId>org.antlr</groupId>
<artifactId>antlr-runtime</artifactId>
<version>3.5.1</version>
</dependency>
<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derby</artifactId>
<version>10.10.1.1</version>
</dependency>
<dependency>
<groupId>javax.jdo</groupId>
<artifactId>jdo2-api</artifactId>
<version>2.3-ec</version>
</dependency>
<dependency>
<groupId>jpox</groupId>
<artifactId>jpox</artifactId>
<version>1.1.9-1</version>
</dependency>
<dependency>
<groupId>jpox</groupId>
<artifactId>jpox-rdbms</artifactId>
<version>1.2.0-beta-5</version>
</dependency>
<!-- END: dependencies for getting Hive JDBC to work -->
</dependencies>
Now I'm on step (3). I tried running the following code:
#Test
public void testHiveMiniDFSClusterIntegration() throws IOException, SQLException {
Configuration conf = new Configuration();
/* Build MiniDFSCluster */
MiniDFSCluster miniDFS = new MiniDFSCluster.Builder(conf).build();
/* Build MiniMR Cluster */
System.setProperty("hadoop.log.dir", "/Users/nishantkelkar/IdeaProjects/" +
"nkelkar-incubator/hive-test/target/hive/logs");
int numTaskTrackers = 1;
int numTaskTrackerDirectories = 1;
String[] racks = null;
String[] hosts = null;
MiniMRCluster miniMR = new MiniMRCluster(numTaskTrackers, miniDFS.getFileSystem().getUri().toString(),
numTaskTrackerDirectories, racks, hosts, new JobConf(conf));
System.setProperty("mapred.job.tracker", miniMR.createJobConf(
new JobConf(conf)).get("mapred.job.tracker"));
try {
String driverName = "org.apache.hive.jdbc.HiveDriver";
Class.forName(driverName);
} catch (ClassNotFoundException e) {
e.printStackTrace();
System.exit(1);
}
Connection hiveConnection = DriverManager.getConnection(
"jdbc:hive2:///", "", "");
Statement stm = hiveConnection.createStatement();
// now create test tables and query them
stm.execute("set hive.support.concurrency = false");
stm.execute("drop table if exists test");
stm.execute("create table if not exists test(a int, b int) row format delimited fields terminated by ' '");
stm.execute("create table dual as select 1 as one from test");
stm.execute("insert into table test select stack(1,4,5) AS (a,b) from dual");
stm.execute("select * from test");
}
My hope was that (3) would be solved by the following line of code from the above method:
Connection hiveConnection = DriverManager.getConnection(
"jdbc:hive2:///", "", "");
However, I'm getting the following error:
java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:161)
at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:150)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:207)
at com.ask.nkelkar.hive.HiveUnitTest.testHiveMiniDFSClusterIntegration(HiveUnitTest.java:54)
Can anyone please let me know what I need to do in addition/what I'm doing wrong to get this to work?
P.S. I looked at HiveRunner and hive_test projects as options, but I wasn't able to get these to work with Cloudera versions of Hadoop.
Your test is failing at the first create table statement. Hive is unhelpfully suppressing the following error message:
file:/user/hive/warehouse/test is not a directory or unable to create one
Hive is attempting to use the default warehouse directory /user/hive/warehouse which doesn't exist on your filesystem. You could create the directory, but for testing you'll likely want to override the default value. For example:
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars;
...
System.setProperty(ConfVars.METASTOREWAREHOUSE.toString(), "/Users/nishantkelkar/IdeaProjects/" +
"nkelkar-incubator/hive-test/target/hive/warehouse");

Arquillian ShrinkWrap

I'm having trouble creating a JUnit test using a pom.xml dependency.
The test are being run with Arquillian
#RunWith(Arquillian.class)
In this method
#Deployment
public static JavaArchive createDeployment() {
First, I create a JavaArchive with the package of the project I'm testing
JavaArchive merge = ShrinkWrap.create(JavaArchive.class).
addPackages(true,
"migrazioneGeaPersistenzaTampone",
"migrazioneGeaPersistenza",
"it.**.mistral.importGEA4.task",
"migrazioneGeaPersistenzaAccess"
).
addClasses(java.sql.Connection.class)
.addAsManifestResource(EmptyAsset.INSTANCE, "beans.xml");
Then when I launch the test, there are some missing dependencies
Unable to resolve any beans for Types: [class it.**.**.be.service.EnvironmentRootService]
present in this dependency
<dependency>
<groupId>it.**.mistral</groupId>
<artifactId>mistral-be</artifactId>
<version>0.1.0</version>
<scope>compile</scope>
</dependency>
I have tried lots of different things to add these dependencies, the best one seems to be using the ShrinkWrap Resolvers (https://github.com/shrinkwrap/resolver/blob/master/README.asciidoc, particullarry this paragraph https://github.com/shrinkwrap/resolver/blob/master/README.asciidoc#resolution-of-artifacts-defined-in-pom-files)
a)
JavaArchive[] archives = Maven.resolver().loadPomFromFile(path_to_pom_file).
importDependencies(ScopeType.TEST,ScopeType.COMPILE).
resolve().withTransitivity().as(JavaArchive.class);
or
Maven.resolver().loadPomFromFile("/path/to/pom.xml").importRuntimeDependencies()
.resolve().withTransitivity().asFile();
The dependency is ignored in either way (i'm missing something?)
b)
JavaArchive[] mistral_be = Maven.configureResolver().workOffline().
resolve("it.**.mistral:mistral-be:0.1.0").withTransitivity().as(JavaArchive.class);
for (int i = 0; i < mistral_be.length ; i++) {
merge = merge.merge(mistral_be[i]);
}
With a simple
System.out.println(merge.toString(true));
I can see that all files from dependencies are present! Anyway I use local repository workoffline()
Maybe I am missing some dependencies?
But exiting the method with a
return merge;
throws a "java.lang.NoClassDefFoundError: com/google/protobuf/GeneratedMessage....Caused by: java.lang.ClassNotFoundException: com.google.protobuf.GeneratedMessage" error.
Then again i tried to add the missing dependecies
// Tryn'g to add protobuf dependencies
JavaArchive[] prto_buf = Maven.configureResolver().withMavenCentralRepo(true).resolve("com.google.protobuf:protobuf-java:2.3.0").withTransitivity().as(JavaArchive.class);
for (int i = 0; i < prto_buf.length ; i++) {
projectPackages = projectPackages.merge(prto_buf[i]);
}
Throws sameException...
Again , with a simple
System.out.println(merge.toString(true));
I can see that com.Google.protobuf.GeneratedMessage is present
Extract of dependencies:
<dependency>
<groupId>org.jboss.spec</groupId>
<artifactId>jboss-javaee-6.0</artifactId>
<version>1.0.0.Final</version>
<type>pom</type>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.junit</groupId>
<artifactId>arquillian-junit-container</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.container</groupId>
<artifactId>arquillian-weld-ee-embedded-1.1</artifactId>
<version>1.0.0.CR3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.shrinkwrap.resolver</groupId>
<artifactId>shrinkwrap-resolver-depchain</artifactId>
<version>2.1.0</version>
<scope>test</scope>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.jboss.arquillian</groupId>
<artifactId>arquillian-bom</artifactId>
<version>1.1.5.Final</version>
<scope>import</scope>
<type>pom</type>
</dependency>
</dependencies>
</dependencyManagement>
Can anybody help me solve this issue?
This smells to me like a classloading issue. Especially considering the fact that you are using embedded container which basically takes everything what's in there from your project classpath.
What is your target container you are deploying your application on? Maybe if you try to run your test there instead of embedded container you will at least see if that is really the problem, or there is still something wrong with your deployment artifact.

Resources