Pointing HiveServer2 to MiniMRCluster for Hive Testing - jdbc

I've been wanting to do Hive integration testing for some of the code that I've been developing. The two major requirements of the testing framework that I need:
It needs to work with a Cloudera version of Hive and Hadoop
(preferably, 2.0.0-cdh4.7.0)
It needs to be all local. Meaning, the Hadoop cluster and Hive
server should start on the beginning of the test, run a few queries,
and teardown after the test is over.
So I broke this problem down into three parts:
Getting code for the HiveServer2 part (I decided to use a JDBC
connector over a Thrift service client)
Getting code for building an in-memory MapReduce cluster (I decided to use MiniMRCluster for this)
Setting up both (1) and (2) above to work with each other.
I was able to get (1) out of the way by looking at many resources. Some of these that were very useful are:
Cloudera Hadoop Google User Group
Hive JDBC Client Wiki
For (2), I followed this excellent post in StackOverflow:
Integration Testing Hive Jobs
So far, so good. At this point of time, my pom.xml in my Maven project, on including both above functionalities, looks something like this:
<repositories>
<repository>
<id>cloudera</id>
<url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.1</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
</dependency>
<!-- START: dependencies for getting MiniMRCluster to work -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-auth</artifactId>
<version>2.0.0-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-test</artifactId>
<version>2.0.0-mr1-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.0.0-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.0.0-cdh4.7.0</version>
<classifier>tests</classifier>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.0.0-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.0.0-cdh4.7.0</version>
<classifier>tests</classifier>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>2.0.0-mr1-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>2.0.0-mr1-cdh4.7.0</version>
<classifier>tests</classifier>
</dependency>
<!-- END: dependencies for getting MiniMRCluster to work -->
<!-- START: dependencies for getting Hive JDBC to work -->
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-builtins</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-cli</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-metastore</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-serde</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-common</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.thrift</groupId>
<artifactId>libfb303</artifactId>
<version>0.9.1</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.15</version>
</dependency>
<dependency>
<groupId>org.antlr</groupId>
<artifactId>antlr-runtime</artifactId>
<version>3.5.1</version>
</dependency>
<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derby</artifactId>
<version>10.10.1.1</version>
</dependency>
<dependency>
<groupId>javax.jdo</groupId>
<artifactId>jdo2-api</artifactId>
<version>2.3-ec</version>
</dependency>
<dependency>
<groupId>jpox</groupId>
<artifactId>jpox</artifactId>
<version>1.1.9-1</version>
</dependency>
<dependency>
<groupId>jpox</groupId>
<artifactId>jpox-rdbms</artifactId>
<version>1.2.0-beta-5</version>
</dependency>
<!-- END: dependencies for getting Hive JDBC to work -->
</dependencies>
Now I'm on step (3). I tried running the following code:
#Test
public void testHiveMiniDFSClusterIntegration() throws IOException, SQLException {
Configuration conf = new Configuration();
/* Build MiniDFSCluster */
MiniDFSCluster miniDFS = new MiniDFSCluster.Builder(conf).build();
/* Build MiniMR Cluster */
System.setProperty("hadoop.log.dir", "/Users/nishantkelkar/IdeaProjects/" +
"nkelkar-incubator/hive-test/target/hive/logs");
int numTaskTrackers = 1;
int numTaskTrackerDirectories = 1;
String[] racks = null;
String[] hosts = null;
MiniMRCluster miniMR = new MiniMRCluster(numTaskTrackers, miniDFS.getFileSystem().getUri().toString(),
numTaskTrackerDirectories, racks, hosts, new JobConf(conf));
System.setProperty("mapred.job.tracker", miniMR.createJobConf(
new JobConf(conf)).get("mapred.job.tracker"));
try {
String driverName = "org.apache.hive.jdbc.HiveDriver";
Class.forName(driverName);
} catch (ClassNotFoundException e) {
e.printStackTrace();
System.exit(1);
}
Connection hiveConnection = DriverManager.getConnection(
"jdbc:hive2:///", "", "");
Statement stm = hiveConnection.createStatement();
// now create test tables and query them
stm.execute("set hive.support.concurrency = false");
stm.execute("drop table if exists test");
stm.execute("create table if not exists test(a int, b int) row format delimited fields terminated by ' '");
stm.execute("create table dual as select 1 as one from test");
stm.execute("insert into table test select stack(1,4,5) AS (a,b) from dual");
stm.execute("select * from test");
}
My hope was that (3) would be solved by the following line of code from the above method:
Connection hiveConnection = DriverManager.getConnection(
"jdbc:hive2:///", "", "");
However, I'm getting the following error:
java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:161)
at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:150)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:207)
at com.ask.nkelkar.hive.HiveUnitTest.testHiveMiniDFSClusterIntegration(HiveUnitTest.java:54)
Can anyone please let me know what I need to do in addition/what I'm doing wrong to get this to work?
P.S. I looked at HiveRunner and hive_test projects as options, but I wasn't able to get these to work with Cloudera versions of Hadoop.

Your test is failing at the first create table statement. Hive is unhelpfully suppressing the following error message:
file:/user/hive/warehouse/test is not a directory or unable to create one
Hive is attempting to use the default warehouse directory /user/hive/warehouse which doesn't exist on your filesystem. You could create the directory, but for testing you'll likely want to override the default value. For example:
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars;
...
System.setProperty(ConfVars.METASTOREWAREHOUSE.toString(), "/Users/nishantkelkar/IdeaProjects/" +
"nkelkar-incubator/hive-test/target/hive/warehouse");

Related

import cucumber.api.DataTable; cannot be resolved

I am trying to automate DataTables in Cucumber where i have written the appropriate feature and step definition for the same.
Eclipse is suggesting to import io.cucumber.datatable.DataTable; and when I use the raw() method, eclipse throws an error saying "The method raw() is undefined for the type DataTable"
Feature : Then user enters username and password
| mngr193115 | edytadA |
Step Definition :
#Then("^user enters username and password$")
public void user_enters_username_and_password(DataTable credentials) {
//driver.findElement(By.linkText("ACCOUNT")).click();
List<List<String>> data = credentials.raw();
driver.findElement(By.xpath("//input[#type='text']")).sendKeys();
driver.findElement(By.name("password")).sendKeys(password);
}
Below is my POM.xml file
<dependencies>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-java -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>4.3.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-jvm -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-jvm</artifactId>
<version>4.3.0</version>
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-junit -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>4.3.0</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-jvm-deps -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-jvm-deps</artifactId>
<version>1.0.6</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/net.masterthought/cucumber-reporting -->
<dependency>
<groupId>net.masterthought</groupId>
<artifactId>cucumber-reporting</artifactId>
<version>4.6.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-core -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-core</artifactId>
<version>3.0.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/gherkin -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>gherkin</artifactId>
<version>5.1.0</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>3.141.59</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-picocontainer -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-picocontainer</artifactId>
<version>4.3.0</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
expected - to resolve the import issues and import cucumber.api.DataTable;
Actual - Eclipse is suggesting to import import io.cucumber.datatable.DataTable; for DataTable and when I import the same, i am not able to use raw() method.
Main Point: People have been facing few errors (mentioned below) as they mix direct & transitive dependencies. So we shall not mix direct & transitive dependencies specially their versions! Doing so can cause unpredictable outcome.
The import cucumber.api.junit cannot be resolved
java.lang.NoClassDefFoundError: gherkin/IGherkinDialectProvider
import cucumber.api.DataTable; cannot be resolved
Solution: Please remove cucumber-java, cucumber-core, cucumber-jvm-deps, gherkin & junit. They're transitive dependencies and will be provided by your dependencies. You can add below set of minimal cucumber dependencies.
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>4.3.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-picocontainer</artifactId>
<version>4.3.0</version>
<scope>test</scope>
</dependency>
If you are using io.cucumber instead of cucumber.api import then use cells() method which is an alternative of the raw() method in io.cucumber package.
Example:
List<List<String>> testData = data.cells();
System.out.println(testData.get(0).get(0)); //displays the first element of dataTable //of 0th row and 0th column
I. option
https://mvnrepository.com/artifact/io.cucumber/datatable-dependencies/1.1.12
https://mvnrepository.com/artifact/io.cucumber/datatable/1.0.3
<!-- https://mvnrepository.com/artifact/io.cucumber/datatable-dependencies -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>datatable-dependencies</artifactId>
<version>1.1.12</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/datatable -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>datatable</artifactId>
<version>1.0.3</version>
</dependency>
II. option
Try out to update cucumber-core and cucumber java dependencies up to latest versions:
https://mvnrepository.com/artifact/io.cucumber/cucumber-core/4.3.1
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-core</artifactId>
<version>4.3.1</version>
</dependency>
https://mvnrepository.com/artifact/io.cucumber/cucumber-java
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-java -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>4.3.1</version>
</dependency>
And after that- do maven reimport.
Hope this helps
I also had an issue while using raw() so instead I changed it to cells() which worked fine
#And("^I enter following for login$")
public void iEnterFollowingForLogin(DataTable table) throws Throwable{
List<List<String>> data = table.cells();
System.out.println("Username: "+data.get(1).get(0));
System.out.println("Password: "+data.get(1).get(1));
}
Feature : Then user enters username and password
| mngr193115 | edytadA |
From your question what is notice was above was the feature file you used.
I agree that you have to do import for datatable and before that can you please change the feature file to below if not
Feature : To check the UN and Pwd
Scenario : ScenarioName
Then user enters username and password
| mngr193115 | edytadA |

Spring boot mongodb autoconfigure throws exception "Cannot determine embedded database driver class for database type NONE"

I am using Spring boot to develop a Spring batch application. I will need my application to write the data finally to MongoDB and thus needs to configure org.springframework.data.mongodb.core.MongoTemplate for org.springframework.batch.item.data.MongoItemWriter.
My pom.xml dependency section looks like this-
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-autoconfigure</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-batch</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-mongodb</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.batch</groupId>
<artifactId>spring-batch-test</artifactId>
<version>${spring.batch.version}</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.projectlombok/lombok -->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.16.18</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.jongo/jongo -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>${slf4j.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.jvnet.jaxb2_commons/jaxb2-basics-runtime -->
<dependency>
<groupId>org.jvnet.jaxb2_commons</groupId>
<artifactId>jaxb2-basics-runtime</artifactId>
<version>1.11.1</version>
</dependency>
<!--<dependency>-->
<!--<groupId>de.flapdoodle.embed</groupId>-->
<!--<artifactId>de.flapdoodle.embed.mongo</artifactId>-->
<!--<version>1.50.5</version>-->
<!--<scope>test</scope>-->
<!--</dependency>-->
<!--<dependency>-->
<!--<groupId>cz.jirutka.spring</groupId>-->
<!--<artifactId>embedmongo-spring</artifactId>-->
<!--<version>RELEASE</version>-->
<!--<scope>test</scope>-->
<!--</dependency>-->
<!-- https://mvnrepository.com/artifact/org.mongodb/mongo-java-driver -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>3.6.0</version>
</dependency>
</dependencies>
The application.properties file looks like this
spring.data.mongodb.host=mongohost
spring.data.mongodb.port=27017
spring.data.mongodb.authentication-database=authdb
spring.data.mongodb.username=user
spring.data.mongodb.password=pwd
spring.datasource.driver-class-name=<< I don't know what to put here >>
Main class is also simple enough and looks like this-
#SpringBootApplication
public class Main {
public static void main(String[] args) {
SpringApplication.run(Main.class, args);
}
}
Now, whenever I try to run my Main class it gives out error
***************************
APPLICATION FAILED TO START
***************************
Description:
Cannot determine embedded database driver class for database type NONE
Action:
If you want an embedded database please put a supported one on the classpath. If you have database settings to be loaded from a particular profile you may need to active it (no profiles are currently active).
After researching a lot about this problem, I figured out that I need to let Spring know about my data store by providing the value of spring.datasource.driver-class-name in application.properties
spring.datasource.driver-class-name=com.mongodb.Server
If I provide com.mongodb.Server as my drive class name its not found on classpath and isn't recognised despite I have mongo java driver dependency on my classpath.
What should I put the value for mongoDB's driver-class-name provided I want to use mongo-java-driver?
If driver class name is not the cause of this issue, what should be the resolution of issue "Cannot determine embedded database driver class for database type NONE"mentioned in title of this question?
Try exluding DataSourceAutoConfiguration.class in your main class:
#SpringBootApplication
#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
public class Main {
public static void main(String[] args) {
SpringApplication.run(Main.class, args);
}
}
Also you don't need this:
spring.datasource.driver-class-name
unless you need jpa configuration as well.

not found: value FlumeUtils

This is my code for integration of spark streaming with flume :
val conf = new SparkConf()
.setAppName("File Count")
.setMaster("local[2]")
val sc = new SparkContext(conf)
val ssc = new StreamingContext(sc, Seconds(10))
val flumeStream = FlumeUtils.createPollingStream(ssc,192.168.1.31,8020)
But i have an error : not found: value FlumeUtils
This is my pom.xml :
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.10.4</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-flume-sink_2.10</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.3.2</version>
</dependency>
My spark version is 1.5.0
Any help !!! and thanks in advance.
FlumeUtils is a class under org.apache.spark.streaming.flume. In your pom.xml, the artifact you are trying to import is spark-streaming-flume-sink_2.10 which doesn't have the FlumeUtils class.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-flume-sink_2.10</artifactId>
<version>1.5.0</version>
</dependency>
Instead import the below artifact:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-flume_2.10</artifactId>
<version>1.5.0</version>
</dependency>
Hope this helps.

java.lang.NoClassDefFoundError: Could not initialize class com.datastax.driver.core.Cluster

I have a maven project A
pom.xml for the project A :
<dependencies>
<dependency>
<groupId>com.app.cops</groupId>
<artifactId>cassandra-logging</artifactId>
<version>0.0.1-SNAPSHOT</version>
</dependency>
</dependencies>
pom.xml of the project cassandra-logging:
<dependencies>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty</artifactId>
<version>3.9.0.Final</version>
</dependency>
<dependency>
<groupId>com.codahale.metrics</groupId>
<artifactId>metrics-core</artifactId>
<version>3.0.2</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.5</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-mapping</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>19.0</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.3.1</version>
</dependency>
</dependencies>
The cops-logging projecct has the following code :
CassandraCopsComponentLogger.instance = new CassandraCopsComponentLogger();
String hosts = CassandraClientUtil.getHost();
String localDC = CassandraClientUtil.getLocalDC();
Cluster cluster;
if (StringUtils.isNotEmpty(localDC))
{
cluster = Cluster.builder().addContactPoints(hosts.split(","))
.withCredentials(CassandraCopsComponentLogger.USER_NAME, CassandraCopsComponentLogger.AUTH_CODE)
.withQueryOptions(new QueryOptions().setConsistencyLevel(ConsistencyLevel.LOCAL_ONE))
.withLoadBalancingPolicy(new TokenAwarePolicy(DCAwareRoundRobinPolicy.builder().withLocalDc(localDC).build())).build();
}
else
{
cluster = Cluster.builder().addContactPoints(hosts.split(","))
.withCredentials(CassandraCopsComponentLogger.USER_NAME, CassandraCopsComponentLogger.AUTH_CODE)
.withQueryOptions(new QueryOptions().setConsistencyLevel(ConsistencyLevel.LOCAL_ONE)).build();
}
Session session = cluster.connect();
CassandraCopsComponentLogger.mappingManager = new MappingManager(session);
I keep on getting exception on the following line :
cluster = Cluster.builder().addContactPoints(hosts.split(","))
I have a unit test in cassandra-logging project which works fine with this code. But when I call the same code from project A I get the
java.lang.NoClassDefFoundError: Could not initialize class com.datastax.driver.core.Cluster
I was able to resolve this by downgrading the cassandra dependency version to <version>2.1.9</version>. Not sure how that helped but I was able to move forward.
Update the version of cassandra-driver-core and cassandra-driver-mapping to 3.3.0 - this worked for me:
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.3.0</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-mapping</artifactId>
<version>3.3.0</version>
</dependency>
I ran into similar issue. The error i got was :
Factory method 'sqlSession' threw exception;
nested exception is java.lang.NoClassDefFoundError: com/datastax/oss/protocol/internal/SegmentCodec
relevant pom.xml segment:
<properties>
..
<oss-java-driver.version>4.9.0</oss-java-driver.version>
..
</properties>
..
<dependency>
<groupId>com.datastax.oss</groupId>
<artifactId>java-driver-core</artifactId>
<version>${oss-java-driver.version}</version>
</dependency>
<dependency>
<groupId>com.datastax.oss</groupId>
<artifactId>java-driver-query-builder</artifactId>
<version>${oss-java-driver.version}</version>
</dependency>
Solution that worked for me : I have to explicitly declare dependency on native-protocol to override the version of native-protocol.
<properties>
..
<cassandra-native-protocol.version>1.4.11</cassandra-native-protocol.version>
..
</properties>
..
<dependency>
<groupId>com.datastax.oss</groupId>
<artifactId>native-protocol</artifactId>
<version>${cassandra-native-protocol.version}</version>
</dependency>
Reference : https://community.datastax.com/answers/8185/view.html

arquillian #Drone injections always returning "about:blank" page

I am having this problem for 2 days now, and I am tending to think that something in my configuration is off. I'll post first my code and then explain:
public class MyTest extends Arquillian {
#Deployment(name = "MyPlatform", testable = false)
public static WebArchive createDeployment() {
WebArchive war;
war = ShrinkWrap
.create (WebArchive.class, "MyPlatform.war")
.merge (Maven
.resolver()
.loadPomFromFile("pom.xml")
.resolve("MyPlatform:My.Platform:war:0.0.1-SNAPSHOT")
.withoutTransitivity()
.asSingle(WebArchive.class));
return war;
}
#Drone
private PhantomJSDriver browser;
#ArquillianResource
private URL deploymentUrl;
#Test(dataProvider = Arquillian.ARQUILLIAN_DATA_PROVIDER)
#RunAsClient
public void should_login_successfully(#InitialPage LoginPage loginPage) {
System.out.println ("ACTUAL: " + browser.getCurrentUrl ());
System.out.println ("DEPLOYMENT URL: " + deploymentUrl.toExternalForm ());
loginPage.login ("demo", "demo");
Assert.assertEquals (deploymentUrl.toExternalForm () + "index.tm", "https://127.0.0.1:8443/MyPlatform/index.tm");
}
The #ArquillianResource injection works fine, and shows the correct URL. However the #Drone injection shows "about:blank". after some testing i found something weird:
if my war file is called something like MyPlatform.blabla.war, then the Drone trancates after the first "dot" and i get "http://127.0.0.1:8080/MyPlatform/login.tm" which is not what i deployed...so for some reason the #Drone is always trancating my deployment URL and cant seem to find the root of it.
here is my POM just in case
<dependency>
<groupId>org.jboss.shrinkwrap.resolver</groupId>
<artifactId>shrinkwrap-resolver-api-maven</artifactId>
<version>2.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.shrinkwrap</groupId>
<artifactId>shrinkwrap-api</artifactId>
<version>1.2.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.protocol</groupId>
<artifactId>arquillian-protocol-servlet</artifactId>
<version>1.1.2.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.shrinkwrap</groupId>
<artifactId>shrinkwrap-impl-base</artifactId>
<version>1.2.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.shrinkwrap.resolver</groupId>
<artifactId>shrinkwrap-resolver-impl-maven</artifactId>
<version>2.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian</groupId>
<artifactId>arquillian-bom</artifactId>
<version>1.1.2.Final</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.extension</groupId>
<artifactId>arquillian-drone-bom</artifactId>
<version>1.2.0.Final</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.graphene</groupId>
<artifactId>graphene-webdriver</artifactId>
<version>2.0.1.Final</version>
<type>pom</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.graphene</groupId>
<artifactId>graphene-webdriver-spi</artifactId>
<version>2.0.1.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.graphene</groupId>
<artifactId>graphene-webdriver-impl</artifactId>
<version>2.0.1.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.as</groupId>
<artifactId>jboss-as-arquillian-container-remote</artifactId>
<version>7.1.1.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.3.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.testng</groupId>
<artifactId>arquillian-testng-container</artifactId>
<version>1.1.2.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>bsh</groupId>
<artifactId>bsh</artifactId>
<version>2.0b4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.3.1</version>
<scope>test</scope>
</dependency>
I would be greatful if someone can help me in solving this pickle..!
Typical, After I posted the question i found the problem, and it was simply that my applicaiton is running over SSL and phantomjs is not redirecting from 8080 -> 8443...
Now to figure out how to do this ...

Resources