Eclipse unable to find raw()method while using Cucumber datatable - datatable

In my feature file i am using datatable
Feature File
And verify for incorrect or incomplete Address
|/api/ |
|/api/2020-05-30|
|/api/20200404 |
|/api/abcfghj |
I am using eclipse and In step definition file, when I am trying to add raw() method for datatable its unable to populate the method. While mouse hovering DataTable it imported package as import io.cucumber.datatable.DataTable;
#And("^verify for incorrect or incomplete Address$")
public void verify_for_incorrect_or_incomplete_url(DataTable address) throws Throwable {
List<List> data = address.
I am not sure what is missing in my dependency file, please guide.
POM.xml
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>5.5.0</version>
<scope>test</scope>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>datatable-dependencies</artifactId>
<version>1.1.12</version>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>datatable</artifactId>
<version>1.0.3</version>
</dependency>
</dependencies>

the raw() method is outdated.
So you can instead do something like this:
List<List<String>> data = address.asLists(String.class);
The Result will be:
[[/api/], [/api/2020-05-30], [/api/20200404], [/api/abcfghj]]

Related

import cucumber.api.DataTable; cannot be resolved

I am trying to automate DataTables in Cucumber where i have written the appropriate feature and step definition for the same.
Eclipse is suggesting to import io.cucumber.datatable.DataTable; and when I use the raw() method, eclipse throws an error saying "The method raw() is undefined for the type DataTable"
Feature : Then user enters username and password
| mngr193115 | edytadA |
Step Definition :
#Then("^user enters username and password$")
public void user_enters_username_and_password(DataTable credentials) {
//driver.findElement(By.linkText("ACCOUNT")).click();
List<List<String>> data = credentials.raw();
driver.findElement(By.xpath("//input[#type='text']")).sendKeys();
driver.findElement(By.name("password")).sendKeys(password);
}
Below is my POM.xml file
<dependencies>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-java -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>4.3.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-jvm -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-jvm</artifactId>
<version>4.3.0</version>
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-junit -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>4.3.0</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-jvm-deps -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-jvm-deps</artifactId>
<version>1.0.6</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/net.masterthought/cucumber-reporting -->
<dependency>
<groupId>net.masterthought</groupId>
<artifactId>cucumber-reporting</artifactId>
<version>4.6.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-core -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-core</artifactId>
<version>3.0.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/gherkin -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>gherkin</artifactId>
<version>5.1.0</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>3.141.59</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-picocontainer -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-picocontainer</artifactId>
<version>4.3.0</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
expected - to resolve the import issues and import cucumber.api.DataTable;
Actual - Eclipse is suggesting to import import io.cucumber.datatable.DataTable; for DataTable and when I import the same, i am not able to use raw() method.
Main Point: People have been facing few errors (mentioned below) as they mix direct & transitive dependencies. So we shall not mix direct & transitive dependencies specially their versions! Doing so can cause unpredictable outcome.
The import cucumber.api.junit cannot be resolved
java.lang.NoClassDefFoundError: gherkin/IGherkinDialectProvider
import cucumber.api.DataTable; cannot be resolved
Solution: Please remove cucumber-java, cucumber-core, cucumber-jvm-deps, gherkin & junit. They're transitive dependencies and will be provided by your dependencies. You can add below set of minimal cucumber dependencies.
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>4.3.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-picocontainer</artifactId>
<version>4.3.0</version>
<scope>test</scope>
</dependency>
If you are using io.cucumber instead of cucumber.api import then use cells() method which is an alternative of the raw() method in io.cucumber package.
Example:
List<List<String>> testData = data.cells();
System.out.println(testData.get(0).get(0)); //displays the first element of dataTable //of 0th row and 0th column
I. option
https://mvnrepository.com/artifact/io.cucumber/datatable-dependencies/1.1.12
https://mvnrepository.com/artifact/io.cucumber/datatable/1.0.3
<!-- https://mvnrepository.com/artifact/io.cucumber/datatable-dependencies -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>datatable-dependencies</artifactId>
<version>1.1.12</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/datatable -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>datatable</artifactId>
<version>1.0.3</version>
</dependency>
II. option
Try out to update cucumber-core and cucumber java dependencies up to latest versions:
https://mvnrepository.com/artifact/io.cucumber/cucumber-core/4.3.1
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-core</artifactId>
<version>4.3.1</version>
</dependency>
https://mvnrepository.com/artifact/io.cucumber/cucumber-java
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-java -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>4.3.1</version>
</dependency>
And after that- do maven reimport.
Hope this helps
I also had an issue while using raw() so instead I changed it to cells() which worked fine
#And("^I enter following for login$")
public void iEnterFollowingForLogin(DataTable table) throws Throwable{
List<List<String>> data = table.cells();
System.out.println("Username: "+data.get(1).get(0));
System.out.println("Password: "+data.get(1).get(1));
}
Feature : Then user enters username and password
| mngr193115 | edytadA |
From your question what is notice was above was the feature file you used.
I agree that you have to do import for datatable and before that can you please change the feature file to below if not
Feature : To check the UN and Pwd
Scenario : ScenarioName
Then user enters username and password
| mngr193115 | edytadA |

Spring boot mongodb autoconfigure throws exception "Cannot determine embedded database driver class for database type NONE"

I am using Spring boot to develop a Spring batch application. I will need my application to write the data finally to MongoDB and thus needs to configure org.springframework.data.mongodb.core.MongoTemplate for org.springframework.batch.item.data.MongoItemWriter.
My pom.xml dependency section looks like this-
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-autoconfigure</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-batch</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-mongodb</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.batch</groupId>
<artifactId>spring-batch-test</artifactId>
<version>${spring.batch.version}</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.projectlombok/lombok -->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.16.18</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.jongo/jongo -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>${slf4j.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.jvnet.jaxb2_commons/jaxb2-basics-runtime -->
<dependency>
<groupId>org.jvnet.jaxb2_commons</groupId>
<artifactId>jaxb2-basics-runtime</artifactId>
<version>1.11.1</version>
</dependency>
<!--<dependency>-->
<!--<groupId>de.flapdoodle.embed</groupId>-->
<!--<artifactId>de.flapdoodle.embed.mongo</artifactId>-->
<!--<version>1.50.5</version>-->
<!--<scope>test</scope>-->
<!--</dependency>-->
<!--<dependency>-->
<!--<groupId>cz.jirutka.spring</groupId>-->
<!--<artifactId>embedmongo-spring</artifactId>-->
<!--<version>RELEASE</version>-->
<!--<scope>test</scope>-->
<!--</dependency>-->
<!-- https://mvnrepository.com/artifact/org.mongodb/mongo-java-driver -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>3.6.0</version>
</dependency>
</dependencies>
The application.properties file looks like this
spring.data.mongodb.host=mongohost
spring.data.mongodb.port=27017
spring.data.mongodb.authentication-database=authdb
spring.data.mongodb.username=user
spring.data.mongodb.password=pwd
spring.datasource.driver-class-name=<< I don't know what to put here >>
Main class is also simple enough and looks like this-
#SpringBootApplication
public class Main {
public static void main(String[] args) {
SpringApplication.run(Main.class, args);
}
}
Now, whenever I try to run my Main class it gives out error
***************************
APPLICATION FAILED TO START
***************************
Description:
Cannot determine embedded database driver class for database type NONE
Action:
If you want an embedded database please put a supported one on the classpath. If you have database settings to be loaded from a particular profile you may need to active it (no profiles are currently active).
After researching a lot about this problem, I figured out that I need to let Spring know about my data store by providing the value of spring.datasource.driver-class-name in application.properties
spring.datasource.driver-class-name=com.mongodb.Server
If I provide com.mongodb.Server as my drive class name its not found on classpath and isn't recognised despite I have mongo java driver dependency on my classpath.
What should I put the value for mongoDB's driver-class-name provided I want to use mongo-java-driver?
If driver class name is not the cause of this issue, what should be the resolution of issue "Cannot determine embedded database driver class for database type NONE"mentioned in title of this question?
Try exluding DataSourceAutoConfiguration.class in your main class:
#SpringBootApplication
#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
public class Main {
public static void main(String[] args) {
SpringApplication.run(Main.class, args);
}
}
Also you don't need this:
spring.datasource.driver-class-name
unless you need jpa configuration as well.

Maven Module with Spring Boot

I am trying to configure Maven to use Spring Boot with multi modules, this is my structure:
- Parent
------ WebClient
------------ scr/main/java/config ---> Config Files
------------ scr/main/java/resources/WEB-INF ---> Template Files
------ Core
------ Services
------ Server
I have a config file where I put my ViewResolver:
#Configuration
#EnableWebMvc
public class MvcConfiguration extends WebMvcConfigurerAdapter {
#Bean
public ViewResolver setupViewResolver() {
// View Resolver
UrlBasedViewResolver viewResolver = new UrlBasedViewResolver();
viewResolver.setPrefix("/WEB-INF/");
viewResolver.setSuffix(".jsp");
viewResolver.setViewClass(JstlView.class);
return viewResolver;
}
Here is my parent pom.xml modules and dependencies:
<modules>
<module>core</module>
<module>services</module>
<module>server</module>
<module>webclient</module>
</modules>
<dependencyManagement>
<dependencies>
<!-- =========================== Spring ======================= -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>1.3.3.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.ws</groupId>
<artifactId>spring-ws-core</artifactId>
<version>2.2.4.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.session</groupId>
<artifactId>spring-session</artifactId>
<version>1.1.0.RELEASE</version>
</dependency>
<!-- Spring Security and MVC dependences -->
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-web</artifactId>
<version>4.1.0.RC1</version>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-config</artifactId>
<version>4.1.0.RC1</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<version>3.1.0</version>
</dependency>
<dependency>
<groupId>javax.servlet.jsp</groupId>
<artifactId>javax.servlet.jsp-api</artifactId>
<version>2.3.1</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
<version>1.2</version>
</dependency>
<dependency>
<groupId>org.apache.tomcat.embed</groupId>
<artifactId>tomcat-embed-jasper</artifactId>
<version>8.5.0</version>
</dependency>
</dependencies>
</dependencyManagement>
At my Server's POM file I have this dependencies:
<dependencies>
<dependency>
<groupId>mygroup</groupId>
<artifactId>core</artifactId>
</dependency>
<dependency>
<groupId>mygroup</groupId>
<artifactId>services</artifactId>
</dependency>
<dependency>
<groupId>mygroup</groupId>
<artifactId>webclient</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.session</groupId>
<artifactId>spring-session</artifactId>
</dependency>
My SpringBootApplication looks like this:
#SpringBootApplication
#ComponentScan({"config"})
public class ServerRunner {
public static void main(String [] args) {
SpringApplication.run(ServerRunner.class, args);
}
}
When I start my application, I see my mapping working and it seems that everything works well but the problem is that Spring does not find my templates:
Whitelabel Error Page
This application has no explicit mapping for /error, so you are seeing this as a fallback.
There was an unexpected error (type=Not Found, status=404).
/WEB-INF/index.jsp
What is wrong with my config? Where should I put my template files?
Thanks for your help!
EDITED:
If I put my template files at Server Project (server\src\main\webapp) it works! So... How do I make the server to read webclient project templates? I need the templates in that submodule.
EDITED 2:
Solved avoid "jsp" files, see answer and comments
I think this is related to JSP limitations described in Boot reference documentation - you can't just read JSPs from anywhere in the classpath (whereas this works with other templating engines).

Pointing HiveServer2 to MiniMRCluster for Hive Testing

I've been wanting to do Hive integration testing for some of the code that I've been developing. The two major requirements of the testing framework that I need:
It needs to work with a Cloudera version of Hive and Hadoop
(preferably, 2.0.0-cdh4.7.0)
It needs to be all local. Meaning, the Hadoop cluster and Hive
server should start on the beginning of the test, run a few queries,
and teardown after the test is over.
So I broke this problem down into three parts:
Getting code for the HiveServer2 part (I decided to use a JDBC
connector over a Thrift service client)
Getting code for building an in-memory MapReduce cluster (I decided to use MiniMRCluster for this)
Setting up both (1) and (2) above to work with each other.
I was able to get (1) out of the way by looking at many resources. Some of these that were very useful are:
Cloudera Hadoop Google User Group
Hive JDBC Client Wiki
For (2), I followed this excellent post in StackOverflow:
Integration Testing Hive Jobs
So far, so good. At this point of time, my pom.xml in my Maven project, on including both above functionalities, looks something like this:
<repositories>
<repository>
<id>cloudera</id>
<url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.1</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
</dependency>
<!-- START: dependencies for getting MiniMRCluster to work -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-auth</artifactId>
<version>2.0.0-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-test</artifactId>
<version>2.0.0-mr1-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.0.0-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.0.0-cdh4.7.0</version>
<classifier>tests</classifier>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.0.0-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.0.0-cdh4.7.0</version>
<classifier>tests</classifier>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>2.0.0-mr1-cdh4.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>2.0.0-mr1-cdh4.7.0</version>
<classifier>tests</classifier>
</dependency>
<!-- END: dependencies for getting MiniMRCluster to work -->
<!-- START: dependencies for getting Hive JDBC to work -->
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-builtins</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-cli</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-metastore</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-serde</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-common</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>${hive.version}</version>
</dependency>
<dependency>
<groupId>org.apache.thrift</groupId>
<artifactId>libfb303</artifactId>
<version>0.9.1</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.15</version>
</dependency>
<dependency>
<groupId>org.antlr</groupId>
<artifactId>antlr-runtime</artifactId>
<version>3.5.1</version>
</dependency>
<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derby</artifactId>
<version>10.10.1.1</version>
</dependency>
<dependency>
<groupId>javax.jdo</groupId>
<artifactId>jdo2-api</artifactId>
<version>2.3-ec</version>
</dependency>
<dependency>
<groupId>jpox</groupId>
<artifactId>jpox</artifactId>
<version>1.1.9-1</version>
</dependency>
<dependency>
<groupId>jpox</groupId>
<artifactId>jpox-rdbms</artifactId>
<version>1.2.0-beta-5</version>
</dependency>
<!-- END: dependencies for getting Hive JDBC to work -->
</dependencies>
Now I'm on step (3). I tried running the following code:
#Test
public void testHiveMiniDFSClusterIntegration() throws IOException, SQLException {
Configuration conf = new Configuration();
/* Build MiniDFSCluster */
MiniDFSCluster miniDFS = new MiniDFSCluster.Builder(conf).build();
/* Build MiniMR Cluster */
System.setProperty("hadoop.log.dir", "/Users/nishantkelkar/IdeaProjects/" +
"nkelkar-incubator/hive-test/target/hive/logs");
int numTaskTrackers = 1;
int numTaskTrackerDirectories = 1;
String[] racks = null;
String[] hosts = null;
MiniMRCluster miniMR = new MiniMRCluster(numTaskTrackers, miniDFS.getFileSystem().getUri().toString(),
numTaskTrackerDirectories, racks, hosts, new JobConf(conf));
System.setProperty("mapred.job.tracker", miniMR.createJobConf(
new JobConf(conf)).get("mapred.job.tracker"));
try {
String driverName = "org.apache.hive.jdbc.HiveDriver";
Class.forName(driverName);
} catch (ClassNotFoundException e) {
e.printStackTrace();
System.exit(1);
}
Connection hiveConnection = DriverManager.getConnection(
"jdbc:hive2:///", "", "");
Statement stm = hiveConnection.createStatement();
// now create test tables and query them
stm.execute("set hive.support.concurrency = false");
stm.execute("drop table if exists test");
stm.execute("create table if not exists test(a int, b int) row format delimited fields terminated by ' '");
stm.execute("create table dual as select 1 as one from test");
stm.execute("insert into table test select stack(1,4,5) AS (a,b) from dual");
stm.execute("select * from test");
}
My hope was that (3) would be solved by the following line of code from the above method:
Connection hiveConnection = DriverManager.getConnection(
"jdbc:hive2:///", "", "");
However, I'm getting the following error:
java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:161)
at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:150)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:207)
at com.ask.nkelkar.hive.HiveUnitTest.testHiveMiniDFSClusterIntegration(HiveUnitTest.java:54)
Can anyone please let me know what I need to do in addition/what I'm doing wrong to get this to work?
P.S. I looked at HiveRunner and hive_test projects as options, but I wasn't able to get these to work with Cloudera versions of Hadoop.
Your test is failing at the first create table statement. Hive is unhelpfully suppressing the following error message:
file:/user/hive/warehouse/test is not a directory or unable to create one
Hive is attempting to use the default warehouse directory /user/hive/warehouse which doesn't exist on your filesystem. You could create the directory, but for testing you'll likely want to override the default value. For example:
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars;
...
System.setProperty(ConfVars.METASTOREWAREHOUSE.toString(), "/Users/nishantkelkar/IdeaProjects/" +
"nkelkar-incubator/hive-test/target/hive/warehouse");

Mockito with static methods in util classes

I tried to find but couldnt get what i was looking for. Is it possible to do something like following in mockito?
when(TestServiceUtil.getTestItem()).thenReturn(someItem);
In your pom.xml, add the following dependencies:
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-module-junit4</artifactId>
<version>1.5.6</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-api-mockito</artifactId>
<version>1.5.6</version>
<scope>test</scope>
</dependency>
above your test class:
#RunWith(PowerMockRunner.class)
public class YourClassName
[...]
#Before
public void beforeTest() throws SQLException {
PowerMockito.mockStatic(TestServiceUtil.class);
Now you can use (as you had it):
when(TestServiceUtil.getTestItem()).thenReturn(someItem);
Last words - don't overuse PowerMockito. Focus on clean, object oriented code.

Resources