Override getSchemaFile and getSolrConfigFile - maven

I'm migrating my Solr test project to Solr 4.1 and I can't override the methods getSchemaFile() and getSolrConfigFile().
I'm getting the following errors:
[ERROR] my_path/SolrConfigTest.java:[63,15] error: getSchemaFile() in SolrConfigTest cannot override getSchemaFile() in SolrTestCaseJ4
[ERROR] overridden method is static
[ERROR] my_path/SolrConfigTest.java:[62,1] error: method does not override or implement a method from a supertype
[ERROR] my_path/SolrConfigTest.java:[68,15] error: getSolrConfigFile() in SolrConfigTest cannot override getSolrConfigFile() in SolrTestCaseJ4
[ERROR] overridden method is static
[ERROR] my_path/SolrConfigTest.java:[67,1] error: method does not override or implement a method from a supertype
The file looks as follows:
import org.apache.log4j.Logger;
import org.apache.solr.client.solrj.SolrServer;
import org.apache.solr.client.solrj.SolrServerException;
import org.apache.solr.client.solrj.embedded.EmbeddedSolrServer;
import org.apache.solr.client.solrj.request.AbstractUpdateRequest;
import org.apache.solr.client.solrj.request.ContentStreamUpdateRequest;
import org.apache.solr.client.solrj.request.CoreAdminRequest;
import org.apache.solr.client.solrj.response.QueryResponse;
import org.apache.solr.common.params.ModifiableSolrParams;
import org.apache.solr.common.util.NamedList;
import org.apache.solr.core.CoreContainer;
import org.apache.solr.util.AbstractSolrTestCase;
public class SolrConfigTest extends AbstractSolrTestCase {
String container = "mycore";
#Override
public String getSolrHome() {
return System.getProperty("user.dir") + "/resources/";
}
#Override
public String getSchemaFile() {
return getSolrHome() + container + "/conf/schema.xml";
}
#Override
public String getSolrConfigFile() {
return getSolrHome() + container + "/conf/solrconfig.xml";
}
#Before
#Override
public void setUp() throws Exception {
super.setUp();
CoreContainer.Initializer initializer = new CoreContainer.Initializer();
coreContainer = initializer.initialize();
server = new EmbeddedSolrServer(coreContainer, "mycore");
}
}
And for maven, my pom.xml file has the following dependencies:
<dependencies>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-core</artifactId>
<version>4.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-core</artifactId>
<version>4.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-solrj</artifactId>
<version>4.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-test-framework</artifactId>
<version>4.1.0</version>
</dependency>
</dependencies>
Anybody has any idea about what I'm missing?

As far as I can see, AbstractSolrTestCase class at Solr 4.1 does not offer anymore methods getSchemaFile() and getSolrConfigFile() to override.
Once getSolrHome() points to the folder where Solr is defined (where you can find the solr.xml file), the schema.xml and solrconfig.xml files will be found at mycore/conf.

Or you can override initCore(String config, String schema, String solrHome) method of SolrTestCaseJ4 class instead so that you can set the config file and schema file as per your configuration.

Related

Injection of CommandGateway not work in Quarkus using AxonIq

I have a microservice in Quarkus which implementing CQRS/Event sourcing using AxonIq Framework. I Already made it using Spring boot and everythings it's ok. I would like to migrate it in Quarkus but I have error during maven compilation probably because the Ioc. When CDI try to create the service I think he can inject Axon CommandGateway and QueryGateway.
[ERROR] Failed to execute goal io.quarkus.platform:quarkus-maven-plugin:2.7.1.Final:build (default) on project domain: Failed to build quarkus application: io.quarkus.builder.BuildExce
ption: Build failure: Build failed due to errors
[ERROR] [error]: Build step io.quarkus.arc.deployment.ArcProcessor#validate threw an exception: javax.enterprise.inject.spi.DeploymentException: Found 2 deployment problems:
[ERROR] [1] Unsatisfied dependency for type org.axonframework.Commandhandling.CommandGateway and qualifiers [#Default]
[ERROR] - java member: com.omb.commands.MyAggregateCommandService().commandGateway
[ERROR] - declared on CLASS bean [types=[com.omb.commands.MyAggregateCommandService, java.lang.Object], qualifiers=[#Default, #Any], target=com.omb.commands.MyAggregateCommandService]
Configuration
package com.omb..configuration;
import com.omb..MyAggregate;
import com.omb..commands.MyAggregateCommandService;
import com.omb..mongo.MongoMyAggregateProjector;
import com.omb..queries.MyAggregateQueryService;
import com.mongodb.client.MongoClient;
import com.mongodb.client.MongoDatabase;
import org.axonframework.config.Configurer;
import org.axonframework.config.DefaultConfigurer;
import org.axonframework.eventsourcing.eventstore.EventStorageEngine;
import org.axonframework.extensions.mongo.DefaultMongoTemplate;
import org.axonframework.extensions.mongo.eventsourcing.tokenstore.MongoTokenStore;
import org.axonframework.serialization.xml.XStreamSerializer;
import javax.enterprise.context.ApplicationScoped;
import javax.enterprise.inject.Produces;
#ApplicationScoped
public class AxonConfiguration {
#Produces
public org.axonframework.config.Configuration getAxonConfiguration(MongoMyAggregateProjector MyAggregateProjector, MongoDatabase database, EventStorageEngine eventStorageEngine) {
XStreamSerializer serializer = XStreamSerializer.defaultSerializer();
Configurer configurer = DefaultConfigurer
.defaultConfiguration()
.configureAggregate(MyAggregate.class)
.eventProcessing(conf -> conf
.registerTokenStore(config -> MongoTokenStore.builder()
.mongoTemplate(
DefaultMongoTemplate.builder()
// optionally choose collection names here
.mongoDatabase(database)
.build())
.serializer(serializer)
.build()))
.registerEventHandler(conf -> MyAggregateProjector)
.registerQueryHandler(conf -> MyAggregateProjector)
.configureEmbeddedEventStore(conf -> eventStorageEngine);
return configurer.start();
}
#Produces
public MongoDatabase mongoDatabase(MongoClient client) {
return client.getDatabase("MyAggregate");
}
#Produces
#ApplicationScoped
public MyAggregateQueryService queryService(org.axonframework.config.Configuration configuration) {
return new MyAggregateQueryService(configuration.queryGateway());
}
#Produces
#ApplicationScoped
public MyAggregateCommandService commandService(org.axonframework.config.Configuration configuration) {
return new MyAggregateCommandService(configuration.commandGateway());
}
}
Service :
package com.omb..commands;
import com.omb..models.MyAggregateDTO;
import org.axonframework.commandhandling.gateway.CommandGateway;
import javax.enterprise.context.ApplicationScoped;
import java.util.concurrent.CompletableFuture;
#ApplicationScoped
public class MyAggregateCommandService {
CommandGateway commandGateway;
public MyAggregateCommandService(CommandGateway commandGateway) {
this.commandGateway = commandGateway;
}
public CompletableFuture<String> createMyAggregate(final MyAggregateDTO MyAggregate) {
return commandGateway.send(new CreateMyAggregateCommand(MyAggregate.id(), MyAggregate.name()));
}
public CompletableFuture<MyAggregateDTO> updateMyAggregate(final String MyAggregateId, final MyAggregateDTO MyAggregate) {
if(MyAggregate.id().equals(MyAggregateId)) {
return commandGateway.send(new UpdateMyAggregateCommand(MyAggregate.id(), MyAggregate.name()));
} else {
throw new IllegalArgumentException("Identifiers are not the same, does not update");
}
}
}
Controller :
package com.omb;
import com.omb..commands.MyAggregateCommandService;
import com.omb..models.MyAggregateDTO;
import javax.validation.Valid;
import javax.ws.rs.POST;
import javax.ws.rs.PUT;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import java.util.UUID;
import java.util.concurrent.CompletableFuture;
#Path("/commands/MyAggregate")
public class MyAggregateCommandController {
private MyAggregateCommandService MyAggregateCommandService;
public MyAggregateCommandController(MyAggregateCommandService MyAggregateCommandService) {
this.MyAggregateCommandService = MyAggregateCommandService;
}
#POST
#Path("/create")
public CompletableFuture<String> createMyAggregate(#Valid MyAggregateCreateInput MyAggregate) {
MyAggregateDTO MyAggregateDTO = new MyAggregateDTO(UUID.randomUUID().toString(), MyAggregate.name());
return MyAggregateCommandService.createMyAggregate(MyAggregateDTO);
}
#PUT
#Path("/{MyAggregateId}")
public CompletableFuture<MyAggregateDTO> updateMyAggregate(#PathParam("MyAggregateId") String MyAggregateId, MyAggregateDTO MyAggregate) {
MyAggregateDTO MyAggregateDTO = new MyAggregateDTO(MyAggregate.id(), MyAggregate.name());
return MyAggregateCommandService.updateMyAggregate(MyAggregateId,MyAggregateDTO);
}
/* #ExceptionHandler(value = Exception.class)
public ResponseEntity<String> handle(Exception e) {
return ResponseEntity.badRequest().body(e.getMessage());
}*/
}
dependency
<dependency>
<groupId>org.axonframework.extensions.mongo</groupId>
<artifactId>axon-mongo</artifactId>
<version>4.5</version>
</dependency>
<dependency>
<groupId>org.axonframework</groupId>
<artifactId>axon-test</artifactId>
<scope>test</scope>
<version>4.5.6</version>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-mongodb-panache</artifactId>
</dependency>
<dependency>
<groupId>org.axonframework</groupId>
<artifactId>axon-configuration</artifactId>
</dependency>
<dependency>
<groupId>org.axonframework</groupId>
<artifactId>axon-modelling</artifactId>
</dependency>
<dependency>
<groupId>org.axonframework</groupId>
<artifactId>axon-messaging</artifactId>
</dependency>
<!-- Quarkus -->
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-core</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-junit5</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-resteasy-reactive-jackson</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-hibernate-validator</artifactId>
</dependency>
I had the same issue, one of the reasons can be that your bean is brought by a dependency and to fix it you need to add an empty beans.xml in main/resources/META-INF in this dependency in order for Quarkus to discover the beans as indicated by the documentation
Relevant extract:
The bean archive is synthesized from:
the application classes,
dependencies that contain a beans.xml descriptor (content is ignored),
dependencies that contain a Jandex index - META-INF/jandex.idx,
dependencies referenced by quarkus.index-dependency in application.properties,
and Quarkus integration code.

Getting error when running console application on eclipse : SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder"

I am a beginner to Spring MVC, trying to get SLF4j logger to work when running the following app
package com.chris.springdemo;
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;
import org.springframework.stereotype.Component;
import com.chris.springdemo.dao.OrganizationDao;
import com.chris.springdemo.daoimpl.OrganizationDaoImpl;
import com.chris.springdemo.domain.Organization;
#Component
public class LoggingApp {
#Autowired
private OrganizationDao dao;
#Autowired
private DaoUtils daoUtils;
public void actionMethod() {
// Creating seed data
daoUtils.createSeedData(dao);
// List organizations
List<Organization> orgs = dao.getAllOrganizations();
daoUtils.printOrganizations(orgs, daoUtils.readOperation);
// Create a new organization record
Organization org = new Organization("General electric", 1991, "", 28, "Howdy");
boolean isCreated = dao.create(org);
daoUtils.printSuccessFailure(daoUtils.createOperation, isCreated);
daoUtils.printOrganizations(dao.getAllOrganizations(), daoUtils.readOperation);
// Get a single organization
Organization org2 = dao.getOrganization(1);
daoUtils.printOrganization(org2, "getOrganization");
// Updating a slogan for an organization
Organization org3 = dao.getOrganization(2);
org3.setSlogan("We build **awesome** driving machines!");
boolean isUpdated = dao.update(org3);
daoUtils.printSuccessFailure(daoUtils.updateOperation, isUpdated);
daoUtils.printOrganization(dao.getOrganization(2), daoUtils.updateOperation);
// Delete an organization
boolean isDeleted = dao.delete(dao.getOrganization(3));
daoUtils.printSuccessFailure(daoUtils.deleteOperation, isDeleted);
daoUtils.printOrganizations(dao.getAllOrganizations(), daoUtils.deleteOperation);
// Clean up
dao.cleanup();
daoUtils.printOrganizationCount(dao.getAllOrganizations(), daoUtils.cleanupOperation);
}
public static void main(String[] args) {
// creating the application context
ApplicationContext ctx = new ClassPathXmlApplicationContext("beans-cp.xml");
LoggingApp mainApp = ctx.getBean(LoggingApp.class);
mainApp.actionMethod();
// close the application context
((ClassPathXmlApplicationContext) ctx).close();
// Create the bean
// OrganizationDao dao = (OrganizationDao) ctx.getBean("orgDao");
}
}
DaoUtils.java contains log statements with Logger
package com.chris.springdemo;
import java.util.ArrayList;
import java.util.List;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import com.chris.springdemo.dao.OrganizationDao;
import com.chris.springdemo.domain.Organization;
#Service
public class DaoUtils {
private static Logger LOGGER = LoggerFactory.getLogger("Logging Tester");
public final String createOperation = "CREATE";
public final String readOperation = "READ";
public final String updateOperation = "UPDATE";
public final String deleteOperation = "DELETE";
public final String cleanupOperation = "TRUNCATE";
public void printOrganizations(List<Organization> orgs, String operation){
LOGGER.info("\n********* printing organizations after " + operation + " operation *********");
for (Organization org : orgs) {
LOGGER.info(org.toString());
}
}
public void printOrganization( Organization org, String operation ) {
LOGGER.info("\n********* printing organization after invoking " + operation + " operation *********\n" + org);
}
public void printSuccessFailure(String operation, boolean param){
if(param)
LOGGER.info("\nOperation " + operation + " successful");
else
LOGGER.info("\nOperation " + operation + " failed");
}
public void createSeedData(OrganizationDao dao){
Organization org1 = new Organization("Amazon", 1994, "65656", 8765, "Work hard, have fun, make history");
Organization org2 = new Organization("BMW", 1929, "45454", 5501, "We build ultimate Driving machines");
Organization org3 = new Organization("Google", 1996, "57575", 4567, "Don't be evil");
List<Organization> orgs = new ArrayList<Organization>();
orgs.add(0, org1); orgs.add(1, org2); orgs.add(2, org3);
//int orgCount = orgs.size();
int createCount = 0;
for(Organization org : orgs){
boolean isCreated = dao.create(org);
if(isCreated)
createCount += 1;
}
LOGGER.info("Created "+ createCount + " organizations");
}
public void printOrganizationCount(List<Organization> orgs, String operation){
LOGGER.info("\n*********Currently we have " + orgs.size()+ " organizations after " + operation + " operation" + " *********");
}
}
My pom.xml is below.
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.chris</groupId>
<artifactId>spring-named-jdbc-template-demo-1</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>5.0.1.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>5.0.1.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>5.0.1.RELEASE</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>6.0.6</version>
</dependency>
<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>1.4</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
<version>5.0.1.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-dao</artifactId>
<version>2.0.8</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.25</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.25</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
When I run the app, I get the following error
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
When I click the URL in the error, it tells me
This warning message is reported when the
org.slf4j.impl.StaticLoggerBinder class could not be loaded into
memory. This happens when no appropriate SLF4J binding could be found
on the class path. Placing one (and only one) of slf4j-nop.jar
slf4j-simple.jar, slf4j-log4j12.jar, slf4j-jdk14.jar or
logback-classic.jar on the class path should solve the problem. SINCE
1.6.0 As of SLF4J version 1.6, in the absence of a binding, SLF4J will default to a no-operation (NOP) logger implementation.
But I only have one implementation of SLF4J on my pom.xml which is
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.25</version>
<scope>test</scope>
</dependency>
I deleted all repositories, updated maven project, cleaned the project etc none of this seem to resolve this issue. Would appreciate any help.
Thanks
Cris

How to enable basic caching with Spring Data JPA

I am trying to enable basic caching with Spring Data JPA. But I cannot understand why the DAO methods are still querying the database instead of using the cache.
Given the following Spring Boot 1.5.1 application
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cache.annotation.EnableCaching;
#SpringBootApplication
#EnableCaching
public class Server{
public static void main(String[] args) {
SpringApplication.run(Server.class, args);
}
}
Controller
#Controller
public class PasswordsController {
#Autowired
private PasswordService service;
#SuppressWarnings("unchecked")
#RequestMapping("/passwords.htm")
public void passwords(Map model,
HttpServletRequest request) {
model.put("passwords", service.getPasswords(request));
}
...
Service
#Service
#Transactional
public class PasswordService extends BaseService {
#Autowired
private PasswordJpaDao passwordDao;
public Collection<Password> getPasswords(HttpServletRequest request) {
Collection<Password> passwords = passwordDao.getPasswords(params);
return passwords;
}
...
Interface
#Transactional
public interface PasswordJpaDaoCustom {
public Collection<Password> getPasswords(PasswordSearchParameters params);
}
and implementation
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import javax.persistence.Query;
import javax.transaction.Transactional;
import org.springframework.cache.annotation.Cacheable;
import org.springframework.stereotype.Repository;
import com.crm.entity.Password;
import com.crm.search.PasswordSearchParameters;
#Transactional
#Repository
public class PasswordJpaDaoImpl implements PasswordJpaDaoCustom {
#PersistenceContext
private EntityManager em;
#Override
#Cacheable("passwords")
public Collection<Password> getPasswords(PasswordSearchParameters params) {
System.err.println("got here");
return em.createQuery(hql, Password.class);
}
...
Maven Dependencies
<!-- Spring Boot start -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-freemarker</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-security</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jdbc</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-cache</artifactId>
</dependency>
<!-- Spring Boot end -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
</dependency>
<dependency>
<groupId>org.javassist</groupId>
<artifactId>javassist</artifactId>
</dependency>
I understand that Spring Boot will implicitly use ConcurrentHashMap for caching without any specific configuration necessary?
But the getPasswords() dao method is always called instead of using the cache. Why is this?
Yes, spring boot by default uses ConcurrentHashMap for caching and the issue with your code is that you did not set any key for your passwords cache, so it is calling the database every time for fetching the data.
So you need to the key (any unique identifier) using the params object variables as shown below:
#Cacheable(value="passwords", key="#params.id")//any unique identifier
public Collection<Password> getPasswords(PasswordSearchParameters params) {
System.err.println("got here");
return em.createQuery(hql, Password.class);
}

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.CanSetDropBehind issue in ecllipse

I have the below spark word count program :
package com.sample.spark;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.*;
import org.apache.spark.api.java.function.FlatMapFunction;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.api.java.function.Function2;
import org.apache.spark.api.java.function.PairFlatMapFunction;
import org.apache.spark.api.java.function.PairFunction;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import scala.Tuple2;
public class SparkWordCount {
public static void main(String[] args) {
SparkConf conf = new SparkConf().setAppName("wordcountspark").setMaster("local").setSparkHome("/Users/hadoop/spark-1.4.0-bin-hadoop1");
JavaSparkContext sc = new JavaSparkContext(conf);
//SparkConf conf = new SparkConf();
//JavaSparkContext sc = new JavaSparkContext("hdfs", "Simple App","/Users/hadoop/spark-1.4.0-bin-hadoop1", new String[]{"target/simple-project-1.0.jar"});
JavaRDD<String> textFile = sc.textFile("hdfs://localhost:54310/data/wordcount");
JavaRDD<String> words = textFile.flatMap(new FlatMapFunction<String, String>() {
public Iterable<String> call(String s) { return Arrays.asList(s.split(" ")); }
});
JavaPairRDD<String, Integer> pairs = words.mapToPair(new PairFunction<String, String, Integer>() {
public Tuple2<String, Integer> call(String s) { return new Tuple2<String, Integer>(s, 1); }
});
JavaPairRDD<String, Integer> counts = pairs.reduceByKey(new Function2<Integer, Integer, Integer>() {
public Integer call(Integer a, Integer b) { return a + b; }
});
counts.saveAsTextFile("hdfs://localhost:54310/data/output/spark/outfile");
}
}
I get the Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.CanSetDropBehind exception when I run the code from ecllipse however if I export as runnable jar and run from the terminal as below it works :
bin/spark-submit --class com.sample.spark.SparkWordCount --master local /Users/hadoop/spark-1.4.0-bin-hadoop1/finalJars/SparkJar-v2.jar
The maven pom looks like :
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.sample.spark</groupId>
<artifactId>SparkRags</artifactId>
<packaging>jar</packaging>
<version>1.0-SNAPSHOT</version>
<name>SparkRags</name>
<url>http://maven.apache.org</url>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.4.0</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>0.23.11</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
<scope>compile</scope>
</dependency>
</dependencies>
</project>
When you run in eclipse, the referenced jars are the only source for your program to run. So the jar hadoop-core(thats where CanSetDropBehind is present), is not added properly in your eclipse from local repository for some reasons. You need to identify this if it is a proxy issue, or any other with pom.
When you run the jar from terminal, the reason for running can be, due to the presence of jar in the classpath referenced. Also while running from terminal, you could also choose to have those jars as fat jar(to include hadoop-core) in your jar. I hope you are not using this option while creating your jar. Then the reference would be picked from inside your jar, without depending on the class path.
Verify each step, and it will help you identify the cause. Happy coding
Found that this was caused because the hadoop-common jar for the version 0.23.11 did not have the class,changed the version to 2.7.0 and also added below dependency :
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.7.0</version>
</dependency>
Got rid of the error then but still seeing the below error :
Exception in thread "main" java.io.EOFException: End of File Exception between local host is: "mbr-xxxx.local/127.0.0.1"; destination host is: "localhost":54310; : java.io.EOFException; For more details see: http://wiki.apache.org/hadoop/EOFException

NoClassDefFoundError: org/json/JSONObject - Hadoop MapReduce

I'm trying to do a Mapreduce job using a Json as input. I imported JSON dependency in POM.xml and Maven clean install run properly. But when I run the Jar in Hadoop i get "NoClassDefFoundError: org/json/JSONObject" error on Mapper class. (I also tried with JSON Java external Jar, but it doesn't work.
This is my test mapper class:
package com.andrew.hadoopNBA.NbaJob1;
import java.io.IOException;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;
import org.json.*;
public class PointsRankingMapper extends Mapper<Object, Text, Text, IntWritable> {
public void map(Object key, Text value, Context context)
throws IOException, InterruptedException {
try {
JSONObject jsn = new JSONObject(value.toString());
System.out.println("printing JSON " + jsn);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
And this is Maven dependency:
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20140107</version>
</dependency>
Any ideas?
You can use jackson to process json.
<repositories>
<repository>
<id>codehaus</id>
<url>http://repository.codehaus.org/org/codehaus</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-mapper-asl</artifactId>
<version>1.8.5</version>
</dependency>
</dependencies>
or google's gson
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>1.7.1</version>
</dependency>
Try:
<dependency>
<groupId>net.sf.json-lib</groupId>
<artifactId>json-lib</artifactId>
<version>2.4</version>
</dependency>

Resources