I am testing the hbase . i am using a standalone one without hadoop. i used the version hbase 0.90.6 the code worked fine and i upgraded to latest version 0.94.0 it fails and gives this exception while i try to put datas in table.
Exception
Exception in thread "main" org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: DoNotRetryIOException: 1 time, servers with issues: xxxx:36601,
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatchCallback(HConnectionManager.java:1591)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatch(HConnectionManager.java:1367)
at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:945)
at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:801)
at org.apache.hadoop.hbase.client.HTable.put(HTable.java:776)
at com.hhase.Hbase.main(Hbase.java:22)
I am using the below code.
package com.hhase;
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Get;
import org.apache.hadoop.hbase.client.HTable;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.util.Bytes;
public class Hbase {
public static void main(String args[]) throws IOException {
Configuration hConf = HBaseConfiguration.create();
HTable table = new HTable(hConf, "myLittleHBaseTable");
Put p = new Put(Bytes.toBytes("myLittleRow"));
Put put = new Put(Bytes.toBytes("myLittleRow"));
put.add(Bytes.toBytes("myLittleFamily"),
Bytes.toBytes("someQualifier"), Bytes.toBytes("Some"));
table.put(put);
}
}
Library used
commons-cli-1.2.jar hadoop-core-1.0.2.jar
commons-codec-1.4.jar hbase-0.94.0.jar
commons-collections-3.2.1.jar httpclient-4.1.2.jar
commons-configuration-1.6.jar httpcore-4.1.4.jar
commons-httpclient-3.1.jar log4j-1.2.16.jar
commons-io-2.1.jar protobuf-java-2.4.0a.jar
commons-lang-2.5.jar slf4j-api-1.5.8.jar
commons-logging-1.1.1.jar slf4j-log4j12-1.5.8.jar
commons-net-1.4.1.jar stax-api-1.0.1.jar
guava-r09.jar zookeeper-3.4.3.jar
I encountered same error while inserting data into the HBase.
In my case, it was due to incorrect column family name.
Please refer to this conversation.
Related
TestNG Error in Maven ProjectHi Davis and others as well: - Below is my project screenshot and code and i am getting this TestNg Error using Maven Project, please help me on this and I have also attached few screenshots for reference:-
GenerateReports.java
package ExtentReports.GenerateHTMLReports;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.testng.annotations.BeforeTest;
import org.testng.annotations.Test;
import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.reporter.ExtentSparkReporter;
public class GenerateReports {
ExtentReports E2;
#BeforeTest
public void config()
{
String s1 = System.getProperty("user.dir")+"\\reports\\index1.html";
ExtentSparkReporter E1 = new ExtentSparkReporter(s1);
E1.config().setReportName("MyAutomationReport");
E1.config().setDocumentTitle("GoogleHomePage");
E2 = new ExtentReports();
E2.attachReporter(E1);
E2.setSystemInfo("Tester", "G");
}
#Test
public void initializeDriver()
{
E2.createTest("MyHTMlReportTest");
System.setProperty("webdriver.chrome.driver",
"F:\\Selenium_Training\\Ganesh_Project\\Browser_Drivers\\chromedriver.exe");
WebDriver driver = new ChromeDriver();
driver.get("https://www.google.com");
driver.getTitle();
driver.close();
E2.flush();
}
}
Pom.xml
I am getting a TestNG error while running a Maven project in XML and here is the error message:-
org.testng.TestNGException:
TestNG by default disables loading DTD from unsecured Urls. If you need to explicitly load the DTD from a http url, please do so by using the JVM argument [-Dtestng.dtd.http=true]
at org.testng.xml.TestNGContentHandler.resolveEntity(TestNGContentHandler.java:115)
at com.sun.org.apache.xerces.internal.util.EntityResolverWrapper.resolveEn
Also, i tried the below possibilities and again getting error message which is different from this:-
I tried to add Jvm argument -Dtestng.dtd.http=true and placed it under run -->run configurations-->arguments-->VM arguments in eclipse
After placing the above jvm argument i am getting this new error message in console, do not know how to solve it below is the error after adding jvm argument and running the maven project:-
org.testng.TestNGException: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at org.testng.TestNG.parseSuite(TestNG.java:354)
at org.testng.TestNG.initializeSuitesAndJarFile(TestNG.java:374)
at org.testng.remote.AbstractRemoteTestNG.run(AbstractRemoteTestNG.java:103)
I am trying to setup a very basic example.
Push random data to Output port in NiFi
Use Spark streaming context to print received data.
Versions (All on single instance)
HDF - 3.1.1.0-35
HDP - 2.6.5.0-292
nifi-spark-receiver & site-to-site-client - 1.7.1
I have configured spark-defaults.conf as follows
spark.driver.extraClassPath /usr/hdf/3.1.1.0-35/nifi/work/META-INF/bundled-dependencies/nifi-client-dto-1.5.0.3.1.1.0-35.jar:/opt/spark-receiver/httpcore-nio-4.0-alpha6.jar:/opt/spark-receiver/nifi-site-to-site-client-1.5.0.3.1.2.0-7.jar:/opt/spark-receiver/nifi-spark-receiver-1.5.0.3.1.2.0-7.jar:/usr/hdf/3.1.1.0-35/nifi/lib/nifi-api-1.5.0.3.1.1.0-35.jar:/usr/hdf/3.1.1.0-35/nifi/lib/bootstrap/nifi-utils-1.5.0.3.1.1.0-35.jar:/usr/hdf/3.1.1.0-35/nifi/lib/nifi-framework-api-1.5.0.3.1.1.0-35.jar
I am running following commands in spark-shell
import org.apache.nifi._
import java.nio.charset._
import org.apache.nifi.spark._
import org.apache.nifi.remote.client._
import org.apache.spark._
import org.apache.nifi.events._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._
import org.apache.nifi.remote._
import org.apache.nifi.remote.client._
import org.apache.nifi.remote.protocol._
import org.apache.spark.storage._
import org.apache.spark.streaming.receiver._
import java.io._
import org.apache.spark.serializer._
val conf = new SiteToSiteClient.Builder().url("http://10.140.0.2:9090/nifi").portName("toSpark").buildConfig()
val ssc = new StreamingContext(sc, Seconds(10))
val lines = ssc.receiverStream(new NiFiReceiver(conf, StorageLevel.MEMORY_ONLY))
val text = lines.map(dataPacket => new String(dataPacket.getContent, StandardCharsets.UTF_8))
text.print()
ssc.start()
After running this code, I get following error:
Exception in thread "NiFi Receiver" java.lang.NoClassDefFoundError:
org/apache/http/nio/protocol/HttpAsyncResponseConsumer
at org.apache.nifi.remote.client.SiteInfoProvider.createSiteToSiteRestApiClient(SiteInfoProvider.java:104)
at org.apache.nifi.remote.client.SiteInfoProvider.refreshRemoteInfo(SiteInfoProvider.java:68)
at org.apache.nifi.remote.client.SiteInfoProvider.getPortIdentifier(SiteInfoProvider.java:220)
at org.apache.nifi.remote.client.SiteInfoProvider.getOutputPortIdentifier(SiteInfoProvider.java:204)
at org.apache.nifi.remote.client.socket.SocketClient.getPortIdentifier(SocketClient.java:79)
at org.apache.nifi.remote.client.socket.SocketClient.createTransaction(SocketClient.java:121)
at org.apache.nifi.spark.NiFiReceiver$ReceiveRunnable.run(NiFiReceiver.java:149)
at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException:
org.apache.http.nio.protocol.HttpAsyncResponseConsumer
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
Pls help.
I'm guessing your inclusion of httpcore-nio-4.0-alpha6.jar is the issue, as that version doesn't have that class in it, which appears to be interfering with the version of httpcore-nio that is included transitively with the nifi-spark-receiver (currently version 4.4.6).
i am trying to build a web using Spring and Angular Js tegether w/ Gradle. my problem is When i run my project or build gradle this error comes out.
"contextLoads FAILED"
"What went Wrong:"
.> There were failing tests.: /build/reports/tests/test/index.html
Caused by:
org.springframework.beans.factory.UnsatisfiedDependencyException
Caused by:
org.springframework.beans.factory.NoSuchBeanDefinitionException
ContextLoad Leads me to this Class:
package com.linkedin.learning.linkedinlearningfullstackappangularspringboot;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit4.SpringRunner;
#RunWith(SpringRunner.class)
#SpringBootTest
public class LinkedInLearningFullStackAppAngularSpringBootApplicationTests {
#Test
public void contextLoads() {
}
}
heres the screenshot to make it clearer:
enter image description here
I'm trying a simple program from the "hadoop in Action" book to merge a series of files from the local file system into one file in the hdfs. The code snippet is the same as the one provided in the book.
import java.lang.*;
import java.util.*;
import java.io.*;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.Path;
public class PutMerge {
public static void main(String[] args) throws IOException{
Configuration conf = new Configuration();
FileSystem hdfs = FileSystem.get(conf);
FileSystem local = FileSystem.getLocal(conf);
Path inputDir = new Path(args[0]); // First argument has the input directory
Path hdfsFile = new Path(args[1]); // Concatenated hdfs file name
try {
FileStatus[] inputFiles = local.listStatus(inputDir); // list of Local Files
FSDataOutputStream out = hdfs.create(hdfsFile); // target file creation
for (int i = 0; i<inputFiles.size; i++ {
FSDataInputStream in = local.open(inputFiles[i].getPath());
int bytesRead = 0;
byte[] buff = new byte[256];
while (bytesRead = (in.read(buff))>0) {
out.write(buff,0,bytesRead);
}
in.close();
}
out.close();
}
catch(Exception e) {
e.printStackTrace();
}
}
}
The program successfully compiled and while trying to run I'm getting the following exception
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/commons/configuration/Configuration
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.(DefaultMetricsSystem.java:37)
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.(DefaultMetricsSystem.java:34)
at org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:217)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:185)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:237)
at org.apache.hadoop.security.KerberosName.(KerberosName.java:79)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:210)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:185)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:237)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:468)
at org.apache.hadoop.fs.FileSystem$Cache$Key.(FileSystem.java:1519)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1420)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
at PutMerge.main(PutMerge.java:16) Caused by: java.lang.ClassNotFoundException:
org.apache.commons.configuration.Configuration
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 17 more
Based on inputs from some of the posts, I added the commons package. My classpath definition is
/usr/java/jdk1.7.0_21:/data/commons-logging-1.1.2/commons-logging-1.1.2.jar:/data/hadoop-1.1.2/hadoop-core-1.1.2.jar:/data/commons-logging-1.1.2/commons-logging-adapters-1.1.2.jar:/data/commons-logging-1.1.2/commons-logging-api-1.1.2.jar:.
Any clue on why this is not working?
You didnt include apache configuration in your classpath.
Really though you shouldn't need to include much besides hadoop itself. Make sure you are running your jar with hadoop itself.
> hadoop -jar myJar.jar
I have a created a simple MapReduce Driver that implements the Tool interface. But when I try to run the job in Eclipse, I get a NoClassDefFoundError before the run() method is invoked.
I am running Hadoop 0.20.2 on Ubuntu 10.04 LTS. The source code and stack trace are provided below. Any assistance will be greatly appreciated.
Sourcecode
import org.apache.hadoop.conf.*;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.*;
public class MyTestDriver extends Configured implements Tool {
#Override
public int run(String[] args) throws Exception {
if (args.length != 2) {
System.err.printf("Usage: %s [generic options] <input> <output>\n",
getClass().getSimpleName());
ToolRunner.printGenericCommandUsage(System.err);
return -1;
}
// Code here to submit Hadoop Job ...
return 0;
}
/**
* #param args
*/
public static void main(String[] args) throws Exception {
int exitCode = ToolRunner.run(new MyTestDriver(), args);
System.exit(exitCode);
}
}
Stacktrace
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/commons/cli/ParseException at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:59) at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at
MaxTemperatureDriver.main(MaxTemperatureDriver.java:44) Caused by:
java.lang.ClassNotFoundException:
org.apache.commons.cli.ParseException at
java.net.URLClassLoader$1.run(URLClassLoader.java:202) at
java.security.AccessController.doPrivileged(Native Method) at
java.net.URLClassLoader.findClass(URLClassLoader.java:190) at
java.lang.ClassLoader.loadClass(ClassLoader.java:307) at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at
java.lang.ClassLoader.loadClass(ClassLoader.java:248) ... 3 more
Are all of Hadoop's dependencies in your Eclipse build path? Make sure all the jars in the hadoop/lib directory are in your build path.
The error means that there was a particular class that was not found. Classes are stored within .jar files. So, you need to check if all the required jar files are available or not. Jar files are searched based on the CLASSPATH variable. If you are using Eclipse as your development environment, please check if the build dependencies are satisfied (you can do this by - right clicking on your project -> configure build path -> add external jar).
Also, if you are not sure which class is missing, you can check the classname (Hadoop is open source, so you can find the class name) and then search the class name in findjar