Pig permission denied - hadoop

I wrote a simple script in pig and it runs good when run from terminal.
However when i try to run the script from the browser using apache server, it throws the following error.
[main] ERROR org.apache.pig.tools.grunt.Grunt - You don't have permission to perform the operation. Error from the server: dummy (Permission denied)
[main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2997: Encountered IOException. dummy (Permission denied)
Any ideas where to fix the problem? Well if it helps, nowhere in my script have I used anything called "dummy"!!!
P.S.: changed the permissions in the /tmp hdfs directory to 777, didn't help!!
checked the permissions in the /tmp local directory has 777, didn't matter!!
tried to use -Dpig.temp.dir to a directory in which apache has permission to write, didn't help!!
Please help, did I miss something?
Edit: Well worth mentioning: this error comes while i am running the script while checking for compilation of the pig script, i.e., pig -c ScriptName
2013-09-06 10:41:19,344 [main] INFO org.apache.pig.Main - Apache Pig version 0.10.0 (r1328203) compiled Apr 19 2012, 22:54:12
2013-09-06 10:41:19,344 [main] INFO org.apache.pig.Main - Logging error messages to: /data/storage/pig-0.10.0/logs/pig_1378444279340.log
2013-09-06 10:41:19,658 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: hdfs://[master]:9000
2013-09-06 10:41:19,743 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to map-reduce job tracker at: [master]:9001
2013-09-06 10:41:19,866 [main] INFO org.apache.pig.scripting.jython.JythonScriptEngine - created tmp python.cachedir=/tmp/pig_jython_4266457116882300725
*sys-package-mgr*: processing new jar, '/usr/java/jdk1.7.0_03/lib/tools.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/hadoop-core-0.20.203.0.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/aspectjrt-1.6.5.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/aspectjtools-1.6.5.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-beanutils-1.7.0.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-beanutils-core-1.8.0.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-cli-1.2.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-codec-1.4.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-collections-3.2.1.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-configuration-1.6.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-configurationhadoop-1.6.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-daemon-1.0.1.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-digester-1.8.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-el-1.0.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-httpclient-3.0.1.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-lang-2.4.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-logging-1.1.1.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-logging-api-1.0.4.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-math-2.1.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/commons-net-1.4.1.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/core-3.1.1.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/guava-r09.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/hadoop-fairscheduler-0.20.203.0.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/hbase-0.94.4.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/hsqldb-1.8.0.10.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/hstreaming-all.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/jackson-core-asl-1.0.1.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/jackson-mapper-asl-1.0.1.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/jasper-compiler-5.5.12.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/jasper-runtime-5.5.12.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/jets3t-0.6.1.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/jetty-6.1.26.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/jetty-util-6.1.26.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/jsch-0.1.42.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/junit-4.5.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/kfs-0.2.2.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/log4j-1.2.15.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/mockito-all-1.8.5.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/oro-2.0.8.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/protobuf-java-2.4.0a.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/servlet-api-2.5-20081211.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/slf4j-api-1.4.3.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/slf4j-log4j12-1.4.3.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/xmlenc-0.52.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/zookeeper-3.4.5.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/jsp-2.1/jsp-2.1.jar'
*sys-package-mgr*: processing new jar, '/home/hadoop/hadoop-0.20.203.0/lib/jsp-2.1/jsp-api-2.1.jar'
*sys-package-mgr*: processing new jar, '/data/storage/pig-0.10.0/lib/automaton.jar'
*sys-package-mgr*: processing new jar, '/data/storage/pig-0.10.0/lib/jython-2.5.0.jar'
*sys-package-mgr*: processing new jar, '/data/storage/pig-0.10.0/pig-0.10.0-withouthadoop.jar'
*sys-package-mgr*: processing new jar, '/usr/java/jdk1.7.0_03/jre/lib/resources.jar'
*sys-package-mgr*: processing new jar, '/usr/java/jdk1.7.0_03/jre/lib/rt.jar'
*sys-package-mgr*: processing new jar, '/usr/java/jdk1.7.0_03/jre/lib/jsse.jar'
*sys-package-mgr*: processing new jar, '/usr/java/jdk1.7.0_03/jre/lib/jce.jar'
*sys-package-mgr*: processing new jar, '/usr/java/jdk1.7.0_03/jre/lib/charsets.jar'
*sys-package-mgr*: processing new jar, '/usr/java/jdk1.7.0_03/jre/lib/ext/sunpkcs11.jar'
*sys-package-mgr*: processing new jar, '/usr/java/jdk1.7.0_03/jre/lib/ext/sunec.jar'
*sys-package-mgr*: processing new jar, '/usr/java/jdk1.7.0_03/jre/lib/ext/sunjce_provider.jar'
*sys-package-mgr*: processing new jar, '/usr/java/jdk1.7.0_03/jre/lib/ext/zipfs.jar'
*sys-package-mgr*: processing new jar, '/usr/java/jdk1.7.0_03/jre/lib/ext/localedata.jar'
*sys-package-mgr*: processing new jar, '/usr/java/jdk1.7.0_03/jre/lib/ext/dnsns.jar'
2013-09-06 10:41:28,086 [main] INFO org.apache.pig.scripting.jython.JythonScriptEngine - Register scripting UDF: pythonDefpyUDF.[UDF1]
2013-09-06 10:41:28,087 [main] INFO org.apache.pig.scripting.jython.JythonScriptEngine - Register scripting UDF: pythonDefpyUDF.[UDF2]
2013-09-06 10:41:28,266 [main] WARN org.apache.pig.PigServer - Encountered Warning IMPLICIT_CAST_TO_FLOAT 1 time(s).
2013-09-06 10:41:28,421 [main] WARN org.apache.pig.PigServer - Encountered Warning IMPLICIT_CAST_TO_FLOAT 1 time(s).
2013-09-06 10:41:28,578 [main] WARN org.apache.pig.PigServer - Encountered Warning IMPLICIT_CAST_TO_FLOAT 1 time(s).
2013-09-06 10:41:29,665 [main] WARN org.apache.pig.PigServer - Encountered Warning IMPLICIT_CAST_TO_FLOAT 1 time(s).
2013-09-06 10:41:29,712 [main] WARN org.apache.pig.tools.grunt.GruntParser - 'rm/rmf' statement is ignored while processing 'explain -script' or '-check'
2013-09-06 10:41:29,774 [main] WARN org.apache.pig.PigServer - Encountered Warning IMPLICIT_CAST_TO_FLOAT 1 time(s).
2013-09-06 10:41:29,814 [main] ERROR org.apache.pig.tools.grunt.Grunt - You don't have permission to perform the operation. Error from the server: dummy (Permission denied)
2013-09-06 10:41:29,815 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2997: Encountered IOException. dummy (Permission denied)
Details at logfile: /data/storage/pig-0.10.0/logs/pig_1378444279340.log

It seems to be permission issues for apache user.
While running following command pig -c ScriptName, check the output from both places.
Output From Shell
Output from script while executing through Apache server
Check the permission for log files as well, where pig is trying to write. This should help you to resolve this issue.

Related

Not able to export Hbase table into CSV file using HUE Pig Script

I have installed Apache Amabari and configured the Hue. I want to export hbase table data into csv file using pig script but I am getting following error.
2017-06-03 10:27:45,518 [ATS Logger 0] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Exception caught by TimelineClientConnectionRetry, will try 30 more time(s).
Message: java.net.ConnectException: Connection refused
2017-06-03 10:27:45,703 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2017-06-03 10:27:45,709 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 101: file '/usr/lib/hbase/lib/hbase-common-1.2.0-cdh5.11.0.jar' does not exist.
2017-06-03 10:27:45,899 [main] INFO org.apache.pig.Main - Pig script completed in 4 seconds and 532 milliseconds (4532 ms)
<<< Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.PigMain], exit code 2
Oozie Launcher failed, finishing Hadoop job gracefully
Please help me and where I am doing wrong.
Let me know your concerns.

Getting error while loading file in pig:

I am trying to execute pig script in terminal and i am getting following error:
INFO [Thread-13] org.apache.hadoop.util.NativeCodeLoader - Loaded the native-hadoop library
WARN [Thread-13] org.apache.hadoop.mapred.JobClient - No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
INFO [Thread-13] org.apache.hadoop.mapred.JobClient - Cleaning up the staging area file:/tmp/hadoop-biadmin/mapred/staging/biadmin-341199244/.staging/job_local_0001
ERROR [Thread-13] org.apache.hadoop.security.UserGroupInformation - PriviledgedActionException as:biadmin cause:org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Input path does not exist: file:/home/biadmin/PIGData/books.csv
ERROR [main] org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to recreate exception from backend error: org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Input path does not exist: file:/home/biadmin/PIGData/books.csv
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:285)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1024)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1041)
at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
at java.security.AccessController.doPrivileged(AccessController.java:310)
at javax.security.auth.Subject.doAs(Subject.java:573)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:886)
at org.apache.hadoop.mapred.jobcontrol.Job.submit(Job.java:378)
at org.apache.hadoop.mapred.jobcontrol.JobControl.startReadyJobs(JobControl.java:247)
at org.apache.hadoop.mapred.jobcontrol.JobControl.run(JobControl.java:279)
at java.lang.Thread.run(Thread.java:738)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:260)
Caused by: org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: file:/home/biadmin/PIGData/books.csv
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:235)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigTextInputFormat.listStatus(PigTextInputFormat.java:36)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:252)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:273)
... 15 more
ERROR [main] org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!
ERROR [main] org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias b
Details at logfile: /opt/ibm/biginsights/pig/bin/pig_1487413261020.log
can anybody help me to resolve this?
The code:
data = LOAD '/home/biadmin/PIGData/books.csv';
b = FOREACH data GENERATE $0;
DUMP b;
Based on the above exception , the input file is not there in the given path file:/home/biadmin/PIGData/books.csv. (which is local file system path)
Pig has two execution modes:
1. local mode (To process local file system files)
$ pig -x local
2. Mapreduce mode (To process HDFS file system files)
$ pig or $ pig -x mapreduce
Make sure that you are running the pig script in appropriate mode.

Cannot write to log - pig error in local mode

Folks
I am new to hadoop eco system , was trying to install pig .
got stuck with the below error will trying to execute the tutorial script.
while installing the pig in hadoop cluster i did not change any permission
please let me know if have to do or the below is pertain to that.
any information is highly appreciable
2016-01-08 07:19:20,655 [main] WARN org.apache.pig.Main - Cannot write to
log file: /usr/lib/pig-0.6.0/tutorial/scripts/pig_1452266360654.log
2016-01-08 07:19:20,723 [main] WARN org.apache.pig.Main - Cannot write to log file: /usr/lib/pig-0.6.0/tutorial/scripts//script1-hadoop.pig1452266360723.log
2016-01-08 07:19:20,740 [main] ERROR org.apache.pig.Main - ERROR 2999: Unexpected internal error. null
2016-01-08 07:19:20,747 [main] WARN org.apache.pig.Main - There is no log file to write to.
2016-01-08 07:19:20,747 [main] ERROR org.apache.pig.Main - java.lang.NullPointerException
at java.util.Hashtable.put(Hashtable.java:394)
at java.util.Properties.setProperty(Properties.java:143)
at org.apache.pig.Main.main(Main.java:373)

Using Local File in Spring Yarn application

i need deploy the war file on yarn container for that I am using Spring batch for creating a yarn application that uses embedded jetty server to deploy a war file,when i use a customhandle for handling the request it works fine the application get deployed on yarn but when i use war file as a handler it doesnt works i get a error
java.io.FileNotFoundException: /root/jettywebapps/webapps/Login.war
but the file is actually present at the given location i am stuck how to access the war file from the yarn container
#OnContainerStart
public void publicVoidNoArgsMethod() throws Exception {
String jetty_home="/root/jettywebapps";
Server server = new Server(9090);
WebAppContext webapp = new WebAppContext();
webapp.setContextPath("/Login");
webapp.setWar(jetty_home+"/webapps/Login.war");
server.setHandler(webapp);
server.start();
server.join();
}
Here is the Stack Trace
2015-04-17 06:05:14.972] boot - 26920 INFO [main] --- ContainerLauncherRunner: Running YarnContainer with parameters []
[2015-04-17 06:05:14.972] boot - 26920 INFO [main] --- ContainerLauncherRunner: Container requested that we wait state, setting up latch
[2015-04-17 06:05:14.975] boot - 26920 INFO [main] --- DefaultYarnContainer: Processing 1 #YarnComponent handlers
[2015-04-17 06:05:15.038] boot - 26920 INFO [main] --- Server: jetty-8.0.4.v20111024
[2015-04-17 06:05:15.080] boot - 26920 WARN [main] --- WebInfConfiguration: Web application not found /root/jettywebapps/webapps/Login.war
[2015-04-17 06:05:15.081] boot - 26920 WARN [main] --- WebAppContext: Failed startup of context o.e.j.w.WebAppContext{/Login,null},/root/jettywebapps/webapps/Login.war
java.io.FileNotFoundException: /root/jettywebapps/webapps/Login.war
at org.eclipse.jetty.webapp.WebInfConfiguration.unpack(WebInfConfiguration.java:479)
at org.eclipse.jetty.webapp.WebInfConfiguration.preConfigure(WebInfConfiguration.java:52)
at org.eclipse.jetty.webapp.WebAppContext.preConfigure(WebAppContext.java:416)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:452)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:59)
at org.eclipse.jetty.server.handler.HandlerWrapper.doStart(HandlerWrapper.java:89)
at org.eclipse.jetty.server.Server.doStart(Server.java:262)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:59)
at hello.container.HelloPojo.publicVoidNoArgsMethod(HelloPojo.java:40)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.springframework.expression.spel.support.ReflectiveMethodExecutor.execute(ReflectiveMethodExecutor.java:112)
at org.springframework.expression.spel.ast.MethodReference.getValueInternal(MethodReference.java:129)
at org.springframework.expression.spel.ast.MethodReference.access$000(MethodReference.java:49)
at org.springframework.expression.spel.ast.MethodReference$MethodValueRef.getValue(MethodReference.java:342)
at org.springframework.expression.spel.ast.CompoundExpression.getValueInternal(CompoundExpression.java:88)
at org.springframework.expression.spel.ast.SpelNodeImpl.getTypedValue(SpelNodeImpl.java:131)
at org.springframework.expression.spel.standard.SpelExpression.getValue(SpelExpression.java:330)
at org.springframework.yarn.support.AbstractExpressionEvaluator.evaluateExpression(AbstractExpressionEvaluator.java:126)
at org.springframework.yarn.container.ContainerMethodInvokerHelper.processInternal(ContainerMethodInvokerHelper.java:229)
at org.springframework.yarn.container.ContainerMethodInvokerHelper.process(ContainerMethodInvokerHelper.java:115)
at org.springframework.yarn.container.MethodInvokingYarnContainerRuntimeProcessor.process(MethodInvokingYarnContainerRuntimeProcessor.java:51)
at org.springframework.yarn.container.ContainerHandler.handle(ContainerHandler.java:99)
at org.springframework.yarn.container.DefaultYarnContainer.getContainerHandlerResults(DefaultYarnContainer.java:174)
at org.springframework.yarn.container.DefaultYarnContainer.runInternal(DefaultYarnContainer.java:77)
please help
Thanks
the container could fetch local files or hdfs files using following code:
Configuration conf = new Configuration();
FileSystem localFS = FileSystem.get(URI.create("file://localhost"), conf);
OutputStream outATXT = localFS.create(new Path("/home/walterchen/a.txt"));
or
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create("hdfs://localhost:9000"), conf);
OutputStream out = fs.create(new Path("/home/a.txt"));

ERROR 2998: Unhandled internal error. Run the code

executing the following command -x local -f /Hbase/load_hbase.pig
I get the following error
2014-11-08 23:36:47,455 [main] INFO org.apache.pig.Main - Apache Pig version 0.12.1 (r1585011) compiled Apr 05 2014, 01:41:34
2014-11-08 23:36:47,455 [main] INFO org.apache.pig.Main - Logging error messages to: /home/eduardo/pig_1415497007452.log
2014-11-08 23:36:47,817 [main] INFO org.apache.pig.impl.util.Utils - Default bootup file /home/eduardo/.pigbootup not found
2014-11-08 23:36:47,918 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: file:///
2014-11-08 23:36:48,436 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2998: Unhandled internal error. org/apache/hadoop/hbase/filter/WritableByteArrayComparable
Here is the code that I run:
raw_data = LOAD '/data/QCLCD201211/201201hourly.txt' USING PigStorage(',');
weather_data = FOREACH raw_data GENERATE $1, $10;
ranked_data = RANK weather_data;
final_data = FILTER ranked_data BY $0 IS NOT NULL;
STORE final_data INTO 'hbase://weather'
USING org.apache.pig.backend.hadoop.hbase.HBaseStorage('info:date info:temp');
I wonder what I'm doing wrong I'll put down the version of hadoop, hbase and the pig.
Hadoop: hadoop-1.2.1
Hbase: hbase-0.96.2-hadoop1
Pig: pig-0.12.1
copy pig jar and hbase jar in hadoop
1) COPY THESE FILES TO THE HADOOP LIBRARY.
sudo cp /usr/lib/pig/lib/pig-common-0.8.0-cdh3u0.jar /usr/lib/hadoop/lib/
sudo cp /usr/lib/pig/lib/hbase-0.96.2-cdh3u0.jar /usr/lib/hadoop/lib/
sudo cp /usr/lib/pig/lib/hbase-0.96.2-cdh3u0.jar /usr/lib/hadoop/lib/
2)CLOSE HBASE AND HADOOP USING FOLLOWING COMMOND
/usr/lib/hadoop/bin/stop-all.sh
/usr/lib/hbase/bin/stop-hbase.sh
3) RESTART HBASE AND HADOOP USING COMMOND
/usr/lib/hadoop/bin/start-all.sh
/usr/lib/hadoop/bin/start-hbase.sh

Resources