I am using spring-data-rest & hibernate to expose a table "File(Id, Name)" from a SAP IQ(Sybase IQ) database. The below error occurs when I do a "GET" on the "File" table using "curl http://localhost:8080/files/1".
2022-07-20 20:01:03 WARN [http-nio-8081-exec-1] o.h.e.jdbc.spi.SqlExceptionHelper - SQL Error: 0, SQLState: JZ0SA
2022-07-20 20:01:03 ERROR [http-nio-8081-exec-1] o.h.e.jdbc.spi.SqlExceptionHelper - JZ0SA: Prepared Statement: Input parameter not set, index: 0.
2022-07-20 20:01:03 INFO [http-nio-8081-exec-1] o.h.e.i.DefaultLoadEventListener - HHH000327: Error performing load command
org.hibernate.exception.GenericJDBCException: could not extract ResultSet
at org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:42)
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:113)
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:99)
at org.hibernate.engine.jdbc.internal.ResultSetReturnImpl.extract(ResultSetReturnImpl.java:67)
I have the same table in "Sybase 16" but it is working perfectly with the same code base.
Anyone faced similar issue?
Thanks in advance
I have been facing issue while handling the datatype for field qty and doing a SUM on the same field. Below is the code. I converted the qty to double but still getting the error mentioned below. Can someone please help me to understand this issue and if possible a solution?
A_test1 = load'EXT_OO_IMP' USING PigStorage('\u0001') AS (it: chararray,loc: chararray,qty: chararray,scheddate: chararray,udc_cta_no: chararray,udc_imp_pack_qty: chararray,udc_imp_ready_dt: chararray,udc_imp_ref_no: chararray,udc_ord_sys_cd: chararray,udc_source: chararray,udc_sply_typ: chararray,udc_vend_pack_id: chararray,udc_purch_stg: chararray,srs_pack_flow_indicator_cd: chararray,it_type_cd: chararray,source_owner_cd: chararray,nks_id: chararray,alloc_replen_cd: chararray);
----- ext_oo_import: {it: chararray,loc: chararray,qty: chararray,scheddate: chararray,udc_cta_no: chararray,udc_imp_pack_qty: chararray,udc_imp_ready_dt: chararray,udc_imp_ref_no: chararray,udc_ord_sys_cd: chararray,udc_source: chararray,udc_sply_typ: chararray,udc_vend_pack_id: chararray,udc_purch_stg: chararray,srs_pack_flow_indicator_cd: chararray,it_type_cd: chararray,source_owner_cd: chararray,nks_id: chararray,alloc_replen_cd: chararray}
----- ############## ############## ##############
import_on_order =
FOREACH A_test1
GENERATE
loc,
it,
nks_id,
(double)(qty is NULL ? 0 : qty) as qty:double,
scheddate,
' ' AS order_source,
' ' AS chs_it_type_cd;
describe import_on_order;
----- import_on_order: {loc: chararray,it: chararray,nks_id: chararray,qty: int,scheddate: chararray,order_source: chararray,chs_it_type_cd: chararray}
grp_import_on_order = GROUP import_on_order BY (loc,it,nks_id,scheddate,order_source,chs_it_type_cd);
describe grp_import_on_order;
----- grp_import_on_order: {group: (loc: chararray,it: chararray,nks_id: chararray,scheddate: chararray,order_source: chararray,chs_it_type_cd: chararray),import_on_order: {(loc: chararray,it: chararray,nks_id: chararray,qty: int,scheddate: chararray,order_source: chararray,chs_it_type_cd: chararray)}}
------------------------------- STORE TO FILE ---------------------------------------------------------------
--------------------------------------------------------------------------------------------------------------
work__idrp_import_on_order =
FOREACH grp_import_on_order
GENERATE group.loc AS loc,
group.it AS it,
group.nks_id AS nks_id,
SUM(import_on_order.qty) AS qty,
group.scheddate AS scheddate,
group.order_source AS order_source,
group.chs_it_type_cd AS chs_it_type_cd;
describe work__idrp_import_on_order;
----- work__idrp_import_on_order: {loc: chararray,it: chararray,nks_id: chararray,qty: int,scheddate: chararray,order_source: chararray,chs_it_type_cd: chararray}
import_on_order_rp =
FOREACH ext_oo_import
GENERATE
it AS chs_it,
loc AS chs_loc,
(qty is NULL ? 0 : qty) as qty:double,
scheddate AS current_due_dt,
' ' AS order_source,
'V' AS source_type_cd,
udc_sply_typ AS sply_typ,
udc_ord_sys_cd AS ord_sys_cd;
2019-01-31 09:03:30,819 [main] ERROR org.apache.pig.tools.grunt.GruntParser - ERROR 0: Exception while executing (Name: grp_import_on_order: Local Rearrange[tuple]{tuple}(false) - scope-1095 Operator Key: scope-1095): org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing (Name: work__idrp_import_on_order: New For Each(false,false)[bag] - scope-1078 Operator Key: scope-1078): org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing (Name: Pre Combiner Local Rearrange[tuple]{Unknown} - scope-1097 Operator Key: scope-1097): org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing (Name: import_on_order: New For Each(false,false,false,false,false,false,false)[bag] - scope-977 Operator Key: scope-977): org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing (Name: ext_oo_import: New For Each(false,false,false,false,false)[bag] - scope-957 Operator Key: scope-957): org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing (Name: New For Each(false,false,false,false,false)[bag] - scope-945 Operator Key: scope-945): org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing [POCast (Name: Cast[double] - scope-926 Operator Key: scope-926) children: [[POProject (Name: Project[chararray][2] - scope-925 Operator Key: scope-925) children: null at []]] at []]: java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.String
Details at logfile: /logs/hdidrp/pig/pig_1548942743751.log
2019-01-31 09:03:30,849 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2019-01-31 09:03:31,012 [main] WARN org.apache.pig.PigServer - Encountered Warning IMPLICIT_CAST_TO_DOUBLE 1 time(s).
import_on_order_rp: {shc_item: chararray,shc_loc: chararray,qty: double,current_due_dt: chararray,order_source: chararray,source_type_cd: chararray,sply_typ: chararray,ord_sys_cd: chararray}
2019-01-31 09:03:31,179 [main] ERROR org.apache.pig.tools.grunt.GruntParser - ERROR 0: Exception while executing (Name: grp_import_on_order: Local Rearrange[tuple]{tuple}(false) - scope-1095 Operator Key: scope-1095): org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing (Name: work__idrp_import_on_order: New For Each(false,false)[bag] - scope-1078 Operator Key: scope-1078): org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing (Name: Pre Combiner Local Rearrange[tuple]{Unknown} - scope-1097 Operator Key: scope-1097): org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing (Name: import_on_order: New For Each(false,false,false,false,false,false,false)[bag] - scope-977 Operator Key: scope-977): org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing (Name: ext_oo_import: New For Each(false,false,false,false,false)[bag] - scope-957 Operator Key: scope-957): org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing (Name: New For Each(false,false,false,false,false)[bag] - scope-945 Operator Key: scope-945): org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing [POCast (Name: Cast[double] - scope-926 Operator Key: scope-926) children: [[POProject (Name: Project[chararray][2] - scope-925 Operator Key: scope-925) children: null at []]] at []]: java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.String
After looking at code, unable to get in first statement you are loading data and doing steps of transformation but again in last statament why you are transforming first data set again which is of string type and while working on this it giving exception.
import_on_order_rp = FOREACH ext_oo_import GENERATE it AS chs_it, loc
AS chs_loc, (qty is NULL ? 0 : qty) as qty:double, scheddate AS
current_due_dt, ' ' AS order_source, 'V' AS source_type_cd,
udc_sply_typ AS sply_typ, udc_ord_sys_cd AS ord_sys_cd;
See if that is the correct case.
I am running Cassandra and have about 20k records in it to play with. I am trying to run a filter in pig on this data but am getting the following message back:
2015-07-23 13:02:23,559 [Thread-4] WARN org.apache.hadoop.mapred.LocalJobRunner - job_local_0001
java.lang.RuntimeException: com.datastax.driver.core.exceptions.InvalidQueryException: Expected 8 or 0 byte long (1)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.initNextRecordReader(PigRecordReader.java:260)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.nextKeyValue(PigRecordReader.java:205)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:532)
at org.apache.hadoop.mapreduce.MapContext.nextKeyValue(MapContext.java:67)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:143)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: Expected 8 or 0 byte long (1)
at com.datastax.driver.core.exceptions.InvalidQueryException.copy(InvalidQueryException.java:35)
at com.datastax.driver.core.DefaultResultSetFuture.extractCauseFromExecutionException(DefaultResultSetFuture.java:263)
at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:179)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:52)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:44)
at org.apache.cassandra.hadoop.cql3.CqlRecordReader$RowIterator.(CqlRecordReader.java:259)
at org.apache.cassandra.hadoop.cql3.CqlRecordReader.initialize(CqlRecordReader.java:151)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.initNextRecordReader(PigRecordReader.java:256)
... 7 more
You would think this is an obvious error, and believe me there are a ton of results on google for this. It's clear that some piece of my data isn't conforming to the expected type of a given column. What I don't understand is 1.) why this is happening, and 2.) how to debug it. If I try to insert invalid data into Cassandra from my nodejs app, it will throw this kind of error if my data type doesn't match the columns data type, which means that this shouldn't be possible? I've read that data validation using UTF8 is wonky and that setting a different kind of validation is the answer, but I don't know how to do that. Here are my steps to reproduce:
grunt> define CqlNativeStorage org.apache.cassandra.hadoop.pig.CqlNativeStorage();
grunt> test = load 'cql://blah/blahblah' USING CqlNativeStorage();
grunt> describe test;
13:09:54.544 [main] DEBUG o.a.c.hadoop.pig.CqlNativeStorage - Found ksDef name: blah
13:09:54.544 [main] DEBUG o.a.c.hadoop.pig.CqlNativeStorage - partition keys: ["ad_id"]
13:09:54.544 [main] DEBUG o.a.c.hadoop.pig.CqlNativeStorage - cluster keys: []
13:09:54.544 [main] DEBUG o.a.c.hadoop.pig.CqlNativeStorage - row key validator: org.apache.cassandra.db.marshal.UTF8Type
13:09:54.544 [main] DEBUG o.a.c.hadoop.pig.CqlNativeStorage - cluster key validator: org.apache.cassandra.db.marshal.CompositeType(org.apache.cassandra.db.marshal.UTF8Type)
blahblah: {ad_id: chararray,address: chararray,city: chararray,date_created: long,date_listed: long,fireplace: bytearray,furnished: bytearray,garage: bytearray,neighbourhood: chararray,num_bathrooms: int,num_bedrooms: int,pet_friendly: bytearray,postal_code: chararray,price: double,province: chararray,square_feet: int,url: chararray,utilities_included: bytearray}
grunt> query1 = FILTER blahblah BY city == 'New York';
grunt> dump query1;
Then it runs for awhile and dumps out tons of logs and the error appears.
Discovered my problem: the pig partioner did not match CQL3, and therefore the data was being parsed incorrectly. Previously the environment variable was PIG_PARTITIONER=org.apache.cassandra.dht.RandomPartitioner. After I changed it to PIG_PARTITIONER=org.apache.cassandra.dht.Murmur3Partitioner it started working.
I'm trying to run a script but I'm getting the error. Here is my script:
X = load 'hdfs://localhost:54310/testing/abcd.txt' USING PigStorage('\t')AS(user,time,query);
Y = LIMIT X 10;
dump Y;
This is the error I'm getting while executing the above script.
2014-06-15 14:31:42,438 [main] INFO org.apache.spark.scheduler.DAGScheduler - Failed to run saveAsNewAPIHadoopFile at StoreConverter.java:58
2014-06-15 14:31:42,491 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2043: Unexpected error during execution.
Pig script
base = load 'u.base' as (uid:long, gid:long, pref:double);
sim1 = mapreduce 'mahout-core-0.7-job.jar'
store base into 'input'
load 'output' as (gid1:long, gid2:long, sim:double)
`org.apache.mahout.cf.taste.hadoop.similarity.item.ItemSimilarityJob -i input -o output -s SIMILARITY_EUCLIDEAN_DISTANCE`;
sim2 = foreach sim1 generate gid2 as gid1, gid1 as gid2, sim;
sim3 = union sim1,sim2;
dump sim3;
Pig output
2013-03-28 09:21:32,564 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: UNION,NATIVE
2013-03-28 09:21:32,676 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2013-03-28 09:21:32,699 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 4
2013-03-28 09:21:32,702 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2127: Internal Error: Cloning of plan failed for optimization.
Details at logfile: /home/chenwl/logs/pig_1364433685680.log
Pig log
Pig Stack Trace
---------------
ERROR 2127: Internal Error: Cloning of plan failed for optimization.
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias sim3
at org.apache.pig.PigServer.openIterator(PigServer.java:836)
at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:696)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
at org.apache.pig.tools.grunt.GruntParser.loadScript(GruntParser.java:531)
at org.apache.pig.tools.grunt.GruntParser.processScript(GruntParser.java:480)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.Script(PigScriptParser.java:804)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:449)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
at org.apache.pig.Main.run(Main.java:538)
at org.apache.pig.Main.main(Main.java:157)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: org.apache.pig.PigException: ERROR 1002: Unable to store alias sim3
at org.apache.pig.PigServer.storeEx(PigServer.java:935)
at org.apache.pig.PigServer.store(PigServer.java:898)
at org.apache.pig.PigServer.openIterator(PigServer.java:811)
... 16 more
Caused by: org.apache.pig.impl.plan.optimizer.OptimizerException: ERROR 2127: Internal Error: Cloning of plan failed for optimization.
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer.mergeDiamondMROper(MultiQueryOptimizer.java:304)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer.visitMROp(MultiQueryOptimizer.java:219)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceOper.visit(MapReduceOper.java:273)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceOper.visit(MapReduceOper.java:46)
at org.apache.pig.impl.plan.ReverseDependencyOrderWalker.walk(ReverseDependencyOrderWalker.java:71)
at org.apache.pig.impl.plan.PlanVisitor.visit(PlanVisitor.java:46)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer.visit(MultiQueryOptimizer.java:94)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.compile(MapReduceLauncher.java:617)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:146)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1264)
at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1249)
at org.apache.pig.PigServer.storeEx(PigServer.java:931)
... 18 more
Caused by: java.lang.CloneNotSupportedException: Unable to find clone for op 1-36: Native('hadoop jar mahout-core-0.7-job.jar org.apache.mahout.cf.taste.hadoop.similarity.item.ItemSimilarityJob -i input -o output -s SIMILARITY_EUCLIDEAN_DISTANCE ') - scope-12
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.plans.PhysicalPlan.clone(PhysicalPlan.java:273)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer.mergeDiamondMROper(MultiQueryOptimizer.java:298)
... 29 more
================================================================================
Environment
OS: ubuntu 12.04
Hadoop: 1.0.4 Subversion https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1393290
Pig: 0.11.0 (r1446324)
P.S.:
It works if sim1 was loaded from hdfs, e.g. sim1 = load 'sim' as (gid1:long, gid2:long, sim:double).