Unable to execute import-hive.sh - hadoop

I am getting below error while running import-hive.sh
Could you please help me out on this?
hadoop#0.0.0.0:~/apache-atlas-2.1.0/hook/apache-atlas-hive-hook-2.1.0/hook-bin$ ./import-hive.sh
Using Hive configuration directory [/home/hadoop/hive/conf]
Log file for import is /home/hadoop/apache-atlas-2.1.0/hook/apache-atlas-hive-hook-2.1.0/logs/import-hive.log
2021-07-13T15:43:21,449 INFO [main] org.apache.atlas.ApplicationProperties - Looking for atlas-application.properties in classpath
2021-07-13T15:43:21,452 INFO [main] org.apache.atlas.ApplicationProperties - Loading atlas-application.properties from file:/home/hadoop/hive/conf/atlas-application.properties
2021-07-13T15:43:21,505 INFO [main] org.apache.atlas.ApplicationProperties - Using graphdb backend 'janus'
2021-07-13T15:43:21,505 INFO [main] org.apache.atlas.ApplicationProperties - Using storage backend 'hbase2'
2021-07-13T15:43:21,506 INFO [main] org.apache.atlas.ApplicationProperties - Using index backend 'solr'
2021-07-13T15:43:21,506 INFO [main] org.apache.atlas.ApplicationProperties - Atlas is running in MODE: PROD.
2021-07-13T15:43:21,506 INFO [main] org.apache.atlas.ApplicationProperties - Setting solr-wait-searcher property 'true'
2021-07-13T15:43:21,506 INFO [main] org.apache.atlas.ApplicationProperties - Setting index.search.map-name property 'false'
2021-07-13T15:43:21,506 INFO [main] org.apache.atlas.ApplicationProperties - Setting atlas.graph.index.search.max-result-set-size = 150
2021-07-13T15:43:21,506 INFO [main] org.apache.atlas.ApplicationProperties - Property (set to default) atlas.graph.cache.db-cache = true
2021-07-13T15:43:21,506 INFO [main] org.apache.atlas.ApplicationProperties - Property (set to default) atlas.graph.cache.db-cache-clean-wait = 20
2021-07-13T15:43:21,506 INFO [main] org.apache.atlas.ApplicationProperties - Property (set to default) atlas.graph.cache.db-cache-size = 0.5
2021-07-13T15:43:21,506 INFO [main] org.apache.atlas.ApplicationProperties - Property (set to default) atlas.graph.cache.tx-cache-size = 15000
2021-07-13T15:43:21,506 INFO [main] org.apache.atlas.ApplicationProperties - Property (set to default) atlas.graph.cache.tx-dirty-size = 120
Enter username for atlas :- admin
Enter password for atlas :-
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/security/authentication/client/ConnectionConfigurator
at org.apache.atlas.AtlasBaseClient.getClient(AtlasBaseClient.java:287)
at org.apache.atlas.AtlasBaseClient.initializeState(AtlasBaseClient.java:454)
at org.apache.atlas.AtlasBaseClient.initializeState(AtlasBaseClient.java:449)
at org.apache.atlas.AtlasBaseClient.<init>(AtlasBaseClient.java:132)
at org.apache.atlas.AtlasClientV2.<init>(AtlasClientV2.java:94)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.main(HiveMetaStoreBridge.java:134)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.security.authentication.client.ConnectionConfigurator
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 6 more
Failed to import Hive Meta Data!!!

Related

Why so many connections are used by Spring reactive with Mongo

I got the exception 'MongoWaitQueueFullException' and I realize the number of connections that my application is using. I use the default configuration of Spring boot (2.2.7.RELEASE) with reactive MongoDB (4.2.8). Transactions are used.
Even when running an integration test that basically creates a bit more than 200 elements then groups them (200 groups). 10 connections are used. When this algorithm is executed over a real data-set, this exception is thrown. The default limit of the waiting queue (500) was reached. This does not make the application scalable.
My question is: is there a way to design a reactive application that helps to reduce the number of connections?
This is the output of my test. Basically, it scans all translations of bundle files and them group them per translation key. An element is persisted per translation key.
return Flux
.fromIterable(bundleFile.getFiles())
.map(ScannedBundleFileEntry::getLocale)
.flatMap(locale ->
handler
.scanTranslations(bundleFileEntity.toLocation(), locale, context)
.index()
.map(indexedTranslation ->
createTranslation(
workspaceEntity,
bundleFileEntity,
locale.getId(),
indexedTranslation.getT1(), // index
indexedTranslation.getT2().getKey(), // bundle key
indexedTranslation.getT2().getValue() // translation
)
)
.flatMap(bundleKeyTemporaryRepository::save)
)
.thenMany(groupIntoBundleKeys(bundleFileEntity))
.then(bundleKeyTemporaryRepository.deleteByBundleFile(bundleFileEntity.getId()))
.then(Mono.just(bundleFileEntity));
The grouping function:
private Flux<BundleKeyEntity> groupIntoBundleKeys(BundleFileEntity bundleFile) {
return this
.findBundleKeys(bundleFile)
.groupBy(BundleKeyGroupKey::new)
.flatMap(bundleKeyGroup ->
bundleKeyGroup
.collectList()
.map(bundleKeys -> {
final BundleKeyGroupKey key = bundleKeyGroup.key();
final BundleKeyEntity entity = new BundleKeyEntity(key.getWorkspace(), key.getBundleFile(), key.getKey());
bundleKeys.forEach(entity::mergeInto);
return entity;
})
)
.flatMap(bundleKeyEntityRepository::save);
}
The test output:
560 [main] INFO o.s.b.t.c.SpringBootTestContextBootstrapper - Neither #ContextConfiguration nor #ContextHierarchy found for test class [be.sgerard.i18n.controller.TranslationControllerTest], using SpringBootContextLoader
569 [main] INFO o.s.t.c.s.AbstractContextLoader - Could not detect default resource locations for test class [be.sgerard.i18n.controller.TranslationControllerTest]: no resource found for suffixes {-context.xml, Context.groovy}.
870 [main] INFO o.s.b.t.c.SpringBootTestContextBootstrapper - Loaded default TestExecutionListener class names from location [META-INF/spring.factories]: [org.springframework.boot.test.mock.mockito.MockitoTestExecutionListener, org.springframework.boot.test.mock.mockito.ResetMocksTestExecutionListener, org.springframework.boot.test.autoconfigure.restdocs.RestDocsTestExecutionListener, org.springframework.boot.test.autoconfigure.web.client.MockRestServiceServerResetTestExecutionListener, org.springframework.boot.test.autoconfigure.web.servlet.MockMvcPrintOnlyOnFailureTestExecutionListener, org.springframework.boot.test.autoconfigure.web.servlet.WebDriverTestExecutionListener, org.springframework.test.context.web.ServletTestExecutionListener, org.springframework.test.context.support.DirtiesContextBeforeModesTestExecutionListener, org.springframework.test.context.support.DependencyInjectionTestExecutionListener, org.springframework.test.context.support.DirtiesContextTestExecutionListener, org.springframework.test.context.transaction.TransactionalTestExecutionListener, org.springframework.test.context.jdbc.SqlScriptsTestExecutionListener, org.springframework.test.context.event.EventPublishingTestExecutionListener, org.springframework.security.test.context.support.WithSecurityContextTestExecutionListener, org.springframework.security.test.context.support.ReactorContextTestExecutionListener]
897 [main] INFO o.s.b.t.c.SpringBootTestContextBootstrapper - Using TestExecutionListeners: [org.springframework.test.context.support.DirtiesContextBeforeModesTestExecutionListener#4372b9b6, org.springframework.boot.test.mock.mockito.MockitoTestExecutionListener#232a7d73, org.springframework.boot.test.autoconfigure.SpringBootDependencyInjectionTestExecutionListener#4b41e4dd, org.springframework.test.context.support.DirtiesContextTestExecutionListener#22ffa91a, org.springframework.test.context.transaction.TransactionalTestExecutionListener#74960bfa, org.springframework.test.context.jdbc.SqlScriptsTestExecutionListener#42721fe, org.springframework.test.context.event.EventPublishingTestExecutionListener#40844aab, org.springframework.security.test.context.support.WithSecurityContextTestExecutionListener#1f6c9cd8, org.springframework.security.test.context.support.ReactorContextTestExecutionListener#5b619d14, org.springframework.boot.test.mock.mockito.ResetMocksTestExecutionListener#66746f57, org.springframework.boot.test.autoconfigure.restdocs.RestDocsTestExecutionListener#447a020, org.springframework.boot.test.autoconfigure.web.client.MockRestServiceServerResetTestExecutionListener#7f36662c, org.springframework.boot.test.autoconfigure.web.servlet.MockMvcPrintOnlyOnFailureTestExecutionListener#28e8dde3, org.springframework.boot.test.autoconfigure.web.servlet.WebDriverTestExecutionListener#6d23017e]
1551 [background-preinit] INFO o.h.v.i.x.c.ValidationBootstrapParameters - HV000006: Using org.hibernate.validator.HibernateValidator as validation provider.
1677 [main] INFO b.s.i.c.TranslationControllerTest - Starting TranslationControllerTest on sgerard with PID 538 (started by sgerard in /home/sgerard/sandboxes/github-oauth/server)
1678 [main] INFO b.s.i.c.TranslationControllerTest - The following profiles are active: test
3250 [main] INFO o.s.d.r.c.RepositoryConfigurationDelegate - Bootstrapping Spring Data Reactive MongoDB repositories in DEFAULT mode.
3747 [main] INFO o.s.d.r.c.RepositoryConfigurationDelegate - Finished Spring Data repository scanning in 493ms. Found 9 Reactive MongoDB repository interfaces.
5143 [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.security.config.annotation.method.configuration.ReactiveMethodSecurityConfiguration' of type [org.springframework.security.config.annotation.method.configuration.ReactiveMethodSecurityConfiguration] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
5719 [main] INFO org.mongodb.driver.cluster - Cluster created with settings {hosts=[localhost:27017], mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500}
5996 [cluster-ClusterId{value='5f42490f1c60f43aff9d7d46', description='null'}-localhost:27017] INFO org.mongodb.driver.connection - Opened connection [connectionId{localValue:1, serverValue:4337}] to localhost:27017
6010 [cluster-ClusterId{value='5f42490f1c60f43aff9d7d46', description='null'}-localhost:27017] INFO org.mongodb.driver.cluster - Monitor thread successfully connected to server with description ServerDescription{address=localhost:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 2, 8]}, minWireVersion=0, maxWireVersion=8, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=12207332, setName='rs0', canonicalAddress=4802c4aff450:27017, hosts=[4802c4aff450:27017], passives=[], arbiters=[], primary='4802c4aff450:27017', tagSet=TagSet{[]}, electionId=7fffffff0000000000000013, setVersion=1, lastWriteDate=Sun Aug 23 12:46:30 CEST 2020, lastUpdateTimeNanos=384505436362981}
6019 [main] INFO org.mongodb.driver.cluster - Cluster created with settings {hosts=[localhost:27017], mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500}
6040 [cluster-ClusterId{value='5f42490f1c60f43aff9d7d47', description='null'}-localhost:27017] INFO org.mongodb.driver.connection - Opened connection [connectionId{localValue:2, serverValue:4338}] to localhost:27017
6042 [cluster-ClusterId{value='5f42490f1c60f43aff9d7d47', description='null'}-localhost:27017] INFO org.mongodb.driver.cluster - Monitor thread successfully connected to server with description ServerDescription{address=localhost:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 2, 8]}, minWireVersion=0, maxWireVersion=8, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=1727974, setName='rs0', canonicalAddress=4802c4aff450:27017, hosts=[4802c4aff450:27017], passives=[], arbiters=[], primary='4802c4aff450:27017', tagSet=TagSet{[]}, electionId=7fffffff0000000000000013, setVersion=1, lastWriteDate=Sun Aug 23 12:46:30 CEST 2020, lastUpdateTimeNanos=384505468960066}
7102 [nioEventLoopGroup-2-2] INFO org.mongodb.driver.connection - Opened connection [connectionId{localValue:3, serverValue:4339}] to localhost:27017
11078 [main] INFO o.s.b.a.e.web.EndpointLinksResolver - Exposing 1 endpoint(s) beneath base path ''
11158 [main] INFO o.h.v.i.x.c.ValidationBootstrapParameters - HV000006: Using org.hibernate.validator.HibernateValidator as validation provider.
11720 [main] INFO org.mongodb.driver.connection - Opened connection [connectionId{localValue:4, serverValue:4340}] to localhost:27017
12084 [main] INFO o.s.s.c.ThreadPoolTaskScheduler - Initializing ExecutorService 'taskScheduler'
12161 [main] INFO b.s.i.c.TranslationControllerTest - Started TranslationControllerTest in 11.157 seconds (JVM running for 13.532)
20381 [nioEventLoopGroup-2-3] INFO org.mongodb.driver.connection - Opened connection [connectionId{localValue:5, serverValue:4341}] to localhost:27017
20408 [nioEventLoopGroup-2-2] INFO b.s.i.s.w.WorkspaceManagerImpl - Synchronize, there is no workspace for the branch [master], let's create it.
20416 [nioEventLoopGroup-2-3] INFO b.s.i.s.w.WorkspaceManagerImpl - The workspace [master] alias [e3cea374-0d37-4c57-bdbf-8bd14d279c12] has been created.
20421 [nioEventLoopGroup-2-3] INFO b.s.i.s.w.WorkspaceManagerImpl - Initializing workspace [master] alias [e3cea374-0d37-4c57-bdbf-8bd14d279c12].
20525 [nioEventLoopGroup-2-2] INFO b.s.i.s.i18n.TranslationManagerImpl - A bundle file has been found located in [server/src/main/resources/i18n] named [exception] with 2 file(s).
20812 [nioEventLoopGroup-2-4] INFO org.mongodb.driver.connection - Opened connection [connectionId{localValue:6, serverValue:4342}] to localhost:27017
21167 [nioEventLoopGroup-2-8] INFO org.mongodb.driver.connection - Opened connection [connectionId{localValue:10, serverValue:4345}] to localhost:27017
21167 [nioEventLoopGroup-2-6] INFO org.mongodb.driver.connection - Opened connection [connectionId{localValue:8, serverValue:4344}] to localhost:27017
21393 [nioEventLoopGroup-2-5] INFO org.mongodb.driver.connection - Opened connection [connectionId{localValue:7, serverValue:4343}] to localhost:27017
21398 [nioEventLoopGroup-2-7] INFO org.mongodb.driver.connection - Opened connection [connectionId{localValue:9, serverValue:4346}] to localhost:27017
21442 [nioEventLoopGroup-2-2] INFO b.s.i.s.i18n.TranslationManagerImpl - A bundle file has been found located in [server/src/main/resources/i18n] named [validation] with 2 file(s).
21503 [nioEventLoopGroup-2-2] INFO b.s.i.s.i18n.TranslationManagerImpl - A bundle file has been found located in [server/src/test/resources/be/sgerard/i18n/service/i18n/file] named [file] with 2 file(s).
21621 [nioEventLoopGroup-2-2] INFO b.s.i.s.i18n.TranslationManagerImpl - A bundle file has been found located in [front/src/main/web/src/assets/i18n] named [i18n] with 2 file(s).
22745 [SpringContextShutdownHook] INFO o.s.s.c.ThreadPoolTaskScheduler - Shutting down ExecutorService 'taskScheduler'
22763 [SpringContextShutdownHook] INFO org.mongodb.driver.connection - Closed connection [connectionId{localValue:4, serverValue:4340}] to localhost:27017 because the pool has been closed.
22766 [SpringContextShutdownHook] INFO org.mongodb.driver.connection - Closed connection [connectionId{localValue:9, serverValue:4346}] to localhost:27017 because the pool has been closed.
22767 [SpringContextShutdownHook] INFO org.mongodb.driver.connection - Closed connection [connectionId{localValue:6, serverValue:4342}] to localhost:27017 because the pool has been closed.
22768 [SpringContextShutdownHook] INFO org.mongodb.driver.connection - Closed connection [connectionId{localValue:8, serverValue:4344}] to localhost:27017 because the pool has been closed.
22768 [SpringContextShutdownHook] INFO org.mongodb.driver.connection - Closed connection [connectionId{localValue:5, serverValue:4341}] to localhost:27017 because the pool has been closed.
22769 [SpringContextShutdownHook] INFO org.mongodb.driver.connection - Closed connection [connectionId{localValue:10, serverValue:4345}] to localhost:27017 because the pool has been closed.
22770 [SpringContextShutdownHook] INFO org.mongodb.driver.connection - Closed connection [connectionId{localValue:7, serverValue:4343}] to localhost:27017 because the pool has been closed.
22776 [SpringContextShutdownHook] INFO org.mongodb.driver.connection - Closed connection [connectionId{localValue:3, serverValue:4339}] to localhost:27017 because the pool has been closed.
Process finished with exit code 0
Spring Reactive is asynchronous. Imagine you have 3 items in your dataset. It opens a connection for the save of the first item. But it won't wait for it to finish and use for the second save. Instead, it opens a second connection as soon as possible. Thus you'll end up overloading all the possible connections in the pool.

Apache PIG, ELEPHANTBIRDJSON Loader

I'm trying to parse below input (there are 2 records in this input)using Elephantbird json loader
[{"node_disk_lnum_1":36,"node_disk_xfers_in_rate_sum":136.40000000000001,"node_disk_bytes_in_rate_22":
187392.0, "node_disk_lnum_7": 13}]
[{"node_disk_lnum_1": 36, "node_disk_xfers_in_rate_sum":
105.2,"node_disk_bytes_in_rate_22": 123084.8, "node_disk_lnum_7":13}]
Here is my syntax:
register '/home/data/Desktop/elephant-bird-pig-4.1.jar';
a = LOAD '/pig/tc1.log' USING
com.twitter.elephantbird.pig.load.JsonLoader('-nestedLoad') as (json:map[]);
b = FOREACH a GENERATE flatten(json#'node_disk_lnum_1') AS
node_disk_lnum_1,flatten(json#'node_disk_xfers_in_rate_sum') AS
node_disk_xfers_in_rate_sum,flatten(json#'node_disk_bytes_in_rate_22') AS
node_disk_bytes_in_rate_22, flatten(json#'node_disk_lnum_7') AS
node_disk_lnum_7;
DESCRIBE b;
b describe result:
b: {node_disk_lnum_1: bytearray,node_disk_xfers_in_rate_sum:
bytearray,node_disk_bytes_in_rate_22: bytearray,node_disk_lnum_7:
bytearray}
c = FOREACH b GENERATE node_disk_lnum_1;
DESCRIBE c;
c: {node_disk_lnum_1: bytearray}
DUMP c;
Expected Result:
36, 136.40000000000001, 187392.0, 13
36, 105.2, 123084.8, 13
Throwing the below error
2017-02-06 01:05:49,337 [main] INFO
org.apache.pig.tools.pigstats.ScriptState - Pig features used in the
script: UNKNOWN 2017-02-06 01:05:49,386 [main] INFO
org.apache.pig.data.SchemaTupleBackend - Key [pig.schematuple] was not
set... will not generate code. 2017-02-06 01:05:49,387 [main] INFO
org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer -
{RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, ConstantCalculator,
GroupByConstParallelSetter, LimitOptimizer, LoadTypeCastInserter,
MergeFilter, MergeForEach, PartitionFilterOptimizer,
PredicatePushdownOptimizer, PushDownForEachFlatten, PushUpFilter,
SplitFilter, StreamTypeCastInserter]} 2017-02-06 01:05:49,390 [main]
INFO org.apache.pig.newplan.logical.rules.ColumnPruneVisitor - Map
key required for a: $0->[node_disk_lnum_1,
node_disk_xfers_in_rate_sum, node_disk_bytes_in_rate_22,
node_disk_lnum_7]
2017-02-06 01:05:49,395 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler
- File concatenation threshold: 100 optimistic? false 2017-02-06 01:05:49,398 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer
- MR plan size before optimization: 1 2017-02-06 01:05:49,398 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer
- MR plan size after optimization: 1 2017-02-06 01:05:49,425 [main] INFO org.apache.pig.tools.pigstats.mapreduce.MRScriptState - Pig
script settings are added to the job 2017-02-06 01:05:49,426 [main]
INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
- mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3 2017-02-06 01:05:49,428 [main] ERROR
org.apache.pig.tools.grunt.Grunt - ERROR 2998: Unhandled internal
error. com/twitter/elephantbird/util/HadoopCompat
Please help what am I missing?
You do not have any nested data in your json,so remove -nestedload
a = LOAD '/pig/tc1.log' USING com.twitter.elephantbird.pig.load.JsonLoader() as (json:map[]);

Jmeter Magento Error

I'm trying to run JMeter for performance testing on a Magento 2 website. So, far I've been able to integrate the benchmark.jmx file provided by Magento into JMeter. But when I try to run it, it starts and ends immediately. This is the error I get
2016/09/01 09:43:43 WARN - jmeter.testbeans.BeanInfoSupport: Localized strings not available for bean class kg.apc.jmeter.config.redis.RedisDataSet java.util.MissingResourceException: Can't find bundle for base name kg.apc.jmeter.config.redis.RedisDataSetResources, locale en_US
at java.util.ResourceBundle.throwMissingResourceException(ResourceBundle.java:1499)
at java.util.ResourceBundle.getBundleImpl(ResourceBundle.java:1322)
at java.util.ResourceBundle.getBundle(ResourceBundle.java:795)
at org.apache.jmeter.testbeans.BeanInfoSupport.<init>(BeanInfoSupport.java:126)
at kg.apc.jmeter.config.redis.RedisDataSetBeanInfo.<init>(RedisDataSetBeanInfo.java:69)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:383)
at com.sun.beans.finder.InstanceFinder.instantiate(InstanceFinder.java:96)
at com.sun.beans.finder.InstanceFinder.find(InstanceFinder.java:66)
at java.beans.Introspector.findExplicitBeanInfo(Introspector.java:438)
at java.beans.Introspector.<init>(Introspector.java:388)
at java.beans.Introspector.getBeanInfo(Introspector.java:163)
at org.apache.jmeter.testbeans.gui.TestBeanGUI.<init>(TestBeanGUI.java:168)
at org.apache.jmeter.gui.util.MenuFactory.initializeMenus(MenuFactory.java:488)
at org.apache.jmeter.gui.util.MenuFactory.<clinit>(MenuFactory.java:160)
at org.apache.jmeter.control.gui.TestPlanGui.createPopupMenu(TestPlanGui.java:93)
at org.apache.jmeter.gui.tree.JMeterTreeNode.createPopupMenu(JMeterTreeNode.java:156)
at org.apache.jmeter.gui.action.EditCommand.doAction(EditCommand.java:47)
at org.apache.jmeter.gui.action.ActionRouter.performAction(ActionRouter.java:80)
at org.apache.jmeter.gui.action.ActionRouter.access$000(ActionRouter.java:40)
at org.apache.jmeter.gui.action.ActionRouter$1.run(ActionRouter.java:62)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:312)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:745)
at java.awt.EventQueue.access$300(EventQueue.java:103)
at java.awt.EventQueue$3.run(EventQueue.java:706)
at java.awt.EventQueue$3.run(EventQueue.java:704)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:77)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:715)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:242)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:161)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:150)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:146)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:138)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:91)
2016/09/01 09:43:44 INFO - jmeter.util.BSFTestElement: Registering JMeter version of JavaScript engine as work-round for BSF-22
2016/09/01 09:43:45 INFO - jmeter.protocol.http.sampler.HTTPSamplerBase: Parser for text/html is org.apache.jmeter.protocol.http.parser.LagartoBasedHtmlParser
2016/09/01 09:43:45 INFO - jmeter.protocol.http.sampler.HTTPSamplerBase: Parser for application/xhtml+xml is org.apache.jmeter.protocol.http.parser.LagartoBasedHtmlParser
2016/09/01 09:43:45 INFO - jmeter.protocol.http.sampler.HTTPSamplerBase: Parser for application/xml is org.apache.jmeter.protocol.http.parser.LagartoBasedHtmlParser
2016/09/01 09:43:45 INFO - jmeter.protocol.http.sampler.HTTPSamplerBase: Parser for text/xml is org.apache.jmeter.protocol.http.parser.LagartoBasedHtmlParser
2016/09/01 09:43:45 INFO - jmeter.protocol.http.sampler.HTTPSamplerBase: Parser for text/vnd.wap.wml is org.apache.jmeter.protocol.http.parser.RegexpHTMLParser
2016/09/01 09:43:45 INFO - jmeter.protocol.http.sampler.HTTPSamplerBase: Parser for text/css is org.apache.jmeter.protocol.http.parser.CssParser
2016/09/01 09:43:45 INFO - jorphan.exec.KeyToolUtils: keytool found at 'keytool'
2016/09/01 09:43:45 INFO - jmeter.protocol.http.proxy.ProxyControl: HTTP(S) Test Script Recorder SSL Proxy will use keys that support embedded 3rd party resources in file /home/yassar/Downloads/jmeter/apache-jmeter-3.0/bin/proxyserver.jks
2016/09/01 09:43:45 INFO - jmeter.gui.util.MenuFactory: Skipping org.apache.jmeter.protocol.mongodb.config.MongoSourceElement
2016/09/01 09:43:45 INFO - jmeter.gui.util.MenuFactory: Skipping org.apache.jmeter.protocol.mongodb.sampler.MongoScriptSampler
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_qos]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_at_most_once]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_at_least_once]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_exactly_once]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_client_types]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_blocking_client]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_async_client]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_message_input_type]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_message_input_type_text]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_message_input_type_file]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_qos]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_at_most_once]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_at_least_once]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_exactly_once]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_client_types]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_blocking_client]
2016/09/01 09:43:45 WARN - jmeter.util.JMeterUtils: ERROR! Resource string not found: [mqtt_async_client]
2016/09/01 09:43:46 INFO - jmeter.gui.util.MenuFactory: Skipping org.apache.jmeter.visualizers.DistributionGraphVisualizer
2016/09/01 09:43:46 INFO - jmeter.samplers.SampleResult: Note: Sample TimeStamps are START times
2016/09/01 09:43:46 INFO - jmeter.samplers.SampleResult: sampleresult.default.encoding is set to ISO-8859-1
2016/09/01 09:43:46 INFO - jmeter.samplers.SampleResult: sampleresult.useNanoTime=true
2016/09/01 09:43:46 INFO - jmeter.samplers.SampleResult: sampleresult.nanoThreadSleep=5000
2016/09/01 09:43:46 INFO - jmeter.gui.util.MenuFactory: Skipping org.apache.jmeter.visualizers.SplineVisualizer
2016/09/01 10:25:41 INFO - jmeter.engine.StandardJMeterEngine: Running the test!
2016/09/01 10:25:41 INFO - jmeter.samplers.SampleEvent: List of sample_variables: []
2016/09/01 10:25:41 INFO - jmeter.samplers.SampleEvent: List of sample_variables: []
2016/09/01 10:25:41 INFO - jmeter.gui.util.JMeterMenuBar: setRunning(true,*local*)
2016/09/01 10:25:42 INFO - jmeter.engine.StandardJMeterEngine: No enabled thread groups found
2016/09/01 10:25:42 INFO - jmeter.engine.StandardJMeterEngine: Notifying test listeners of end of test
2016/09/01 10:25:42 INFO - jmeter.services.FileServer: Default base='/home/yassar/Downloads/jmeter/apache-jmeter-3.0/bin'
2016/09/01 10:25:42 INFO - jmeter.gui.util.JMeterMenuBar: setRunning(false,*local*)
2016/09/01 10:25:50 INFO - jmeter.gui.action.Load: Loading file: /home/yassar/Downloads/benchmark.jmx
2016/09/01 10:25:50 INFO - jmeter.services.FileServer: Set new base='/home/yassar/Downloads'
2016/09/01 10:25:50 INFO - jmeter.save.SaveService: Testplan (JMX) version: 2.2. Testlog (JTL) version: 2.2
2016/09/01 10:25:50 INFO - jmeter.save.SaveService: Using SaveService properties file encoding UTF-8
2016/09/01 10:25:50 INFO - jmeter.save.SaveService: Using SaveService properties version 2.9
2016/09/01 10:25:50 INFO - jmeter.save.SaveService: All converter versions present and correct
2016/09/01 10:25:50 INFO - jmeter.save.SaveService: Loading file: /home/yassar/Downloads/benchmark.jmx
2016/09/01 10:25:50 WARN - jmeter.gui.action.Load: Unexpected error java.lang.IllegalArgumentException: Problem loading XML from:'/home/yassar/Downloads/benchmark.jmx', cannot determine class for element: com.thoughtworks.xstream.mapper.CannotResolveClassException: is-copy-enabled is-u2f-enabled
at org.apache.jmeter.save.SaveService.readTree(SaveService.java:533)
at org.apache.jmeter.save.SaveService.loadTree(SaveService.java:503)
at org.apache.jmeter.gui.action.Load.loadProjectFile(Load.java:130)
at org.apache.jmeter.gui.action.Load.loadProjectFile(Load.java:102)
at org.apache.jmeter.gui.action.Load.doAction(Load.java:89)
at org.apache.jmeter.gui.action.ActionRouter.performAction(ActionRouter.java:80)
at org.apache.jmeter.gui.action.ActionRouter.access$000(ActionRouter.java:40)
at org.apache.jmeter.gui.action.ActionRouter$1.run(ActionRouter.java:62)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:312)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:745)
at java.awt.EventQueue.access$300(EventQueue.java:103)
at java.awt.EventQueue$3.run(EventQueue.java:706)
at java.awt.EventQueue$3.run(EventQueue.java:704)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:77)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:715)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:242)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:161)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:150)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:146)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:138)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:91)
Caused by: com.thoughtworks.xstream.mapper.CannotResolveClassException: is-copy-enabled is-u2f-enabled
at com.thoughtworks.xstream.mapper.DefaultMapper.realClass(DefaultMapper.java:79)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.DynamicProxyMapper.realClass(DynamicProxyMapper.java:55)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.PackageAliasingMapper.realClass(PackageAliasingMapper.java:88)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.ClassAliasingMapper.realClass(ClassAliasingMapper.java:79)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.ArrayMapper.realClass(ArrayMapper.java:74)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.SecurityMapper.realClass(SecurityMapper.java:71)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at org.apache.jmeter.save.SaveService$XStreamWrapper$1.realClass(SaveService.java:98)
at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:30)
at com.thoughtworks.xstream.mapper.CachingMapper.realClass(CachingMapper.java:47)
at com.thoughtworks.xstream.core.util.HierarchicalStreams.readClassType(HierarchicalStreams.java:31)
at com.thoughtworks.xstream.core.TreeUnmarshaller.start(TreeUnmarshaller.java:133)
at com.thoughtworks.xstream.core.AbstractTreeMarshallingStrategy.unmarshal(AbstractTreeMarshallingStrategy.java:32)
at com.thoughtworks.xstream.XStream.unmarshal(XStream.java:1206)
at com.thoughtworks.xstream.XStream.unmarshal(XStream.java:1190)
at com.thoughtworks.xstream.XStream.fromXML(XStream.java:1061)
at org.apache.jmeter.save.SaveService.readTree(SaveService.java:524)
... 21 more
2016/09/01 10:26:12 INFO - jmeter.gui.action.Load: Loading file: /home/yassar/Projects/m205/setup/performance-toolkit/benchmark.jmx
2016/09/01 10:26:12 INFO - jmeter.services.FileServer: Set new base='/home/yassar/Projects/m205/setup/performance-toolkit'
2016/09/01 10:26:12 INFO - jmeter.save.SaveService: Loading file: /home/yassar/Projects/m205/setup/performance-toolkit/benchmark.jmx
2016/09/01 10:26:12 INFO - jmeter.protocol.http.control.CookieManager: Settings: Delete null: true Check: true Allow variable: true Save: false Prefix: COOKIE_
2016/09/01 10:26:13 INFO - jmeter.services.FileServer: Set new base='/home/yassar/Projects/m205/setup/performance-toolkit'
2016/09/01 10:27:08 INFO - jmeter.services.FileServer: Set new base='/home/yassar/Projects/m205/setup/performance-toolkit'
2016/09/01 10:27:11 INFO - jmeter.engine.StandardJMeterEngine: Running the test!
2016/09/01 10:27:11 INFO - jmeter.samplers.SampleEvent: List of sample_variables: []
2016/09/01 10:27:11 INFO - jmeter.gui.util.JMeterMenuBar: setRunning(true,*local*)
2016/09/01 10:27:12 INFO - jmeter.engine.StandardJMeterEngine: Starting setUp thread groups
2016/09/01 10:27:12 INFO - jmeter.engine.StandardJMeterEngine: Starting setUp ThreadGroup: 1 : setUp Thread Group
2016/09/01 10:27:12 INFO - jmeter.engine.StandardJMeterEngine: Starting 1 threads for group setUp Thread Group.
2016/09/01 10:27:12 INFO - jmeter.engine.StandardJMeterEngine: Test will stop on error
2016/09/01 10:27:12 INFO - jmeter.threads.ThreadGroup: Starting thread group number 1 threads 1 ramp-up 1 perThread 1000.0 delayedStart=false
2016/09/01 10:27:12 INFO - jmeter.threads.ThreadGroup: Started thread group number 1
2016/09/01 10:27:12 INFO - jmeter.engine.StandardJMeterEngine: Waiting for all setup thread groups to exit
2016/09/01 10:27:12 INFO - jmeter.threads.JMeterThread: Thread started: setUp Thread Group 1-1
2016/09/01 10:27:12 ERROR - jmeter.util.BeanShellInterpreter: Error invoking bsh method: eval Sourced file: inline evaluation of: ``Boolean stopTestOnError (String error) { log.error(error); System.out.pr . . . '' : Method Invocation path.substring
2016/09/01 10:27:12 WARN - jmeter.protocol.java.sampler.BeanShellSampler: org.apache.jorphan.util.JMeterException: Error invoking bsh method: eval Sourced file: inline evaluation of: ``Boolean stopTestOnError (String error) { log.error(error); System.out.pr . . . '' : Method Invocation path.substring
2016/09/01 10:27:12 INFO - jmeter.threads.JMeterThread: Stop Test detected by thread: setUp Thread Group 1-1
2016/09/01 10:27:12 INFO - jmeter.threads.JMeterThread: Thread finished: setUp Thread Group 1-1
2016/09/01 10:27:12 INFO - jmeter.threads.JMeterThread: Stopping: setUp Thread Group 1-1
2016/09/01 10:27:12 INFO - jmeter.engine.StandardJMeterEngine: All Setup Threads have ended
2016/09/01 10:27:12 INFO - jmeter.engine.StandardJMeterEngine: No enabled thread groups found
2016/09/01 10:27:12 INFO - jmeter.engine.StandardJMeterEngine: Starting tearDown thread groups
2016/09/01 10:27:12 INFO - jmeter.engine.StandardJMeterEngine: Notifying test listeners of end of test
2016/09/01 10:27:12 INFO - jmeter.gui.util.JMeterMenuBar: setRunning(false,*local*)
It seems you are missing some plug-ins that are used along with this JMX. You need to copy these plug-ins in under JMeter /lib/ext folder and it should work.
Check what plug-ins are being used by the benchmark.jmx
Success Finally...
I have been able to launch it. Actually the issue was mostly with urls. I dont know why but 'host' and 'admin_path' variables work in funny ways with Magento. But i find a walk around by manually going through the 'html requests' and added the variable required. Now it is running
I think you're missing the Redis Data Set plugin also.
Please take a look at: https://jmeter-plugins.org/wiki/RedisDataSet/

Table not fount exception (E0729) , while executing hive query from oozie workflow

Script_SusRes.q
select * from ufo_session_details limit 5
Workflow_SusRes.xml
<?xml version="1.0" encoding="UTF-8"?>
<workflow-app xmlns="uri:oozie:workflow:0.4" name="hive-wf">
<start to="hive-node"/>
<action name="hive-node">
<hive xmlns="uri:oozie:hive-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>default</value>
</property>
</configuration>
<script>Script_SusRes.q</script>
</hive>
<ok to="end"/>
<error to="fail"/>
</action>
<kill name="fail">
<message>Hive failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app>
SusRes.properties
oozieClientUrl=http://zltv5636.vci.att.com:11000/oozie
nameNode=hdfs://zltv5635.vci.att.com:8020
jobTracker=zltv5636.vci.att.com:50300
queueName=default
userName=wfe
oozie.use.system.libpath=true
oozie.libpath = ${nameNode}/tmp/nt283s
oozie.wf.application.path=/tmp/nt283s/workflow_SusRes.xml
Error Log
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [10001] Oozie Launcher failed, finishing Hadoop job gracefully
Oozie Launcher ends
stderr logs
Logging initialized using configuration in file:/opt/app/workload/hadoop/mapred/local/taskTracker/wfe/jobcache/job_201510130626_0451/attempt_201510130626_0451_m_000000_0/work/hive-log4j.properties
FAILED: SemanticException [Error 10001]: Line 1:14 Table not found 'ufo_session_details'
Intercepting System.exit(10001)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [10001]
syslog logs
2015-11-03 00:26:20,599 INFO org.apache.hadoop.util.NativeCodeLoader: Loaded the native-hadoop library
2015-11-03 00:26:20,902 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/distcache/8045442539840332845_326451332_1282624021/zltv5635.vci.att.com/tmp/nt283s/Script_SusRes.q <- /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/jobcache/job_201510130626_0451/attempt_201510130626_0451_m_000000_0/work/Script_SusRes.q
2015-11-03 00:26:20,911 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/distcache/3435440518513182209_187825668_1219418250/zltv5635.vci.att.com/tmp/nt283s/Script_SusRes.sql <- /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/jobcache/job_201510130626_0451/attempt_201510130626_0451_m_000000_0/work/Script_SusRes.sql
2015-11-03 00:26:20,913 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/distcache/-5883507949569818012_2054276612_1203833745/zltv5635.vci.att.com/tmp/nt283s/lib <- /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/jobcache/job_201510130626_0451/attempt_201510130626_0451_m_000000_0/work/lib
2015-11-03 00:26:20,916 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/distcache/6682880817470643170_1186359172_1225814386/zltv5635.vci.att.com/tmp/nt283s/workflow_SusRes.xml <- /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/jobcache/job_201510130626_0451/attempt_201510130626_0451_m_000000_0/work/workflow_SusRes.xml
2015-11-03 00:26:21,441 INFO org.apache.hadoop.util.ProcessTree: setsid exited with exit code 0
2015-11-03 00:26:21,448 INFO org.apache.hadoop.mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin#698cdde3
2015-11-03 00:26:21,602 INFO org.apache.hadoop.mapred.MapTask: Processing split: hdfs://zltv5635.vci.att.com:8020/user/wfe/oozie-oozi/0000088-151013062722898-oozie-oozi-W/hive-node--hive/input/dummy.txt:0+5
2015-11-03 00:26:21,630 INFO com.hadoop.compression.lzo.GPLNativeCodeLoader: Loaded native gpl library
2015-11-03 00:26:21,635 INFO com.hadoop.compression.lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev cf4e7cbf8ed0f0622504d008101c2729dc0c9ff3]
2015-11-03 00:26:21,652 WARN org.apache.hadoop.io.compress.snappy.LoadSnappy: Snappy native library is available
2015-11-03 00:26:21,652 INFO org.apache.hadoop.io.compress.snappy.LoadSnappy: Snappy native library loaded
2015-11-03 00:26:21,663 INFO org.apache.hadoop.mapred.MapTask: numReduceTasks: 0
2015-11-03 00:26:22,654 INFO SessionState:
Logging initialized using configuration in file:/opt/app/workload/hadoop/mapred/local/taskTracker/wfe/jobcache/job_201510130626_0451/attempt_201510130626_0451_m_000000_0/work/hive-log4j.properties
2015-11-03 00:26:22,910 INFO org.apache.hadoop.hive.ql.Driver: <PERFLOG method=Driver.run>
2015-11-03 00:26:22,911 INFO org.apache.hadoop.hive.ql.Driver: <PERFLOG method=TimeToSubmit>
2015-11-03 00:26:22,912 INFO org.apache.hadoop.hive.ql.Driver: <PERFLOG method=compile>
2015-11-03 00:26:22,998 INFO hive.ql.parse.ParseDriver: Parsing command: select * from ufo_session_details limit 5
2015-11-03 00:26:23,618 INFO hive.ql.parse.ParseDriver: Parse Completed
2015-11-03 00:26:23,799 INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer: Starting Semantic Analysis
2015-11-03 00:26:23,802 INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer: Completed phase 1 of Semantic Analysis
2015-11-03 00:26:23,802 INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer: Get metadata for source tables
2015-11-03 00:26:23,990 INFO org.apache.hadoop.hive.metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2015-11-03 00:26:24,031 INFO org.apache.hadoop.hive.metastore.ObjectStore: ObjectStore, initialize called
2015-11-03 00:26:24,328 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
2015-11-03 00:26:28,112 INFO org.apache.hadoop.hive.metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2015-11-03 00:26:28,169 INFO org.apache.hadoop.hive.metastore.ObjectStore: Initialized ObjectStore
2015-11-03 00:26:30,767 INFO org.apache.hadoop.hive.metastore.HiveMetaStore: 0: get_table : db=default tbl=ufo_session_details
2015-11-03 00:26:30,768 INFO org.apache.hadoop.hive.metastore.HiveMetaStore.audit: ugi=wfe ip=unknown-ip-addr cmd=get_table : db=default tbl=ufo_session_details
2015-11-03 00:26:30,781 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
2015-11-03 00:26:30,782 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
2015-11-03 00:26:33,319 ERROR org.apache.hadoop.hive.metastore.RetryingHMSHandler: NoSuchObjectException(message:default.ufo_session_details table not found)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1380)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:102)
at com.sun.proxy.$Proxy11.get_table(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:836)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:74)
at com.sun.proxy.$Proxy12.getTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:945)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:887)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1083)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1059)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:8680)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:278)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:433)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:902)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:261)
at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:238)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:37)
at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:491)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:365)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
1677 [main] INFO org.apache.hadoop.hive.ql.Driver -
1679 [main] INFO org.apache.hadoop.hive.ql.Driver -
1680 [main] INFO org.apache.hadoop.hive.ql.Driver -
1771 [main] INFO hive.ql.parse.ParseDriver - Parsing command: select * from ufo_session_master limit 5
2512 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
2683 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
2686 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
2686 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
2831 [main] INFO hive.metastore - Trying to connect to metastore with URI thrift://zltv5636.vci.att.com:9083
2952 [main] WARN hive.metastore - Failed to connect to the MetaStore Server...
2952 [main] INFO hive.metastore - Waiting 1 seconds before next connection attempt.
3952 [main] INFO hive.metastore - Trying to connect to metastore with URI thrift://zltv5636.vci.att.com:9083
3959 [main] WARN hive.metastore - Failed to connect to the MetaStore Server...
3960 [main] INFO hive.metastore - Waiting 1 seconds before next connection attempt.
4960 [main] INFO hive.metastore - Trying to connect to metastore with URI thrift://zltv5636.vci.att.com:9083
4967 [main] WARN hive.metastore - Failed to connect to the MetaStore Server...
4967 [main] INFO hive.metastore - Waiting 1 seconds before next connection attempt.
5978 [main] ERROR org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table ufo_session_master

Pig "Max" command for pig-0.12.1 and pig-0.13.0 with Hadoop-2.4.0

I have a pig script I got from Hortonworks that works fine with pig-0.9.2.15 with Hadoop-1.0.3.16. But when I run it with pig-0.12.1(recompiled with -Dhadoopversion=23) or pig-0.13.0 on Hadoop-2.4.0, it won't work.
It seems the following line is where the problem is.
max_runs = FOREACH grp_data GENERATE group as grp, MAX(runs.runs) as max_runs;
Here's the whole script.
batting = load 'pig_data/Batting.csv' using PigStorage(',');
runs = FOREACH batting GENERATE $0 as playerID, $1 as year, $8 as runs;
grp_data = GROUP runs by (year);
max_runs = FOREACH grp_data GENERATE group as grp, MAX(runs.runs) as max_runs;
join_max_run = JOIN max_runs by ($0, max_runs), runs by (year,runs);
join_data = FOREACH join_max_run GENERATE $0 as year, $2 as playerID, $1 as runs;
STORE join_data INTO './join_data';
And here's the hadoop error info:
2014-07-29 18:03:02,957 [main] ERROR
org.apache.pig.tools.pigstats.PigStats - ERROR 0:
org.apache.pig.backend.executionengine.ExecException: ERROR 0:
Exception while executing (Name: grp_data: Local
Rearrange[tuple]{bytearray}(false) - scope-34 Operator Key: scope-34):
org.apache.pig.backend.executionengine.ExecException: ERROR 2106:
Error executing an algebraic function 2014-07-29 18:03:02,958 [main]
ERROR org.apache.pig.tools.pigstats.mapreduce.MRPigStatsUtil - 1 map
reduce job(s) failed!
How can I fix this if I still want to use "MAX" function? Thank you!
Here's the complete information:
14/07/29 17:50:11 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL
14/07/29 17:50:11 INFO pig.ExecTypeProvider: Trying ExecType :
MAPREDUCE 14/07/29 17:50:11 INFO pig.ExecTypeProvider: Picked
MAPREDUCE as the ExecType 2014-07-29 17:50:12,104 [main] INFO
org.apache.pig.Main - Apache Pig version 0.13.0 (r1606446) compiled
Jun 29 2014, 02:27:58 2014-07-29 17:50:12,104 [main] INFO
org.apache.pig.Main - Logging error messages to:
/root/hadooptestingsuite/scripts/tests/pig_test/hadoop2/pig_1406677812103.log
2014-07-29 17:50:13,050 [main] INFO org.apache.pig.impl.util.Utils -
Default bootup file /root/.pigbootup not found 2014-07-29 17:50:13,415
[main] INFO org.apache.hadoop.conf.Configuration.deprecation -
mapred.job.tracker is deprecated. Instead, use
mapreduce.jobtracker.address 2014-07-29 17:50:13,415 [main] INFO
org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is
deprecated. Instead, use fs.defaultFS 2014-07-29 17:50:13,415 [main]
INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine -
Connecting to hadoop file system at:
hdfs://namenode.cmda.hadoop.com:8020 2014-07-29 17:50:14,302 [main]
INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine -
Connecting to map-reduce job tracker at: namenode.cmda.hadoop.com:8021
2014-07-29 17:50:14,990 [main] INFO
org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is
deprecated. Instead, use fs.defaultFS 2014-07-29 17:50:15,570 [main]
INFO org.apache.hadoop.conf.Configuration.deprecation -
fs.default.name is deprecated. Instead, use fs.defaultFS 2014-07-29
17:50:15,665 [main] WARN org.apache.pig.newplan.BaseOperatorPlan -
Encountered Warning IMPLICIT_CAST_TO_DOUBLE 1 time(s). 2014-07-29
17:50:15,705 [main] INFO
org.apache.hadoop.conf.Configuration.deprecation -
mapred.textoutputformat.separator is deprecated. Instead, use
mapreduce.output.textoutputformat.separator 2014-07-29 17:50:15,791
[main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features
used in the script: HASH_JOIN,GROUP_BY 2014-07-29 17:50:15,873 [main]
INFO org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer -
{RULES_ENABLED=[AddForEach, ColumnMapKeyPrune,
GroupByConstParallelSetter, LimitOptimizer, LoadTypeCastInserter,
MergeFilter, MergeForEach, PartitionFilterOptimizer,
PushDownForEachFlatten, PushUpFilter, SplitFilter,
StreamTypeCastInserter],
RULES_DISABLED=[FilterLogicExpressionSimplifier]} 2014-07-29
17:50:16,319 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler
- File concatenation threshold: 100 optimistic? false 2014-07-29 17:50:16,377 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.CombinerOptimizer
- Choosing to move algebraic foreach to combiner 2014-07-29 17:50:16,410 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler$LastInputStreamingOptimizer
- Rewrite: POPackage->POForEach to POPackage(JoinPackager) 2014-07-29 17:50:16,417 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer
- MR plan size before optimization: 3 2014-07-29 17:50:16,418 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer
- Merged 1 map-reduce splittees. 2014-07-29 17:50:16,418 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer
- Merged 1 out of total 3 MR operators. 2014-07-29 17:50:16,418 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer
- MR plan size after optimization: 2 2014-07-29 17:50:16,493 [main] INFO org.apache.hadoop.conf.Configuration.deprecation -
fs.default.name is deprecated. Instead, use fs.defaultFS 2014-07-29
17:50:16,575 [main] INFO org.apache.hadoop.yarn.client.RMProxy -
Connecting to ResourceManager at
namenode.cmda.hadoop.com/10.0.3.1:8050 2014-07-29 17:50:16,973 [main]
INFO org.apache.pig.tools.pigstats.mapreduce.MRScriptState - Pig
script settings are added to the job 2014-07-29 17:50:17,007 [main]
INFO org.apache.hadoop.conf.Configuration.deprecation -
mapred.job.reduce.markreset.buffer.percent is deprecated. Instead, use
mapreduce.reduce.markreset.buffer.percent 2014-07-29 17:50:17,007
[main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
- mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3 2014-07-29 17:50:17,007 [main] INFO
org.apache.hadoop.conf.Configuration.deprecation -
mapred.output.compress is deprecated. Instead, use
mapreduce.output.fileoutputformat.compress 2014-07-29 17:50:17,020
[main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
- Reduce phase detected, estimating # of required reducers. 2014-07-29 17:50:17,020 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
- Using reducer estimator: org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.InputSizeReducerEstimator
2014-07-29 17:50:17,064 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.InputSizeReducerEstimator
- BytesPerReducer=1000000000 maxReducers=999 totalInputFileSize=6398990 2014-07-29 17:50:17,067 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
- Setting Parallelism to 1 2014-07-29 17:50:17,067 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.reduce.tasks
is deprecated. Instead, use mapreduce.job.reduces 2014-07-29
17:50:17,068 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
- This job cannot be converted run in-process 2014-07-29 17:50:17,068 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
- creating jar file Job2337803902169382273.jar 2014-07-29 17:50:20,957 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
- jar file Job2337803902169382273.jar created 2014-07-29 17:50:20,957 [main] INFO org.apache.hadoop.conf.Configuration.deprecation -
mapred.jar is deprecated. Instead, use mapreduce.job.jar 2014-07-29
17:50:21,001 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
- Setting up multi store job 2014-07-29 17:50:21,036 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Key [pig.schematuple] is
false, will not generate code. 2014-07-29 17:50:21,036 [main] INFO
org.apache.pig.data.SchemaTupleFrontend - Starting process to move
generated code to distributed cacche 2014-07-29 17:50:21,046 [main]
INFO org.apache.pig.data.SchemaTupleFrontend - Setting key
[pig.schematuple.classes] with classes to deserialize [] 2014-07-29
17:50:21,310 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- 1 map-reduce job(s) waiting for submission. 2014-07-29 17:50:21,311 [main] INFO org.apache.hadoop.conf.Configuration.deprecation -
mapred.job.tracker.http.address is deprecated. Instead, use
mapreduce.jobtracker.http.address 2014-07-29 17:50:21,332 [JobControl]
INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to
ResourceManager at namenode.cmda.hadoop.com/10.0.3.1:8050 2014-07-29
17:50:21,366 [JobControl] INFO
org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is
deprecated. Instead, use fs.defaultFS 2014-07-29 17:50:22,606
[JobControl] INFO
org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input
paths to process : 1 2014-07-29 17:50:22,606 [JobControl] INFO
org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total
input paths to process : 1 2014-07-29 17:50:22,629 [JobControl] INFO
org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total
input paths (combined) to process : 1 2014-07-29 17:50:22,729
[JobControl] INFO org.apache.hadoop.mapreduce.JobSubmitter - number
of splits:1 2014-07-29 17:50:22,745 [JobControl] INFO
org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is
deprecated. Instead, use fs.defaultFS 2014-07-29 17:50:23,026
[JobControl] INFO org.apache.hadoop.mapreduce.JobSubmitter -
Submitting tokens for job: job_1406677482986_0003 2014-07-29
17:50:23,258 [JobControl] INFO
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl - Submitted
application application_1406677482986_0003 2014-07-29 17:50:23,340
[JobControl] INFO org.apache.hadoop.mapreduce.Job - The url to track
the job:
http://namenode.cmda.hadoop.com:8088/proxy/application_1406677482986_0003/
2014-07-29 17:50:23,340 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- HadoopJobId: job_1406677482986_0003 2014-07-29 17:50:23,340 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- Processing aliases batting,grp_data,max_runs,runs 2014-07-29 17:50:23,340 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- detailed locations: M: batting[3,10],runs[5,7],max_runs[7,11],grp_data[6,11] C:
max_runs[7,11],grp_data[6,11] R: max_runs[7,11] 2014-07-29
17:50:23,340 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- More information at: http://namenode.cmda.hadoop.com:50030/jobdetails.jsp?jobid=job_1406677482986_0003
2014-07-29 17:50:23,357 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- 0% complete 2014-07-29 17:50:23,357 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- Running jobs are [job_1406677482986_0003] 2014-07-29 17:51:15,564 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- 50% complete 2014-07-29 17:51:15,564 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- Running jobs are [job_1406677482986_0003] 2014-07-29 17:51:18,582 [main] WARN
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure. 2014-07-29 17:51:18,582 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- job job_1406677482986_0003 has failed! Stop running all dependent jobs 2014-07-29 17:51:18,582 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- 100% complete 2014-07-29 17:51:18,825 [main] ERROR org.apache.pig.tools.pigstats.PigStats - ERROR 0:
org.apache.pig.backend.executionengine.ExecException: ERROR 0:
Exception while executing (Name: grp_data: Local
Rearrange[tuple]{bytearray}(false) - scope-73 Operator Key: scope-73):
org.apache.pig.backend.executionengine.ExecException: ERROR 2106:
Error executing an algebraic function 2014-07-29 17:51:18,825 [main]
ERROR org.apache.pig.tools.pigstats.mapreduce.MRPigStatsUtil - 1 map
reduce job(s) failed! 2014-07-29 17:51:18,826 [main] INFO
org.apache.pig.tools.pigstats.mapreduce.SimplePigStats - Script
Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
2.4.0 0.13.0 root 2014-07-29 17:50:16 2014-07-29 17:51:18 HASH_JOIN,GROUP_BY
Failed!
Failed Jobs: JobId Alias Feature Message Outputs
job_1406677482986_0003 batting,grp_data,max_runs,runs MULTI_QUERY,COMBINER Message:
Job failed!
Input(s): Failed to read data from
"hdfs://namenode.cmda.hadoop.com:8020/user/root/pig_data/Batting.csv"
Output(s):
Counters: Total records written : 0 Total bytes written : 0 Spillable
Memory Manager spill count : 0 Total bags proactively spilled: 0 Total
records proactively spilled: 0
Job DAG: job_1406677482986_0003 -> null, null
2014-07-29 17:51:18,826 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- Failed! 2014-07-29 17:51:18,827 [main] ERROR org.apache.pig.tools.grunt.GruntParser - ERROR 2106: Error executing
an algebraic function Details at logfile:
/root/hadooptestingsuite/scripts/tests/pig_test/hadoop2/pig_1406677812103.log
2014-07-29 17:51:18,828 [main] ERROR
org.apache.pig.tools.grunt.GruntParser - ERROR 2244: Job scope-58
failed, hadoop does not return any error message Details at logfile:
/root/hadooptestingsuite/scripts/tests/pig_test/hadoop2/pig_1406677812103.log
try by casting MAX function
max_runs = FOREACH grp_data GENERATE group as grp, (int)MAX(runs.runs) as max_runs;
hope it will work
You should use data types in your load statement.
runs = FOREACH batting GENERATE $0 as playerID:chararray, $1 as year:int, $8 as runs:int;
If this doesn't help for some reason, try explicit casting.
max_runs = FOREACH grp_data GENERATE group as grp, MAX((int)runs.runs) as max_runs;
Thank both #BigData and #Mikko Kupsu for the hint. The issue does indeed have something to do the datatype casting.
After specifying the data type of each column as follows everything runs great.
batting =
LOAD '/user/root/pig_data/Batting.csv' USING PigStorage(',')
AS (playerID: CHARARRAY, yearID: INT, stint: INT, teamID: CHARARRAY, lgID: CHARARRAY,
G: INT, G_batting: INT, AB: INT, R: INT, H: INT, two_B: INT, three_B: INT, HR: INT, RBI: INT,
SB: INT, CS: INT, BB:INT, SO: INT, IBB: INT, HBP: INT, SH: INT, SF: INT, GIDP: INT, G_old: INT);

Resources