JMeter - Resource string not found error - jmeter

I am using JMeter 3.2 r1790748 and getting the below messages in the log.
2017-08-25 11:41:52,208 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_health_title]
2017-08-25 11:41:52,208 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_health_title]
2017-08-25 11:41:52,208 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_equation_healthy]
2017-08-25 11:41:52,208 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_equation_active]
2017-08-25 11:41:52,208 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_equation_warning]
2017-08-25 11:41:52,208 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_equation_dead]
2017-08-25 11:41:52,208 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_equation_load]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_health_tab_title]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_legend_health]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_legend_load]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_legend_memory_per]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_legend_thread_per]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_label_left_top]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_label_left_middle]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_label_left_bottom]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_label_right_healthy]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_label_right_dead]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_performance_title]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_performance_servers]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_performance_tab_title]
2017-08-25 11:41:52,224 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_label_prefix]
2017-08-25 11:41:52,239 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_health_title]
2017-08-25 11:41:52,239 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_health_title]
2017-08-25 11:41:52,427 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_health_title]
2017-08-25 11:41:52,427 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_health_title]
2017-08-25 11:41:52,427 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_health_title]
2017-08-25 11:41:52,427 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_health_title]
2017-08-25 11:41:52,427 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_health_title]
2017-08-25 11:41:52,443 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_health_title]
2017-08-25 11:41:52,521 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [monitor_health_title]
Does anyone know what are these errors are about and how to resolve them?

You seem to be having a component of a 3rd Party plugin that relies on resources of a removed component called Monitor Results.
See:
https://bz.apache.org/bugzilla/show_bug.cgi?id=60423
And release notes of JMeter 3.2 in section Removed elements or functions:
http://jmeter.apache.org/changes.html#Incompatible%20changes
Bug 60423 - Drop Monitor Results listener

Related

Sqoop import into Hive - ERROR ("javax.management.MBeanTrustPermission" "register")

i'm facing this error when running a sqoop import command to hive and HDFS, the HDFS job runs without problems, but i can't make the same import to hive due to theese java errors.
log:
[sga-dl#localhost ~]$ sqoop import --connect jdbc:mysql://localhost:3306/v3_abrasivos --username hiveusr --password DATPASS --table History_Demand --m 1 --target-dir /raw_imports/History_Demand --hive-import --create-hive-table --hive-table sqoop_v3_abrasivos.History_Demand
...
20/02/04 09:50:01 INFO mapreduce.ImportJobBase: Transferred 794.1428 MB in 58.9304 seconds (13.476
20/02/04 09:50:01 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `History_Demand` AS t LIMIT 1
20/02/04 09:50:01 WARN hive.TableDefWriter: Column Dia had to be cast to a less precise type in Hive
20/02/04 09:50:01 INFO hive.HiveImport: Loading uploaded data into Hive
20/02/04 09:50:01 INFO conf.HiveConf: Found configuration file file:/opt/hive/conf/hive-site.xml
2020-02-04 09:50:02,957 main ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")
at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
at java.lang.SecurityManager.checkPermission(SecurityManager.java:585)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanTrustPermission(DefaultMBeanServerInterceptor.java:1848)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:322)
at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
at org.apache.logging.log4j.core.jmx.Server.register(Server.java:380)
at org.apache.logging.log4j.core.jmx.Server.reregisterMBeansAfterReconfigure(Server.java:165)
at org.apache.logging.log4j.core.jmx.Server.reregisterMBeansAfterReconfigure(Server.java:138)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:507)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:249)
at org.apache.logging.log4j.core.async.AsyncLoggerContext.start(AsyncLoggerContext.java:86)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:239)
at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:157)
at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:130)
at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:100)
at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:187)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:154)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:90)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:82)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:65)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:702)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Logging initialized using configuration in jar:file:/usr/local/apache-hive-2.3.6-bin/lib/hive-common-2.3.6.jar!/hive-log4j2.properties Async: true
20/02/04 09:50:03 INFO SessionState:
Logging initialized using configuration in jar:file:/usr/local/apache-hive-2.3.6-bin/lib/hive-common-2.3.6.jar!/hive-log4j2.properties Async: true
20/02/04 09:50:03 INFO session.SessionState: Created HDFS directory: /tmp/hive/sga-dl/9133a12b-b9d9-4d17-ad47-3dc6ca872ab6
20/02/04 09:50:03 INFO session.SessionState: Created local directory: /tmp/sga-dl/9133a12b-b9d9-4d17-ad47-3dc6ca872ab6
20/02/04 09:50:03 INFO session.SessionState: Created HDFS directory: /tmp/hive/sga-dl/9133a12b-b9d9-4d17-ad47-3dc6ca872ab6/_tmp_space.db
20/02/04 09:50:03 INFO conf.HiveConf: Using the default value passed in for log id: 9133a12b-b9d9-4d17-ad47-3dc6ca872ab6
20/02/04 09:50:03 INFO session.SessionState: Updating thread name to 9133a12b-b9d9-4d17-ad47-3dc6ca872ab6 main
20/02/04 09:50:03 INFO conf.HiveConf: Using the default value passed in for log id: 9133a12b-b9d9-4d17-ad47-3dc6ca872ab6
20/02/04 09:50:03 INFO ql.Driver: Compiling command(queryId=sga-dl_20200204095003_9b2a8fbf-d798-4146-b57c-ffcb5920891b): CREATE TABLE `sqoop_v3_abrasivos.History_Demand` ( `idHistory_Demand` BIGINT, `Entidade_SalesOrg` STRING, `Cod_Material` BIGINT, `Material` STRING, `Plan_Group` STRING, `Linha_Produto` STRING, `MTS_MTO` STRING, `Cod_Tipo_OV` STRING, `Tipo_OV` STRING, `OV_Numero` BIGINT, `OV_Linha` INT, `Cod_Cliente` STRING, `Cliente` STRING, `Cod_Cliente_Corp` STRING, `Cliente_Corp` STRING, `Pais_Cliente` STRING, `UF_Cliente` STRING, `Estado_Cliente` STRING, `Cidade_Cliente` STRING, `Bairro_Cliente` STRING, `Sigla_Moeda` STRING, `OV_Moeda` STRING, `Cod_Sell_Plant` INT, `Sell_Plant` STRING, `Pais_SalesOrg` STRING, `Dia` STRING, `Periodo` STRING, `UN_Estoque` STRING, `Qtd_UN_Estoque` STRING, `Vendas_LOC` STRING) COMMENT 'Imported by sqoop on 2020/02/04 09:50:01' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
20/02/04 09:50:05 INFO metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
20/02/04 09:50:05 INFO metastore.ObjectStore: ObjectStore, initialize called
20/02/04 09:50:06 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
20/02/04 09:50:06 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
20/02/04 09:50:06 ERROR bonecp.BoneCP: Unable to start/stop JMX
java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")
at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
at java.lang.SecurityManager.checkPermission(SecurityManager.java:585)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanTrustPermission(DefaultMBeanServerInterceptor.java:1848)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:322)
at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
at com.jolbox.bonecp.BoneCP.registerUnregisterJMX(BoneCP.java:528)
at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:500)
at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:297)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:422)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:817)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:334)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:213)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
at java.security.AccessController.doPrivileged(Native Method)
at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:521)
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:550)
at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:405)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:342)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:303)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:628)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:594)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:588)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:655)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:431)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:79)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:92)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6902)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:164)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1707)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3600)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3652)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3632)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3894)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:248)
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:231)
at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:388)
at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:332)
at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:312)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:288)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.createHiveDB(BaseSemanticAnalyzer.java:236)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.<init>(BaseSemanticAnalyzer.java:215)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.<init>(SemanticAnalyzer.java:362)
at org.apache.hadoop.hive.ql.parse.CalcitePlanner.<init>(CalcitePlanner.java:267)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzerFactory.get(SemanticAnalyzerFactory.java:318)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:484)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:474)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:490)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
20/02/04 09:50:06 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
20/02/04 09:50:08 ERROR bonecp.BoneCP: Unable to start/stop JMX
java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")
at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
at java.lang.SecurityManager.checkPermission(SecurityManager.java:585)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanTrustPermission(DefaultMBeanServerInterceptor.java:1848)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:322)
at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
at com.jolbox.bonecp.BoneCP.registerUnregisterJMX(BoneCP.java:528)
at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:500)
at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
at org.datanucleus.store.rdbms.ConnectionProviderPriorityList.getConnection(ConnectionProviderPriorityList.java:57)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:402)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getXAResource(ConnectionFactoryImpl.java:361)
at org.datanucleus.store.connection.ConnectionManagerImpl.allocateConnection(ConnectionManagerImpl.java:316)
at org.datanucleus.store.connection.AbstractConnectionFactory.getConnection(AbstractConnectionFactory.java:84)
at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:347)
at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:310)
at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:591)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1855)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1744)
at org.datanucleus.store.query.Query.execute(Query.java:1726)
at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:374)
at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:216)
at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:181)
at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:144)
at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:410)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:342)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:303)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:628)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:594)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:588)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:655)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:431)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:79)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:92)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6902)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:164)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1707)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3600)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3652)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3632)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3894)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:248)
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:231)
at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:388)
at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:332)
at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:312)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:288)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.createHiveDB(BaseSemanticAnalyzer.java:236)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.<init>(BaseSemanticAnalyzer.java:215)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.<init>(SemanticAnalyzer.java:362)
at org.apache.hadoop.hive.ql.parse.CalcitePlanner.<init>(CalcitePlanner.java:267)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzerFactory.get(SemanticAnalyzerFactory.java:318)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:484)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:474)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:490)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
20/02/04 09:50:09 INFO metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
20/02/04 09:50:09 INFO metastore.ObjectStore: Initialized ObjectStore
20/02/04 09:50:09 INFO metastore.HiveMetaStore: Added admin role in metastore
20/02/04 09:50:09 INFO metastore.HiveMetaStore: Added public role in metastore
20/02/04 09:50:09 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty
20/02/04 09:50:09 INFO metastore.HiveMetaStore: 0: get_all_functions
20/02/04 09:50:09 INFO HiveMetaStore.audit: ugi=sga-dl ip=unknown-ip-addr cmd=get_all_functions
20/02/04 09:50:09 INFO parse.CalcitePlanner: Starting Semantic Analysis
20/02/04 09:50:09 INFO parse.CalcitePlanner: Creating table sqoop_v3_abrasivos.History_Demand position=13
20/02/04 09:50:09 INFO metastore.HiveMetaStore: 0: get_database: sqoop_v3_abrasivos
20/02/04 09:50:09 INFO HiveMetaStore.audit: ugi=sga-dl ip=unknown-ip-addr cmd=get_database: sqoop_v3_abrasivos
20/02/04 09:50:09 WARN metastore.ObjectStore: Failed to get database sqoop_v3_abrasivos, returning NoSuchObjectException
FAILED: SemanticException [Error 10072]: Database does not exist: sqoop_v3_abrasivos
20/02/04 09:50:09 ERROR ql.Driver: FAILED: SemanticException [Error 10072]: Database does not exist: sqoop_v3_abrasivos
org.apache.hadoop.hive.ql.parse.SemanticException: Database does not exist: sqoop_v3_abrasivos
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getDatabase(BaseSemanticAnalyzer.java:1687)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getDatabase(BaseSemanticAnalyzer.java:1676)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.addDbAndTabToOutputs(SemanticAnalyzer.java:12171)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:12024)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:11020)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11133)
at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:474)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:490)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
20/02/04 09:50:09 INFO ql.Driver: Completed compiling command(queryId=sga-dl_20200204095003_9b2a8fbf-d798-4146-b57c-ffcb5920891b); Time taken: 6.541 seconds
20/02/04 09:50:09 INFO conf.HiveConf: Using the default value passed in for log id: 9133a12b-b9d9-4d17-ad47-3dc6ca872ab6
20/02/04 09:50:09 INFO session.SessionState: Resetting thread name to main
20/02/04 09:50:09 INFO conf.HiveConf: Using the default value passed in for log id: 9133a12b-b9d9-4d17-ad47-3dc6ca872ab6
20/02/04 09:50:09 INFO session.SessionState: Deleted directory: /tmp/hive/sga-dl/9133a12b-b9d9-4d17-ad47-3dc6ca872ab6 on fs with scheme hdfs
20/02/04 09:50:09 INFO session.SessionState: Deleted directory: /tmp/sga-dl/9133a12b-b9d9-4d17-ad47-3dc6ca872ab6 on fs with scheme file
20/02/04 09:50:09 INFO metastore.HiveMetaStore: 0: Cleaning up thread local RawStore...
20/02/04 09:50:09 INFO HiveMetaStore.audit: ugi=sga-dl ip=unknown-ip-addr cmd=Cleaning up thread local RawStore...
20/02/04 09:50:09 INFO metastore.HiveMetaStore: 0: Done cleaning up thread local RawStore
20/02/04 09:50:09 INFO HiveMetaStore.audit: ugi=sga-dl ip=unknown-ip-addr cmd=Done cleaning up thread local RawStore
20/02/04 09:50:09 ERROR tool.ImportTool: Import failed: java.io.IOException: Hive CliDriver exited with status=10072
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:355)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
The data from mysql goes to hdfs, but the hive part just doesn't run. In my head the main 'offensor' is the error pointed here
2020-02-04 09:50:02,957 main ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")
How can i fix this?
I encountered the problem when I try to get a KafkaConsumer using KafkaClient.
I set a SecurityManager to the System to avoid "System.exit(in org.apache.spark.deploy.SparkSubmit.main)" calls, which will result in service exit.
def forbidSystemExistCall = {
val securityManager = new SecurityManager {
override def checkPermission(perm: Permission): Unit = {
if(perm.getName.startsWith("exitVM")) {
throw new ExitTrappedException
}
}
}
System.setSecurityManager(securityManager)
}
However,this SecurityManager does not handle "javax.management.MBeanTrustPermission",then the problems comes. My sulotion(work a charm in my app):
Policy.setPolicy(new MBeanRegistryPolicy)
class MBeanRegistryPolicy extends Policy {
val defaultPolicy = Policy.getPolicy()
override def implies(domain: ProtectionDomain, permission: Permission): Boolean = {
if(permission.isInstanceOf[javax.management.MBeanTrustPermission]) true
else defaultPolicy.implies(domain, permission)
}
}
By default, JVM doesn't use a SecurityManager, so if I don't invoke "forbidSystemCall"(which attach a user-defined SecurityManager to JVM), the problem did not occur.
May it help your problem.

JMeter 5.0 / "JMeterUtils: ERROR! Resource string not found" # Aggregate Report

JMeter 5.0 r1840935 # Win 7 64 Bit
All the time when I click on the Aggregate Report I'm getting about 10 Error-Message-Lines in the Log-Window ...
Is this a problem just on my PC or does it happen on your PC too?
All other things in JMeter are working very fine.
Maybe it has to do with this entry but I'm not enough experienced to understand it:
JMeter - Resource string not found error
I appreciate your help.
Kind regards
Joe
=======================================
2019-01-22 20:47:16,976 WARN o.a.j.u.JMeterUtils: ERROR! Resource
string not found: [99%line] 2019-01-22 20:47:16,976 WARN
o.a.j.u.JMeterUtils: ERROR! Resource string not found: [min]
2019-01-22 20:47:16,976 WARN o.a.j.u.JMeterUtils: ERROR! Resource
string not found: [error%] 2019-01-22 20:47:16,976 WARN
o.a.j.u.JMeterUtils: ERROR! Resource string not found: [throughput]
2019-01-22 20:47:16,976 WARN o.a.j.u.JMeterUtils: ERROR! Resource
string not found: [received_kb/sec] 2019-01-22 20:47:16,976 WARN
o.a.j.u.JMeterUtils: ERROR! Resource string not found: [sent_kb/sec]
2019-01-22 20:58:54,822 WARN o.a.j.u.JMeterUtils: ERROR! Resource
string not found: [label] 2019-01-22 20:58:54,822 WARN
o.a.j.u.JMeterUtils: ERROR! Resource string not found: [#_samples]
2019-01-22 20:58:54,822 WARN o.a.j.u.JMeterUtils: ERROR! Resource
string not found: [median] 2019-01-22 20:58:54,822 WARN
o.a.j.u.JMeterUtils: ERROR! Resource string not found: [90%_line]
2019-01-22 20:58:54,822 WARN o.a.j.u.JMeterUtils: ERROR! Resource
string not found: [95%_line] 2019-01-22 20:58:54,822 WARN
o.a.j.u.JMeterUtils: ERROR! Resource string not found: [99%line]
2019-01-22 20:58:54,822 WARN o.a.j.u.JMeterUtils: ERROR! Resource
string not found: [min] 2019-01-22 20:58:54,822 WARN
o.a.j.u.JMeterUtils: ERROR! Resource string not found: [error%]
2019-01-22 20:58:54,822 WARN o.a.j.u.JMeterUtils: ERROR! Resource
string not found: [throughput] 2019-01-22 20:58:54,822 WARN
o.a.j.u.JMeterUtils: ERROR! Resource string not found:
[received_kb/sec] 2019-01-22 20:58:54,822 WARN o.a.j.u.JMeterUtils:
ERROR! Resource string not found: [sent_kb/sec]
=======================================
This is a bug:
https://bz.apache.org/bugzilla/show_bug.cgi?id=62896
Which is already fixed in nightly build and will be in 5.1:
https://ci.apache.org/projects/jmeter/nightlies/

Unable to display aggregate graph within Jmeter

I am receiving the following error when trying to ouptut a aggregate graph within jmeter:
2018-10-11 09:15:00,731 ERROR o.a.j.JMeter: Uncaught exception:
java.lang.IndexOutOfBoundsException: Index -1 out-of-bounds for length 13
at jdk.internal.util.Preconditions.outOfBounds(Preconditions.java:64) ~[?:?]
at jdk.internal.util.Preconditions.outOfBoundsCheckIndex(Preconditions.java:70) ~[?:?]
at jdk.internal.util.Preconditions.checkIndex(Preconditions.java:248) ~[?:?]
at java.util.Objects.checkIndex(Objects.java:372) ~[?:?]
at java.util.ArrayList.get(ArrayList.java:440) ~[?:?]
at org.apache.jorphan.gui.ObjectTableModel.getValueAt(ObjectTableModel.java:187) ~[jorphan.jar:5.0 r1840935]
at org.apache.jmeter.visualizers.StatGraphVisualizer.getData(StatGraphVisualizer.java:638) ~[ApacheJMeter_components.jar:5.0 r1840935]
at org.apache.jmeter.visualizers.StatGraphVisualizer.makeGraph(StatGraphVisualizer.java:594) ~[ApacheJMeter_components.jar:5.0 r1840935]
at org.apache.jmeter.visualizers.StatGraphVisualizer.actionMakeGraph(StatGraphVisualizer.java:790) ~[ApacheJMeter_components.jar:5.0 r1840935]
at org.apache.jmeter.visualizers.StatGraphVisualizer.actionPerformed(StatGraphVisualizer.java:702) ~[ApacheJMeter_components.jar:5.0 r1840935]
at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1967) ~[?:?]
at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2308) ~[?:?]
at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:405) ~[?:?]
at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:262) ~[?:?]
at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:270) ~[?:?]
at java.awt.Component.processMouseEvent(Component.java:6589) ~[?:?]
at javax.swing.JComponent.processMouseEvent(JComponent.java:3342) ~[?:?]
at java.awt.Component.processEvent(Component.java:6354) ~[?:?]
at java.awt.Container.processEvent(Container.java:2261) ~[?:?]
at java.awt.Component.dispatchEventImpl(Component.java:4966) ~[?:?]
at java.awt.Container.dispatchEventImpl(Container.java:2319) ~[?:?]
at java.awt.Component.dispatchEvent(Component.java:4798) ~[?:?]
at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4914) ~[?:?]
at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4543) ~[?:?]
at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4484) ~[?:?]
at java.awt.Container.dispatchEventImpl(Container.java:2305) ~[?:?]
at java.awt.Window.dispatchEventImpl(Window.java:2772) ~[?:?]
at java.awt.Component.dispatchEvent(Component.java:4798) ~[?:?]
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:772) ~[?:?]
at java.awt.EventQueue.access$600(EventQueue.java:97) ~[?:?]
at java.awt.EventQueue$4.run(EventQueue.java:721) ~[?:?]
at java.awt.EventQueue$4.run(EventQueue.java:715) ~[?:?]
at java.security.AccessController.doPrivileged(Native Method) ~[?:?]
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:87) ~[?:?]
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:97) ~[?:?]
at java.awt.EventQueue$5.run(EventQueue.java:745) ~[?:?]
at java.awt.EventQueue$5.run(EventQueue.java:743) ~[?:?]
at java.security.AccessController.doPrivileged(Native Method) ~[?:?]
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:87) ~[?:?]
at java.awt.EventQueue.dispatchEvent(EventQueue.java:742) ~[?:?]
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:203) [?:?]
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:124) [?:?]
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:113) [?:?]
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:109) [?:?]
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101) [?:?]
at java.awt.EventDispatchThread.run(EventDispatchThread.java:90) [?:?]
2018-10-11 09:15:04,438 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [label]
2018-10-11 09:15:04,521 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [label]
2018-10-11 09:15:04,521 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [#_samples]
2018-10-11 09:15:04,521 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [median]
2018-10-11 09:15:04,521 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [90%_line]
2018-10-11 09:15:04,521 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [95%_line]
2018-10-11 09:15:04,521 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [99%_line]
2018-10-11 09:15:04,521 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [min]
2018-10-11 09:15:04,522 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [error_%]
2018-10-11 09:15:04,522 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [throughput]
2018-10-11 09:15:04,524 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [received_kb/sec]
2018-10-11 09:15:04,524 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [sent_kb/sec]
2018-10-11 09:15:04,551 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [label]
2018-10-11 09:15:04,552 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [label]
2018-10-11 09:15:04,552 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [label]
2018-10-11 09:15:04,552 WARN o.a.j.u.JMeterUtils: ERROR! Resource string not found: [label]
This means the graph cannot be generated and I am unsure how to fix the issue.
These are the results and settings of my aggregate graph, can anybody help me with this issue please?
Thanks
This is a known bug of JMeter 5.0:
https://bz.apache.org/bugzilla/show_bug.cgi?id=62770
It is fixed in nightly build:
https://builds.apache.org/job/JMeter-trunk/lastSuccessfulBuild/artifact/trunk/dist/
And will be available in 5.1
Anyway, the best way to analyze your test is to use the Web Report which exists since 3.0 version:
https://jmeter.apache.org/usermanual/generating-dashboard.html

Fetch the Data from DB2 and save in MongoDB using Spring Batch without metadata tables

I need to fetch all the data from Db2 table and copy to mongoDB. I am using Spring batch without metadata tables. I am using JpaPagingItemReader for fetching data from Db2. I am able to get the data from Db2 for the first fetch which based on the PageSize attribute and save in mongoDB successfully but for the next set of data, getting error in fetching the data with the pagination.
Below given the Error:
15:17:36,085 INFO [stdout] (default task-2) 2017-12-06 15:17:36.084 INFO 7088 --- [ default task-2] c.t.controller.ItemController : About to copy All Item
15:17:36,135 INFO [stdout] (default task-2) 2017-12-06 15:17:36.135 INFO 7088 --- [ default task-2] o.s.b.c.l.support.SimpleJobLauncher : Job: [SimpleJob: [name=readDB2]] launched with the following parameters: [{}]
15:17:36,180 INFO [stdout] (default task-2) 2017-12-06 15:17:36.180 INFO 7088 --- [ default task-2] o.s.batch.core.job.SimpleStepHandler : Executing step: [step1]
15:17:36,215 INFO [stdout] (default task-2) Hibernate: select itm0_.itm_nbr as t1_4_, itm0_.itm_des_txt as itm_des13_4_ from itm itm0_ fetch first 10 rows only
15:17:36,884 INFO [stdout] (default task-2) 2017-12-06 15:17:36.884 DEBUG 7088 --- [ default task-2] o.s.data.mongodb.core.MongoTemplate : Saving DBObject containing fields: [_class, itmDesTxt, itemId]
15:17:38,621 INFO [stdout] (default task-2) 2017-12-06 15:17:38.621 INFO 7088 --- [ default task-2] org.mongodb.driver.connection : Opened connection [connectionId{localValue:3, serverValue:1998}] to lxmdbpmdmdev001:27017
15:17:38,894 INFO [stdout] (default task-2) 2017-12-06 15:17:38.894 DEBUG 7088 --- [ default task-2] o.s.data.mongodb.core.MongoTemplate : Saving DBObject containing fields: [_class, itmDesTxt, itemId]
15:17:39,139 INFO [stdout] (default task-2) 2017-12-06 15:17:39.139 DEBUG 7088 --- [ default task-2] o.s.data.mongodb.core.MongoTemplate : Saving DBObject containing fields: [_class, itmDesTxt, itemId]
15:17:39,386 INFO [stdout] (default task-2) 2017-12-06 15:17:39.386 DEBUG 7088 --- [ default task-2] o.s.data.mongodb.core.MongoTemplate : Saving DBObject containing fields: [_class, itmDesTxt, itemId]
15:17:39,632 INFO [stdout] (default task-2) 2017-12-06 15:17:39.632 DEBUG 7088 --- [ default task-2] o.s.data.mongodb.core.MongoTemplate : Saving DBObject containing fields: [_class, itmDesTxt, itemId]
15:17:39,881 INFO [stdout] (default task-2) 2017-12-06 15:17:39.881 DEBUG 7088 --- [ default task-2] o.s.data.mongodb.core.MongoTemplate : Saving DBObject containing fields: [_class, itmDesTxt, itemId]
15:17:40,123 INFO [stdout] (default task-2) 2017-12-06 15:17:40.123 DEBUG 7088 --- [ default task-2] o.s.data.mongodb.core.MongoTemplate : Saving DBObject containing fields: [_class, itmDesTxt, itemId]
15:17:40,377 INFO [stdout] (default task-2) 2017-12-06 15:17:40.377 DEBUG 7088 --- [ default task-2] o.s.data.mongodb.core.MongoTemplate : Saving DBObject containing fields: [_class, itmDesTxt, itemId]
15:17:40,619 INFO [stdout] (default task-2) 2017-12-06 15:17:40.619 DEBUG 7088 --- [ default task-2] o.s.data.mongodb.core.MongoTemplate : Saving DBObject containing fields: [_class, itmDesTxt, itemId]
15:17:40,865 INFO [stdout] (default task-2) 2017-12-06 15:17:40.865 DEBUG 7088 --- [ default task-2] o.s.data.mongodb.core.MongoTemplate : Saving DBObject containing fields: [_class, itmDesTxt, itemId]
15:17:41,109 INFO [stdout] (default task-2) Hibernate: select * from ( select inner2_.*, rownumber() over(order by order of inner2_) as rownumber_ from ( select itm0_.itm_nbr as t1_4_, itm0_.itm_des_txt as itm_des13_4_ from itm itm0_ fetch first 20 rows only ) as inner2_ ) as inner1_ where rownumber_ > 10 order by rownumber_
15:17:41,346 WARN [org.hibernate.engine.jdbc.spi.SqlExceptionHelper] (default task-2) SQL Error: -199, SQLState: 42601
15:17:41,347 ERROR [org.hibernate.engine.jdbc.spi.SqlExceptionHelper] (default task-2) DB2 SQL Error: SQLCODE=-199, SQLSTATE=42601, SQLERRMC=OF;;??( [ DESC ASC NULLS RANGE CONCAT || / MICROSECONDS MICROSECOND, DRIVER=4.18.60
15:17:41,347 WARN [org.hibernate.engine.jdbc.spi.SqlExceptionHelper] (default task-2) SQL Error: -516, SQLState: 26501
15:17:41,347 ERROR [org.hibernate.engine.jdbc.spi.SqlExceptionHelper] (default task-2) DB2 SQL Error: SQLCODE=-516, SQLSTATE=26501, SQLERRMC=null, DRIVER=4.18.60
15:17:41,347 WARN [org.hibernate.engine.jdbc.spi.SqlExceptionHelper] (default task-2) SQL Error: -514, SQLState: 26501
15:17:41,347 ERROR [org.hibernate.engine.jdbc.spi.SqlExceptionHelper] (default task-2) DB2 SQL Error: SQLCODE=-514, SQLSTATE=26501, SQLERRMC=SQL_CURLN200C1, DRIVER=4.18.60
15:17:41,362 INFO [stdout] (default task-2) 2017-12-06 15:17:41.362 ERROR 7088 --- [ default task-2] o.s.batch.core.step.AbstractStep : Encountered an error executing step step1 in job readDB2
15:17:41,362 INFO [stdout] (default task-2)
15:17:41,362 INFO [stdout] (default task-2) javax.persistence.PersistenceException: org.hibernate.exception.SQLGrammarException: could not extract ResultSet
15:17:41,362 INFO [stdout] (default task-2) at org.hibernate.jpa.spi.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1692) ~[hibernate-entitymanager-5.0.12.Final.jar:5.0.12.Final]
15:17:41,362 INFO [stdout] (default task-2) at org.hibernate.jpa.spi.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1602) ~[hibernate-entitymanager-5.0.12.Final.jar:5.0.12.Final]
15:17:41,362 INFO [stdout] (default task-2) at org.hibernate.jpa.internal.QueryImpl.getResultList(QueryImpl.java:492) ~[hibernate-entitymanager-5.0.12.Final.jar:5.0.12.Final]
15:17:41,362 INFO [stdout] (default task-2) at org.springframework.batch.item.database.JpaPagingItemReader.doReadPage(JpaPagingItemReader.java:225) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,362 INFO [stdout] (default task-2) at org.springframework.batch.item.database.AbstractPagingItemReader.doRead(AbstractPagingItemReader.java:108) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,362 INFO [stdout] (default task-2) at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.read(AbstractItemCountingItemStreamItemReader.java:88) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,362 INFO [stdout] (default task-2) at org.springframework.batch.core.step.item.SimpleChunkProvider.doRead(SimpleChunkProvider.java:91) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,362 INFO [stdout] (default task-2) at org.springframework.batch.core.step.item.SimpleChunkProvider.read(SimpleChunkProvider.java:157) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,363 INFO [stdout] (default task-2) at org.springframework.batch.core.step.item.SimpleChunkProvider$1.doInIteration(SimpleChunkProvider.java:116) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,363 INFO [stdout] (default task-2) at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:374) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,363 INFO [stdout] (default task-2) at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,363 INFO [stdout] (default task-2) at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:144) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,363 INFO [stdout] (default task-2) at org.springframework.batch.core.step.item.SimpleChunkProvider.provide(SimpleChunkProvider.java:110) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,363 INFO [stdout] (default task-2) at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:69) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,363 INFO [stdout] (default task-2) at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:406) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,363 INFO [stdout] (default task-2) at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:330) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,363 INFO [stdout] (default task-2) at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133) ~[spring-tx-4.3.12.RELEASE.jar:4.3.12.RELEASE]
15:17:41,363 INFO [stdout] (default task-2) at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:272) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,363 INFO [stdout] (default task-2) at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:81) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,376 INFO [stdout] (default task-2) at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:374) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,376 INFO [stdout] (default task-2) at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,376 INFO [stdout] (default task-2) at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:144) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,376 INFO [stdout] (default task-2) at org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:257) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,376 INFO [stdout] (default task-2) at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:200) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,376 INFO [stdout] (default task-2) at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:148) [spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,376 INFO [stdout] (default task-2) at org.springframework.batch.core.job.AbstractJob.handleStep(AbstractJob.java:392) [spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,376 INFO [stdout] (default task-2) at org.springframework.batch.core.job.SimpleJob.doExecute(SimpleJob.java:135) [spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,376 INFO [stdout] (default task-2) at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:306) [spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,376 INFO [stdout] (default task-2) at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:135) [spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,376 INFO [stdout] (default task-2) at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50) [spring-core-4.3.12.RELEASE.jar:4.3.12.RELEASE]
15:17:41,376 INFO [stdout] (default task-2) at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:128) [spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
15:17:41,376 INFO [stdout] (default task-2) at com.test.service.impl.ItemServiceImpl.itemCopy(ItemServiceImpl.java:266) [classes:na]
15:17:41,376 INFO [stdout] (default task-2) at com.test.controller.ItemController.allItemCopy(ItemController.java:48) [classes:na]
....
15:17:41,410 INFO [stdout] (default task-2) 2017-12-06 15:17:41.410 INFO 7088 --- [ default task-2] o.s.b.c.l.support.SimpleJobLauncher : Job: [SimpleJob: [name=readDB2]] completed with the following parameters: [{}] and the following status: [FAILED]
Batch Configuration:
#Configuration
#EnableBatchProcessing
public class ItemBatch {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
EntityManagerFactory entityManagerFactory;
#Autowired
MongoTemplate mongoTemplate;
#Bean
public Job readDB2() {
return jobBuilderFactory.get("readDB2").start(step1()).build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<Itm, com.test.model.mongodb.Itm> chunk(5)
.reader(reader())
.writer(writer())
.build();
}
#Bean
public ItemStreamReader<Itm> reader() {
JpaPagingItemReader<Itm> databaseReader = new JpaPagingItemReader<>();
try {
databaseReader.setEntityManagerFactory(entityManagerFactory);
JpaQueryProviderImpl<Itm> jpaQueryProvider = new JpaQueryProviderImpl<>();
jpaQueryProvider.setQuery("Itm.findAll");
databaseReader.setQueryProvider(jpaQueryProvider);
databaseReader.setPageSize(10);
databaseReader.afterPropertiesSet();
} catch (Exception e) {
System.err.println("Error :" + e);
}
return databaseReader;
}
#Bean
public MongoItemWriter<com.test.model.mongodb.Itm> writer() {
MongoItemWriter<com.test.model.mongodb.Itm> writer = new MongoItemWriter<>();
try {
writer.setTemplate(mongoTemplate);
} catch (Exception e) {
e.printStackTrace();
}
writer.setCollection("itm");
return writer;
}
}
In-Memory Configuration instead of metadata:
#Configuration
public class SpringBatchTestConfiguration {
#Bean
public ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactory(ResourcelessTransactionManager txManager)
throws Exception {
MapJobRepositoryFactoryBean factory = new MapJobRepositoryFactoryBean(txManager);
factory.afterPropertiesSet();
return factory;
}
#Bean
public JobRepository jobRepository(MapJobRepositoryFactoryBean factory) throws Exception {
return factory.getObject();
}
#Bean
public JobExplorer jobExplorer(MapJobRepositoryFactoryBean factory) {
return new SimpleJobExplorer(factory.getJobInstanceDao(), factory.getJobExecutionDao(),
factory.getStepExecutionDao(), factory.getExecutionContextDao());
}
#Bean
public SimpleJobLauncher jobLauncher(JobRepository jobRepository) {
SimpleJobLauncher launcher = new SimpleJobLauncher();
launcher.setJobRepository(jobRepository);
return launcher;
}
}
QueryProvider:
public class JpaQueryProviderImpl<E> extends AbstractJpaQueryProvider {
private Class<E> entityClass;
private String query;
#Override
public Query createQuery() {
return getEntityManager().createNamedQuery(query, entityClass);
}
public void setQuery(String query) {
this.query = query;
}
public void setEntityClass(Class<E> entityClazz) {
this.entityClass = entityClazz;
}
#Override
public void afterPropertiesSet() throws Exception {
Assert.isTrue(StringUtils.hasText(query), "Query cannot be empty");
Assert.notNull(entityClass, "Entity class cannot be NULL");
}
}
Entity:
#Entity
#Table(name="ITM")
#NamedQueries({ #NamedQuery(name = "Itm.findAll", query = "select i from Itm i") })
public class Itm implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#Column(name="ITEM_NBR")
private int itemId;
#Column(name="ITM_DES_TXT")
private String itmDesTxt;
//getter and setter
}
The root cause is this error:
DB2 SQL Error: SQLCODE=-199, SQLSTATE=42601, SQLERRMC=OF;;??( [ DESC ASC NULLS RANGE CONCAT || / MICROSECONDS MICROSECOND, DRIVER=4.18.60
which points you to the OF keyword being unexpected.
ORDER BY ORDER OF in the OLAP specification is not valid in Db2 for z/OS v11, as can be seen in the manual. You'll need to rewrite your statement to order by some explicit expression -- or skip ORDER BY altogether since you're not ordering the inner2_ subselect anyway.
It is not issue with Spring batch, it is the issue in Spring Pagination with DB2 Z/OS. To fix the issue, we have to with the new dialect. Still I am facing issue with the new dialect. Need to find the proper dialect configuration.

Resolve dao.DuplicateKeyException in Spring MVC

When I run the code I get this error . How could I resolve it .Thanks
StringBuffer contactQuery =new StringBuffer();
contactQuery.append("Insert into rescontact (");
contactQuery.append("ContactSeq,ResID,LastName,FirstName,ContactLabel,Phone1,Phone2)");
contactQuery.append("Values( ?,?,?,?,?,?,?)");
logClient.debug("Insert Query " + contactQuery.toString());
System.out.println(" Contact Insertion Query "+contactQuery.toString());
try{
JdbcTemplate jdbcTemplate = this.getJdbcTemplate();
return jdbcTemplate.update(contactQuery.toString(),new Object[] {contactSeq,resId,lastName,firstName,"WEB",Long.parseLong(contactresult),Long.parseLong(alternatecontactresult)});
Exception:
ERROR [stderr] (http-localhost/127.0.0.1:8080-2)
org.springframework.dao.DuplicateKeyException: PreparedStatementCallback; SQL [Insert into rescontact (ContactSeq,ResID,LastName,FirstName,ContactLabel,Phone1,Phone2)Values( ?,?,?,?,?,?,?)]; ORA-00001: unique constraint (DOR_RESTINSP.SYS_C0011900) violated
09:31:59,851 ERROR [stderr] (http-localhost/127.0.0.1:8080-2) ; nested exception is java.sql.SQLIntegrityConstraintViolationException: ORA-00001: unique constraint (DOR_RESTINSP.SYS_C0011900) violated
09:31:59,852 ERROR [stderr] (http-localhost/127.0.0.1:8080-2)
09:31:59,852 ERROR [stderr] (http-localhost/127.0.0.1:8080-2)
at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:239)
09:31:59,852 ERROR [stderr] (http-localhost/127.0.0.1:8080-2)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:73)
09:31:59,852 ERROR [stderr] (http-localhost/127.0.0.1:8080-2)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:660)
09:31:59,852 ERROR [stderr] (http-localhost/127.0.0.1:8080-2)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:909)
09:31:59,853 ERROR [stderr] (http-localhost/127.0.0.1:8080-2)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:970)
09:31:59,853 ERROR [stderr] (http-localhost/127.0.0.1:8080-2)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:980)

Resources