I get the following exception when I try to run Grails from the terminal in OSX:
| Loading Grails 2.3.6
| Error java.lang.reflect.InvocationTargetException
| Error at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
| Error at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
| Error at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
| Error at java.lang.reflect.Method.invoke(Method.java:597)
| Error at org.codehaus.groovy.grails.cli.support.GrailsStarter.rootLoader(GrailsStarter.java:235)
| Error at org.codehaus.groovy.grails.cli.support.GrailsStarter.main(GrailsStarter.java:263)
| Error Caused by: java.lang.IllegalAccessError: class sun.reflect.GeneratedConstructorAccessor2 cannot access its superclass sun.reflect.ConstructorAccessorImpl
| Error at sun.misc.Unsafe.defineClass(Native Method)
| Error at sun.reflect.ClassDefiner.defineClass(ClassDefiner.java:45)
| Error at sun.reflect.MethodAccessorGenerator$1.run(MethodAccessorGenerator.java:381)
| Error at java.security.AccessController.doPrivileged(Native Method)
| Error at sun.reflect.MethodAccessorGenerator.generate(MethodAccessorGenerator.java:377)
| Error at sun.reflect.MethodAccessorGenerator.generateConstructor(MethodAccessorGenerator.java:76)
| Error at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:30)
| Error at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
| Error at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
| Error at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77)
| Error at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:102)
| Error at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:202)
| Error at org.codehaus.groovy.grails.resolve.EnhancedDefaultDependencyDescriptor.addRuleForModuleId(EnhancedDefaultDependencyDescriptor.groovy:135)
| Error at org.codehaus.groovy.grails.resolve.EnhancedDefaultDependencyDescriptor$addRuleForModuleId$0.callCurrent(Unknown Source)
| Error at org.codehaus.groovy.grails.resolve.EnhancedDefaultDependencyDescriptor.excludeForMap(EnhancedDefaultDependencyDescriptor.groovy:113)
| Error at org.codehaus.groovy.grails.resolve.EnhancedDefaultDependencyDescriptor.this$3$excludeForMap(EnhancedDefaultDependencyDescriptor.groovy)
| Error at org.codehaus.groovy.grails.resolve.EnhancedDefaultDependencyDescriptor$this$3$excludeForMap.callCurrent(Unknown Source)
| Error at org.codehaus.groovy.grails.resolve.EnhancedDefaultDependencyDescriptor.<init>(EnhancedDefaultDependencyDescriptor.groovy:76)
| Error at org.codehaus.groovy.grails.resolve.EnhancedDefaultDependencyDescriptor.<init>(EnhancedDefaultDependencyDescriptor.groovy:80)
| Error at org.codehaus.groovy.grails.resolve.GrailsIvyDependencies.registerDependency(GrailsIvyDependencies.groovy:69)
| Error at org.codehaus.groovy.grails.resolve.GrailsIvyDependencies.registerDependencies(GrailsIvyDependencies.groovy:58)
| Error at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
| Error at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
| Error at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
| Error at java.lang.reflect.Method.invoke(Method.java:597)
| Error at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
| Error at org.codehaus.groovy.runtime.callsite.StaticMetaMethodSite$StaticMetaMethodSiteNoUnwrapNoCoerce.invoke(StaticMetaMethodSite.java:148)
| Error at org.codehaus.groovy.runtime.callsite.StaticMetaMethodSite.call(StaticMetaMethodSite.java:88)
| Error at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
| Error at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
| Error at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:124)
| Error at org.codehaus.groovy.grails.resolve.GrailsIvyDependencies$_createDeclaration_closure1_closure3.doCall(GrailsIvyDependencies.groovy:117)
| Error at org.codehaus.groovy.grails.resolve.GrailsIvyDependencies$_createDeclaration_closure1_closure3.doCall(GrailsIvyDependencies.groovy)
| Error at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
| Error at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
| Error at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
| Error at java.lang.reflect.Method.invoke(Method.java:597)
| Error at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:272)
| Error at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.call(PogoMetaMethodSite.java:64)
| Error at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
| Error at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
| Error at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:112)
| Error at org.codehaus.groovy.grails.resolve.config.DependencyConfigurationConfigurer.dependencies(DependencyConfigurationConfigurer.groovy:150)
| Error at org.codehaus.groovy.grails.resolve.config.DependencyConfigurationConfigurer$dependencies$1.call(Unknown Source)
| Error at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
| Error at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
| Error at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
| Error at org.codehaus.groovy.grails.resolve.GrailsIvyDependencies$_createDeclaration_closure1.doCall(GrailsIvyDependencies.groovy:102)
| Error at org.codehaus.groovy.grails.resolve.GrailsIvyDependencies$_createDeclaration_closure1.doCall(GrailsIvyDependencies.groovy)
| Error at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
| Error at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
| Error at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
| Error at java.lang.reflect.Method.invoke(Method.java:597)
| Error at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
| Error at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
| Error at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1086)
| Error at groovy.lang.ExpandoMetaClass.invokeMethod(ExpandoMetaClass.java:1110)
| Error at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:910)
| Error at groovy.lang.Closure.call(Closure.java:411)
| Error at groovy.lang.Closure.call(Closure.java:405)
| Error at org.codehaus.groovy.grails.resolve.AbstractIvyDependencyManager.doParseDependencies(AbstractIvyDependencyManager.java:676)
| Error at org.codehaus.groovy.grails.resolve.AbstractIvyDependencyManager.parseDependencies(AbstractIvyDependencyManager.java:577)
| Error at org.codehaus.groovy.grails.resolve.DependencyManager$parseDependencies.call(Unknown Source)
| Error at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
| Error at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
| Error at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
| Error at org.codehaus.groovy.grails.resolve.DependencyManagerConfigurer.configureIvy(DependencyManagerConfigurer.groovy:157)
| Error at grails.util.BuildSettings.configureDependencyManager(BuildSettings.groovy:1281)
| Error at grails.util.BuildSettings.postLoadConfig(BuildSettings.groovy:1219)
| Error at grails.util.BuildSettings.loadConfig(BuildSettings.groovy:1075)
| Error at grails.util.BuildSettings.loadConfig(BuildSettings.groovy)
| Error at grails.util.BuildSettings$loadConfig$0.callCurrent(Unknown Source)
| Error at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:49)
| Error at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:133)
| Error at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:141)
| Error at grails.util.BuildSettings.loadConfig(BuildSettings.groovy:1053)
| Error at org.codehaus.groovy.grails.cli.GrailsScriptRunner.loadConfigEnvironment(GrailsScriptRunner.java:249)
| Error at org.codehaus.groovy.grails.cli.GrailsScriptRunner.main(GrailsScriptRunner.java:210)
| Error ... 6 more
I can run it fine from within IntelliJ. I know it's something with my environment configuration, but I haven't been able to figure out what, yet. Doesi anyone have any ideas?
I'm running Java:
java version "1.6.0_65"
Java(TM) SE Runtime Environment (build 1.6.0_65-b14-462-11M4609)
Java HotSpot(TM) 64-Bit Server VM (build 20.65-b04-462, mixed mode)
OSX: 10.8.4
Are you exporting JAVA_HOME, I know on Linux I have to everytime I want to use grails command line..
Take a look here, maybe try updating to jdk since it does say jre .. (not a mac expert though) - also a few things to try:
http://liberalsprouts.blogspot.co.uk/2012/12/how-to-install-jdk-7-and-set-up.html
echo $PATH
echo $JAVA_HOME
java --version
and
$JAVA_HOME/bin/java -version
see if it is all the same.. I guess it will be, if you follow the instructions maybe see how latest jdk treats it.
to figure out what your intelliJ is using:
jps
3735
7588 GrailsStarter
7660 ForkedTomcatServer
7783 Jps
in my case its ggts if i now run:
lsof -p 7588|grep -i java|grep jdk|head -n 3
java 7588 mx1 txt REG 8,1 38568 18745123 /usr/lib/jvm/java-6-openjdk-i386/jre/bin/java
java 7588 mx1 mem REG 8,1 71084 12982357 /usr/lib/jvm/java-6-openjdk-i386/jre/lib/i386/libj2pkcs11.so
java 7588 mx1 mem REG 8,1 85518 18745490 /usr/lib/jvm/java-6-openjdk-common/jre/lib/jce.jar
which is -p {process id} I piped it into head to minimise output - but it gives me an idea of what jdk it is using - maybe you can trace what is going on and exports etc from using this method..
Related
Sonarqube version: 9.7.1.62043 (Developer Edition)
Database: PostgreSQL
Installed using: Docker
Integraion with: Gitlab Community Edition 15.5.2
Cannot go to project dashboard in sonarqube - getting exceptions as mentioned
Issue: Getting a pop up after clicking on project with the message ‘An error has occurred. Please contact your administrator’
We have configured gitlab-ci to run sonarscanner on every pull request.
In the docker logs I can see logs as:
web.log
2023.01.30 07:24:06 ERROR web[AYX0AdNMD90rem68AAMM][o.s.s.w.WebServiceEngine] Fail to process request http://MY_SONAR_URL/api/project_pull_requests/list?project=MY_PROJECT_ID
java.lang.NullPointerException: Pull request data should be available for branch type PULL_REQUEST
at java.base/java.util.Objects.requireNonNull(Objects.java:246)
ce.log
2023.01.31 13:14:45 INFO ce[AYYH9x3NGuaUdN9D31Cm][o.s.c.t.s.ComputationStepExecutor] Compute new coverage | status=FAILED | time=99ms
2023.01.31 13:14:45 INFO ce[AYYH9x3NGuaUdN9D31Cm][o.s.c.t.p.a.p.PostProjectAnalysisTasksExecutor] Webhooks | globalWebhooks=0 | projectWebhooks=0 | status=SUCCESS | time=2ms
2023.01.31 13:14:45 INFO ce[AYYH9x3NGuaUdN9D31Cm][c.s.G.D.F.B] GitLab's instance URL from the scanner ('https://gitlab.MYDOMAIN/api/v4') is overridden by the settings ('https://gitlab.MYDOMAIN/api/v4')
2023.01.31 13:14:45 INFO ce[AYYH9x3NGuaUdN9D31Cm][c.s.G.D.F.B] GitLab's project ID from the scanner ('74') is overridden by the settings ('74')
2023.01.31 13:14:45 ERROR ce[AYYH9x3NGuaUdN9D31Cm][o.s.c.t.p.a.p.PostProjectAnalysisTasksExecutor] Execution of task class com.sonarsource.G.D.d failed
java.lang.NullPointerException: null
at org.sonar.db.protobuf.DbProjectBranches$PullRequestData$Builder.mergeFrom(DbProjectBranches.java:864)
at org.sonar.db.protobuf.DbProjectBranches$PullRequestData.newBuilder(DbProjectBranches.java:697)
at com.sonarsource.G.D.d.A(Unknown Source)
at java.base/java.util.Optional.ifPresent(Optional.java:183)
at com.sonarsource.G.D.d.A(Unknown Source)
at com.sonarsource.G.D.d.B(Unknown Source)
at org.sonar.ce.async.SynchronousAsyncExecution.addToQueue(SynchronousAsyncExecution.java:27)
at com.sonarsource.G.D.d.A(Unknown Source)
at java.base/java.util.Optional.ifPresent(Optional.java:183)
at com.sonarsource.G.D.d.finished(Unknown Source)
at org.sonar.ce.task.projectanalysis.api.posttask.PostProjectAnalysisTasksExecutor.executeTask(PostProjectAnalysisTasksExecutor.java:102)
at org.sonar.ce.task.projectanalysis.api.posttask.PostProjectAnalysisTasksExecutor.finished(PostProjectAnalysisTasksExecutor.java:93)
at org.sonar.ce.task.step.ComputationStepExecutor.executeListener(ComputationStepExecutor.java:89)
at org.sonar.ce.task.step.ComputationStepExecutor.execute(ComputationStepExecutor.java:61)
at org.sonar.ce.task.projectanalysis.taskprocessor.ReportTaskProcessor.process(ReportTaskProcessor.java:75)
at org.sonar.ce.taskprocessor.CeWorkerImpl$ExecuteTask.executeTask(CeWorkerImpl.java:212)
at org.sonar.ce.taskprocessor.CeWorkerImpl$ExecuteTask.run(CeWorkerImpl.java:194)
at org.sonar.ce.taskprocessor.CeWorkerImpl.findAndProcessTask(CeWorkerImpl.java:160)
at org.sonar.ce.taskprocessor.CeWorkerImpl$TrackRunningState.get(CeWorkerImpl.java:135)
at org.sonar.ce.taskprocessor.CeWorkerImpl.call(CeWorkerImpl.java:87)
at org.sonar.ce.taskprocessor.CeWorkerImpl.call(CeWorkerImpl.java:53)
at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:131)
at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:74)
at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:82)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
2023.01.31 13:14:45 INFO ce[AYYH9x3NGuaUdN9D31Cm][o.s.c.t.p.a.p.PostProjectAnalysisTasksExecutor] Pull Request decoration | status=FAILED | time=10ms
2023.01.31 13:14:45 INFO ce[AYYH9x3NGuaUdN9D31Cm][o.s.c.t.p.a.p.PostProjectAnalysisTasksExecutor] Report branch Quality Gate status to devops platforms | status=SUCCESS | time=0ms
2023.01.31 13:14:45 ERROR ce[AYYH9x3NGuaUdN9D31Cm][o.s.c.t.CeWorkerImpl] Failed to execute task AYYH9x3NGuaUdN9D31Cm
org.sonar.ce.task.projectanalysis.component.VisitException: Visit failed for Component {key=KEY:SOME_PATH/PricingServiceImpl.java,type=FILE} located KEY:SOME_PATH(type=DIRECTORY)->KEY:FILE_PATH(type=DIRECTORY)->KEY:FILE_PATH(type=DIRECTORY)->KEY:quote(type=DIRECTORY)->KEY(type=PROJECT)
at org.sonar.ce.task.projectanalysis.component.VisitException.rethrowOrWrap(VisitException.java:44)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visit(PathAwareCrawler.java:52)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visitChildren(PathAwareCrawler.java:87)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visitImpl(PathAwareCrawler.java:70)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visit(PathAwareCrawler.java:50)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visitChildren(PathAwareCrawler.java:87)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visitImpl(PathAwareCrawler.java:70)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visit(PathAwareCrawler.java:50)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visitChildren(PathAwareCrawler.java:87)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visitImpl(PathAwareCrawler.java:70)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visit(PathAwareCrawler.java:50)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visitChildren(PathAwareCrawler.java:87)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visitImpl(PathAwareCrawler.java:70)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visit(PathAwareCrawler.java:50)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visitChildren(PathAwareCrawler.java:87)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visitImpl(PathAwareCrawler.java:70)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visit(PathAwareCrawler.java:50)
at org.sonar.ce.task.projectanalysis.step.NewCoverageMeasuresStep.execute(NewCoverageMeasuresStep.java:90)
at org.sonar.ce.task.step.ComputationStepExecutor.executeStep(ComputationStepExecutor.java:79)
at org.sonar.ce.task.step.ComputationStepExecutor.executeSteps(ComputationStepExecutor.java:70)
at org.sonar.ce.task.step.ComputationStepExecutor.execute(ComputationStepExecutor.java:57)
at org.sonar.ce.task.projectanalysis.taskprocessor.ReportTaskProcessor.process(ReportTaskProcessor.java:75)
at org.sonar.ce.taskprocessor.CeWorkerImpl$ExecuteTask.executeTask(CeWorkerImpl.java:212)
at org.sonar.ce.taskprocessor.CeWorkerImpl$ExecuteTask.run(CeWorkerImpl.java:194)
at org.sonar.ce.taskprocessor.CeWorkerImpl.findAndProcessTask(CeWorkerImpl.java:160)
at org.sonar.ce.taskprocessor.CeWorkerImpl$TrackRunningState.get(CeWorkerImpl.java:135)
at org.sonar.ce.taskprocessor.CeWorkerImpl.call(CeWorkerImpl.java:87)
at org.sonar.ce.taskprocessor.CeWorkerImpl.call(CeWorkerImpl.java:53)
at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:131)
at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:74)
at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:82)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.NullPointerException: null
at org.sonar.ce.task.projectanalysis.scm.ScmInfoRepositoryImpl.removeAuthorAndRevision(ScmInfoRepositoryImpl.java:104)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
at java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:550)
at java.base/java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
at java.base/java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:517)
at org.sonar.ce.task.projectanalysis.scm.ScmInfoRepositoryImpl.removeAuthorAndRevision(ScmInfoRepositoryImpl.java:99)
at org.sonar.ce.task.projectanalysis.scm.ScmInfoRepositoryImpl.generateAndMergeDb(ScmInfoRepositoryImpl.java:119)
at org.sonar.ce.task.projectanalysis.scm.ScmInfoRepositoryImpl.getScmInfoForComponent(ScmInfoRepositoryImpl.java:74)
at java.base/java.util.HashMap.computeIfAbsent(HashMap.java:1134)
at org.sonar.ce.task.projectanalysis.scm.ScmInfoRepositoryImpl.getScmInfo(ScmInfoRepositoryImpl.java:65)
at org.sonar.ce.task.projectanalysis.source.NewLinesRepository.computeNewLinesFromScm(NewLinesRepository.java:72)
at org.sonar.ce.task.projectanalysis.source.NewLinesRepository.getNewLines(NewLinesRepository.java:64)
at org.sonar.ce.task.projectanalysis.step.NewCoverageMeasuresStep$NewCoverageCounter.initialize(NewCoverageMeasuresStep.java:205)
at org.sonar.ce.task.projectanalysis.formula.FormulaExecutorComponentVisitor.processLeaf(FormulaExecutorComponentVisitor.java:148)
at org.sonar.ce.task.projectanalysis.formula.FormulaExecutorComponentVisitor.process(FormulaExecutorComponentVisitor.java:125)
at org.sonar.ce.task.projectanalysis.formula.FormulaExecutorComponentVisitor.visitFile(FormulaExecutorComponentVisitor.java:105)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visitNode(PathAwareCrawler.java:102)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visitImpl(PathAwareCrawler.java:73)
at org.sonar.ce.task.projectanalysis.component.PathAwareCrawler.visit(PathAwareCrawler.java:50)
... 35 common frames omitted
2023.01.31 13:14:45 INFO ce[AYYH9x3NGuaUdN9D31Cm][o.s.c.t.CeWorkerImpl] Executed task | project=KEY | type=REPORT | pullRequest=8837 | id=AYYH9x3NGuaUdN9D31Cm | submitter=sonarqube11609 | status=FAILED | time=2908ms
2023.01.31 13:26:06 INFO ce[][o.s.c.t.CeWorkerImpl] Execute task | project=KEY | type=REPORT | pullRequest=8837 | id=AYYIAY0jGuaUdN9D31Co | submitter=sonarqube11609
2023.01.31 13:26:07 INFO ce[AYYIAY0jGuaUdN9D31Co][o.s.c.t.s.ComputationStepExecutor] Extract report | status=SUCCESS | time=876ms
2023.01.31 13:26:07 INFO ce[AYYIAY0jGuaUdN9D31Co][o.s.c.t.s.ComputationStepExecutor] Persist scanner context | status=SUCCESS | time=35ms
2023.01.31 13:26:07 INFO ce[AYYIAY0jGuaUdN9D31Co][o.s.c.t.s.ComputationStepExecutor] Propagate analysis warnings from scanner report | status=SUCCESS | time=6ms
This does not fail for all the pull requests, but only a few.
I'm new to Kafka but had Debezium Connect/Mysql up and running in Docker just fine. All of a sudden all docker containers were gone and upon restart and attempt to reconnect to MySQL the JDBC connection fails with this response to the Connect rest API:
Using ZOOKEEPER_CONNECT=0.0.0.0:2181
Using KAFKA_LISTENERS=PLAINTEXT://172.19.0.5:9092 and KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://172.19.0.5:9092
HTTP/1.1 400 Bad Request
Date: Fri, 30 Apr 2021 20:39:56 GMT
Content-Type: application/json
Content-Length:
Server: Jetty(9.4.33.v20201020)
{"error_code":400,"message":"Connector configuration is invalid
and contains the following 1 error(s): \nUnable to connect: The server time
zone value 'EDT' is unrecognized or represents more than one time zone. You
must configure either the server or JDBC driver (via the 'serverTimezone'
configuration property) to use a more specifc time zone value if you want
to utilize time zone support.\nYou can also find the above list of errors
at the endpoint `/connector-plugins/{connectorType}/config/validate`"}
in response to running this:
docker run -it --rm --net mynet_default debezium/kafka \
curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" \
mynet_connect_1:8083/connectors/ \
-d '{
"name": "my-connector-001",
"config": {
"connector.class": "io.debezium.connector.mysql.MySqlConnector",
"tasks.max": "1",
"database.hostname": "my.domain.com",
"database.port": "3306",
"database.user": "myuser",
"database.password": "mypassword",
"database.server.id": "6400",
"database.server.name": "dbserver001",
"database.include.list": "mydb",
"database.history.kafka.bootstrap.servers": "kafka:9092",
"database.history.kafka.topic": "dbhistory.metrics.connector004",
"table.include.list":"mydb.users,mydb.applications"
} }'
This worked fine for a couple hours. Then as I was watching updates mostly straight from the Debezium tutorial, all the kafka containers were gone and then ever since (with the exact same config) will no longer connect, citing the timezone thing. I can connect with the same credentials in the mysql client (via the docker network) and the MySQL permissions and grants did not change. There's a Gitter mention last July that this error is itself an erroneous indication of some other connection failure. And there are multiple reports of it being a bug in JDBC. Is there any other possibility beside someone having changed something on our database?
This is the Connect log:
connect_1 | 2021-04-30 20:28:47,254 INFO || [Worker clientId=connect-1, groupId=1] Session key updated [org.apache.kafka.connect.runtime.distributed.DistributedHerder]
connect_1 | 2021-04-30 20:39:56,996 ERROR || Failed testing connection for jdbc:mysql://my.domain.com:3306/?useInformationSchema=true&nullCatalogMeansCurrent=false&useSSL=false&useUnicode=true&characterEncoding=UTF-8&characterSetResults=UTF-8&zeroDateTimeBehavior=CONVERT_TO_NULL&connectTimeout=30000 with user 'myuser' [io.debezium.connector.mysql.MySqlConnector]
connect_1 | java.sql.SQLException: The server time zone value 'EDT' is unrecognized or represents more than one time zone. You must configure either the server or JDBC driver (via the 'serverTimezone' configuration property) to use a more specifc time zone value if you want to utilize time zone support.
connect_1 | at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:129)
connect_1 | at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
connect_1 | at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:89)
connect_1 | at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:63)
connect_1 | at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:73)
connect_1 | at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:76)
connect_1 | at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:836)
connect_1 | at com.mysql.cj.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:456)
connect_1 | at com.mysql.cj.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:246)
connect_1 | at com.mysql.cj.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:197)
connect_1 | at io.debezium.jdbc.JdbcConnection.lambda$patternBasedFactory$1(JdbcConnection.java:231)
connect_1 | at io.debezium.jdbc.JdbcConnection.connection(JdbcConnection.java:872)
connect_1 | at io.debezium.connector.mysql.MySqlConnection.connection(MySqlConnection.java:79)
connect_1 | at io.debezium.jdbc.JdbcConnection.connection(JdbcConnection.java:867)
connect_1 | at io.debezium.jdbc.JdbcConnection.connect(JdbcConnection.java:413)
connect_1 | at io.debezium.connector.mysql.MySqlConnector.validateConnection(MySqlConnector.java:98)
connect_1 | at io.debezium.connector.common.RelationalBaseSourceConnector.validate(RelationalBaseSourceConnector.java:52)
connect_1 | at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:375)
connect_1 | at org.apache.kafka.connect.runtime.AbstractHerder.lambda$validateConnectorConfig$1(AbstractHerder.java:326)
connect_1 | at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
connect_1 | at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
connect_1 | at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
connect_1 | at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
connect_1 | at java.base/java.lang.Thread.run(Thread.java:834)
connect_1 | Caused by: com.mysql.cj.exceptions.InvalidConnectionAttributeException: The server time zone value 'EDT' is unrecognized or represents more than one time zone. You must configure either the server or JDBC driver (via the 'serverTimezone' configuration property) to use a more specifc time zone value if you want to utilize time zone support.
connect_1 | at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
connect_1 | at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
connect_1 | at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
connect_1 | at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
connect_1 | at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:61)
connect_1 | at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:85)
connect_1 | at com.mysql.cj.util.TimeUtil.getCanonicalTimezone(TimeUtil.java:132)
connect_1 | at com.mysql.cj.protocol.a.NativeProtocol.configureTimezone(NativeProtocol.java:2120)
connect_1 | at com.mysql.cj.protocol.a.NativeProtocol.initServerSession(NativeProtocol.java:2143)
connect_1 | at com.mysql.cj.jdbc.ConnectionImpl.initializePropsFromServer(ConnectionImpl.java:1310)
connect_1 | at com.mysql.cj.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:967)
connect_1 | at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:826)
connect_1 | ... 17 more
connect_1 | 2021-04-30 20:39:56,998 INFO || AbstractConfig values:
connect_1 | [org.apache.kafka.common.config.AbstractConfig]
Is there a parameter I could be passing in via the Connect rest API to specify the timezone in the JDBC string (shown near the top of this log)? I'm using the Debezium (1.5) Docker images per this tutorial.
I think EDT is not in the mysql.time_zone_name table
SELECT * FROM mysql.time_zone_name;
Adding the configuration "database.serverTimezone" solves this issue on my end, e.g.
"config": {
...
"database.serverTimezone": "America/Los_Angeles",
...
}
in my case, using kafka connect, i had to modify the connector config - mysql connection string (connection.url property)
like this:
jdbc:mysql://<server ip or dns>:<port>?serverTimezone=GMT%2b8:00&<more parameters>
Note: %2b is the + character.
I'm new in Opendaylight and I was following a tutorial to build a simple Hello World project (this tutorial) but when I run the project with ./karaf and check if the module is initialized with log:display | grep hello I obtain this error:
2017-08-04 12:43:57,159 | INFO | Event Dispatcher | YangTextSchemaContextResolver | 55 - org.opendaylight.yangtools.yang-parser-impl - 1.0.2.Boron-SR2 | Provided module name /META-INF/yang/hello.yang#0000-00-00.yang does not match actual text hello#2015-01-05.yang, corrected
2017-08-04 12:44:01,928 | INFO | Event Dispatcher | YangTextSchemaContextResolver | 55 - org.opendaylight.yangtools.yang-parser-impl - 1.0.2.Boron-SR2 | Provided module name /META-INF/yang/hello.yang#0000-00-00.yang does not match actual text hello#2015-01-05.yang, corrected
2017-08-04 12:44:08,295 | INFO | Event Dispatcher | BlueprintBundleTracker | 148 - org.opendaylight.controller.blueprint - 0.5.2.Boron-SR2 | Creating blueprint container for bundle org.opendaylight.hello.impl_0.1.0.SNAPSHOT [174] with paths [bundleentry://174.fwk592688102/org/opendaylight/blueprint/impl-blueprint.xml]
2017-08-04 12:44:08,318 | ERROR | Event Dispatcher | BlueprintContainerImpl | 15 - org.apache.aries.blueprint.core - 1.6.1 | Unable to start blueprint container for bundle org.opendaylight.hello.impl/0.1.0.SNAPSHOT
With the diag command I get this output:
opendaylight-user#root>diag
hello-impl (174)
----------------
Status: Failure
Blueprint
4/08/17 14:12
Exception:
Unable to validate xml
org.osgi.service.blueprint.container.ComponentDefinitionException: Unable to validate xml
at org.apache.aries.blueprint.parser.Parser.validate(Parser.java:349)
at org.apache.aries.blueprint.parser.Parser.validate(Parser.java:336)
at org.apache.aries.blueprint.container.BlueprintContainerImpl.doRun(BlueprintContainerImpl.java:343)
at org.apache.aries.blueprint.container.BlueprintContainerImpl.run(BlueprintContainerImpl.java:276)
at org.apache.aries.blueprint.container.BlueprintExtender.createContainer(BlueprintExtender.java:300)
at org.apache.aries.blueprint.container.BlueprintExtender.createContainer(BlueprintExtender.java:269)
at org.apache.aries.blueprint.container.BlueprintExtender.access$900(BlueprintExtender.java:68)
at org.apache.aries.blueprint.container.BlueprintExtender$BlueprintContainerServiceImpl.createContainer(BlueprintExtender.java:602)
at org.opendaylight.controller.blueprint.BlueprintBundleTracker.modifiedBundle(BlueprintBundleTracker.java:178)
at org.opendaylight.controller.blueprint.BlueprintBundleTracker.addingBundle(BlueprintBundleTracker.java:159)
at org.opendaylight.controller.blueprint.BlueprintBundleTracker.addingBundle(BlueprintBundleTracker.java:51)
at org.osgi.util.tracker.BundleTracker$Tracked.customizerAdding(BundleTracker.java:467)
at org.osgi.util.tracker.BundleTracker$Tracked.customizerAdding(BundleTracker.java:414)
at org.osgi.util.tracker.AbstractTracked.trackAdding(AbstractTracked.java:256)
at org.osgi.util.tracker.AbstractTracked.track(AbstractTracked.java:229)
at org.osgi.util.tracker.BundleTracker$Tracked.bundleChanged(BundleTracker.java:443)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.dispatchEvent(BundleContextImpl.java:847)
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:230)
at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEventSynchronous(ListenerQueue.java:148)
at org.eclipse.osgi.framework.internal.core.Framework.publishBundleEventPrivileged(Framework.java:1568)
at org.eclipse.osgi.framework.internal.core.Framework.publishBundleEvent(Framework.java:1504)
at org.eclipse.osgi.framework.internal.core.Framework.publishBundleEvent(Framework.java:1499)
at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:391)
at org.eclipse.osgi.framework.internal.core.AbstractBundle.resume(AbstractBundle.java:390)
at org.eclipse.osgi.framework.internal.core.Framework.resumeBundle(Framework.java:1176)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:559)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:544)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.incFWSL(StartLevelManager.java:457)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.doSetStartLevel(StartLevelManager.java:243)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:438)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:1)
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:230)
at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:340)
Caused by: org.xml.sax.SAXParseException: cvc-complex-type.2.3: Element 'blueprint' cannot have character [children], because the type's content type is element-only.
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source)
at org.apache.xerces.util.ErrorHandlerWrapper.error(Unknown Source)
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
at org.apache.xerces.impl.xs.XMLSchemaValidator$XSIErrorReporter.reportError(Unknown Source)
at org.apache.xerces.impl.xs.XMLSchemaValidator.reportSchemaError(Unknown Source)
at org.apache.xerces.impl.xs.XMLSchemaValidator.elementLocallyValidComplexType(Unknown Source)
at org.apache.xerces.impl.xs.XMLSchemaValidator.elementLocallyValidType(Unknown Source)
at org.apache.xerces.impl.xs.XMLSchemaValidator.processElementContent(Unknown Source)
at org.apache.xerces.impl.xs.XMLSchemaValidator.handleEndElement(Unknown Source)
at org.apache.xerces.impl.xs.XMLSchemaValidator.endElement(Unknown Source)
at org.apache.xerces.jaxp.validation.DOMValidatorHelper.finishNode(Unknown Source)
at org.apache.xerces.jaxp.validation.DOMValidatorHelper.validate(Unknown Source)
at org.apache.xerces.jaxp.validation.DOMValidatorHelper.validate(Unknown Source)
at org.apache.xerces.jaxp.validation.ValidatorImpl.validate(Unknown Source)
at javax.xml.validation.Validator.validate(Unknown Source)
at org.apache.aries.blueprint.parser.Parser.validate(Parser.java:346)
... 32 more
As I've said, I was following the tutorial so my files are exactly the same to the opendaylight link (this is the repository I've created GitHub).
I think it's important to say how I've generated. This is de code:
mvn archetype:generate -DarchetypeGroupId=org.opendaylight.controller -DarchetypeArtifactId=opendaylight-startup-archetype -DarchetypeRepository=https://nexus.opendaylight.org/content/repositories/public/ -DarchetypeCatalog=remote -DarchetypeVersion=1.2.2-Boron-SR2
Thank you all,
Daniel Romero Morcillo
In the logs you provided:
Element 'blueprint' cannot have character [children], because the type's content type is element-only.
So I think there are simply some errors (invalid XML) in your blueprint file.
If it is exactly like the one in the link you provided [here] there are some extra characters in line 19
I have loaded JSON data to my HDFS, I created the table with required columns in MySQL database as follows.
How to create table with row formatter for accepting JSON?
My HDFS data
{
"Employees" : [
{
"userId":"rirani",
"jobTitleName":"Developer",
"firstName":"Romin",
"lastName":"Irani",
"preferredFullName":"Romin Irani",
"employeeCode":"E1",
"region":"CA",
"phoneNumber":"408-1234567",
"emailAddress":"romin.k.irani#gmail.com"
},
{
"userId":"nirani",
"jobTitleName":"Developer",
"firstName":"Neil",
"lastName":"Irani",
"preferredFullName":"Neil Irani",
"employeeCode":"E2",
"region":"CA",
"phoneNumber":"408-1111111",
"emailAddress":"neilrirani#gmail.com"
},
{
"userId":"thanks",
"jobTitleName":"Program Directory",
"firstName":"Tom",
"lastName":"Hanks",
"preferredFullName":"Tom Hanks",
"employeeCode":"E3",
"region":"CA",
"phoneNumber":"408-2222222",
"emailAddress":"tomhanks#gmail.com"
}
]
}
My SQL table structure
mysql> create table employee(userid int,jobTitleName varchar(20),firstName varchar(20),lastName varchar(20),preferrredFullName varchar(20),employeeCode varchar(20),region varchar(20),phoneNumber varchar(20), emailAddress varchar(20),modifiedDate timestamp DEFAULT CURRENT_TIMESTAMP);
mysql> desc employee;
+--------------------+-------------+------+-----+-------------------+-------+
| Field | Type | Null | Key | Default | Extra |
+--------------------+-------------+------+-----+-------------------+-------+
| userid | int(11) | YES | | NULL | |
| jobTitleName | varchar(20) | YES | | NULL | |
| firstName | varchar(20) | YES | | NULL | |
| lastName | varchar(20) | YES | | NULL | |
| preferrredFullName | varchar(20) | YES | | NULL | |
| employeeCode | varchar(20) | YES | | NULL | |
| region | varchar(20) | YES | | NULL | |
| phoneNumber | varchar(20) | YES | | NULL | |
| emailAddress | varchar(20) | YES | | NULL | |
| modifiedDate | timestamp | NO | | CURRENT_TIMESTAMP | |
+--------------------+-------------+------+-----+-------------------+-------+
10 rows in set (0.00 sec)
I am trying to load data from my HDFS to MySQL for the above table using sqoop export as follows
sqoop export --connect jdbc:mysql://localhost/emp_scheme --username root --password adithyan --table employee --export-dir /user/adithyan/filesystem/employee.txt
it has end up with exception as follows
17/02/18 19:35:35 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
17/02/18 19:35:35 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/02/18 19:35:35 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/02/18 19:35:35 INFO tool.CodeGenTool: Beginning code generation
17/02/18 19:35:36 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1
17/02/18 19:35:36 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1
17/02/18 19:35:36 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/adithyan/hadoop_dir/hadoop-1.2.1
Note: /tmp/sqoop-adithyan/compile/35afadf151a1dd1626a3658577cbc2dd/employee.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/02/18 19:35:41 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-adithyan/compile/35afadf151a1dd1626a3658577cbc2dd/employee.jar
17/02/18 19:35:41 INFO mapreduce.ExportJobBase: Beginning export of employee
17/02/18 19:35:45 INFO input.FileInputFormat: Total input paths to process : 1
17/02/18 19:35:45 INFO input.FileInputFormat: Total input paths to process : 1
17/02/18 19:35:45 INFO util.NativeCodeLoader: Loaded the native-hadoop library
17/02/18 19:35:45 WARN snappy.LoadSnappy: Snappy native library not loaded
17/02/18 19:35:46 INFO mapred.JobClient: Running job: job_201702181051_0002
17/02/18 19:35:47 INFO mapred.JobClient: map 0% reduce 0%
17/02/18 19:36:17 INFO mapred.JobClient: Task Id : attempt_201702181051_0002_m_000000_0, Status : FAILED
java.io.IOException: Can't export data, please check failed map task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.RuntimeException: Can't parse input data: '"firstName":"Tom"'
at employee.__loadFromFields(employee.java:596)
at employee.parse(employee.java:499)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
Caused by: java.lang.NumberFormatException: For input string: ""firstName":"Tom""
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:569)
at java.lang.Integer.valueOf(Integer.java:766)
at employee.__loadFromFields(employee.java:548)
... 12 more
17/02/18 19:36:18 INFO mapred.JobClient: Task Id : attempt_201702181051_0002_m_000001_0, Status : FAILED
java.io.IOException: Can't export data, please check failed map task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.RuntimeException: Can't parse input data: '{'
at employee.__loadFromFields(employee.java:596)
at employee.parse(employee.java:499)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
Caused by: java.lang.NumberFormatException: For input string: "{"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:580)
at java.lang.Integer.valueOf(Integer.java:766)
at employee.__loadFromFields(employee.java:548)
... 12 more
17/02/18 19:36:29 INFO mapred.JobClient: Task Id : attempt_201702181051_0002_m_000000_1, Status : FAILED
java.io.IOException: Can't export data, please check failed map task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.RuntimeException: Can't parse input data: '"firstName":"Tom"'
at employee.__loadFromFields(employee.java:596)
at employee.parse(employee.java:499)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
Caused by: java.lang.NumberFormatException: For input string: ""firstName":"Tom""
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:569)
at java.lang.Integer.valueOf(Integer.java:766)
at employee.__loadFromFields(employee.java:548)
... 12 more
17/02/18 19:36:29 INFO mapred.JobClient: Task Id : attempt_201702181051_0002_m_000001_1, Status : FAILED
java.io.IOException: Can't export data, please check failed map task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.RuntimeException: Can't parse input data: '{'
at employee.__loadFromFields(employee.java:596)
at employee.parse(employee.java:499)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
Caused by: java.lang.NumberFormatException: For input string: "{"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:580)
at java.lang.Integer.valueOf(Integer.java:766)
at employee.__loadFromFields(employee.java:548)
... 12 more
17/02/18 19:36:42 INFO mapred.JobClient: Task Id : attempt_201702181051_0002_m_000000_2, Status : FAILED
java.io.IOException: Can't export data, please check failed map task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.RuntimeException: Can't parse input data: '"firstName":"Tom"'
at employee.__loadFromFields(employee.java:596)
at employee.parse(employee.java:499)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
Caused by: java.lang.NumberFormatException: For input string: ""firstName":"Tom""
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:569)
at java.lang.Integer.valueOf(Integer.java:766)
at employee.__loadFromFields(employee.java:548)
... 12 more
17/02/18 19:36:42 INFO mapred.JobClient: Task Id : attempt_201702181051_0002_m_000001_2, Status : FAILED
java.io.IOException: Can't export data, please check failed map task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.RuntimeException: Can't parse input data: '{'
at employee.__loadFromFields(employee.java:596)
at employee.parse(employee.java:499)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
Caused by: java.lang.NumberFormatException: For input string: "{"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:580)
at java.lang.Integer.valueOf(Integer.java:766)
at employee.__loadFromFields(employee.java:548)
Can somebody help me on this?
you may have to look at multiple options..
JSON_SET/REPLACE/INSERT - these options may not be directly supported by sqoop yet.
Another options is to pre-process data using pig then stage the data in HDFS before sqooping to RDBMS.
I am trying to run RandomWalkWith Restart example https://github.com/apache/giraph/blob/release-1.0/giraph-examples/src/main/java/org/apache/giraph/examples/RandomWalkWithRestartVertex.java
My Input is data is
12 34 56
34 78
56 34 78
78 34
and I am running
hadoop jar giraph-examples-1.1.0-for-hadoop-2.2.0-jar-with-dependencies.jar GiraphRunner -Dgiraph.zkList=<host>:port -libjars giraph-examples-1.1.0-for-hadoop-2.2.0-jar-with-dependencies.jar
org.apache.giraph.examples.RandomWalkWithRestartComputation
-mc org.apache.giraph.examples.RandomWalkVertexMasterCompute
-wc org.apache.giraph.examples.RandomWalkWorkerContext
-vof org.apache.giraph.examples.VertexWithDoubleValueDoubleEdgeTextOutputFormat
-vif org.apache.giraph.examples.LongDoubleDoubleTextInputFormat
-vip giraph_algorithms/personalized_pr/input/graph.txt
-op giraph_algorithms/personalized_pr/out1 -w 1
But I am getting this error.. :-/
Error: java.lang.IllegalStateException: run: Caught an unrecoverable exception
For input string: "PK�uE META-INF/��PKPK�uEMETA-INF/MANIFEST.MF�M��LK-.�" at
org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:101) at
org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at
org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at
org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:415) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554) at
org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) Caused by:
java.lang.NumberFormatException: For input string: "PK�uE META-INF/��PKPK�uEMETA-
INF/MANIFEST.MF�M��LK-.�" at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) at
java.lang.Long.parseLong(Long.java:441) at java.lang.Long.parseLong(Long.java:483) at
org.apache.giraph.examples.RandomWalkWorkerContext.initializeSources(
RandomWalkWorkerContext.java:131) at org.apache.giraph.examples.RandomWalkWorkerContext.
setStaticVars(RandomWalkWorkerContext.java:160) at
org.apache.giraph.examples.RandomWalkWorkerContext
.preApplication(RandomWalkWorkerContext.java:146) at
org.apache.giraph.graph.GraphTaskManager.workerContextPreApp(
GraphTaskManager.java:815) at
org.apache.giraph.graph.GraphTaskManager.
prepareGraphStateAndWorkerContext(GraphTaskManager.java:451) at
org.apache.giraph.graph.GraphTaskManager.execute(GraphTaskManager.java:266) at
org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:91) ... 7 more
Why is it reading manifest file.. When I specifically saying it to read a file and not even a directory?
Because you passed the libjar argument as the vertex class file.
Like the other arguments, you need to say: -D libjars=your_jar.jar.