I have this sshagent code block:
sshagent(['ssh_key.hashed']) {
sh """
ssh -o StrictHostKeyChecking=no -l user example.com <<EOF
today=`date +%Y-%m-%d`
drush -r /var/www/html/example.com sql-dump --gzip > /var/www/html/example.com/backups/example_prodDB-jenkins-${today}.sql.gz
EOF
""".stripIndent()
}
Where as you can see, the real intention is to get the db dump of a drupal database.
Now, it will work if used on a regular shell script.
I need to write it that way as I will reuse the $today shell variable in another line of code.
Within the jenkinsfile, it seems like it is interpreted as a groovy variable based on the error:
groovy.lang.MissingPropertyException: No such property: today for class: groovy.lang.Binding
at groovy.lang.Binding.getVariable(Binding.java:63)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onGetProperty(SandboxInterceptor.java:270)
at org.kohsuke.groovy.sandbox.impl.Checker$7.call(Checker.java:353)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:357)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:333)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:333)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:333)
at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.getProperty(SandboxInvoker.java:29)
at com.cloudbees.groovy.cps.impl.PropertyAccessBlock.rawGet(PropertyAccessBlock.java:20)
at WorkflowScript.run(WorkflowScript:11)
at ___cps.transform___(Native Method)
at com.cloudbees.groovy.cps.impl.PropertyishBlock$ContinuationImpl.get(PropertyishBlock.java:74)
at com.cloudbees.groovy.cps.LValueBlock$GetAdapter.receive(LValueBlock.java:30)
at com.cloudbees.groovy.cps.impl.PropertyishBlock$ContinuationImpl.fixName(PropertyishBlock.java:66)
at jdk.internal.reflect.GeneratedMethodAccessor305.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at com.cloudbees.groovy.cps.impl.ContinuationPtr$ContinuationImpl.receive(ContinuationPtr.java:72)
at com.cloudbees.groovy.cps.impl.ConstantBlock.eval(ConstantBlock.java:21)
at com.cloudbees.groovy.cps.Next.step(Next.java:83)
at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:174)
at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:163)
at org.codehaus.groovy.runtime.GroovyCategorySupport$ThreadCategoryInfo.use(GroovyCategorySupport.java:129)
at org.codehaus.groovy.runtime.GroovyCategorySupport.use(GroovyCategorySupport.java:268)
at com.cloudbees.groovy.cps.Continuable.run0(Continuable.java:163)
at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.access$001(SandboxContinuable.java:18)
at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.run0(SandboxContinuable.java:51)
at org.jenkinsci.plugins.workflow.cps.CpsThread.runNextChunk(CpsThread.java:185)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.run(CpsThreadGroup.java:400)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.access$400(CpsThreadGroup.java:96)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:312)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:276)
at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$2.call(CpsVmExecutorService.java:67)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:131)
at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Finished: FAILURE
Any hints is appreciated.
Thanks.
After a lot of searching online and trial and error found this helpful reference https://code-maven.com/jenkins-pipeline-environment-variables
I don't know if it's the right way to do it but it works.
So in summary, I had to add an environment variable and add use it within my "sshagent"
Posting sample Jenkinsfile that works.
pipeline {
agent any
stages {
stage('Dump Prod DB') {
environment {
today = sh(script: 'date +%Y-%m-%d', , returnStdout: true).trim()
}
steps {
sshagent(['promet_key.hashed']) {
sh """
ssh -o StrictHostKeyChecking=no -l user example.com <<EOF
drush -r /var/www/html/example.com sql-dump --gzip > /var/www/html/example.com/backups/example_prodDB-jenkins-${today}.sql.gz
EOF
""".stripIndent()
}
}
}
}
}
Related
I have a Django project. It has some files with cyrillic chars in their names:
SonarQube gives error:
java.nio.file.InvalidPathException: Malformed input or input contains
unmappable characters:
src/apps/orders/sap_data_examples/ZLO_CANCEL_ORDERS/??????????
?????????????? ??????????????.json at
java.base/sun.nio.fs.UnixPath.encode(Unknown Source) at
java.base/sun.nio.fs.UnixPath.(Unknown Source) at
java.base/sun.nio.fs.UnixFileSystem.getPath(Unknown Source) at
java.base/java.nio.file.Path.resolve(Unknown Source) at
org.sonar.scm.git.IncludedFilesRepository.indexFiles(IncludedFilesRepository.java:65)
at
org.sonar.scm.git.IncludedFilesRepository.(IncludedFilesRepository.java:40)
at org.sonar.scm.git.GitIgnoreCommand.init(GitIgnoreCommand.java:37)
at
org.sonar.scanner.scan.filesystem.ProjectFileIndexer.index(ProjectFileIndexer.java:104)
at
org.sonar.scanner.scan.ProjectScanContainer.doAfterStart(ProjectScanContainer.java:352)
at
org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:137)
at
org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:123)
at
org.sonar.scanner.bootstrap.GlobalContainer.doAfterStart(GlobalContainer.java:150)
at
org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:137)
at
org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:123)
at org.sonar.batch.bootstrapper.Batch.doExecute(Batch.java:72) at
org.sonar.batch.bootstrapper.Batch.execute(Batch.java:66) at
org.sonarsource.scanner.api.internal.batch.BatchIsolatedLauncher.execute(BatchIsolatedLauncher.java:46)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method) at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown
Source) at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
Source) at java.base/java.lang.reflect.Method.invoke(Unknown Source)
at
org.sonarsource.scanner.api.internal.IsolatedLauncherProxy.invoke(IsolatedLauncherProxy.java:60)
at com.sun.proxy.$Proxy0.execute(Unknown Source) at
org.sonarsource.scanner.api.EmbeddedScanner.doExecute(EmbeddedScanner.java:189)
at
org.sonarsource.scanner.api.EmbeddedScanner.execute(EmbeddedScanner.java:138)
at org.sonarsource.scanner.cli.Main.execute(Main.java:112) at
org.sonarsource.scanner.cli.Main.execute(Main.java:75) at
org.sonarsource.scanner.cli.Main.main(Main.java:61)
SonarQube in gitlab-ci is being started as follows:
analyze:
stage: analyze
needs: ["tests"]
dependencies:
- tests
image:
name: docker-sre.site.ru/sonarqube/scanner:${SONAR_TAG}
entrypoint: [""]
before_script:
- echo code analysis will be executed on ${SONAR_HOST}
script:
- /entrypoint.sh ${SONAR_HOST} ${SONAR_LOGIN} -X
allow_failure: true
artifacts:
when: always
reports:
junit: report.xml
coverage_report:
coverage_format: cobertura
path: coverage.xml
I supposed that the error related to cyrillic chars in file names. If so how to fix this? Maybe specify encoding some how?
Please help me to resolve this issue.
I am getting issue while executing below commands under steps in jenkinsfile.
steps{
sh "curl -s -L https://github.com/stedolan/jq/releases/download/jq-1.5/jq-linux64 > ./jq"
sh "chmod +x ./jq"
sh "curl -X POST --header 'Content-Type: application/json' --header 'Accept: application/json' --header 'X-API-Token: xxxxxxxx' 'https://api.appcenter.ms/v0.1/apps/raghu/${app_name}/release_uploads' >> output_file.json"
sh "cat output_file.json"
script{
upload_id=$(cat output_file.json | jq -r '.upload_id')
echo "${upload_id}"
}
}
[Pipeline] End of Pipeline
groovy.lang.MissingPropertyException: No such property: output for class: groovy.lang.Binding
at groovy.lang.Binding.getVariable(Binding.java:63)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onGetProperty(SandboxInterceptor.java:264)
at org.kohsuke.groovy.sandbox.impl.Checker$6.call(Checker.java:289)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:293)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:269)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:269)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:269)
at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.getProperty(SandboxInvoker.java:29)
at com.cloudbees.groovy.cps.impl.PropertyAccessBlock.rawGet(PropertyAccessBlock.java:20)
at WorkflowScript.run(WorkflowScript:25)
at cps.transform(Native Method)
at com.cloudbees.groovy.cps.impl.PropertyishBlock$ContinuationImpl.get(PropertyishBlock.java:74)
at com.cloudbees.groovy.cps.LValueBlock$GetAdapter.receive(LValueBlock.java:30)
at com.cloudbees.groovy.cps.impl.PropertyishBlock$ContinuationImpl.fixName(PropertyishBlock.java:66)
at sun.reflect.GeneratedMethodAccessor8670.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.cloudbees.groovy.cps.impl.ContinuationPtr$ContinuationImpl.receive(ContinuationPtr.java:72)
at com.cloudbees.groovy.cps.impl.ConstantBlock.eval(ConstantBlock.java:21)
at com.cloudbees.groovy.cps.Next.step(Next.java:83)
at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:174)
at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:163)
at org.codehaus.groovy.runtime.GroovyCategorySupport$ThreadCategoryInfo.use(GroovyCategorySupport.java:129)
at org.codehaus.groovy.runtime.GroovyCategorySupport.use(GroovyCategorySupport.java:268)
at com.cloudbees.groovy.cps.Continuable.run0(Continuable.java:163)
at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.access$101(SandboxContinuable.java:34)
at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.lambda$run0$0(SandboxContinuable.java:59)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovySandbox.runInSandbox(GroovySandbox.java:136)
at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.run0(SandboxContinuable.java:58)
at org.jenkinsci.plugins.workflow.cps.CpsThread.runNextChunk(CpsThread.java:174)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.run(CpsThreadGroup.java:347)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.access$200(CpsThreadGroup.java:93)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:259)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:247)
at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$2.call(CpsVmExecutorService.java:64)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:131)
at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Finished: FAILURE
script{
sh """
upload_id=\$(cat output_file.json | jq -r '.upload_id')
echo "\${upload_id}"
"""
}
You need to modify the script block as above and also need to escape $-sign to avoid this error. But dont know where you are using upload_id and where you are getting release_url
RUN THIS COMMAND
/bin/nifi.sh stateless RunFromRegistry Once --file ./test/stateless_test1.json
LOG
Note: Use of this command is considered experimental. The commands and approach used may change from time to time.
Java home (JAVA_HOME): /home/deltaman/software/jdk1.8.0_211
Java options (STATELESS_JAVA_OPTS): -Xms1024m -Xmx1024m
13:48:39.835 [main] INFO org.apache.nifi.StatelessNiFi - Unpacking 100 NARs
13:50:51.513 [main] INFO org.apache.nifi.StatelessNiFi - Finished unpacking 100 NARs in 131671 millis
Exception in thread "main" java.lang.reflect.InvocationTargetException
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.nifi.StatelessNiFi.main(StatelessNiFi.java:103)
... 5 more
Caused by: java.nio.file.NoSuchFileException: ./test/stateless_test1.json
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
at java.nio.file.Files.newByteChannel(Files.java:361)
at java.nio.file.Files.newByteChannel(Files.java:407)
at java.nio.file.Files.readAllBytes(Files.java:3152)
at org.apache.nifi.stateless.runtimes.Program.runLocal(Program.java:119)
at org.apache.nifi.stateless.runtimes.Program.launch(Program.java:67)
... 10 more
it seems no exist file,but i can find the file as follows:
$ cat ./test/stateless_test1.json
{
"registryUrl": "http://10.148.123.12:9991",
"bucketId": "ec1b291e-c3f1-437c-a4e4-c069bd2f6ed1",
"flowId": "b1f73fe8-2874-47a5-970c-6b25eea19497",
"parameters": {
"text" : "xixixixi"
}
}
CONFIGURATION
IDK WHAT IS THE PROBLEM?
ANY SUGGESTION IS APPRECIATION!
/bin/nifi.sh stateless RunFromRegistry Once --file ./test/stateless_test1.json
it is relative path,must use full path,such as
/home/NiFi/nifi-1.10.0/bin/nifi.sh stateless RunFromRegistry Once --file /home/NiFi/nifi-1.10.0/test/stateless_test1.json
For example, I have two Hive jobs, where the output of one job is used as a argument/variable in the second job. I can successfully run the following comand on terminal to get my result on the master node of the EMR cluster.
[hadoop#ip-10-6-131-223 ~]$ hive -f s3://MyProjectXYZ/bin/GetNewJobDetails_SelectAndOverwrite.hql --hivevar LatestLastUpdated=$(hive -f s3://MyProjectXYZ/bin/GetNewJobDetails_LatestLastUpdated.hql)
However, it seems I can not add a Hive step to run GetNewJobDetails_SelectAndOverwrite.hql with the Arguments textbox set as --hivevar LatestLastUpdated=$(hive -f s3://MyProjectXYZ/bin/GetNewJobDetails_LatestLastUpdated.hql).
The error is:
Details : FAILED: ParseException line 7:61 cannot recognize input near
'$' '(' 'hive' in expression specification
JAR location : command-runner.jar
Main class : None
Arguments : hive-script --run-hive-script --args -f
s3://MyProjectXYZ/bin/GetNewJobDetails_SelectAndOverwrite.hql
--hivevar LatestLastUpdated=$(hive -f s3://MyProjectXYZ/bin/GetNewJobDetails_LatestLastUpdated.hql)
Action on failure: Cancel and wait
I also tried it with command-runner.jar to run the first hive command. It still does not work:
NoViableAltException(15#[412:1: atomExpression : ( constant | (
intervalExpression )=> intervalExpression | castExpression |
extractExpression | floorExpression | caseExpression | whenExpression
| ( subQueryExpression )=> ( subQueryExpression ) -> ^(
TOK_SUBQUERY_EXPR TOK_SUBQUERY_OP subQueryExpression ) | ( function
)=> function | tableOrColumn | expressionsInParenthesis[true] );]) at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser$DFA36.specialStateTransition(HiveParser_IdentifiersParser.java:31808)
at org.antlr.runtime.DFA.predict(DFA.java:80) at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.atomExpression(HiveParser_IdentifiersParser.java:6746)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceFieldExpression(HiveParser_IdentifiersParser.java:6988)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceUnaryPrefixExpression(HiveParser_IdentifiersParser.java:7324)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceUnarySuffixExpression(HiveParser_IdentifiersParser.java:7380)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceBitwiseXorExpression(HiveParser_IdentifiersParser.java:7542)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceStarExpression(HiveParser_IdentifiersParser.java:7685)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedencePlusExpression(HiveParser_IdentifiersParser.java:7828)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceConcatenateExpression(HiveParser_IdentifiersParser.java:7967)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceAmpersandExpression(HiveParser_IdentifiersParser.java:8177)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceBitwiseOrExpression(HiveParser_IdentifiersParser.java:8314)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceSimilarExpressionPart(HiveParser_IdentifiersParser.java:8943)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceSimilarExpressionMain(HiveParser_IdentifiersParser.java:8816)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceSimilarExpression(HiveParser_IdentifiersParser.java:8697)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceEqualExpression(HiveParser_IdentifiersParser.java:9537)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceNotExpression(HiveParser_IdentifiersParser.java:9703)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceAndExpression(HiveParser_IdentifiersParser.java:9812)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceOrExpression(HiveParser_IdentifiersParser.java:9953)
at
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.expression(HiveParser_IdentifiersParser.java:6686)
at
org.apache.hadoop.hive.ql.parse.HiveParser.expression(HiveParser.java:42062)
at
org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.searchCondition(HiveParser_FromClauseParser.java:6446)
at
org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.whereClause(HiveParser_FromClauseParser.java:6364)
at
org.apache.hadoop.hive.ql.parse.HiveParser.whereClause(HiveParser.java:41844)
at
org.apache.hadoop.hive.ql.parse.HiveParser.atomSelectStatement(HiveParser.java:36755)
at
org.apache.hadoop.hive.ql.parse.HiveParser.selectStatement(HiveParser.java:36987)
at
org.apache.hadoop.hive.ql.parse.HiveParser.regularBody(HiveParser.java:36504)
at
org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpressionBody(HiveParser.java:35822)
at
org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpression(HiveParser.java:35710)
at
org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2284)
at
org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1333)
at
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:208)
at
org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:77)
at
org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:70)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468) at
org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317) at
org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457) at
org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237) at
org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227) at
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
at
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
at
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:474)
at
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:490)
at
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) at
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.apache.hadoop.util.RunJar.run(RunJar.java:234) at
org.apache.hadoop.util.RunJar.main(RunJar.java:148) FAILED:
ParseException line 7:61 cannot recognize input near '$' '(' 'hive' in
expression specification
You should execute the two hive commands as 2 different steps on the EMR. Also the arguments should be passed as a list instead of string. You can split your hive command by space (' '), which will return a list and pass this list as argument to the EMR step.
Reference : https://docs.aws.amazon.com/cli/latest/reference/emr/add-steps.html
I use the Naive Bayes in Mahout to do classification.
after I trained the model, I used one document to test the model.
I transfer the data into vector
./bin/mahout seqdirectory -i /home/d/mahoutTest/ -o /home/d/seqMahout1
./bin/mahout seq2sparse -i /home/d/seqMahout1/ -o /home/d/vecMahout1 -lnorm -nv -ow -wt tfidf
but when I was trying to test the data, there is error
./bin/mahout testnb -i /home/d/vecMahout1/tfidf-vectors/ -m /tmp/mahout-work-d/model/ -l /tmp/mahout-work-d/labelindex -o /home/d/out10/
SLF4J: Failed toString() invocation on an object of type [org.apache.mahout.classifier.ResultAnalyzer]
org.apache.commons.math3.exception.MathIllegalArgumentException: weigth array must contain at least one non-zero value
at org.apache.commons.math3.stat.descriptive.AbstractUnivariateStatistic.test(AbstractUnivariateStatistic.java:309)
at org.apache.commons.math3.stat.descriptive.AbstractUnivariateStatistic.test(AbstractUnivariateStatistic.java:245)
at org.apache.commons.math3.stat.descriptive.moment.Mean.evaluate(Mean.java:211)
at org.apache.commons.math3.stat.descriptive.moment.Mean.evaluate(Mean.java:254)
at org.apache.mahout.classifier.ConfusionMatrix.getWeightedPrecision(ConfusionMatrix.java:143)
at org.apache.mahout.classifier.ResultAnalyzer.toString(ResultAnalyzer.java:114)
at org.slf4j.helpers.MessageFormatter.safeObjectAppend(MessageFormatter.java:304)
at org.slf4j.helpers.MessageFormatter.deeplyAppendParameter(MessageFormatter.java:276)
at org.slf4j.helpers.MessageFormatter.arrayFormat(MessageFormatter.java:230)
at org.slf4j.helpers.MessageFormatter.format(MessageFormatter.java:152)
at org.slf4j.impl.Log4jLoggerAdapter.info(Log4jLoggerAdapter.java:345)
at org.apache.mahout.classifier.naivebayes.test.TestNaiveBayesDriver.run(TestNaiveBayesDriver.java:107)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.mahout.classifier.naivebayes.test.TestNaiveBayesDriver.main(TestNaiveBayesDriver.java:67)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:152)
at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
This can happen if the "true" label for your test item is not in the training set. Do you see a warning like:
15/07/07 15:20:12 WARN ConfusionMatrix: Label YOUR TEST LABEL HERE did not appear in the training examples