in my ruby code, I have to search lines from the file and then process. below is the code to run the zgrep on the file and then process the output. the code is not running. I get below error all the time.
2022-09-19 15:24:/src/SVSHelpers/file_search.rb(20):INFO:{"scanid"=>"110718", "rawrev"=>"SC_FLD_IODB0", "srchpattern"=>".*pie_mcawrp.FatalErrDetected", "limit"=>"20"}
2022-09-19 15:24:/src/SVSHelpers/file_search.rb(23):INFO:zgrep -i -m20 ".*pie_mcawrp.FatalErrDetected" /proj/platform_scandump/scanview30/ASCII/ScanviewStg/110718.txt.gz
2022-09-19 15:24:/src/SVSHelpers/file_search.rb(45):ERROR:problem in signal retrieval command: zgrep -i -m20 ".*pie_mcawrp.FatalErrDetected" /proj/platform_scandump/scanview30/ASCII/ScanviewStg/110718.txt.gz, error : undefined method `flush' for ScanView::ReportScriptingHelper::ScanViewScript, searchedline :
here is my code:
items["srchpattern"] = items["srchpattern"].gsub("?", ".*")
$logger.info("#{items}")
command = "zgrep -i -m" + items["limit"] + ' "' + items["srchpattern"] + '" ' + $ascii_file_path + '/' + items["scanid"] +'.txt.gz'
#command='zgrep -i -m20 ".*pie_mcawrp.FatalErrDetected" /proj/platform_scandump/scanview30/ASCII/ScanviewStg/110718.txt.gz'
$logger.info("#{command}")
ll = `#{command}`
$logger.info("found lines : #{ll}")
searchedlines = ll.split("\n")
signals = []
for line in searchedlines do
tokens = line.split(",")
signals.push({
"signalNameString": tokens[0],
"signalHexValueString": tokens[1],
"signalDecimalValue": tokens[1].to_i(16)
})
end
return JSON.dump({
"statusmsg": "passed",
"data": {
"scanOrSystemIdString": items["scanid"],
"rtlSignalDatApisList": signals
}
})
this code is executed in the instant_eval. if run outside of instant_eval it runs alright. also it is running fine on window. however creating problem on Linux.
Related
So in my Jenkins pipeline I run a couple of curl commands across different stages. I store the ouput of Stage1 into a file and for every item in that list I run another curl command and use the output of that to extract some values using jq.
However from the second stage I can't seem to store the jq extracted values into variables to echo them later. What am I doing wrong?
{Stage1}
.
.
.
{Stage2}
def lines = stageOneList.readLines()
lines.each { line -> println line
stageTwoList = sh (script: "curl -u $apptoken" + " -X GET --url " + '"' + "$appurl" + "components/tree?component=" + line + '"', returnStdout: true)
pfName = sh (script: "jq -r '.component.name' <<< '${stageTwoList}' ")
pfKey = sh (script: "jq -r '.component.key' <<< '${stageTwoList}' ")
echo "Component Names and Keys\n | $pfName | $pfKey |"
}
returns in the end for Stage2
[Pipeline] sh
+ jq -r .component.name
digital-hot-wallet-gateway
[Pipeline] sh
+ jq -r .component.key
dhwg
[Pipeline] echo
Component Names and Keys
| null | null |
Any help in the right direction appreciated!
You passed true as the argument for the returnStdout argument to the shell step method for stageTwoList, but then forgot to use the same argument for the JSON parsed returns to the next two variable assignments:
def lines = stageOneList.readLines()
lines.each { line -> println line
stageTwoList = sh(script: "curl -u $apptoken" + " -X GET --url " + '"' + "$appurl" + "components/tree?component=" + line + '"', returnStdout: true)
pfName = sh(script: "jq -r '.component.name' <<< '${stageTwoList}' ", returnStdout: true)
pfKey = sh(script: "jq -r '.component.key' <<< '${stageTwoList}' ", returnStdout: true)
echo "Component Names and Keys\n | $pfName | $pfKey |"
}
Note you can also make this much easier on yourself by doing the JSON parsing natively in Groovy and with Jenkins Pipeline step methods:
String stageTwoList = sh(script: "curl -u $apptoken" + " -X GET --url " + '"' + "$appurl" + "components/tree?component=" + line + '"', returnStdout: true)
Map stageTwoListData = readJSON(text: stageTwoList)
pfName = stageTwoListData['component']['name']
pfKey = stageTwoListData['component']['key']
I am trying to execute a shell script on a windows node using Jenkins.
The bash script uses sort -u flag in one of the steps to filter out unique elements from an existing array
list_unique=($(echo "${list[#]}" | tr ' ' '\n' | sort -u | tr '\n' ' '))
Note - shebang used in the script is #!/bin/bash
On calling the script from command prompt as - bash test.sh $arg1
I got the following error -
-uThe system cannot find the file specified.
I understand the issue was that with the above call, sort.exe was being used from command prompt and not the Unix sort command. To get around this I changed the path variable in Windows System variables and moved \cygwin\bin ahead of \Windows\System32
This fixed the issue and the above call gave me the expected results.
However, When the same script is called on this node using Jenkins, I get the same error again
-uThe system cannot find the file specified.
Jenkins stage calling the script
stage("Run Test") {
options {
timeout(time: 5, unit: 'MINUTES')
}
steps {
script {
if(fileExists("${Test_dir}")){
dir("${Test_dir}"){
if(fileExists("test.sh")){
def command = 'bash test.sh ${env.arg1}'
env.output = sh(returnStdout: true , script : "${command}").trim()
if (env.output == "Invalid"){
def err_msg = "Error Found."
sh "echo -n '" + err_msg + " ' > ${ERR_MSG_FILE}"
error(err_msg)
}
sh "echo Running tests for ${env.output}"
}
}
}
}
}
}
Kindly Help
I'm trying to write an npm script that will execute an ssh shell command. Currently it's working by executing an osascript command to open a Terminal window and run the command.
I'd like to change this to execute the command in the current terminal. The script is including both shelljs and executive. The script ends without anything happening when I use executive. With shelljs I get:
Pseudo-terminal will not be allocated because stdin is not a terminal.
the input device is not a TTY
The command being executed is: ssh -i [ssh-key] -t ubuntu#[ip-address] eval $(base64 -D <<< [command in base64])
The base64 command is sudo docker exec -i -t $(sudo docker ps -aqf "ancestor=' + containerName + '") /bin/bash
If I output the command and copy and paste it, it will work as expected, sshing into a remote machine and running a docker exec command.
If I remove the -t option I don't get the warning messages but there's no output in the console and the script hangs (I assume it's running the command in the background with no output). If I remove the eval ... part I get an output that looks like what you'd see when sshing into a server but without the input terminal.
What can I do to execute this command in the same terminal or in a new tab. If I have to use an osascript command to do this, that's fine as well. I'll be executing this command from the terminal in PhpStorm though.
Edit
Here's the block of code:
var execCommand = 'sudo docker exec -i -t $(sudo docker ps -aqf "ancestor=nginx") /bin/bash';
var buffer = new Buffer(execCommand);
var encoded = buffer.toString('base64');
var cmd = "ssh -i " + this.keyPath + " -t ubuntu#" + ip + " eval $(base64 -D <<< " + encoded + ") ";
shell.exec(cmd);
Edit 2
I can ssh into the machine successfully and get a command prompt but I'm getting a the input device is not a TTY error now when I add the eval command.
var docker_exec = 'sudo docker exec -it $(sudo docker ps -aqf "ancestor=' + containerName + '") /bin/bash';
var encoded = new Buffer(docker_exec).toString('base64');
var sshTerm = spawn('ssh', [
'-i',
this.keyPath,
'ubuntu#' + ip,
'eval',
'eval $(base64 -D <<< ' + encoded + ')'
], {
stdio: 'inherit',
shell: true
});
sshTerm.on('exit', function() {
process.exit(0);
});
I checked and it shelljs.exec is good for non TTY commands. For TTY based command you can use normal spawn method. Below is a sample code that works great for me
var spawn = require('child_process').spawn;
var sshTerm = spawn('ssh', ["vagrant#192.168.33.100", ""], {
stdio: 'inherit'
});
// listen for the 'exit' event
// which fires when the process exits
sshTerm.on('exit', function(code, signal) {
if (code === 0) {
// process completed successfully
} else {
// handle error
}
});
I would like to get last build output in pipeline Jenkins job and attach in email (using emailext plugin). Curl works fine and gives proper build output but i can't store in variable to attach in the email. I'm using latest jenkins version.
I can see there are couple of related posts for simple sh command but that doesn't work for curl response store.
Tried code:
1.
def consoleOutput = sh(returnStdout: true, script: 'curl http://' + jenkinsUser + ':' + jenkinsUserToken + '#' + jenkinsServer + ':8080/job/' + 'myJob/lastBuild/consoleText').trim()
echo consoleOutput
2.
sh 'curl http://' + jenkinsUser + ':' + jenkinsUserToken + '#' + jenkinsServer + ':8080/job/' + "${env.JOB_NAME}" + '/lastBuild/consoleText; echo $? > status'
def consoleOutput = readFile('status').trim()
3.
def consoleOutput = sh(script: 'curl http://' + jenkinsUser + ':' + jenkinsUserToken + '#' + jenkinsServer + ':8080/job/' + '/myJob/lastBuild/consoleText', returnStatus: true).split("\r?\n")
echo consoleOutput
It looks like you're missing the inner array and some double quotes and escaped double quotes for running the script:
sh([ script: "curl \"http://${jenkinsUser}:${jenkinsUserToken}#${jenkinsServer}:8080/job/myJob/lastBuild/consoleText\"").trim()
Also there are multiple ways to do shell scripts and it depends on the type of jenkins pipeline you are using.
In a jenkins declarative pipeline you need to include a script {...} block for all script type code and setting variables, and that would look like this:
pipeline {
agent {
...
}
parameters {
...
}
environment {
...
}
stages {
stage('Run Required Scripts') {
steps {
...
script {
NOTIFIER_BULD_NAME = sh([script: "./getNotifier.sh", returnStdout: true]).trim()
EMAIL_TEXT = sh([script: "./printEmailText.sh ${CURRENT_BUILD} ${PREVIOUS_BUILD}", returnStdout: true]).trim()
BODY= sh([ script: "curl \"http://${jenkinsUser}:${jenkinsUserToken}#${jenkinsServer}:8080/job/myJob/lastBuild/consoleText\"").trim()
}
}
}
stage('Send Email') {
when {
expression {
// Only send when there is text.
"${EMAIL_TEXT}" != "";
}
}
steps{
emailext (
to: 'software#company.com',
subject: "You have mail - ${EMAIL_TEXT}",
body: """${NOTIFIER_BULD_NAME} - ${EMAIL_TEXT}:
...
${BODY}
""",
attachLog: false
)
}
}
}
In a Jenkins scripted pipeline, you don't need a script{} block, you can actually put it most places. Mostly I've put it in stage blocks stage('some stage'){...} and I've done it like this:
V5_DIR = WORKSPACE + '/' + sh([script: "basename ${V5_GIT_URL} .git", returnStdout: true]).trim()
Although I've also used curl commands (for scripted pipelines) and didn't need the inner array...
lastSuccessfulCommit = sh(
script: "curl -sL --user ${JENKINS_API_USER}:${JENKINS_API_PSW} \"${lastSuccessfulCommitUrl}\" | sed -e 's/<[^>]*>//g'",
returnStdout: true
)
And for reference, echoing vars looks like this in both
sh([script: "echo \"Value: ${someVariable}\""])
Hopefully this documentation helps a bit too, but I know recently Jenkins documentation can be pretty spotty, so I also found a great gist about how to not do things for Jenkins Declarative pipelines. Good luck!
my build script has the following task
task editProjectArtificat (type:Exec) {
executable "sed"
args "-e '/myInsertionMatchingPattern/r " + projectDir.toString() + "/scripts/install/myTextFileToInsert' < " + projectDir.toString() + "/build/scripts/MyOriginalInputFile > " + projectDir.toString() + "/build/scripts/MyChangedOutputFile"
}
gradle build fails when the above task executes with this error message
sed: 1: " '/myInsertionPattern/r ...": invalid command code
FAILURE: Build failed with an exception.
What went wrong:
Execution failed for task ':MyProject:editProjectArtificat'.
Process 'command 'sed'' finished with non-zero exit value 1
However, when I change the gradle.build script to make the task look like this
task editProjectArtificat (type:Exec) {
executable "sed"
args "-e /myInsertionMatchingPattern/r " + projectDir.toString() + "/scripts/install/myTextFileToInsert < " + projectDir.toString() + "/build/scripts/MyOriginalInputFile > " + projectDir.toString() + "/build/scripts/MyChangedOutputFile"
}
Now that both of the "'" removed in the "arg" line, we no longer get gradle build errors; however, sed does not produce "MyChangedOutputFile" file as expected when gradle build is done.
Typing sed command with both "'" on a shell produces the expected output? sed fails when the "'" are removed on the shell. my understanding sed needs "'" around the matching pattern and commands.
I don't know gradle, but it seems like args needs a list with each argument separated. However, you are using redirection (< and >) and that has to be done by the shell, so you shouldn't be executing sed but bash. You want to have something like bash -c "sed -e '/.../r ...' <... >..." so something like this might work:
task editProjectArtificat (type:Exec) {
executable "bash"
args "-c", "sed -e '/myInsertionMatchingPattern/r " + projectDir.toString() + "/scripts/install/myTextFileToInsert' < " + projectDir.toString() + "/build/scripts/MyOriginalInputFile > " + projectDir.toString() + "/build/scripts/MyChangedOutputFile"
}