Chainlink Node -Job Run: requestData: while unmarshalling JSON: invalid character '$' looking for beginning of value - bad input for task - chainlink

It appears that the fetch task is not working. I haven't found any bugs, and compared this task to a job that worked.
Fetch task that succeeded ->
fetch [type=bridge name="armanino-trust-token" requestData=
"{\"id\": $(jobSpec.externalJobID), \"data\": { \"tokenName\": $(decode_cbor.tokenName)}}"]
Fetch task that is failing ->
fetch [type=bridge name="rasp-pi-cpu" requestData=
"{\"id\": $(jobSpec.externalJobID), \"data\": { \"pi-temp\": $(decode_cbor.pi-temp)}}"]
This is the error that appears in my logs ->
requestData: while unmarshalling JSON: invalid character '$' looking for beginning of value; js: {"id": { "__chainlink_key_path__": "jobSpec.externalJobID" }, "data": { "pi-temp": $(decode_cbor.pi-temp)}}: bad input for task

Dashes are not allowed in pipeline variable names. Replace pi-temp with pi_temp and it should work.

Related

Access HTTP response data saved in a variable with colon

I'm using Google Cloud Workflow to call via http.get a CloudRun app that returns a XML document that has been converted to json, the below json gets successfully returned to Workflow in Step 2 which contains the converted XML to json in the body.
{
"body": {
"ResponseMessage": {
"#xmlns": "http://someurl.com/services",
"Response": {
"#xmlns:a": "http://someurl.com/data",
"#xmlns:i": "http://www.w3.org/2001/XMLSchema-instance",
"a:ReferenceNumber": {
"#i:nil": "true"
},
"a:DateTime": "2023-01-01T00:17:38+0000",
"a:TransactionId": "154200432",
"a:Environment": "Development",
"a:RequestDateTime": "2023-01-01T11:17:39",
}
},
"code": 200,
"headers": {
"Alt-Svc": "h3=\":443\"; ma=2592000,h3-29=\":443\"; ma=2592000,h3-Q050=\":443\"; ma=2592000,h3-Q046=\":443\"; ma=2592000,h3-Q043=\":443\"; ma=2592000,quic=\":443\"; ma=2592000; v=\"46,43\"",
"Content-Length": "1601",
"Content-Type": "application/json",
"Date": "Sun, 01 Jan 2023 00:17:39 GMT",
"Server": "Google Frontend",
"X-Cloud-Trace-Context": "931754ab82102397eb07775171485850"
}
}
}
The full yaml of the workflow is below and without step3/step4 it works. In step3 I try to access an element in the json which is returned from step2 as per https://cloud.google.com/workflows/docs/http-requests#access-data
main:
params: [args]
steps:
- step1:
assign:
- searchURL: ${"https://myfunction.a.run.app/search/" + args.type + "/" + args.serial}
- step2:
call: http.get
args:
url: ${searchURL}
result: s2result
- step3:
assign:
- resubmitURL: '${https://myfunction.a.run.app/resubmit/" + ${s2result.body.ResponseMessage.Response.a:TransactionId} }'
- step4:
call: http.get
args:
url: ${resubmitURL}
result: s4result
- returnOutput:
return: ${s4result}
However due to the colon : in the field I'm trying to access there are yaml parsing errors when I attempt to save another variable assignment. How can I access a HTTP response data saved in a variable when there are colons in the property field.
The errors in the console are similar too
Could not deploy workflow: main.yaml:14:25: parse error: in workflow 'main', step 'step3': token recognition error at: ':'
- resubmitURL: '${"https://myfunction.a.run.app/resubmit/" + ${s2result.body.ResponseMessage.Response.a:TransactionId}'
^
main.yaml:14:25: parse error: in workflow 'main', step 'step3': mismatched input '+' expecting {'[', LOGICAL_NOT, '(', '-', TRUE, FALSE, NULL, NUM_FLOAT, NUM_INT, STRING, IDENTIFIER}
- resubmitURL: '${"https://myfunction.a.run.app/resubmit/" + ${s2result.body.ResponseMessage.Response.a:TransactionId}'
Two techniques are required to reference map keys with special characters like this:
As recommended in the documentation, all expressions should be wrapped in single quotes to avoid YAML parsing errors (i.e. '${...}').
When referencing keys with special characters, you can use array notation to wrap the key name in quotes (i.e. var["KEY"]).
Together, it looks like this:
main:
steps:
- init:
assign:
- var:
key:
"co:lon": bar
- returnOutput:
return: '${"foo" + var.key["co:lon"]}'
In your code you are using an expression inside an expression:
- resubmitURL: '**${**"https://myfunction.a.run.app/resubmit/" + **${**s2result.body.ResponseMessage.Response.a:TransactionId**}**'
In this sample from your error message your not even closing the expressions right.
If you pack everything into one expression and use the hint from Kris with the key, it should deploy:
- resubmitURL: '${"https://myfunction.a.run.app/resubmit/" + s2result.body.ResponseMessage.Response["a:TransactionId"]}'
Here is my full test case:
main:
params: [args]
steps:
- init_assign:
assign:
- input: ${args}
- s2result:
body:
ResponseMessage:
Response:
"a:TransactionId": "Test"
- resubmitURL: '${"https://myfunction.a.run.app/resubmit/" + s2result.body.ResponseMessage.Response["a:TransactionId"]}'
- log1:
call: sys.log
args:
text: ${resubmitURL}
severity: INFO
With the log: 'https://myfunction.a.run.app/resubmit/Test'

Can a logstash filter error be forwarded to elastic?

I'm having these json parsing errors from time to time:
2022-01-07T12:15:19,872][WARN ][logstash.filters.json ] Error parsing json
{:source=>"message", :raw=>" { the invalid json }", :exception=>#<LogStash::Json::ParserError: Unrecognized character escape 'x' (code 120)
Is there a way to get the :exception field in the logstash config file?
I opened the exact same thread on the elastic forum and got a working solution there. Thanks to #Badger on the forum, I ended up using the following raw ruby filter:
ruby {
code => '
#source = "message"
source = event.get(#source)
return unless source
begin
parsed = LogStash::Json.load(source)
rescue => e
event.set("jsonException", e.to_s)
return
end
#target = "jsonData"
if #target
event.set(#target, parsed)
end
'
}
which extracts the info I needed:
"jsonException" => "Unexpected character (',' (code 44)): was expecting a colon to separate field name and value\n at [Source: (byte[])\"{ \"baz\", \"oh!\" }\r\"; line: 1, column: 9]",
Or as the author of the solution suggested, get rid of the #target part and use the normal json filter for the rest of the data.

Using actual regex in Ansible's search_regex parameter

I'm trying to use the wait_for module in Ansible to repeatedly check a log file and end when it finds either success (the word Successfully appears in the log) or failure (the string [FATAL] appears in the log).
- name: Wait for Logstash API Endpoint to be running
wait_for:
path: /var/log/logstash/logstash-plain.log
search_regex: '(\\[FATAL\\]|Successfully)'
delay: 30
timeout: 120
I've tried various approaches to the search_regex parameter including no escape characters, single escape characters etc.
I've checked that the logs' output includes [FATAL] and it definitely does, but I can't get this module to work.
Where am I going wrong?
** EDIT **
Attempting to use the following code:
On the following log:
[2019-01-15T15:43:14,735][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.5.2"}
[2019-01-15T15:44:00,534][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::OrgLogstashConfigIr::InvalidIRException", :message=>"Config has duplicate Ids: \nID: jsonfilter P[filter-json{\"id\"=>\"jsonfilter\", \"source\"=>\"message\", \"remove_field\"=>[\"_sourceUri\", \"_user\", \"sourceUri\", \"user\", \"pid\", \"v\"]}|[str]pipeline:41:7:```\njson {\n id => \"jsonfilter\"\n source => \"message\"\n # remove some irrelevant fields\n remove_field => [\"_sourceUri\", \"_user\", \"sourceUri\", \"user\", \"pid\", \"v\"]\n }\n```]\nP[filter-json{\"id\"=>\"jsonfilter\", \"source\"=>\"message\", \"remove_field\"=>[\"_sourceUri\", \"_user\", \"sourceUri\", \"user\", \"pid\", \"v\"]}|[str]pipeline:191:7:```\njson {\n id => \"jsonfilter\"\n source => \"message\"\n # remove some irrelevant fields\n remove_field => [\"_sourceUri\", \"_user\", \"sourceUri\", \"user\", \"pid\", \"v\"]\n }\n```]", :backtrace=>["org.logstash.config.ir.graph.Graph.validate(org/logstash/config/ir/graph/Graph.java:294)", "org.logstash.config.ir.PipelineIR.<init>(org/logstash/config/ir/PipelineIR.java:52)", "java.lang.reflect.Constructor.newInstance(java/lang/reflect/Constructor.java:423)", "org.jruby.javasupport.JavaConstructor.newInstanceDirect(org/jruby/javasupport/JavaConstructor.java:246)", "org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:1022)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "usr.share.logstash.logstash_minus_core.lib.logstash.compiler.compile_sources(/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:29)", "usr.share.logstash.logstash_minus_core.lib.logstash.compiler.RUBY$method$compile_sources$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/compiler.rb)", "org.jruby.RubyClass.finvoke(org/jruby/RubyClass.java:899)", "org.jruby.RubyBasicObject.callMethod(org/jruby/RubyBasicObject.java:372)", "org.logstash.config.ir.ConfigCompiler.configToPipelineIR(org/logstash/config/ir/ConfigCompiler.java:32)", "org.logstash.execution.AbstractPipelineExt.initialize(org/logstash/execution/AbstractPipelineExt.java:149)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$3$0$initialize.call(org/logstash/execution/AbstractPipelineExt$INVOKER$i$3$0$initialize.gen)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline.initialize(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline.initialize(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90)", "org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:1022)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline_action.create.block in execute(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:42)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)", "org.jruby.RubyProc.call19(org/jruby/RubyProc.java:273)", "org.jruby.RubyProc$INVOKER$i$0$0$call19.call(org/jruby/RubyProc$INVOKER$i$0$0$call19.gen)", "usr.share.logstash.logstash_minus_core.lib.logstash.agent.block in exclusive(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:92)", "org.jruby.ext.thread.Mutex.synchronize(org/jruby/ext/thread/Mutex.java:148)", "org.jruby.ext.thread.Mutex$INVOKER$i$0$0$synchronize.call(org/jruby/ext/thread/Mutex$INVOKER$i$0$0$synchronize.gen)", "usr.share.logstash.logstash_minus_core.lib.logstash.agent.exclusive(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:92)", "usr.share.logstash.logstash_minus_core.lib.logstash.agent.RUBY$method$exclusive$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/agent.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline_action.create.execute(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash/pipeline_action//usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.agent.block in converge_state(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:317)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:246)", "java.lang.Thread.run(java/lang/Thread.java:748)"]}
[2019-01-15T15:44:01,189][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<LogStash::Error: Don't know how to handle `Java::OrgLogstashConfigIr::InvalidIRException` for `PipelineAction::Create<main>`>, :backtrace=>["org/logstash/execution/ConvergeResultExt.java:103:in `create'", "org/logstash/execution/ConvergeResultExt.java:34:in `add'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:329:in `block in converge_state'"]}
[2019-01-15T15:44:02,220][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
Results in this failure:
Further info:
Based on my testing you were soooooooooo close. This:
search_regex: "(\\[FATAL\\]|Successfully)"
works for me. (Note " rather than ')
Also, check that Ansible has permission to read the log file. In the case that it doesn't, wait_for doesn't give you any indication that it can't read the file.

Error reading project.json, Unterminated string error when performing dotnet restore

Using Visual Studio for Mac to perform a "dotnet restore" but the terminal shows this message:
error : Error reading
'/Users/abc/corefx/src/Microsoft.TargetingPack.Private.CoreCLR/ref/project.json' at line 8 column 1 : Unterminated string. Expected delimiter: ". Path 'frameworks', line 8, position 1.
error:Unterminated string. Expected delimiter: ". Path 'frameworks', line 8, position 1.
project.json:
{
"dependencies": {
"Microsoft.TargetingPack.Private.CoreCLR": "1.2.0-beta-24904-03"
},
"frameworks": {
"netcoreapp1.1.0”: {}
}
}
I have found the file but I cannot identify the cause of the problem. What could cause this error message?
It looks to me like the netcoreapp1.1.0 property name is terminated with a typographic right quote mark ” which is not valid JSON. It should be a standard straight quote " instead.
{
"dependencies": {
"Microsoft.TargetingPack.Private.CoreCLR": "1.2.0-beta-24904-03"
},
"frameworks": {
"netcoreapp1.1.0": {}
}
}

Error handling with curl and elasticsearch

I'm currently developing bash scripts that use elasticsearch and I need a good error-handling.
In this situation I try to add a document to elasticsearch and check if the operation succeeded.
At first I naively tried this :
response=$(curl -XPOST 'http://localhost:9200/indexation/document' -d '
{
"content":"'"$txt"'",,
"date_treatment":"'"$(date +%Y-%m-%d)"'"
}') && echo ok || echo fail
But curl doesn't work that way and still returns success (0 - which is actually logical) even though the json request is obviously incorrect (note the double comma on line 3) and elasticsearch displays errors.
So the answer isn't there. Now I think I should analyze the variable $response to catch errors (grep ?). I post this question to get hints or solutions on the way to do this in a reliable way and to make sure I'm not missing an obvious solution (maybe a curl option I don't know ?).
Additional useful things
Parsing JSON with Unix tools
Examples of the content of $response :
success :
{
"_id": "AVQz7Fg0nF90YvJIX_2C",
"_index": "indexation",
"_shards": {
"failed": 0,
"successful": 1,
"total": 1
},
"_type": "document",
"_version": 1,
"created": true
}
error :
{
"error": {
"caused_by": {
"reason": "json_parse_exception: Unexpected character (',' (code 44)): was expecting either valid name character (for unquoted name) or double-quote (for quoted) to start field name\n at [Source: org.elasticsearch.common.io.stream.InputStreamStreamInput#139163f; line: 3, column: 17]",
"type": "json_parse_exception"
},
"reason": "failed to parse",
"root_cause": [
{
"reason": "json_parse_exception: Unexpected character (',' (code 44)): was expecting either valid name character (for unquoted name) or double-quote (for quoted) to start field name\n at [Source: org.elasticsearch.common.io.stream.InputStreamStreamInput#139163f; line: 3, column: 17]",
"type": "json_parse_exception"
}
],
"type": "mapper_parsing_exception"
},
"status": 400
}
A simple workaround is to use the -f/--fail option.
As per documentation :
(HTTP) Fail silently (no output at all) on server errors. This is
mostly done to better enable scripts etc to better deal with failed
attempts. In normal cases when an HTTP server fails to deliver a
document, it returns an HTML document stating so (which often also
describes why and more). This flag will prevent curl from outputting
that and return error 22.
This method is not fail-safe and there are occasions where
non-successful response codes will slip through, especially when
authentication is involved (response codes 401 and 407).
example:
response=$(curl -XPOST 'http://localhost:9200/indexation/document' -d '
{
"content":"'"$txt"'",,
"date_treatment":"'"$(date +%Y-%m-%d)"'"
}' -f ) && echo ok || echo fail

Resources