Stopping nifi processor without wait/notify processor and curl commands - apache-nifi

I want to terminate invokhttp processor as soon as it fails, for that I use ExecuteStreamCommand processor I have made bat file with code like this:
curl http://localhost:8080/nifi-api/controller/process-groups/root/processors/f511a6a1-015d-1000-970e-969eac1e6fc5'-X PUT -H 'Accept: application/json'-d #stop.json -vv
and I have related json file with code like this:
{
"status": {
"runStatus": "STOPPED"
},
"component": {
"state": "STOPPED",
"id": "f511a6a1-015d-1000-970e-969eac1e6fc5"
},
"id": "f511a6a1-015d-1000-970e-969eac1e6fc5",
"revision": {
"version": 30,
"clientId": "0343f0b9-015e-1000-7cd8-570f8953ec11"
}
}
I use my jso file as an argument for command inside ExecuteStreamCommand processor bat it throws an exception like this:
What should I change?

all actions in nifi that you can do through web browser you can do through nifi-api.
use google chrome you can press F12 to activate DevTools
(other browsers also has this option)
then select Network tab
do required action on nifi (for example stop the processor)
right-click the request and choose menu copy -> copy as cUrl (bash)
now you have curl command in clipboard that repeats the same nifi action through calling nifi-api
you can remove all headers parameters (-H) except one: -H 'Content-Type: application/json'
so the stop action for my processor will look like this:
curl 'http://localhost:8080/nifi-api/processors/d03bbf8b-015d-1000-f7d6-2f949d44cb7f' -X PUT -H 'Content-Type: application/json' --data-binary '{"revision":{"clientId":"09dbb50e-015e-1000-787b-058ed0938d0e","version":1},"component":{"id":"d03bbf8b-015d-1000-f7d6-2f949d44cb7f","state":"STOPPED"}}'
beware! every time you change processor (even state) its version changes.
so before sending stop request you have to get current version & state.
you have to sent GET request to the same url as above without any additional headers:
http://localhost:8080/nifi-api/processors/d03bbf8b-015d-1000-f7d6-2f949d44cb7f
where d03bbf8b-015d-1000-f7d6-2f949d44cb7f is id of your processor.
you can just try this url in browser but replace the processor id in it.
the response will be in json.
{"revision":
{"clientId":"09dbb50e-015e-1000-787b-058ed0938d0e","version":4},
"id":"d03bbf8b-015d-1000-f7d6-2f949d44cb7f",
"uri":
...a lot of information here about this processor...
}
you can take clientId and version from result and use those attributes to build correct STOP request.
PS:
ExecuteStreamCommand transfers flow file into executing command as an input stream that could cause problems
use ExecuteProcess because you passing all the parameters to curl in command line and not through input stream.
you can stop the nifi processor without using curl - you just need to build correct sequence of processors like this:
InvokeHTTP (get current state) -> EvaluateJsonPath (extract version and clientId) -> ReplaceText (build json for stop using attrs from prev step) -> InvokeHTTP (call stop)
try to avoid the logic of stopping processor from nifi - sure it's possible. just re-think your algorithm.

here is template which show how to stop invokehttp processor :
https://www.dropbox.com/s/uv14kuvk2evy9an/StopInvokeHttpPoceesor.xml?dl=0

Related

ecobee API thermostat request (json) using bash and curl

I'm writing a bash script to interface with my ecobee (query info and change settings). I have the authorization all worked out (access token and refresh token) and am now trying to request info from the ecobee. This json parameter list is dynamically created. None of the curl examples in the Developers API Doc seem to work.
I've tried assigning the json to a variable (?json="$selection") and to a file (?json=#"$location"). My latest attempt (hard coding the json and escaping the braces) of the curl is as follows:
response=$(curl -s "https://api.ecobee.com/1/thermostat?json=\{"selection":\{"selectionType":"registered","selectionMatch":null,"includeRuntime":true,"includeSettings":true/}/}" -H "Content-Type: application/json;charset=UTF-8" -H "Authorization: Bearer $access_token")
and I received a null response declare -- response=""
If I have curl read from a file:
{"selection":{"selectionType":"registered","selectionMatch":null,"includeRuntime":true,"includeSettings":true}}
then I receive:
response='{
"status": {
"code": 4,
"message": "Serialization error. Malformed json. Check your request and parameters are valid."
}
}'
Which I assuming it' an escape issue?
Can anyone offer any insight? Again, I thought this would be the easy part. I'm using jq to build my request json. Other alternatives to curl that can better deal with json? I'm limited to bash (which is what I know)
Thanks
To integrate (arbitrary) data into a URL (or URI in general) you need to prepare it using percent-encoding first.
As you have mentioned to use jq to compose that data, you could add #uri to your jq filter to perform the encoding, and use the --raw-output (or -r) option to then output it properly.
For example, jq -r '#uri' applied to your JSON
{
"selection": {
"selectionType": "registered",
"selectionMatch": null,
"includeRuntime": true,
"includeSettings": true
}
}
would output
%7B%22selection%22%3A%7B%22selectionType%22%3A%22registered%22%2C%22selectionMatch%22%3Anull%2C%22includeRuntime%22%3Atrue%2C%22includeSettings%22%3Atrue%7D%7D
which you can use within the query string ?json=….

How to automate a task on remote server. How can I take value from one response to be added in next command?

I want to transfer old db_backups to some cold storage(Azure blob storage). How can I automate the task to transfer the files when the size reaches some specific limit, all ths should be done on remote server which I access through ssh in terminal.
Also how to read the response from which I can extract access_token and
can use that in my next command to be run automatically.
can anyone give me some example regarding this or any article to read?
How to add curl post request in bash script?
curl -X POST 'https://server.domain.com/v2/jobs/28723316-9373-44ba-9229-7c796f21b099/runs?project_id=aff59748-260a-476e-9578-b4f4a93e7a92' -H 'Content-Type: application/json' -H "Authorization: Bearer $token" -d ''
output is something like this :
{
"keys": [
{
"keyName": "key1",
"value": "FD0o1y8eSPutze",
"permissions": "FULL"
},
{
"keyName": "key2",
"value": "RLQ59xAi8Eg6p8VpIYx",
"permissions": "FULL"
}
]
}
I want to know how the above command will be run in a shell script. I want to take key1 value from the response and echo it.
but right now I am having trouble in adding $token in the bearer authorization.
can anyone tell me the right format for this in shell script.

Teams message card

I am trying to post message to Microsoft Teams channel using Windows batch script but I could not make use of the Teams message card formats. I am able to post messages using the below commands but as plain texts. Is there anyway in which I can make use of the Message card JSON formats ?
I also have some command line arguments which need to be used for the batch script so that the message displayed uses the same arguments.
curl -H "Content-type: application/json" --data "{\"#type\": \"ActionCard\",\"title\": \"New Lab %2 deployed successfully\", \"text\": \"Status is %3\"}" %1
The above command worked just fine. But doesnt satisfy my exact requirement which is described above.
I also created a seperate json file which was called as below and this worked fine but couldnt make use of command line arguments to format the JSON values.
curl --data #message.json webhook_url
message.json is as below
{
"summary":"New Lab deployed",
"sections":[
{
"activityTitle":"A <b>new lab</b> has been added!"
},
{
"title":"Details:",
"facts":[
{
"name":"Lab Name",
"value":"REPLACE"
},
{
"name":"Status",
"value":"REPLACE"
}
]
}
]
}
cURL is able to read data from file:
--data-binary "#message.json"
do not forget prepend AT sign to identify the doublequoted string is a filename, not a data itself.

Parse VMware REST API response

I'm trying to parse a json response from a REST API call. My awk is not strong. This is a bash shell script, and I use curl to get the response and write it to a file. My problem is solely trying to cut the response up into useful parts.
The response is all run together on one line and looks like this:
{
"value": {
"summary": "Patch for VMware vCenter Server Appliance 6.5.0",
"install_time": "2017-03-22T22:43:25 UTC",
"product": "VMware vCenter Server Appliance",
"build": "5178943",
"releasedate": "March 14, 2017",
"type": "vCenter Server with an external Platform Services Controller",
"version": "6.5.0.5300"
}
}
I'm interested in simply writing the type, version, and product strings into a log file. Ideally on 3 lines, but I really don't care; I simply need to be able to identify the build etc at the time this backup script ran, so if I need to rebuild & restore I can make sure I have a compatible build.
Your Rest API gives you JSON format, it's best suited for a JSON parser like jq :
curl -s '/rest/endpoint' | jq -r '.value | .type,.version,.product' > config.txt

How to run _update elasticsearch with external script

I want to run example update
curl -XPOST 'localhost:9200/test/type1/1/_update' -d '{
"script" : "ctx._source.text = \"some text\""
}'
(http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/docs-update.html), but got error {"error":"ElasticsearchIllegalArgumentException[failed to execute script]; nested: ScriptException[dynamic scripting for [mvel] disabled]; ","status":400}.
From this page http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/modules-scripting.html, I found out that I need to place my script (I called it demorun.groovy) and run it by name.
I did that, and now try to reference as
curl -XPOST 'localhost:9200/test/type1/1/_update' -d '{
"script" : "demorun.groovy"
}'
But still get same error.
I suppose that reference it wrong. How to pass _update with external script?
My demorun.groovy:
ctx._source.text = \"some text\"
The error message you are receiving indicates that dynamic scripting is disabled, which is the default setting. You need to enable to get scripting to work:
Enabling dynamic scripting
We recommend running Elasticsearch behind an application or proxy,
which protects Elasticsearch from the outside world. If users are
allowed to run dynamic scripts (even in a search request), then they
have the same access to your box as the user that Elasticsearch is
running as. For this reason dynamic scripting is allowed only for
sandboxed languages by default.
First, you should not run Elasticsearch as the root user, as this
would allow a script to access or do anything on your server, without
limitations. Second, you should not expose Elasticsearch directly to
users, but instead have a proxy application inbetween. If you do
intend to expose Elasticsearch directly to your users, then you have
to decide whether you trust them enough to run scripts on your box or
not. If you do, you can enable dynamic scripting by adding the
following setting to the config/elasticsearch.yml file on every node:
script.disable_dynamic: false
While this still allows execution of named scripts provided in the
config, or native Java scripts registered through plugins, it also
allows users to run arbitrary scripts via the API. Instead of sending
the name of the file as the script, the body of the script can be sent
instead.
There are three possible configuration values for the
script.disable_dynamic setting, the default value is sandbox:
true: all dynamic scripting is disabled, scripts must be placed in the
config/scripts directory.
false: all dynamic scripting is enabled, scripts may be sent as
strings in requests.
sandbox: scripts may be sent as strings for languages that are
sandboxed.
http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/modules-scripting.html
The problem with the ES request above is that it's using the wrong format.
Dynamic scripting (i.e. inline scripting):
curl -XPOST 'localhost:9200/test/type1/1/_update' -d '{
"script" : "ctx._source.text = \"some text\""
}'
Static scripting (i.e. offline scripting):
curl -XPOST 'localhost:9200/test/type1/1/_update' -d '{
"script" : {
"lang": "groovy",
"script_file": "some-name",
"params": {
"foo": "some text"
}
}
}'
And you are supposed to put
ctx._source.text = foo
into .../elasticsearch/config/scripts/some-name.groovy to reproduce almost the same functionality. Actually, much better because you don't need to open up ES to dynamic scripting, and you get arguments passing.
In elasticsearch 2.0, script.disable_dynamic: false not work, because:
Exception in thread "main" java.lang.IllegalArgumentException:
script.disable_dynamic is not a supported setting, replac e with
fine-grained script settings. Dynamic scripts can be enabled for all
languages and all operations by replacing script.disable_dynamic:
false with s cript.inline: on and script.indexed: on in
elasticsearch.yml
as error says , you need set below in elasticsearch.yml:
script.inline: on
script.indexed: on
I encountered this same issue and resolved it by adding the following code to the elasticsearch.yml file in config folder:
script.disable_dynamic: false

Resources