Upload JSON file to InfluxDB - jenkins-pipeline

I am trying to collect Bitbucket API data as a json file and push the data to InfluxDB. I am doing this via Jenkins scripted pipeline.
I am not sure how to convert my json data to points as the data will be dynamic. The script executes without any error but I am unable to see my data in InfluxDB
How can I specify which measurement to use.
Code is as below
import groovy.json.*
def result
influxSRV='x.x.x.x:8086'
influxDb='dbname'
measurement = 'ms'
node('master'){
stage('collect'){
sh "curl -XGET -u 'xx:yy' https://x.y.z > output.json"
result = readJSON file: 'output.json'
sh "curl -iX POST \'http://${influxSRV}/write?db=${influxDb}&precision=ms\' --data-binary \'${measurement},${result}\'"
}
}
But the data is not uploaded. Can anyone let me know what I am missing?

Try this:
result = readJSON file: 'output.json'
sh "curl -iX POST http://${influxSRV}/write?db=${influxDb}&precision=ms --data-binary '${measurement},${result}'"
You can then see the output of curl in the output.

Related

jMeter: upload sequentially named files

I've several files containg POST body requests.I'd like to send those requests in parallel using jmeter.
Currently, I'm using curl + parallel command apporach.
Related curl command is like:
curl -s -X POST $FHIR_SERVER/ -H "Content-Type: application/fhir+json" --data "#patient-bundle-01.json"
Request bodies are files like patient-bundle-xx, where xx is a number. Currently, I'd like to send up to 1500 requests using this incremental pattern.
My approach using parallel:
doit() {
bundle="$1"
curl -s -X POST $FHIR_SERVER/ -H "Content-Type: application/fhir+json" --data "#patient-bundle-$bundle.json"
}
export -f doit
export FHIR_SERVER
seq -w 99 | parallel -j77 doit
How could I get this behavior using jmeter?
Add a Counter configuration element and specify desired start and end value
In your HTTP Request sampler add ${bundle} JMeter Variable reference to the file name
More information: How to Use a Counter in a JMeter Test
Another (easier) option could be using Directory Listing Config plugin

What is causing the Elasticsearch bulk load to fail?

I can't figure out why I can't bulk load Elasticsearch with JSON. I've done this before but this time I am totally stumped.
I have processed a set of JSON documents into Elastic Bulk Load Format and am trying to bulk load the index I just created (verified created, can be queried, and is empty).
{"create": {"_id": "ef68e997-c616-4b0b-b08e-dfc09f8cb08f"}}
{"id": "ef68e997-c616-4b0b-b08e-dfc09f8cb08f", "title": "My document"}
... repeats for all records
The command I run uses a list of paths to the JSON bulk files and a loop to curl/POST them to Elastic using credentials:
while IFS= read -r "path" < "${DOC_LIST_PATH}"
do
echo "Submitting Elastic formatted docs at ${path} to Elastic index 'docs' ..."
curl \
-X POST \
-H "Content-Type: application/x-ndjson" \
"https://${ES_USER}:${ES_PASSWD}#${ES_HOSTNAME}:${ES_PORT}/docs/_bulk" \
--data-binary "#${path}"
done
I've done all this before and it should work but... it doesn't. I get this error instead:
Submitting Elastic formatted docs at data/docs.json/part-00000.json to Elastic index 'docs' ...
Warning: Couldn't read data from file
Warning: "data/docs.json/part-00000.json",
Warning: this makes an empty POST.
{"error":{"root_cause":[{"type":"parse_exception","reason":"request body is required"}],"type":"parse_exception","reason":"request body is required"},"status":400}
... repeats for all files
I have found that the problem is with this bash code, not the data or the bulk load request:
--data-binary "#${path}"
If I replace that with this, it works:
--data-binary "#data/docs.json/part-00000.json"
Making the full working command for a single file:
curl -X POST -H "Content-Type: application/x-ndjson" "https://${ES_USER}:${ES_PASSWD}#${ES_HOSTNAME}:${ES_PORT}/docs/_bulk" --data-binary "#data/docs.json/part-00000.json"
But I need to script this, so this is still maddening. Please help!
This example is also in a gist here

Not able to extract Json Data in Apache-NiFi

I have written a below POST command and using "HandleHttpRequest" processor to receive the POST request in Apache NiFi
curl -v -H "Accept: application/json" -H "Content-type: application/json" -X POST -d '{"employeeDetails":{"empid":"124","empname": "praveen"}}' http://localhost:7002
I am able to receive the json data in "handleHttpRequest" processor as shown below
when I check the list queue I am able to see the json data
HandleHttpProcessor details
But I want to extract empid and check whether empid of my json data is null or not,I tried
"ExtractText","ReplaceText","UpdateAttribute","EvaluateJsonPath" etc Processors to fetch empolyee details but I am unable to do it.
EvaluateJson path details
I am getting "flowfile did not have a valid JSON content" error in EvaluateJsonPath processor
How do I extract empdata and check whether its null or not?
The problem is not related to NiFi. You should post data with CURL like this (change double quote " to single quote ' after -d):
curl -v -H "Accept: application/json" -H "Content-type: application/json" -X POST -d '{"employeeDetails":{"empid":"124","empname": "praveen"}}' http://localhost:7002
I have the exact same two processors you have, additionally, I also have a HandleHTTPResponse added so that my curl command exits neatly without sending additional buffer messages that make the EvaluateJSONPath component fail with invalid JSON error (My guess is this could have been your case as well).
HTTP Request Flow
Also as #Behrouz Seyedi mentioned you would need to use a single quote in your command. This is my curl command
curl -v -H "Content-type: application/json" -X POST -d '{"employeeDetails":{"empid":"124","empname": "praveen"}}' http://localhost:7003
This is the screenshot to the EvaluateJSONPath processor.
This is the response of the EvaluateJSONPath
empid

Curl command not uploading file contents

Hi I am using cURL command to upload a file which is a POST request to my local machine service.
I am using following commands to upload
curl -i -X POST -H "Content-Type: multipart/form-data" -F
"/Users/myName/Folder/file.csv" http://localhost:port/api/fileupload
In my application side I am using spring frameworks web binding to receive the file
Following is the code snippet
public ResponseEntity importDimensions(#RequestBody MultipartFile file) {
// file is variable is always null
}
What am I missing here?
You need an # sign before the filename, like this: #/Users/myName/Folder/file.csv.
And if your server-side code is expecting a parameter named file then you need to do this:
-F "file=#/Users/myName/Folder/file.csv"

Load GeoJSON file into Apache CouchDB

I am working on Windows10 and tried to load a geojson file into my couchdb via the "curl" command and a POST request in the cmd which looks like that:
C:\Program Files\cURL\bin>curl -d #path-to-my-data\data.geojson -H "Content-type: application/json" -X POST http://127.0.0.1:5984/_utils/database.html?-dbName-
and then I get the following error:
{"error":"method_not_allowed","reason":"Only GET,HEAD allowed"}
On http://couchdb-13.readthedocs.org/en/latest/api-basics/ it is said, that "If you use the an unsupported HTTP request type with a URL that does not support the specified type, a 405 error will be returned, listing the supported HTTP methods."
When I try that with a PUT request, I get the same error.
I validated the json with jsonlint so this should not be the problem.
I tried several tutorials like "Three Steps to CouchDB Heaven …" or "Export & Import a Database with CouchDB" but none of them seems to work.
So I am not sure, where the problem is. Do I need to make changes in my geojson file, or something else?
thanks for your help
The needed curl command just looks like that:
curl -H "Content-Type: application/json" -X POST http://localhost:5984/db -d #C:\Users\Name\Desktop\data.geojson

Resources