Hi StackOverflow community
I am working on implementing ELK for one of my project. I have a continuous stream of logs getting generated in the format. By continuous stream I mean that logs are getting generated in the following format(Date in starting of each log)
03 Apr 2017 05:54:30,416 [INFO] Log text
I want to parse these logs and convert to a structured document using Logstash and then put in ElasticSearch. Logs are going to ElasticSearch but they are in a block(i.e. for one request, all logs generated are part of one elasticsearch document) and every log is not creating a new document.
My logstash's input configuration is like following:
input {
file {
path => "/Users/abc/logfile.txt"
start_position => "beginning"
codec => multiline {
pattern => "^([0-9]{2} [A-Za-z]{3} [0-9]{4} [0-9:]{8},[0-9]+)"
negate => true
what => previous
}
}
}
Is there some issue with this configuration? The regex pattern added in multiline codec is tested using regex tester and it correctly identifies Date in the starting of log.
Related
We are sending https requests to ingest data. How can the request be formatted so that Elastic dynamically maps the 'geo_point' field as type:point and not text or number? Thank you!
curl -X POST "https://in-https.URL" -H 'Content-Type: application/json' -d'
{
"geo_point": -71.34, 41.12
}
'
geo_point dynamic mapping is not supported - https://www.elastic.co/guide/en/elasticsearch/reference/current/dynamic-field-mapping.html.
To index a geo_point field, precede you original request with a request to _mapping to add your geo_point field to the index mapping - https://www.elastic.co/guide/en/elasticsearch/reference/current/geo-point.html
I have a Spring boot application, that produces logs into a file.
I also have running Elastic search (in docker) and Kibana and Logstash (not in docker).
This is my Logstash config:
input {
file {
type => "java"
path => "C:\Users\user\Documents\logs\semblogs.log"
start_position => "beginning"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout {
codec => rubydebug
}
}
Elastic is up and running. When I check for the data in index that was created like this:
http://localhost:9200/logstash-2019.11.04-000001/_search
it shows:
took 0
timed_out false
_shards
total 1
successful 1
skipped 0
failed 0
hits
total
value 0
relation "eq"
max_score null
hits []
In Kibana I also can't create an index, it says there are no data in elastic.
I suspect that Logstash is not sending incoming anything to Elastic, but I don't know why. There ARE logs in the log file from the app...
I want to send a POST request from Node-RED to the Composer REST server.
Error trying invoke business network. Error: No valid responses from any peers.Response from attempted peer comms was an error: Error: 2 UNKNOWN: error executing chaincode: transaction returned with failure: ValidationException: Instance org.acme.shipping.perishable.AccelReading#c8c829bfd738d7ec63180c5225ae85bd77fad29b4ad9d8ad4bc40a14362f1060 missing required field accel_x
Playground/Test
{
"$class": "org.acme.shipping.perishable.AccelReading",
"accel_x": 0,
"accel_y": 0,
"accel_z": 0,
"latitude": "",
"longitude": "",
"readingTime": "",
"shipment": "resource:org.acme.shipping.perishable.Shipment#4879"
}
Node-RED URL
http://...:31090/api/AccelReading?data=
{"$class":"org.acme.shipping.perishable.AccelReading",
"accel_x":23264,
"accel_y":-20960,
"accel_z":-2448,
"readingTime":"2018-02-14T15:16:44.284Z",
"latitude":"51",
"longitude":"11",
"shipment":"resource:org.acme.shipping.perishable.Shipment#320022000251363131363432"
}
Payload
Postman POST request with all parameters defined as key/value pairs in the body
Response
{
"error": {
"statusCode": 422,
"name": "ValidationError",
"message": "The `AccelReading` instance is not valid. Details: `shipment` can't be blank (value: undefined).",
"details": {
"context": "AccelReading",
"codes": {
"shipment": [
"presence"
]
},
"messages": {
"shipment": [
"can't be blank"
]
}
},
"stack": "ValidationError: The `AccelReading` instance is not valid. Details: `shipment` can't be blank (value: undefined).\n at /home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/dao.js:398:12\n at AccelReading.<anonymous> (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/validations.js:578:11)\n at AccelReading.next (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/hooks.js:93:12)\n at AccelReading.<anonymous> (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/validations.js:575:23)\n at AccelReading.trigger (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/hooks.js:83:12)\n at AccelReading.Validatable.isValid (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/validations.js:541:8)\n at /home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/dao.js:394:9\n at doNotify (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:155:49)\n at doNotify (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:155:49)\n at doNotify (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:155:49)\n at doNotify (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:155:49)\n at Function.ObserverMixin._notifyBaseObservers (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:178:5)\n at Function.ObserverMixin.notifyObserversOf (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:153:8)\n at Function.ObserverMixin._notifyBaseObservers (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:176:15)\n at Function.ObserverMixin.notifyObserversOf (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:153:8)\n at Function.ObserverMixin._notifyBaseObservers (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:176:15)"
}
}
Postman/JSON
The issue was a refresh of Composer REST server was required, to reflect the desired fields appearing in Swagger. This means to delete the REST server container and re-create using the internal 192.x address - the steps in Kubernetes are:
bx cs cluster-config blockchain
export KUBECONFIG=/Users/<name>/.bluemix/plugins/container-service/clusters/blockchain/kube-config-mil01-blockchain.yml
./delete/delete_composer-rest-server.sh
./create/create_composer-rest-server.sh --business-network-card admin#perishable-network
Attaching a screenshot of what the POST should look like (after refresh) for the IoT business network referred to
i am trying to send http request using jMeter. in request parameter i have to send an array of json objects.
here is the piece of request i have created:
{
"parameter1": "value1",
"parameter2": "value2",
"event[event_location_suggestions_attributes]": [
{
"latitude" : 13,
"longitude" : 13,
"address" : "asdsds"
}
]
}
but the third parameter is not being sent with the request. i may be mistaking in the format for the third parameter. please help with this.
Is it doable?
I'm doing the first tests with the new YouTube Data API V3 to migrate as soon as my site from old "API V2" to the new "V3 API."
I have the following problem: for a full request on a video id set as "private" as:
https://www.googleapis.com/youtube/v3/videos?id=7J7tGINYazA&key=**************************&part=snippet,contentDetails,statistics,status
the result is the following:
{
"kind", "youtube # videoListResponse"
"etag": "\" yHwg34KvgIlW9-uBcSEkgasDbzI / T_9s-xed4wEGn3XBIbu1JsPGi2U \ "",
"PageInfo": {
"totalResults": 0,
"resultsPerPage": 0
},
"items": []
}
as in the case of a video that does not exist ...
but according to what reported in the literature:
https://developers.google.com/youtube/v3/docs/videos#status.privacyStatus
should not be returned an sippet containing the status of the private video like this:
{
"kind", "youtube # videoListResponse"
"etag": "\" yHwg34KvgIlW9-uBcSEkgasDbzI / ULL6GjWjIQ4a7ruFwiAk1ExdLiw \ "",
"PageInfo": {
"totalResults": 1,
"resultsPerPage": 1
},
"items": [
{
"kind", "youtube video #"
"etag": "\" yHwg34KvgIlW9-uBcSEkgasDbzI / CWIAg26CY5tX532HpkYrib52e0c \ "",
"id": "nemioqnQa0Y"
"status": {
"uploadStatus": "processed"
"privacyStatus": "private"
"license": "youtube"
"embeddable": false,
"publicStatsViewable": false
}
}
]
}
The parameter privacyStatus should not contain 3 possible values (private, public, unlisted), as indicated in the documentation?
How in the world does not return the value "private"? ... Is this a bug?
Can you help? thanks
This is happening because the video is private. Using the api key, anyone can request any video if they know the id. However, since your video is set to private, you would need to use OAuth to authenticate.
Another way to think of it is like this - If I somehow gained access to your private video ID, perhaps I just got lucky and picked a random made-up id and got yours, I still should not be able to view it just because I know the id, and have a key - I would need to authenticate first to prove to YouTube that I am the owner of that private video.