Escape character in index name when using elasticdump on cli - bash

elasticdump is a great tool for moving and saving indices. Copying one index data from one index to another is simply this command:
$ elasticdump --input=http://user:pass#localhost:9200/my-index --key=... --output=https://user:pass#my-cluster:9200/my-index --tlsAuth --type=data
In my application I dynamically create indices. Somehow I managed to have an unescaped character in some indices, which is ’. One resulting index name is for example mens’s_clothing.
When I now want to copy the data from that index to another, elasticdump gives me this error:
$ elasticdump --input=https://...#my-cluster:9200/mens’s_clothing --key=... --output=https://...#localhost:9200/mens’s_clothing --tlsAuth --type=data
...
... Error Emitted => Request path contains unescaped characters
Fair enough. Let's URL-encode the index name:
$ elasticdump --input=https://...#my-cluster:9200/mens%E2%80%99s_clothing --key=... --output=https://...#localhost:9200/mens%E2%80%99s_clothing --tlsAuth --type=data
...
{
_index: 'mens%E2%80%99s_clothing',
_type: '_doc',
_id: '367f9125_1_1',
status: 400,
error: {
type: 'invalid_index_name_exception',
reason: 'Invalid index name [mens%E2%80%99s_clothing], must be lowercase',
index_uuid: '_na_',
index: 'mens%E2%80%99s_clothing'
}
}
Elasticsearch returns this error, so elasticdump doesn't understand that the index name is encoded, because when I try to create the index with curl it works:
$ curl -X PUT "https://...#localhost:9202/men%E2%80%99s_clothing"
{"acknowledged":true,"shards_acknowledged":true,"index":"men’s_clothing"}
Does anyone know how to tell elasticdump that the index name is URI-encoded?

Related

what is the reason for this logstash error("error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper)

bellow is my filebeat config and I added a logId :
- type: log
fields:
source: 'filebeat2'
logID: debugger
fields_under_root: true
enabled: true
paths:
- /var/log/path/*
and below is my output section of logstash conf :
if "debugger" in [logID] and ("" not in [Exeption]) {
elasticsearch {
user => ""
password => ""
hosts => ["https://ip:9200"]
index => "debugger"
}
}
and I put some log files in path(10 files) and I randomely got this error in logstash-plain.log :
{"index"=>{"_index"=>"debugger", "_type"=>"_doc", "_id"=>"9-DmvoIBPs8quoIM7hCa",
"status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper
[request.dubugeDate] cannot be changed from type [text] to [long]"}}}}
and also this :
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field
[debug.registrationDate] of type [long] in document with id 'Bt_YvoIBPs8quoIMXfwd'.
Preview of field's value: '2022-08-1707:37:08.256'", "caused_by"=>
{"type"=>"illegal_argument_exception", "reason"=>"For input string: \"2022-08-
1707:37:08.256\""}}}}}
can anybody help me ?
Look like, in the first case, in the index mapping, your field request.dubugeDate defined as long, and you try to ingest some string data.
In the second case the field debug.registrationDate find mapping, defined as long, and you try to ingest string (date).
You can check the mapping of your index with GET /YOUR_INDEX/_mapping command from the Kibana or same via curl

How to compress with 'best_compression' elasticsearch data

How can I compress all elasticsearch data (existing one as well as new data) with the "best_compression" option?
I know since 5.00 version I can't put "index.codec: best_compression" in the elasticsearch.yml file. I've read the log which indicates that it's deccaped and I should use
curl -XPUT 'http://localhost:9200/_all/_settings?preserve_existing=true' -d '{"index.codec" : "best_compression"}'
But when used I'm given the following error:
{"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"Can't update non dynamic settings [[index.codec]] for open indices [[logstash-dns-2018.07.30/xHq6UfgsSD2M1dBZhV3cOg], [logstash-2018.07.27/7U7uUsEORFqXtJtrk4KvDw], [logstash-dns-2018.07.27/Xbx15QXOQ5KJAK7iop_54Q], [logstash-http-2018.07.27/q0Rs65a3TjW4NJfcljUHEw], [logstash-flow-2018.07.30/0Erbh2TcRgmFJLMLr8Ka8w], [logstash-2018.07.30/boOd8BdrQV2QoziKaZ_2lw], [logstash-alert-2018.07.27/o5yqwdNqR5yAcbJ-HCNVHw], [logstash-alert-2018.07.30/pp6ZWKLISECVzUiCDDeydQ], [logstash-tls-2018.07.30/rZi6KfC7RtqOVjUt7CCqDQ], [logstash-ssh-2018.07.27/wKi-p6slSqO0-vbwRqS1ZA], [.kibana/XaFQRcEXTW6jLUCmBijzKQ], [logstash-tls-2018.07.27/hbiXYCzjRumh3ND6up9vNw], [logstash-flow-2018.07.27/XfspJr1TS4y6MnCgAmRq1g], [logstash-fileinfo-2018.07.27/9VWyBHsqRmO4QsnN-gdt_w], [logstash-http-2018.07.30/U9JO9Cp-QQO7gvRNoHt7FQ], [logstash-fileinfo-2018.07.30/nlwHeDOsQ3ii8CLxcgE3Ag]]"}],"type":"illegal_argument_exception","reason":"Can't update non dynamic settings [[index.codec]] for open indices [[logstash-dns-2018.07.30/xHq6UfgsSD2M1dBZhV3cOg], [logstash-2018.07.27/7U7uUsEORFqXtJtrk4KvDw], [logstash-dns-2018.07.27/Xbx15QXOQ5KJAK7iop_54Q], [logstash-http-2018.07.27/q0Rs65a3TjW4NJfcljUHEw], [logstash-flow-2018.07.30/0Erbh2TcRgmFJLMLr8Ka8w], [logstash-2018.07.30/boOd8BdrQV2QoziKaZ_2lw], [logstash-alert-2018.07.27/o5yqwdNqR5yAcbJ-HCNVHw], [logstash-alert-2018.07.30/pp6ZWKLISECVzUiCDDeydQ], [logstash-tls-2018.07.30/rZi6KfC7RtqOVjUt7CCqDQ], [logstash-ssh-2018.07.27/wKi-p6slSqO0-vbwRqS1ZA], [.kibana/XaFQRcEXTW6jLUCmBijzKQ], [logstash-tls-2018.07.27/hbiXYCzjRumh3ND6up9vNw], [logstash-flow-2018.07.27/XfspJr1TS4y6MnCgAmRq1g], [logstash-fileinfo-2018.07.27/9VWyBHsqRmO4QsnN-gdt_w], [logstash-http-2018.07.30/U9JO9Cp-QQO7gvRNoHt7FQ], [logstash-fileinfo-2018.07.30/nlwHeDOsQ3ii8CLxcgE3Ag]]"},"status":400}
Solved:
Close all indices:
http://localhost:9200/_all/_close'
Apply best_compression to all
curl -XPUT 'http://localhost:9200/_all/_settings' -d '{"index.codec" : "best_compression"}'
Open all indices:
curl -XPOST 'http://localhost:9200/_all/_open'

Sematext Logagent Elasticsearch - Indexes not being created?

I'm trying to send data to Elasticsearch using logagent but while there doesn't seem to be any error sending the data, the index isn't being created in ELK. I'm trying to find the index by creating a new index pattern via the Kibana GUI but the index does not seem to exist. This is my logagent.conf right now:
input:
# bro-start:
# module: command
# # store BRO logs in /tmp/bro in JSON format
# command: mkdir /tmp/bro; cd /tmp/bro; /usr/local/bro/bin/bro -i eth0 -e 'redef LogAscii::use_json=T;'
# sourceName: bro
# restart: 1
# read the BRO logs from the file system ...
files:
- '/usr/local/bro/logs/current/*.log'
parser:
json:
enabled: true
transform: !!js/function >
function (sourceName, parsed, config) {
var src = sourceName.split('/')
// generate Elasticsearch _type out of the log file sourceName
// e.g. "dns" from /tmp/bro/dns.log
if (src && src[src.length-1]) {
parsed._type = src[src.length-1].replace(/\.log/g,'')
}
// store log file path in each doc
parsed.logSource = sourceName
// convert Bro timestamps to JavaScript timestamps
if (parsed.ts) {
parsed['#timestamp'] = new Date(parsed.ts * 1000)
}
}
output:
stdout: false
elasticsearch:
module: elasticsearch
url: http://10.10.10.10:9200
index: bro_logs
Maybe I have to create the index mappings manually? I don't know.
Thank you for any advice or insight!
I found out that there actually was an error . I was trying to send some authentication via a field called "auth" but that doesn't exist. I can do url: https://USERNAME:PASSWORD#10.10.10.10:9200 though.

Can't get geo_point to work with Bonsai on Heroku

I'm trying to use a geo_point field on Heroku/Bonsai but it just doesn't want to work.
It works in local, but whenever I check the mapping for my index on Heroku/Bonsai it says my field is a string: "coordinates":{"type":"string"}
My mapping looks like this:
tire.mapping do
...
indexes :coordinates, type: "geo_point", lat_lon: true
...
end
And my to_indexed_json like this:
def to_indexed_json
{
...
coordinates: map_marker.nil? ? nil : [map_marker.latitude, map_marker.longitude].join(','),
...
}.to_json
end
In the console on Heroku I tried MyModel.mapping and MyModel.index.mapping and the first one correctly has :coordinates=>{:type=>"geo_point", :lat_lon=>true}.
Here's how I got this to work. Index name 'myindex' type name 'myindextype'
On the local machine
curl -XGET https://[LOCAL_ES_URL]/myindex/myindextype/_mapping
save the output to a .json file. example: typedefinition.json (or hand build one)
{
"myindextype":{
"properties":{
"dataone":{"type":"string"},
"datatwo":{"type":"double"},
"location":{"type":"geo_point"},
"datathree":{"type":"long"},
"datafour":{"type":"string"}
}
}
}
On heroku enter the command
heroku config
and get the BONSAI_URL. Put it in the follwoing commands in place of [BONSAI_URL]. (https://asdfasdfdsf:asdfadf#asdfasdfasdf.us-east-1.bonsai.io/myindex)
curl -XDELETE https://[BONSAI_URL]/myindex
curl -XPOST https://[BONSAI_URL]/myindex
curl -XPUT -d#typedefinition.json https://[BONSAI_URL]/myindex/myindextype/_mapping
curl -XGET https://[BONSAI_URL]/myindex/myindextype/_mapping
Deletes the indes if it exists.
Createds an empty index.
Uses the .json file as a definition for mapping.
Get the new mapping to make sure it worked.

Couchdb view Queries

Could you please help me in creating a view. Below is the requirement
select * from personaccount where name="srini" and user="pup" order by lastloggedin
I have to send name and user as input to the view and the data should be sorted by lastloggedin.
Below is the view I have created but it is not working
{
"language": "javascript",
"views": {
"sortdatetimefunc": {
"map": "function(doc) {
emit({
lastloggedin: doc.lastloggedin,
name: doc.name,
user: doc.user
},doc);
}"
}
}
}
And this the curl command iam using:
http://uta:password#localhost:5984/personaccount/_design/checkdatesorting/_view/sortdatetimefunc?key={\"name:srini\",\"user:pup\"}
My Questions are
As sorting will be done on key and I want it on lastloggedin so I have given that also in emit function.
But iam passing name and user only as parameters. Do we need to pass all the parameters which we give it in key?
First of all I want to convey to you for the reply, I have done the same and iam getting errors. Please help
Could you please try this on your PC, iam posting all the commands :
curl -X PUT http://uta:password#localhost:5984/person-data
curl -X PUT http://uta:password#localhost:5984/person-data/srini -d '{"Name":"SRINI", "Idnum":"383896", "Format":"NTSC", "Studio":"Disney", "Year":"2009", "Rating":"PG", "lastTimeOfCall": "2012-02-08T19:44:37+0100"}'
curl -X PUT http://uta:password#localhost:5984/person-data/raju -d '{"Name":"RAJU", "Idnum":"456787", "Format":"FAT", "Studio":"VFX", "Year":"2010", "Rating":"PG", "lastTimeOfCall": "2012-02-08T19:50:37+0100"}'
curl -X PUT http://uta:password#localhost:5984/person-data/vihar -d '{"Name":"BALA", "Idnum":"567876", "Format":"FAT32", "Studio":"YELL", "Year":"2011", "Rating":"PG", "lastTimeOfCall": "2012-02-08T19:55:37+0100"}'
Here's the view as you said I created :
{
"_id": "_design/persondestwo",
"_rev": "1-0d3b4857b8e6c9e47cc9af771c433571",
"language": "javascript",
"views": {
"personviewtwo": {
"map": "function (doc) {\u000a emit([ doc.Name, doc.Idnum, doc.lastTimeOfCall ], null);\u000a}"
}
}
}
I have fired this command from curl command :
curl -X GET http://uta:password#localhost:5984/person-data/_design/persondestwo/_view/personviewtwo?startkey=["SRINI","383896"]&endkey=["SRINI","383896",{}]descending=true&include_docs=true
I got this error :
[4] 3000
curl: (3) [globbing] error: bad range specification after pos 99
[5] 1776
[6] 2736
[3] Done descending=true
[4] Done(3) curl -X GET http://uta:password#localhost:5984/person-data/_design/persondestwo/_view/personviewtwo?startkey=["SRINI","383896"]
[5] Done endkey=["SRINI","383896"]
I am not knowing what this error is.
I have also tried passing the parameters the below way and it is not helping
curl -X GET http://uta:password#localhost:5984/person-data/_design/persondestwo/_view/personviewtwo?key={\"Name\":\"SRINI\",\"Idnum\": \"383896\"}&descending=true
But I get different errors on escape sequences
Overall I just want this query to be satisfied through the view :
select * from person-data where Name="SRINI" and Idnum="383896" orderby lastTimeOfCall
My concern is how to pass the multiple parameters from curl command as I get lot of errors if I do the above way.
First off, you need to use an array as your key. I would use:
function (doc) {
emit([ doc.name, doc.user, doc.lastLoggedIn ], null);
}
This basically outputs all the documents in order by name, then user, then lastLoggedIn. You can use the following URL to query.
/_design/checkdatesorting/_view/sortdatetimefunc?startkey=["srini","pup"]&endkey=["srini","pup",{}]&include_docs=true
Second, notice I did not output doc as the value of your query. It takes up much more disk space, especially if your documents are fairly large. Just use include_docs=true.
Lastly, refer to the CouchDB Wiki, it's pretty helpful.
I just stumbled upon this question. The errors you are getting are caused by not escaping this command:
curl -X GET http://uta:password#localhost:5984/person-data/_design/persondestwo/_view/personviewtwo?startkey=["SRINI","383896"]&endkey=["SRINI","383896",{}]descending=true&include_docs=true
The & character has a special meaning on the command-line and should be escaped when part of an actual parameter.
So you should put quotes around the big URL, and escape the quotes inside it:
curl -X GET "http://uta:password#localhost:5984/person-data/_design/persondestwo/_view/personviewtwo?startkey=[\"SRINI\",\"383896\"]&endkey=[\"SRINI\",\"383896\",{}]descending=true&include_docs=true"

Resources