Kibana: Cant import Shakespeare.json on Sense Web Plugin - windows

I am trying to import shakespeare.json as per elastic search tutorial.
[Environment]
Elastic Search 2.1
Sense -Extension for Chrome
[Background]
When I paste curl -XPUT localhost:9200/_bulk --data-binary #shakespeare.json
on the sense tab(the extensions opens a new tab with 2 windows)
It's converted to PUT /_bulk and the output is
{
"error": {
"root_cause": [
{
"type": "parse_exception",
"reason": "Failed to derive xcontent"
}
],
"type": "parse_exception",
"reason": "Failed to derive xcontent"
},
"status": 400
}
[My Findings]
I have downloaded shakespeare.json locally, but I think Sense is not
able to locate the path where the file resides(May be, I have the file at incorrect location)
I also tried finding the current directory for Sense, but I am not sure where would I find the index.html for chrome-extension in windows for this plug-in.
Whatever documentation I found, is Linux specific.
Any inputs appreciated.

You should not do this in Sense, but simply from the command line
curl -XPUT localhost:9200/_bulk --data-binary #shakespeare.json
Make sure to point to the correct path of the shakespeare.json file if it is not located in the directory you're in.
curl -XPUT localhost:9200/_bulk --data-binary #/path/to/shakespeare.json
UPDATE
If you run on Windows and you don't have the cURL command present, you can download cURL at https://curl.haxx.se/download.html

In Latest ElasticSearch 6, to populate the sharespeare_6.0.json, the following is the curl command
curl -H Content-Type:application/x-ndjson -XPUT localhost:9200/shakespeare/doc/_bulk --data-binary #shakespeare_6.0.json

Related

How to create index and type in elastic search?

I have installed elasticsearch version 2.3.2. I have to add index and type to that elasticsearch. Before I used sense plugin to achieve this. But the addon was removed from webstore. Please give suggestion.
Sense plugin is now a Kibana app. Please refer official reference for installation.
The answer of your question is, you can create index and type in Elasticsearch by running below curl command
curl -XPUT "http://localhost:9200/IndexName/TypeName"
You can use a Rest client like postman to do this. You can get the postman as a chrome extension.
The other way is to do an SSH into one of the nodes in your cluster and run the POST command using CURL.
`curl -X POST 'localhost:9200/bookindex/books' -H 'Content-Type: application/json' -d'
{
"bookId" : "A00-3",
"author" : "Sankaran",
"publisher" : "Mcgrahill",
"name" : "how to get a job"
}'
I will automatically create an index named 'bookindex' with type 'books' and index the data. If index and type already exist it will add the entry to the index.
All operations in Elasticsearch can be done via REST API calls.
To create an index use the index API
curl -XPUT 'localhost:9200/twitter?pretty' -H 'Content-Type: application/json' -d'{"settings" : {"index" : {"number_of_shards" : 3, "number_of_replicas" : 0 }}}'
To create the mapping the you can use the _mapping endpoint-
curl -XPUT http://localhost:9200/twitter/tweets/_mapping -d #"create_p4_schema_payload.json"
Here,mapping is provided via a json file name create_p4_schema_payload.json which contains the following-
{
"properties": {
"user_name": {
"type": "text"
}
}
}
All these can be run via any terminal which supports curl. For windows, you may install cygwin to run linux command from command prompt.
Like it was said above, you can access it through REST api calls. The command you need to run is:
curl -XPUT 'http://localhost:9200/IndexName?include_type_name=TypeName'
CURL is a raw text that can be imported into Postman, for example, or you can install it's CLI and simply run it. Simply put:
It's a PUT api call to the ElasticSearch/IndexName, adding the Query Parameter include_type_name.
The reference guide is at: Elastic Search - Create index API
Sense plugin is removed from chrome webstore. You could use Kibana which has sense like dev-tool to perform ElasticSearch queries.
Follow this link to install kibana.

Elastic data restore from S3

I have elasticsearch backup taken into S3. But I am not able to restore it using any of the commands mentioned below.
curl -XPOST http://localhost:9200/_snapshot/elasticsearch/snap-dev_1/_restore
curl -XPOST http://localhost:9200/_snapshot/snap-deliveryreports_june2016bk/elasticsearch/_restore
I can see the files in S3:
What is the command to restore the data shown in the image?
update:
The following command is successful (returns acknowleged: true)
It means access key, secret key, bucket name and region is correct.
curl -XPUT 'http://localhost:9200/_snapshot/s3_repository?verify=true&pretty' -d'
{
"type": "s3",
"settings": {
"bucket": "todel162",
"region": "us-east-1"
}
}'
I guess I only need to know how to use restore snapshot command.
You can use the cat recovery API to monitor your restore status, as restoring just piggybacks on the regular recovery mechanism of elasticsearch, so check if you see anything using those APIs.

Deleting a type in Elastic Search using curl

I am trying to delete a type in elastic search using curl script in bat file
ECHO Running Curl Script
curl -XDELETE "http://localhost/testing/" -d''
pause
The response that i got was No handler found for uri . I looked into documentation of Elastic Search and it says to use delete by query https://www.elastic.co/guide/en/elasticsearch/reference/5.0/docs-delete-by-query.html
How can i modify the my curl script to use this new api for ES 2.3
Thanks
If you want to use the delete-by-query API to delete all documents of a given type, you can do it like this:
curl -XDELETE "http://localhost/testing/_query?q=_type:typename"
However, you're better off deleting the index and recreating it so you can modify the mapping type as you see fit.
curl -XDELETE "http://localhost/testing/"
curl -XPUT "http://localhost/testing/" -d '{"settings": {...}, "mappings": {...}}'

How to delete all documents from an elasticsearch index

I am trying to delete all document from my index and getting the following error on CURL. No handler found for uri [/logstash-2016.03.11/logevent/] and method [DELETE]
Here is my delete command on Windows command.
curl -XDELETE "http://localhost:9200/logstash-2016.03.11/logevent/"
can anybody help?
curl -XPOST "http://localhost:9200/logstash2016.03.11/logevent/_delete_by_query" -d'
{
"query":{
"match_all":{}
}
}'
The delete-by-query API is new and should still be considered
experimental. The API may change in ways that are not backwards
compatible
https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-delete-by-query.html
You cannot delete a type from an index by executing a delete on the type.
To solve your problem, you have 2 solutions.
If you only have a single type in your logstash index, just execute curl -XDELETE "http://localhost:9200/logstash-2016.03.11. It will delete the old index, but logstash will recreate it when it'll process the next event
You install the delete by query plugin ( https://www.elastic.co/guide/en/elasticsearch/plugins/2.2/plugins-delete-by-query.html ) and run something like this :
curl -XDELETE /logstash-2016.03.11/logevent/_query -d '
{
"query": { "match_all": {}}
}'

Update issue in elasticsearch

When I try to update documents using the update api, the following error is popping up
{"error":"RemoteTransportExceptiones-node9][inet[/10.130.89.220:9300[indices:data/write/update]]; nested: VersionConflictEngineExceptionnewsentiments][3] [relevancy][abc#gmail.com]: version conflict, current [71], provided [70; ","status":409}
What is causing the above error and how can I resolve it?
When you send your request you specify the version of the doc you want to update ? Something like:
curl -XPUT 'localhost:9200/myIndex/MyType/1?version=70' -d '{
"content" : "here is my update"
}'
The issue is someone (or you) already updated the version 70 so now the version is 71.
To resolve your issue, just don't pass the version in the request:
curl -XPUT 'localhost:9200/myIndex/MyType/1' -d '{
"content" : "here is my update"
}'

Resources