I tried to use elasticdump module to copy my production elasticsearch index to local machine by running command
elasticdump \
--input=http://localhost:9200/prod-index \
--output=http://localhost:9200/local-index \
--type=data \
--ignore-errors \
--maxSockets=5
But it only dumped the first 100 objects and then gave error like
Wed, 08 Jun 2016 22:35:30 GMT | starting dump
Wed, 08 Jun 2016 22:35:30 GMT | got 100 objects from source elasticsearch (offset: 0)
Wed, 08 Jun 2016 22:35:30 GMT | sent 100 objects to destination file, wrote 100
Wed, 08 Jun 2016 22:35:31 GMT | Error Emitted => <html>
<head><title>404 Not Found</title></head>
<body bgcolor="white">
<center><h1>404 Not Found</h1></center>
<hr><center>nginx</center>
</body>
</html>
Wed, 08 Jun 2016 22:35:31 GMT | got 0 objects from source elasticsearch (offset: 100)
Wed, 08 Jun 2016 22:35:31 GMT | Total Writes: 100
Wed, 08 Jun 2016 22:35:31 GMT | dump complete
Searched online for a few solutions such as configure the max sockets. Still not working for me. Any other solutions? Also my npm solution is 0.10.32. Does it matter?
Related
I have a Elasticsearch docker container from which I want to export data in a human readable file format for further analysis. For that I have tried to use elasticdump (other tolls that do the the same are also recommend to know if you can tell me them). When I try to export the mapping information it works but not not when I try the data.
My command just switching the type from mapping to data
elasticdump --input=http://localhost:9200/sessions2-110125 --output=/usr/share/elasticsearch/dump/sessions2-110125.json --type=mapping
For mapping I get:
Tue, 16 Nov 2021 10:25:04 GMT | starting dump
Tue, 16 Nov 2021 10:25:04 GMT | got 1 objects from source elasticsearch (offset: 0)
Tue, 16 Nov 2021 10:25:04 GMT | sent 1 objects to destination file, wrote 1
Tue, 16 Nov 2021 10:25:04 GMT | got 0 objects from source elasticsearch (offset: 1)
Tue, 16 Nov 2021 10:25:04 GMT | Total Writes: 1
Tue, 16 Nov 2021 10:25:04 GMT | dump complete
For data I get:
Tue, 16 Nov 2021 10:24:42 GMT | starting dump
Tue, 16 Nov 2021 10:24:42 GMT | Error Emitted => {"error":"Content-Type header [] is not supported","status":406}
Tue, 16 Nov 2021 10:24:42 GMT | Total Writes: 0
Tue, 16 Nov 2021 10:24:42 GMT | dump ended with error (get phase) => Error: {"error":"Content-Type header [] is not supported","status":406}
I found out that I have to specify the header with the argument
--headers='{"Content-Type": "application/json"}'
but despite that it doesn't change the error message.
Im new in elasticsearch, but really need help to solve this problem.
I try to migrate all indices and data from ES 2.1.1(port 6200) to ES 7.2.1(port 9200), and when i run this command then problem is coming.
Can anyone help?
Ty
Mac$ elasticdump \
> --input=http://localhost:6200/twitter \
> --output=http://localhost:9200/twitter \
> --type=analyzer
Fri, 24 Jan 2020 20:38:23 GMT | starting dump
Fri, 24 Jan 2020 20:38:24 GMT | got 1 objects from source elasticsearch (offset: 0)
Fri, 24 Jan 2020 20:38:56 GMT | sent 1 objects to destination elasticsearch, wrote 1
Fri, 24 Jan 2020 20:38:56 GMT | got 0 objects from source elasticsearch (offset: 1)
Fri, 24 Jan 2020 20:38:56 GMT | Total Writes: 1
Fri, 24 Jan 2020 20:38:56 GMT | dump complete
Mac$ elasticdump \
> --input=http://localhost:6200/twitter \
> --output=http://localhost:9200/twitter \
> --type=mapping
Fri, 24 Jan 2020 20:39:45 GMT | starting dump
Fri, 24 Jan 2020 20:39:45 GMT | got 1 objects from source elasticsearch (offset: 0)
Fri, 24 Jan 2020 20:39:46 GMT | Error Emitted => {"root_cause":[{"type":"mapper_parsing_exception","reason":"No handler for type [string] declared on field [display_url]"}],"type":"mapper_parsing_exception","reason":"No handler for type [string] declared on field [display_url]"}
Fri, 24 Jan 2020 20:39:46 GMT | Error Emitted => {"root_cause":[{"type":"mapper_parsing_exception","reason":"No handler for type [string] declared on field [display_url]"}],"type":"mapper_parsing_exception","reason":"No handler for type [string] declared on field [display_url]"}
Fri, 24 Jan 2020 20:39:46 GMT | Total Writes: 0
Fri, 24 Jan 2020 20:39:46 GMT | dump ended with error (get phase) => [object Object]
This is because the field display_url is of type string and that has changed to text in ES 5.0. So you need to replace all occurrences of string to text in your mapping before trying to send that to ES 7.
So you need to do it in two steps:
Mac$ elasticdump \
> --input=http://localhost:6200/twitter \
> --file=twitter-mapping.json \
> --type=mapping
Then change all string occurrences to text, and then you can send the mapping.
Mac$ elasticdump \
> --file=twitter-mapping.json \
> --output=http://localhost:9200/twitter \
> --type=mapping
whenever i start to import or export the data, i faced this error from elasticdump...
starting dump
Mon, 07 Oct 2019 06:15:25 GMT | got 5 objects from source file (offset: 0)
Mon, 07 Oct 2019 06:15:26 GMT | Error Emitted => Cannot read property 'body' of undefined
Mon, 07 Oct 2019 06:15:26 GMT | Total Writes: 0
Mon, 07 Oct 2019 06:15:26 GMT | dump ended with error (set phase) => TypeError: Cannot read property 'body' of undefined
any help is highly appreciated
I tried import data from json file. Use command like this:
elasticdump --input=2016-1-1-2016-7-31-2.json --output=http://localhost:9200/ And format like this: https://github.com/taskrabbit/elasticsearch-dump/blob/master/test/seeds.json
My backup file has few indexes. But when I start command which I wrote above - got result like this:
Fri, 13 Apr 2018 13:36:44 GMT | starting dump
Fri, 13 Apr 2018 13:36:44 GMT | got 100 objects from source file (offset: 0)
Fri, 13 Apr 2018 13:36:44 GMT | sent 100 objects to destination elasticsearch, wrote 0
Fri, 13 Apr 2018 13:36:44 GMT | got 291 objects from source file (offset: 100)
Fri, 13 Apr 2018 13:36:44 GMT | sent 291 objects to destination elasticsearch, wrote 0
Fri, 13 Apr 2018 13:36:44 GMT | got 292 objects from source file (offset: 391)
Fri, 13 Apr 2018 13:36:45 GMT | sent 292 objects to destination elasticsearch, wrote 0
Fri, 13 Apr 2018 13:36:45 GMT | got 293 objects from source file (offset: 683)
If I set index name in URL or through --output-index={INDEX} - all data from file go to that index with separating on types.
I will be grateful for the help!
in your elasticdump commandline , try with type argument --type=data and indexName e.g. elasticdump --input=2016-1-1-2016-7-31-2.json --output=http://localhost:9200/myIndex --type=data
With Docker you can do tne next:
docker run --name es-dump --rm -ti elasticdump/elasticsearch-dump \
--input=./2016-1-1-2016-7-31-2.json \
--output=http://localhost:9200/2016-1-1-2016-7-31-2 \
--type=data \
--limit=10000
I have used elasticdump --input=2016-1-1-2016-7-31-2.json --output=http://localhost:9200/my_index
and got:
Fri, 18 Nov 2016 14:51:41 GMT | starting dump
Fri, 18 Nov 2016 14:51:42 GMT | got 1 objects from source file (offset: 0)
Fri, 18 Nov 2016 14:51:42 GMT | sent 1 objects to destination elasticsearch, wrote 0
Fri, 18 Nov 2016 14:51:42 GMT | got 0 objects from source file (offset: 1)
Fri, 18 Nov 2016 14:51:42 GMT | Total Writes: 0
Fri, 18 Nov 2016 14:51:42 GMT | dump complete
but http://localhost:9200/_cat/indices doesn't show anything.
What's wrong here?
btw: I plan to use elasticdump to import a folder of json files.
My json file looks like:
{"response":
{
"status": "OK",
"results":[
{"id": "1"},
{"id": "2"}
]
}
}