I am unable to migrate date in Elasticsearch 6. My date example is:
Fri, 21 Apr 2017 01:58:20 GMT
I have tried this without success:
"date": {
"type": "date",
"format": "E, d MMM Y H:m:s z"
}
Please help.
Related
Trying to build a grid with months as columns using webdatarocks, and the problem is that columns are sorted alphabetically (Apr 2020, Aug 2020, Dec 200, ...). Is there an option to order columns by date (Dec 200, Nov 2020, Oct 2020, ...)?
Example is available here
https://codesandbox.io/s/nifty-stonebraker-7mf56?file=/src/App.tsx
This is possible by adding an object to your data that will define data types. Here is an explanation.
In your case, this object would look this way:
{
"CONTRACT": {
"type": "string"
},
"value": {
"type": "number"
},
"date": {
"type": "date string"
},
"name": {
"type": "string"
}
}, {
type: "CONTRACT",
value: 217,
date: "Dec 2020",
name: "24"
}, {
type: "CONTRACT",
value: 725.84,
date: "Dec 2020",
name: "3 "
}, ...
After this, the columns should be ordered by dates. Note that input dates should be formatted properly (compliant with ISO 8601).
The way dates are shown inside WebDataRocks can be modified with the help of datePattern from options.
Given a table called alerts and a database called database with an array
of objects with a date attribute called History how can I pluck based
on a date range on that date attribute?
with the following query,
r.db("database").table("alerts").pluck("history").limit(10000)
I get back something like the following
{
"history": [
{
"text": "text1" ,
"updateTime": Thu Jun 20 2019 01:29:47 GMT+00:00 ,
},
{
"text": "text2" ,
"updateTime": Thu Jun 20 2019 01:24:59 GMT+00:00 ,
},
]
}
{
"history": [
{
"text": "text3" ,
"updateTime": Thu Jun 20 2018 01:29:47 GMT+00:00 ,
},
{
"text": "text4" ,
"updateTime": Thu Jun 20 2018 01:24:59 GMT+00:00 ,
},
]
}
how can I pluck the sub object called history and only return histories that are in a specific range on the updateTime attribute.
for example between jan/2/2009 to jan/3/2009
You need to filter based on a time range and use pluck on a nested object. Here are some examples about how to do that from the official documentation
r.table("users").filter(function (user) {
return user("subscriptionDate").during(
r.time(2012, 1, 1, 'Z'), r.time(2013, 1, 1, 'Z'));
}).run(conn, callback);
Source: https://www.rethinkdb.com/api/javascript/filter/
r.table('marvel').pluck({'abilities' : {'damage' : true, 'mana_cost' : true}, 'weapons' : true}).run(conn, callback)
Source: https://www.rethinkdb.com/api/javascript/pluck/
I have specified the mapping info in the template as below:
"createdDate": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss||yyyy-MM-dd||epoch_millis||EEE MMM dd HH:mm:ss Z YYYY||dd MMM yyyy HH:mm:ss zzz"
},
"expirationDate": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss||yyyy-MM-dd||epoch_millis||EEE MMM dd HH:mm:ss Z YYYY||dd MMM yyyy HH:mm:ss zzz"
}
And when i tried to insert a Doc into the index i am getting the following error.
PUT my_index/MyDocType/5c7ab034-5de3-4401-8b46-7f8158618b68
{
"uuid": "5c7ab034-5de3-4401-8b46-7f8158618b68",
"createdDate": "15 Apr 2019 14:10:10 EDT",
"expirationDate": "10 Oct 2019 00:00:00 EDT"
}
And here is the Error i am getting:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "failed to parse [createdDate]"
}
],
"type": "mapper_parsing_exception",
"reason": "failed to parse [createdDate]",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "Invalid format: \"15 Apr 2019 14:10:10 EDT\" is malformed at \" Apr 2019 14:10:10 EDT\""
}
},
"status": 400
}
I can parse the String Values using following with out any issues. Not sure why Elastic is complaining while parsing the Month part of the Date String !!
SimpleDateFormat sdf = new SimpleDateFormat("dd MMM yyyy HH:mm:ss zzz");
As mentioned in this link, below is what z and Z means
z time zone text Pacific Standard Time; PST
Z time zone offset/id zone -0800; -08:00; America/Los_Angeles
It appears that your format is not valid Joda DateTimeFormat
Adding the correct format dd MMM yyyy HH:mm:ss z to your mapping, as mentioned below, should solve your issue.
{
"createdDate":{
"type":"date",
"format":"yyyy-MM-dd HH:mm:ss||yyyy-MM-dd||epoch_millis||EEE MMM dd HH:mm:ss Z YYYY||dd MMM yyyy HH:mm:ss zzz||dd MMM yyyy HH:mm:ss z"
}
}
Hope it helps!
I need to make a query with a list of Incidents and his nested events ordered DESC by his startedAt and timestamp dates. By default ReQL give the dates with a ASC order. I've got the folowing structure:
{
"id": "87e14db8-1e15-4718-baac-f1c785e985cb" ,
"title": "Connection Error"
"startedAt": Mon Oct 26 2015 14:33:00 GMT+00:00 ,
"events": [{
"message": "Cannot connect to theserver.com",
"timestamp": Mon Oct 26 2015 14:33:00 GMT+00:00
},{
"message": "Cannot connect to theserver.com,"
"timestamp": Mon Oct 26 2015 14:33:20 GMT+00:00
},{
"message": "Cannot connect to theserver.com",
"timestamp": Mon Oct 26 2015 14:33:40 GMT+00:00
}]
},{
"id": "87e14db8-1e15-4718-baac-f1c785e985cb" ,
"title": "Other Connection Error"
"startedAt": Mon Oct 26 2015 14:34:20 GMT+00:00 ,
"events": [{
"message": "Connection rejected",
"timestamp": Mon Oct 26 2015 14:34:20 GMT+00:00
},{
"message": "Connection rejected",
"timestamp": Mon Oct 26 2015 14:34:41 GMT+00:00
}]
},{
... (several more)
}
If I run r.db('mydb').table('Incident').orderBy(r.desc('createdAt')), the Incident's are ordered by createdAt as espected. But the nested eventsare still ordered ASC.
How can I make a query in order to get the nested events with a DESC order by timestamp?
Something like this should do it:
r.table('Incident').orderBy(r.desc('createdAt')).merge(function(row) {
return {events: row('events').orderBy(r.desc('timestamp'))};
})
I think this is what you're looking for. Just took a little wizardy with the .map(...) method.
r.db("test").table("stackoverflow").orderBy(r.desc('startedAt')).map(function(d){
return {
"startedAt":d("startedAt"),
"title": d("title"),
"id": d("id"),
"events": d("events").orderBy(r.desc("timestamp"))
}
})
I'm enjoying learning reQL so far but I stumbled upon a problem.
This is the data that I have stored in a table called events
[{
"date": "Tue Mar 17 2015 00:00:00 GMT+00:00" ,
"id": "00dacebd-b27e-49b5-be4b-42c2578db4bb" ,
"event_name": "View page" ,
"total": 4 ,
"unique": 4
},
{
"date": "Mon Mar 16 2015 00:00:00 GMT+00:00" ,
"id": "09ac3579-960b-4a2b-95be-8e018d683494" ,
"event_name": "View page" ,
"total": 68 ,
"unique": 35
},
{
"date": "Tue Mar 17 2015 00:00:00 GMT+00:00" ,
"id": "0bb01050-e93d-4845-94aa-b86b1198338d" ,
"event_name": "Click" ,
"total": 17 ,
"unique": 8
},
{
"date": "Mon Mar 16 2015 00:00:00 GMT+00:00" ,
"id": "174dcf3e-7c77-47b6-a05d-b875c9f7e563" ,
"event_name": "Click" ,
"total": 113 ,
"unique": 35
}]
And I would like the end result to look like this
[{
"date": "Mon Mar 16 2015 00:00:00 GMT+00:00",
"Click": 113,
"View Page": 68
},
{
"date": "Tue Mar 17 2015 00:00:00 GMT+00:00",
"Click": 17,
"View Page": 4
}]
The closet I got was with this query:
r.table("events").orderBy({index: r.desc('date')}).group('date').map(function(event) {
return r.object(event('repo_name'), event('unique'));
}).reduce(function(a, b) {
return a.merge(b.keys().map(function(key) {
return [key, a(key).default(0).add(b(key))];}).coerceTo('object'));
})
The results is:
[{
"date": "Mon Mar 16 2015 00:00:00 GMT+00:00",
"reduction": {
"Click": 113,
"View page": 68
}
},
{
"group": "Tue Mar 17 2015 00:00:00 GMT+00:00",
"reduction": {
"Click": 17,
"View page": 4
}
}]
However as you can see the events are nested under reduction and changefeeds won't work on this query either :(
Anyone can point me in the right direction?
Cheers,
You can change the group/reduction format like this:
query.ungroup().map(function(row){
return r.expr({date: row('group')}).merge(row('reduction'));
})
Unfortunately changefeeds on aggregations aren't supported as of 1.6, but that should be possible in 2.2 or 2.3 (so in a few months).