I need a logic for date conversion using jsonata - jsonata

My json is like this
{
"Payload" :{
"Date": "",
"Date value" : "2018-12-20T00:00:00.000Z"
}
I need a logic for both if it Iso format are above date format
I write this logic for epoch time
Using toMillis but it is working only for iso
Please help me to resolve this problem

Related

Google Calendar API timeZone attribute

I have a list of time zones that I am using, and I want to pass them to google calendar rather than converting them to offsets at runtime. I am already using a match table for these time zones, and I am adding the timezone offset at the end of the "dateTime" attribute for each of these timezones.
CET
Europe/Prague
US/Eastern
PST
EST
US/Pacific
CTT
Asia/Tokyo
CST
Asia/Taipei
Europe/London
Europe/Amsterdam
Europe/Belgrade
America/Montreal
Australia/Melbourne
Europe/Oslo
Europe/Berlin
Europe/Zurich
Asia/Novosibirsk
Asia/Hong_Kong
Asia/Shanghai
ROK
Asia/Jerusalem
America/Fortaleza
IST
America/Belem
America/New_York
The problem is, this list will only continue to grow, and what I am asking is if there is a way for google calendar to interpret these timezone names as offsets to UTC. I have a sample of the JSON I use:
{
"Event" : {
"description" : "Out of office. Reason: Doctor_visit. Status: APPROVED.",
"end" : {
"dateTime" : "2016-11-16T09:00:00+0200"
"timeZone" : "Europe/Prague"
},
"start" : {
"dateTime" : "2016-11-16T12:00:00+0200"
"timeZone" : "Europe/Prague"
},
"summary" : "Out of office"
},
"parameters" : {
"calendarId" : "xxx#group.calendar.google.com"
}
}
Instead of the "+0200" offset, I am asking if I can pass in "timeZone" : "Europe/Prague" to be interpreted somehow. I have tried putting an offset of "+0000" and passing in the timezone name to the "timeZone" attribute, or passing in "+0000" and a "timeZone" attribute interpreted as "UTC+2:00", but the only way it seems to work for me is if I interpret the timezone at runtime and add an offset at the end of the "dateTime" attribute in the format above.
I am using Dell Boomi to communicate with the calendar API. Any help is greatly appreciated :)
Google calendars timezones are internal you cant change them if you want to add an event as stated in the documentation for events.insert
The time, as a combined date-time value (formatted according to RFC3339). A time zone offset is required unless a time zone is explicitly specified in timeZone.
You must convert it to the correct format when you insert it. YOu may want to consider adding this as a feature request issue forum

Update Processor returns empty String: Converting string date to long in NIFI

Converting a string date with the format mentioned on the image to a number (long) but the output I get is empty string.
Using a JSON reader and writer;
where in input JSON it is a string and in the output JSON it is of type long.
Tried to keep the output JSON type as a String and tried to evaluate the following expression but that was also empty string
${DATE1.value:toDate('yyyy-MM-dd HH:mm:ss'):toNumber():toString()}
Sample data trying to convert: {"DATE1" : "2018-01-17 00:00:00"}
Tried to follow the solution on this link but still getting empty string.
Method 1: Referring to contents of flowfile:-
If you want to change the DATE1 value based on the field value from the content then you need to refer as field.value
Replacement Value Strategy
Literal Value
//DATE1
${field.value:toDate('yyyy-MM-dd HH:mm:ss'):toNumber()}
Referring DATE1 value from the content, then apply expression language to it.
Avro Schema Registry:-
{ "namespace": "nifi", "name": "balances", "type": "record",
"fields": [
{ "name": "DATE1", "type": "string"} ] }
Read DATE1 field value as String from the content.
JsonRecordSetWriter:-
{ "namespace": "nifi", "name": "balances", "type": "record",
"fields": [
{ "name": "DATE1", "type":"long"} ] }
In SetWriter configure DATE1 as Long type.
Input:-
{"DATE1":"2018-01-17 00:00:00"}
Output:-
[{"DATE1":1516165200000}]
(or)
Method 2: Referring to attribute of the flowfile:-
if you are having DATE1 as attribute of the flowfile with value 2018-01-17 00:00:00 we are going to use DATE1 attribute instead of field.value(which refers to contents of flowfile)
Then UpdateRecord Configs would be
Replacement Value Strategy
Literal Value
//DATE1
${DATE1:toDate('yyyy-MM-dd HH:mm:ss'):toNumber()}
in this Expression we are using DATE1 attribute to Update the contents of flowfile.
Both methods will result the same output.
Check value before converts to Date by using isEmpty() and ifElse().
${field.value:isEmpty():ifElse('', ${field.value:toDate('yyyy-MM-dd HH:mm:ss'):toNumber()})}

Access epoch from date field in groovy script in elastic search

My orignal question is here
https://discuss.elastic.co/t/access-the-epoch-of-the-date-type-doc-in-groovy-script/53129
I need to access the stored millis(as in index) of the date in groovy script. Is this possible
Orignal Question
As per my understanding and from here [understanding how elasticsearch stores dates internally (understanding how elasticsearch stores dates internally) elastic search stores the date internally as epoch format. Now consider I need to access this epoch in groovy script and out doc date format is date_optional_time. Now when I try to access it in groovy script it gives me the formatted date(as on the time of input). Is there a way to access the epoch time here.
I have come up with three thoughts
1) Convert the doc value to date and get millis in script,
2) Create a new field with copy_to that stores the date as epoch format
3)//or if possible directly access the epoch. but how?
Can some body guide me on this.I need epoch because I need to update some other field on basis of the epoch
e.g
Consider a mapping like this
{
"createdDate": {
"type": "date",
"store": true,
"format": "dateOptionalTime"
}
"modifiedDate": {
"type": "date",
"store": true,
"format": "dateOptionalTime"
}
"daysINBetween" : {
"type" : "long"
}
,
}
Now I need to run a script that stores (createdDate.millis - modifiedDate.millis) / (24 * 60 * 60 * 1000). I don't want to create a new object of date each time, that's why I am trying to access epoch in script

Elasticsearch is giving error with date on bulk insert

I am trying to insert records in Elasticsearch using bulk api and I am getting below error
"error": "MapperParsingException[failed to parse [created_date]]; nested: MapperParsingException[failed to parse date field [2015-07-18 13:00:22], tried both date format [dateOptionalTime], and timestamp number with locale []]; nested: IllegalArgumentException[Invalid format: \"2015-07-18 13:00:22\" is malformed at \" 13:00:22\"]; "
while I am passing below date
"created_date":"2015-07-18 13:00:22"
and below mapping is used
"created_date": {
"format": "yyyy-MM-DD HH:mm:ss",
"type": "date"
},
I can see that date is correct and mapping is also correct, error is giving for this particular record only and other records are inserted successfully. What could be the reason?
I doubt your mapping has been applied to the field you are expecting.
Logs says tried both date format [dateOptionalTime], and timestamp number with locale []
It does not say that it tries yyyy-MM-DD HH:mm:ss.
May be your created_date is another created_date field?
use "created_date":"2015-07-18T13:00:22"
It may help You

Logstash inserting dates as strings instead of dateOptionalTime

I have an Elasticsearch index with the following mapping:
"pickup_datetime": {
"type": "date",
"format": "dateOptionalTime"
}
Here is an example of a date contained in the file that is being read in
"pickup_datetime": "2013-01-07 06:08:51"
I am using Logstash to read and insert data into ES with the following lines to attempt to convert the date string into the date type.
date {
match => [ "pickup_datetime", "yyyy-MM-dd HH:mm:ss" ]
target => "pickup_datetime"
}
But the match never seems to occur.
What am I doing wrong?
It turns out the date filter was before the csv filter, where the columns get named, hence the date filter was not finding the pickup_datetime column since it had not yet been named.
It might be a good idea to clearly mention the sequentiality of the filters in the documentation to avoid others having similar problems in the future.

Resources