Google Calendar API timeZone attribute - google-api

I have a list of time zones that I am using, and I want to pass them to google calendar rather than converting them to offsets at runtime. I am already using a match table for these time zones, and I am adding the timezone offset at the end of the "dateTime" attribute for each of these timezones.
CET
Europe/Prague
US/Eastern
PST
EST
US/Pacific
CTT
Asia/Tokyo
CST
Asia/Taipei
Europe/London
Europe/Amsterdam
Europe/Belgrade
America/Montreal
Australia/Melbourne
Europe/Oslo
Europe/Berlin
Europe/Zurich
Asia/Novosibirsk
Asia/Hong_Kong
Asia/Shanghai
ROK
Asia/Jerusalem
America/Fortaleza
IST
America/Belem
America/New_York
The problem is, this list will only continue to grow, and what I am asking is if there is a way for google calendar to interpret these timezone names as offsets to UTC. I have a sample of the JSON I use:
{
"Event" : {
"description" : "Out of office. Reason: Doctor_visit. Status: APPROVED.",
"end" : {
"dateTime" : "2016-11-16T09:00:00+0200"
"timeZone" : "Europe/Prague"
},
"start" : {
"dateTime" : "2016-11-16T12:00:00+0200"
"timeZone" : "Europe/Prague"
},
"summary" : "Out of office"
},
"parameters" : {
"calendarId" : "xxx#group.calendar.google.com"
}
}
Instead of the "+0200" offset, I am asking if I can pass in "timeZone" : "Europe/Prague" to be interpreted somehow. I have tried putting an offset of "+0000" and passing in the timezone name to the "timeZone" attribute, or passing in "+0000" and a "timeZone" attribute interpreted as "UTC+2:00", but the only way it seems to work for me is if I interpret the timezone at runtime and add an offset at the end of the "dateTime" attribute in the format above.
I am using Dell Boomi to communicate with the calendar API. Any help is greatly appreciated :)

Google calendars timezones are internal you cant change them if you want to add an event as stated in the documentation for events.insert
The time, as a combined date-time value (formatted according to RFC3339). A time zone offset is required unless a time zone is explicitly specified in timeZone.
You must convert it to the correct format when you insert it. YOu may want to consider adding this as a feature request issue forum

Related

I need a logic for date conversion using jsonata

My json is like this
{
"Payload" :{
"Date": "",
"Date value" : "2018-12-20T00:00:00.000Z"
}
I need a logic for both if it Iso format are above date format
I write this logic for epoch time
Using toMillis but it is working only for iso
Please help me to resolve this problem

what does mean now/d elasticsearch

what exactly is it now-1d/d or now/d in elastic search, Below is an example query
GET /_search
{
"query": {
"range" : {
"timestamp" : {
"gte" : "now-1d/d",
"lt" : "now/d"
}
}
}
}
it will take the current timestamp(time when your query reaches to Elasticsearch) and deduct the 1 day timestamp and bring the document in that range.
These types of queries are useful when you don't want to specify the exact time and want to get data of last 1 day, 3 day, 7 day, 1 month etc.
As mentioned in official doc of range query
now is always the current system time in UTC.
Taken example from official doc of datemath
Assuming now is 2001-01-01 12:00:00, some examples are:
now+1h now in milliseconds plus one hour. Resolves to: 2001-01-01
13:00:00
now-1h now in milliseconds minus one hour. Resolves to: 2001-01-01
11:00:00
now-1h/d now in milliseconds minus one hour, rounded down to UTC
00:00. Resolves to: 2001-01-01 00:00:00
2001.02.01||+1M/d 2001-02-01 in milliseconds plus one month. Resolves to: 2001-03-01 00:00:00

date format convertion in logstash elk stack

I have a date column in my table that I fetch using jdbc input in logstash. The problem is logstash gives a wrong value to elasticsearch stack.
For example if I have a date start_date="2018-03-01" in elasticsearch I would get the value "2018-02-28 23:00:00.000".
What I want is to keep the format of start_date or at least output the value "2018-03-01 00:00:00.000" to elasticsearch.
I tried to use this filter :
date {
timezone => "UTC"
match => ["start_date" , "ISO8601", "yyyy-MM-dd HH:mm:ss"]
}
but it didn't work.
It is because, you are trying to convert it to UTC timezone. You need to change your configuration like this:
date {
match => ["start_date" , "yyyy-MM-dd"]
}
This would be enough to parse your date.
Let me know if that works.

How to add new timestamp field on logstash?

I'm parsing a log that have previously loaded in my localhost and I would like to get the event date field in each row as a timestamp, but kibana only can gets it as a string.
Example:
I have this event
2016/09/27 13:33:49.701 GMT(09/27 15:33:49 +0200) INFO BILLINGGW ConvergysDelegate.getCustomer(): Calling getCustomerFromCache: 0001:606523
It was loaded on September 27th 2016, 16:04:53.222, but the logdate field (the event date) is: 2016/09/27 13:33:49.701.
On logstash filter I defined:
(?<logdate>%{NUMBER:year}/%{NUMBER:month}/%{NUMBER:day} %{HOUR:hday}:%{MINUTE:min}:%{SECOND:sec}) %{GREEDYDATA:result}
I also proved with:
(?<logdate>%{YEAR:year}/%{MONTHNUM:month}/%{MONTHDAY:day} %{HOUR:hday}:%{MINUTE:min}:%{SECOND:sec}) %{GREEDYDATA:result}
And kibana reads logdate like string. How can I get that kibana could read it as timestamp?
I proved only with the date:
(?<logdate>%{NUMBER:year}/%{NUMBER:month}/%{NUMBER:day})
and Kibana interpreted it properly like timestamp, but the problem is how to add correctly the hours, minutes and seconds to the logdate field.
Could anyone help me?
Best regards.
You'll have to convert from a string to a timestamp, using the date filter.
date {
match => [ "logdate", "yyyy/MM/dd HH:mm:ss"]
}
This will attempt to parse the logdate field with this date pattern yyyy/MM/dd HH:mm:ss and, if successful, will replace the #timestamp field with the result. You can specify another field for the parsed date with the target option.

Dynamic time zone offset in elasticsearch aggregation?

I'm aggregating documents that each have a timestamp. The timestamp is UTC, but the documents each also have a local time zone ("timezone": "America/Los_Angeles") that can be different across documents.
I'm trying to do a date_histogram aggregation based on local time, not UTC or a fixed time zone (e.g., using the option "time_zone": "America/Los_Angeles").
How can I convert the timezone for each document to its local time before the aggregation?
Here's the simple aggregation:
{
"aggs": {
"date": {
"date_histogram": {
"field": "created_timestamp",
"interval": "day"
}
}
}
}
I'm not sure if I fully understand it, but it seems like the time_zone property would be for that:
The zone value accepts either a numeric value for the hours offset, for example: "time_zone" : -2. It also accepts a format of hours and minutes, like "time_zone" : "-02:30". Another option is to provide a time zone accepted as one of the values listed here.
If you store another field that's the local time without timezone information it should work.
Take every timestamp you have (which is in UTC), convert it to a date in the local timezone (this will contain the timezone information). Now simply drop the timezone information from this datetime. Now you can perform actions on this new field.
Suppose you start with this time in UTC:
'2016-07-17T01:33:52.412Z'
Now, suppose you're in PDT you can convert it to:
'2016-07-16T18:33:52.412-07:00'
Now, hack off the end so you end up with:
'2016-07-16T18:33:52.412Z'
Now you can operate on this field.

Resources