Trying to build a grid with months as columns using webdatarocks, and the problem is that columns are sorted alphabetically (Apr 2020, Aug 2020, Dec 200, ...). Is there an option to order columns by date (Dec 200, Nov 2020, Oct 2020, ...)?
Example is available here
https://codesandbox.io/s/nifty-stonebraker-7mf56?file=/src/App.tsx
This is possible by adding an object to your data that will define data types. Here is an explanation.
In your case, this object would look this way:
{
"CONTRACT": {
"type": "string"
},
"value": {
"type": "number"
},
"date": {
"type": "date string"
},
"name": {
"type": "string"
}
}, {
type: "CONTRACT",
value: 217,
date: "Dec 2020",
name: "24"
}, {
type: "CONTRACT",
value: 725.84,
date: "Dec 2020",
name: "3 "
}, ...
After this, the columns should be ordered by dates. Note that input dates should be formatted properly (compliant with ISO 8601).
The way dates are shown inside WebDataRocks can be modified with the help of datePattern from options.
I am looking at the timestamp data for logstash and it seems to be off by 4 hours. Likewise, during ingestion, I have a datetime: yyyyMMdd HH:mm which is local to EST (New York) but is being conveyed as off by this same 4 hours.
I am not sure how logstash determines the current time, but i was thinking it mightve been specific to the host machine? When looking at my machine, running date returns Mon Oct 19 17:32:25 UTC 2020 which is a 4 hour difference from me currently ( 13:32 ), but the machine is accurate.
What I am thinking is that somehow there is a misinterpretation of the #timestaamp object on this logstash machine. My recent Logstash ingested object is showing: Oct 19, 2020 # 09:33:00.000 which is 4 hour different.
I presumed that timestamp is set in logstash and not in elastic, but i can see that somehow there may be some sort of misinterpretation.
I am currently using the most up to date docker containers, which are all 7.9.2. The ingested data timestamp is incorrect, and likewise, I noticed that some ingested data us being ingested at the above format but has no set datetime to adjust.
My desired end goal is to: Fix this discrency and then index the data on the timestamp reported and not the time of the curl request.
Ingested Data:
// http://realtime.portauthority.org/bustime/api/v3/getvehicles?key=hC5Di7VSYU3hjmw2gAqHtKdec&rt=65,67,69,7,71,71A,71B,71C,71D,74&format=json
{
"bustime-response": {
"vehicle": [
{
"vid": "6141",
"rtpidatafeed": "Port Authority Bus",
"tmstmp": "20201019 11:53",
"lat": "40.45320129394531",
"lon": "-79.7513656616211",
"hdg": "176",
"pid": 7788,
"rt": "67",
"des": "Downtown",
"pdist": 0,
"dly": false,
"spd": 0,
"tatripid": "9333",
"origtatripno": "11348066",
"tablockid": "067 -066",
"zone": "",
"mode": 0,
"psgld": "HALF_EMPTY"
}
],
"error": [
{
"rt": "65",
"msg": "No data found for parameter"
},
{
"rt": "7",
"msg": "No data found for parameter"
}
]
}
}
JSON Entry from Kibana:
{
"_index": "transit-pittsburgh-2020.10.19",
"_type": "_doc",
"_id": "y60WQnUBgX7z6iMwvAaJ",
"_version": 1,
"_score": null,
"_source": {
"#timestamp": "2020-10-19T14:19:00.000Z",
"bustime-response": {
"error": [
{
"msg": "No data found for parameter",
"rt": "65"
},
{
"msg": "No data found for parameter",
"rt": "7"
},
{
"msg": "No data found for parameter",
"rt": "71"
}
],
"vehicle": {
"rtpidatafeed": "Port Authority Bus",
"pdist": 72453,
"tablockid": "067 -066",
"hdg": "66",
"vid": "6141",
"lat": "40.433110918317524",
"rt": "67",
"dly": false,
"origtatripno": "11348056",
"bk_tmstmp": "20201019 14:19",
"tatripid": "9249",
"mode": 0,
"tmstmp": "20201019T14:19",
"pid": 7294,
"psgld": "FULL",
"lon": "-79.7984379359654",
"spd": 20,
"zone": "",
"geo_location": "40.433110918317524,-79.7984379359654",
"des": "CCAC Boyce"
}
},
"#version": "1"
},
"fields": {
"#timestamp": [
"2020-10-19T14:19:00.000Z"
],
"bustime-response.vehicle.tmstmp": [
"2020-10-19T14:19:00.000Z"
]
},
"sort": [
1603117140000
]
}
One thing i did noticed was that the Date converstion for bustime-response.vehcile.tmstmp is creating an ISO date as UTC when the ingested date was a simple local yyyyMMdd HH:mm format which I need to turn into EST Timezone.
If I understood correctly you are using the date filter with the field tmstpm to create the #timestamp fields.
The format yyyyMMdd HH:mm of the tmstpm field does not have any information about the offset from UTC, so if you simple use the date filter with this field without specifying that this time has a offset, it will be treated as a UTC time.
Using your example, 20201019 11:53
date {
match => ["tmstmp", "yyyyMMdd HH:mm"]
}
Losgtash will create the #timestamp field as 2020-10-19T11:43:00Z, and in your timezone this time is 2020-10-19T07:43:00Z, which is wrong.
You need to tell logstash that your original time field is in a different timezone from UTC.
date {
match => ["tmstmp", "yyyyMMdd HH:mm"]
timezone => "America/New_York"
}
This way the #timestamp field will be created with the value 2020-10-19T15:43:00Z which is the UTC time when your local time is 11:43.
You can also use timezone => "-0400"
So my dateFormat is "yyyy-MM-dd HH:mm:ss" which I think causes my data points to be rounded to the day.
var chart = am4core.createFromConfig({
// Category axis
"xAxes": [{
"type": "DateAxis",
"renderer": {
"labels": {
"location": 0.0001
}
},
"dateFormats": {
"dateFormat": "yyyy-MM-dd HH:mm:ss"
}
}],
Image that shows the issue I have
Timestamp format: 2016-12-08 14:00:43
[1]: https://i.stack.imgur.com/yhhbn.png
You need to set the inputDateFormat to parse your date+timestamp format. By default it is set to parse daily data (yyyy-MM-dd), which is why your data is clustered together.
am4core.createFromConfig({
// ...
dateFormatter: {
inputDateFormat: 'yyyy-MM-dd HH:mm:ss',
// ...
},
// ...
Currently i am creating a chatbot for skype using Dialogflow the main problem is when i use the command "Now" in a skype message it uses my current time +1 hour, but when i ask for the time "Now" from the IOS Application it use the current correct TimeZone, someone knows from "where" exactly dialogflow takes the current time zone for the word "Now" because from my app-IOS because from the IOS_Application it gets one value (Correct timezone value) and from skype it gets a another(timezone + 1 hour value)
Raw Interaction Log(Dialogflow - Skype):
{
"queryText": "what time is now?",
"parameters": {
"time": "StiDate [Thu Oct 18 12:38:16 CDT 2018]"
},
"fulfillmentText": "the time is 12:38:16",
"fulfillmentMessages": [
{
"text": {
"text": [
"[{\"type\":0,\"speech\":\"the time is 12:38:16\"}]"
]
}
}
],
"intent": {
"id": "37524c80-a15a-4c04-aa9b-38986ff38993",
"displayName": "A_Test_EventTime"
},
"languageCode": "en",
"sentimentAnalysisResult": {},
"id": "93ce9408-4b73-4f18-9ae0-b947a906afc8",
"sessionId": "6b69769b-1ce7-4359-9018-c88d017485bf",
"timestamp": "2018-10-18T17:38:16.164Z",
"source": "agent"
}
Raw Interaction Log(Dialogflow - AppIOS):
{
"queryText": "What time is now?",
"parameters": {
"time": "StiDate [Thu Oct 18 11:38:00 CST 2018]"
},
"fulfillmentText": "the time is 11:38:00",
"fulfillmentMessages": [
{
"text": {
"text": [
"[{\"type\":0,\"speech\":\"the time is 11:38:00\"}]"
]
}
}
],
"outputContexts": [
{
"name": "fa75fc39-7c68-47ac-bea5-12394f425855",
"lifespanCount": 4,
"parameters": {
"time.original": "now?",
"time": "StiDate [Thu Oct 18 11:38:00 CST 2018]"
}
}
],
"intent": {
"id": "37524c80-a15a-4c04-aa9b-38986ff38993",
"displayName": "A_Test_EventTime"
},
"languageCode": "en",
"sentimentAnalysisResult": {},
"id": "58ade82b-c842-44b6-b0a2-d6cced4d6648",
"sessionId": "dfe0efda53d11aa3d8d43e92a726f9e4",
"timestamp": "2018-10-18T17:38:00.695Z",
"source": "agent"
}
Dialogflow agents have a default time zone. You can change this time zone in your Dialogflow's agents settings in the console: https://dialogflow.com/docs/agents/create-manage#general
I am unable to migrate date in Elasticsearch 6. My date example is:
Fri, 21 Apr 2017 01:58:20 GMT
I have tried this without success:
"date": {
"type": "date",
"format": "E, d MMM Y H:m:s z"
}
Please help.