Columns order of WebDataRocks pivot grid - webdatarocks

Trying to build a grid with months as columns using webdatarocks, and the problem is that columns are sorted alphabetically (Apr 2020, Aug 2020, Dec 200, ...). Is there an option to order columns by date (Dec 200, Nov 2020, Oct 2020, ...)?
Example is available here
https://codesandbox.io/s/nifty-stonebraker-7mf56?file=/src/App.tsx

This is possible by adding an object to your data that will define data types. Here is an explanation.
In your case, this object would look this way:
{
"CONTRACT": {
"type": "string"
},
"value": {
"type": "number"
},
"date": {
"type": "date string"
},
"name": {
"type": "string"
}
}, {
type: "CONTRACT",
value: 217,
date: "Dec 2020",
name: "24"
}, {
type: "CONTRACT",
value: 725.84,
date: "Dec 2020",
name: "3 "
}, ...
After this, the columns should be ordered by dates. Note that input dates should be formatted properly (compliant with ISO 8601).
The way dates are shown inside WebDataRocks can be modified with the help of datePattern from options.

Related

Gmail API - How do I see a email opening date?

I need to use the Google API in order to retrieve the first time an email was marked as view (Specifically, when an email was opened).
I'm using GET https://gmail.googleapis.com/gmail/v1/users/{userId}/messages/{id} requesting only the metadata to get the messages but the response looks like this:
{
"id": "17a05bd8db1609b9",
"threadId": "17a05bd8db1609b9",
"labelIds": [
"CATEGORY_PROMOTIONS",
"UNREAD",
"INBOX"
],
"payload": {
"partId": "",
"headers": [
{
"name": "Delivered-To",
"value": "{EMAIL ADDRESS}"
},
{
"name": "Received",
"value": "by 2002:a55:c51e:0:b029:e9:12c1:65a9 with SMTP id b30csp1876084egk; Sun, 13 Jun 2021 07:19:06 -0700 (PDT)"
},
{
"name": "X-Google-Smtp-Source",
"value": "ABdhPJzPcWJR1zsvAH654luf+agnL6i6CGj8S/jO1MDZVz3yPHcqE7y37chZ7euL02n40t6idUB/"
},
{
"name": "X-Received",
"value": "by 2002:a9d:62ce:: with SMTP id z14mr10328566otk.255.1623593946243; Sun, 13 Jun 2021 07:19:06 -0700 (PDT)"
},
{
"name": "ARC-Seal",
"value": "i=1; a=rsa-sha256; t=1623593946; cv=none; d=google.com; s=arc-20160816; b=L8L+Vz979TjsIDtXAyhnPBQUmW8Njjz+DiyScOHFvyHbmOC9sIyaH5AFOafzFou45N nTtpzyq9pSlZ8VWd6N9N+NYcdldf67A7/FarG9iIs6EvddVYcpbEqdTPOyMt6/mluVQO utRoX3ma1TFAIyXoQLvxfPZ5QZLZNQFpPwYWGIkB+/8r45OKkqhuWtX8d93InKgpoVIf NQjaI4Tnr2AJWWJjiALL8bLoCe1QvA3mV+I1sTbGRPZAIPcKfm+nB3smYgkH7f9C+1+8 iXR/45AWh+9Sxd1IFrHHokfTEOQvHEWDXm8BBagCFaRFJv45V+FIyWGJKKpL4UCI0oab 7ZRg=="
},
{
"name": "ARC-Message-Signature",
"value": "i=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=arc-20160816; h=list-unsubscribe-post:list-unsubscribe:mime-version:subject :message-id:to:reply-to:from:date:dkim-signature:dkim-signature; bh=3CtcdJdIFbH5c/+55q2hcIqL8foXcatroOK85FVqTxk=; b=0/F/Qn1AmdXlp7t9Or1qvUB+6xmvr2Ewxm33BtMBo956QCvgQQ5qilxt3ZI1Kqx+YB zuQZLKRcG7T1kRqvsq3ERdrAqAr6P8+I6j9yWw6XaI7uuU8crVbnEjbkUAheFjmNeXOP ZcuwtlUPlgDiyOmE6ND2HWLrpUcCKxx/TY17fYkR/H08yr44BqtTXSJVUG12n5Sjb8iA nnFyJHYBRg2Elw7vMnUl+wiO0k1EH9C7ltwTJCjVsDPe0LcvcjtDcr0R4i24sYbNTDgN fyOsKMfJnPAE/oLk6iZhd0NvWVkUUvop6b8kdZtLfVH1jLMIYPBOlNjeSct+yfmtrHWt Vrwg=="
},
{
"name": "ARC-Authentication-Results",
"value": "i=1; mx.google.com; dkim=pass header.i=#emails.waves-audio.com header.s=gears header.b=AAcEguTx; dkim=pass header.i=#d.messagegears.io header.s=gears header.b=jvFNHriZ; spf=pass (google.com: domain of 346064636000c23595702-b21164-83d6be2bdc4542f7a574924245020c5e#emails.waves-audio.com designates 135.84.217.27 as permitted sender) smtp.mailfrom=346064636000c23595702-b21164-83d6be2bdc4542f7a574924245020c5e#emails.waves-audio.com"
},
{
"name": "Return-Path",
"value": "\u003c346064636000c23595702-b21164-83d6be2bdc4542f7a574924245020c5e#emails.waves-audio.com\u003e"
},
{
"name": "Received",
"value": "from mta0201-27.emails.waves-audio.com (mta0201-27.emails.waves-audio.com. [135.84.217.27]) by mx.google.com with ESMTPS id t22si9652818otl.163.2021.06.13.07.19.06 for \u003c{EMAIL ADDRESS}\u003e (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Sun, 13 Jun 2021 07:19:06 -0700 (PDT)"
},
{
"name": "Received-SPF",
"value": "pass (google.com: domain of 346064636000c23595702-b21164-83d6be2bdc4542f7a574924245020c5e#emails.waves-audio.com designates 135.84.217.27 as permitted sender) client-ip=135.84.217.27;"
},
{
"name": "Authentication-Results",
"value": "mx.google.com; dkim=pass header.i=#emails.waves-audio.com header.s=gears header.b=AAcEguTx; dkim=pass header.i=#d.messagegears.io header.s=gears header.b=jvFNHriZ; spf=pass (google.com: domain of 346064636000c23595702-b21164-83d6be2bdc4542f7a574924245020c5e#emails.waves-audio.com designates 135.84.217.27 as permitted sender) smtp.mailfrom=346064636000c23595702-b21164-83d6be2bdc4542f7a574924245020c5e#emails.waves-audio.com"
},
{
"name": "DKIM-Signature",
"value": "v=1; a=rsa-sha256; c=relaxed/relaxed; s=gears; d=emails.waves-audio.com; h=Date:From:Reply-To:To:Message-ID:Subject:MIME-Version:Content-Type: List-Unsubscribe:List-Unsubscribe-Post; i=news#emails.waves-audio.com; bh=3CtcdJdIFbH5c/+55q2hcIqL8foXcatroOK85FVqTxk=; b=AAcEguTxQVhKb8tKVqR1lfLjeU7RxkHAe91vfNVg5UdOTOvGfi+oPi4wnn3dR/XUuFYxu47u9Cfo g+jeKSONECg68D/xEtQCnf0MfO71lKSLXDghlYhhaAh5Jjd2IH88b+2hM5fBFN7Fz7lDUp1+Bw/0 U9IH4Ei+w7E8RXs/D6E="
},
{
"name": "DKIM-Signature",
"value": "v=1; a=rsa-sha256; c=relaxed/relaxed; s=gears; d=d.messagegears.io; h=Date:From:Reply-To:To:Message-ID:Subject:MIME-Version:Content-Type: List-Unsubscribe:List-Unsubscribe-Post; bh=3CtcdJdIFbH5c/+55q2hcIqL8foXcatroOK85FVqTxk=; b=jvFNHriZksLExFsp7Br0sf598nLFywhbNS7N+70VY0zeKKLxvm0G4EKNUJ3Fe+3a5oWYWa7HBcJS vq8hAERdI6vQjNNZHYJifHodm4+B04CXCDev9Il3Sx3qB+CYDYymKyeiEycsHWejCdJilo8HN+GE Cxv+AtCFwq2s68gY/r0="
},
{
"name": "Date",
"value": "Sun, 13 Jun 2021 10:04:23 -0400 (EDT)"
},
{
"name": "From",
"value": "Waves Audio \u003cnews#emails.waves-audio.com\u003e"
},
{
"name": "Reply-To",
"value": "Waves Audio \u003cnews#emails.waves-audio.com\u003e"
},
{
"name": "To",
"value": "{EMAIL ADDRESS}"
},
{
"name": "Message-ID",
"value": "\u003c618286522.102678350.1623593063602.JavaMail.cloud#mta0201.messagegears.net\u003e"
},
{
"name": "Subject",
"value": "ENDS TODAY ⏰ ALL Compressors $29.99"
},
{
"name": "MIME-Version",
"value": "1.0"
},
{
"name": "Content-Type",
"value": "multipart/mixed; boundary=\"----=_Part_102678347_1640882403.1623593063602\""
},
{
"name": "X-Original-To",
"value": "{EMAIL ADDRESS}"
},
{
"name": "List-Unsubscribe",
"value": "\u003chttp://track.waves-audio.com/list-unsub/uc/2/1cla%3ANDYwNjQ2MzY%3AMDItYjIxMTY0LTgzZDZiZTJiZGM0NTQyZjdhNTc0OTI0MjQ1MDIwYzVl%3AYXJzZXJlZ0BnbWFpbC5jb20%3AMTY1NjkwNQ%3An%3An%3A_8LcuFe86CJ4F5wm08TiWA\u003e, \u003cmailto:unsub-346064636000c23595702-b21164-83d6be2bdc4542f7a574924245020c5e#emails.waves-audio.com\u003e"
},
{
"name": "List-Unsubscribe-Post",
"value": "List-Unsubscribe=One-Click"
}
]
},
"sizeEstimate": 51137,
"historyId": "6952408",
"internalDate": "1623593063000"
}
And this doesn't show any field like First opened, nor anything similar.
The other approach I was thinking about was checking the history of Labels of the message, if I'm able to retrieve the date at which the UNREAD labelId was added, I would be able to determine the time the email was viewed.
Issue:
In Gmail API, there's no direct way to retrieve the date a certain message was read.
In Gmail itself, there's the option of requesting a read receipt, but this doesn't automatically apply to all emails, and is not available to the API either (consider filing a feature request in Issue Tracker for this).
Using users.history you can track changes to labels (e.g. UNREAD), but the related History resource does not include information on the dates these label changes occurred, so it would not be useful for your situation either (apart from this, history records expire after a short time, typically around one week or a bit more - see Limitations).
Workaround:
Taking all this into account, I think currently the best approach would be to develop a Workspace Gmail add-on. With this, you can add a contextual trigger that fires a function every time a message is opened (as long as the add-on is being used).
The fired function could then be used to store the current date as well as the messageId (for example, using PropertiesService. This way, you could keep track of the dates each message was opened. In this case, the messageId (or the threadId, for that matter), could be retrieved in your function thanks to Gmail event object.
More specifically, you contextual function could be something along the following lines:
function onGmailContextual(e) {
const messageId = e.gmail.messageId;
const userProps = PropertiesService.getUserProperties();
let dateRead;
const messageProp = userProps.getProperty(messageId);
if (messageProp) { // If it exists, the message was read before
dateRead = new Date(JSON.parse(messageProp)); // Retrieve previously store read date
} else { // If it doesn't exist, the message was not read before (at least while the add-on was open)
dateRead = new Date(); // Get current date
userProps.setProperty(messageId, JSON.stringify(dateRead.getTime())); // Store current date as reading date for this message
}
// ...
}
Please note that this will only work for messages opened when the add-on is opened, so you cannot use this to retrieve reading dates for old messages.
Further reading:
Extending Gmail with Google Workspace add-ons
Build contextual message interfaces
Install an unpublished add-on

Microsoft Graph API: Why is creating a recurring event in the past returning a 400?

I am running into an issue creating recurring events in the past using Graph API. When I POST this data to /me/calendars/[calendarId]/events I get an ErrorPropertyValidationFailure error:
{
"isAllDay": true,
"start": {
"timeZone": "America/New_York",
"dateTime": "2000-09-02"
},
"end": {
"timeZone": "America/New_York",
"dateTime": "2000-09-03"
},
"subject": "Jimmy's birthday",
"body": { "contentType": "text", "content": "" },
"isCancelled": false,
"recurrence": {
"pattern": {
"type": "absoluteYearly",
"interval": 1,
"dayOfMonth": 2,
"month": 9
},
"range": { "startDate": "2000-09-02", "type": "noEnd" }
},
"showAs": "free",
"type": "seriesMaster"
}
All of the data seems valid to me, and indeed just changing the start and end dateTime values and the recurrence range's startDate to be in 2019 instead of 2000, and it seems to work.
But here's where it gets weird: keep those values in 2000, and change the dayOfMonth in the recurrence pattern to an incorrect value, like 5. Then when submitting to the API, it works! The recurrence will instead appear to begin on Sep 5 of 2000 and there is nothing on Sep 2 (also, the event seems to run "Tue 9/5/2000, 11:00 PM to Wed 9/6/2000, 11:00 PM" on the calendar, which is strange, since it also appears as an all-day event).
So my question is: is this a bug? Or what the heck's happening? It looks like correct data is getting a validation error, but incorrect data creates an event.
Updating to add the error body:
{
"error": {
"code": "ErrorPropertyValidationFailure",
"message": "At least one property failed validation.",
"innerError": {
"date": "2020-10-15T22:48:50",
"request-id": "c08b1d73-5b6d-46ac-8751-d1f17310f652",
"client-request-id": "c08b1d73-5b6d-46ac-8751-d1f17310f652"
}
}
}

RethinkDB Query: How to pluck bassed on a date range

Given a table called alerts and a database called database with an array
of objects with a date attribute called History how can I pluck based
on a date range on that date attribute?
with the following query,
r.db("database").table("alerts").pluck("history").limit(10000)
I get back something like the following
{
"history": [
{
"text": "text1" ,
"updateTime": Thu Jun 20 2019 01:29:47 GMT+00:00 ,
},
{
"text": "text2" ,
"updateTime": Thu Jun 20 2019 01:24:59 GMT+00:00 ,
},
]
}
{
"history": [
{
"text": "text3" ,
"updateTime": Thu Jun 20 2018 01:29:47 GMT+00:00 ,
},
{
"text": "text4" ,
"updateTime": Thu Jun 20 2018 01:24:59 GMT+00:00 ,
},
]
}
how can I pluck the sub object called history and only return histories that are in a specific range on the updateTime attribute.
for example between jan/2/2009 to jan/3/2009
You need to filter based on a time range and use pluck on a nested object. Here are some examples about how to do that from the official documentation
r.table("users").filter(function (user) {
return user("subscriptionDate").during(
r.time(2012, 1, 1, 'Z'), r.time(2013, 1, 1, 'Z'));
}).run(conn, callback);
Source: https://www.rethinkdb.com/api/javascript/filter/
r.table('marvel').pluck({'abilities' : {'damage' : true, 'mana_cost' : true}, 'weapons' : true}).run(conn, callback)
Source: https://www.rethinkdb.com/api/javascript/pluck/

Custom object array binding?

I've looked through the documentation, but perhaps I've overlooked what I assume to be a straightforward task. Is it possible to provide a custom binding function so that, in an array of objects, each object corresponds to one cell, rather than each object corresponding to a full row? Would this binding maintain the reference to the original object so that the data would change after being modified in the spreadsheet?
For example, I'd want to create the following sheet:
With JSON in this structure:
[
{
"name": "USA",
"year": 2015,
"sales": 1,
},
{
"name": "USA",
"year": 2016,
"sales": 2,
},
{
"name": "USA",
"year": 2017,
"sales": 3,
},
{
"name": "Canada",
"year": 2015,
"sales": 4,
},
{
"name": "Canada",
"year": 2016,
"sales": 5,
},
{
"name": "Canada",
"year": 2017,
"sales": 6,
}
]
You should look at the columns definition. In there you can define the data source for each column such that it will iterate through your objects and set the values of each column given the id for that column. And yes, it uses references so if you edit them, your objects get edited as well.

Count the different variables in an array in my document

I'm trying out rethinkDB and playing around with some query to see if it could fit by use case. So far, so good. However, I have a question regarding reQL.
For example in this case I store analytics events in rethinkDB such as:
[{
"userId": "abdf213",
"timestamp": "Sat Jan 17 2015 00:32:20 GMT+00:00",
"action": "Page"
},
{
"userId": "123abc",
"timestamp": "Sat Jan 17 2015 00:42:20 GMT+00:00",
"action": "Track"
},
{
"userId": "abdf213",
"timestamp": "Sat Jan 17 2015 00:45:20 GMT+00:00",
"action": "Track"
},
{
"userId": "123abc",
"timestamp": "Sat Jan 17 2015 00:44:20 GMT+00:00",
"action": "Page"
},
{
"userId": "123abc",
"timestamp": "Sat Jan 17 2015 00:48:20 GMT+00:00",
"action": "Page"
}]
I'd like the end result of my query to look like this:
{
"group": "123abc",
"reduction": {
"Page": 2,
"Track": 1
}
},
{
"group": "abdf213",
"reduction": {
"Page": 1,
"Track": 1
}
}
Bear in mind that the action name are not known in advance.
TBH, I'm not quite sure how to achieve this with ReQL.
Right now I have this query (using the data explorer):
r.db('test').table('events').group('userId').map(function(event) {
return event('action')
})
which return doc like this one:
{
"group": "-71omc5zdgdimpuveheqs6dvt5q6xlwenjg7m" ,
"reduction": [
"Identify" ,
"Page" ,
"Track"
]
}
Anyone can point me in the right direction here?
Cheers,
S
Try:
r.table('events').group('userId').map(function(event) {
return r.object(event('action'), 1);
}).reduce(function(a, b) {
return a.merge(b.keys().map(function(key) {
return [key, a(key).default(0).add(b(key))];}).coerceTo('object'));
})
Here's my solution:
r.table("events").group("userId", "action").count().ungroup()
.group(r.row("group")(0))
.map([r.row("group")(1), r.row("reduction")])
.coerceTo("object")
ReQL doesn't support nesting groups, but you can group by multiple fields at the same time and then performing further grouping on the output.

Resources