I am new to Google Apps Script and learning javascript as I go about this project. Over the course of the introductory codelabs I noted the best practice to read all the data into an array with one command, perform operations, and then write it with one command.
I understood how to do this working with Google Sheets but how do I achieve this working with Google Calendar? I have come across a few links discussing batching with Google Calendar API and Advanced Google Services but I didn't understand how to make use of the information.
I basically hope to batch edit events instead of accessing Google Calendar repeatedly in a for loop.
function deleteMonth() {
// Set Date range to delete
var today = new Date();
var firstDay = new Date(today.getFullYear(), today.getMonth(), 1);
var lastDay = new Date(today.getFullYear(), today.getMonth() + 1, 0);
// read spreadsheet data and get User Info from ss
var spreadsheet = SpreadsheetApp.getActiveSpreadsheet();
var idSheet = spreadsheet.getSheetByName('User Info');
//Get users from sheet in array of objects with properties from column header in
//'User Info' (name, email, year, calName, calID, early, late)
var userInfo = getSheetData(idSheet);
var deletedNames = "";
for (i = 0; i < userInfo.length; i++) {
var calID = userInfo[i].calID;
// if we have calID proceed to delete events
if (calID) {
console.time("get events");
var calendar = CalendarApp.getCalendarById(calID);
var events = calendar.getEvents(firstDay, lastDay);
console.timeEnd("get events");
// Delete events and add deleted name to string
// deletedNames
for (i = 0; i < events.length; i++) {
console.time("delete event");
deletedNames += events[i].getTitle() + ", ";
events[i].deleteEvent();
console.timeEnd("delete event");
}
}
}
spreadsheet.toast("Deleted events: \n" + deletedNames);
}
Time output from console.time():
Other related links which sounded relevant:
Using advanced google services (apps script resource)
Google Developer blog?
I believe your goal as follows
You want to delete all events in a month for several calendars using the batch process with Google Apps Script.
You want to reduce the process cost of above process.
For this, how about this answer?
Issue and workaround:
Calendar API can process with the batch requests. The batch requests can run 100 requests by one API call and can process with the asynchronous process. By this, I think that the process cost can bereduced. But, in the current stage, unfortunately, several calendar IDs cannot be used in one batch. When the requests including several calendar IDs, an error of Cannot perform operations on different calendars in the same batch request. occurs. So in your case, all events in a month in one calendar ID can be deleted in one batch requests. It is required to request the number of calendar IDs. I would like to propose this as the current workaround.
By the way, as the modification point of your scrit, in your script, the variable i is used in 1st for loop and 2nd for loop. By this, all values of userInfo are not used. Please be careful this.
Sample script:
Before you run the script, please enable Calendar API at Advanced Google services.
function deleteMonth() {
var today = new Date();
var firstDay = new Date(today.getFullYear(), today.getMonth(), 1);
var lastDay = new Date(today.getFullYear(), today.getMonth() + 1, 0);
var spreadsheet = SpreadsheetApp.getActiveSpreadsheet();
var idSheet = spreadsheet.getSheetByName('User Info');
var userInfo = getSheetData(idSheet);
var deletedNames = "";
var requests = []; // For batch requests.
for (i = 0; i < userInfo.length; i++) {
var req = [];
var calID = userInfo[i].calID;
if (calID) {
var calendar = CalendarApp.getCalendarById(calID);
var events = calendar.getEvents(firstDay, lastDay);
for (j = 0; j < events.length; j++) {
deletedNames += events[j].getTitle() + ", ";
var e = events[j];
req.push({
method: "DELETE",
endpoint: `https://www.googleapis.com/calendar/v3/calendars/${calID}/events/${e.getId().replace("#google.com", "")}`,
});
}
}
requests.push(req);
}
// Run batch requests.
requests.forEach(req => {
const limit = 100;
const split = Math.ceil(req.length / limit);
const boundary = "xxxxxxxxxx";
for (let i = 0; i < split; i++) {
const object = {batchPath: "batch/calendar/v3", requests: req.splice(0, limit)};
const payload = object.requests.reduce((s, e, i) => s += "Content-Type: application/http\r\nContent-ID: " + i + "\r\n\r\n" + e.method + " " + e.endpoint + "\r\nContent-Type: application/json; charset=utf-8\r\n\r\n" + JSON.stringify(e.requestBody) + "\r\n--" + boundary + "\r\n", "--" + boundary + "\r\n");
const params = {method: "post", contentType: "multipart/mixed; boundary=" + boundary, payload: payload, headers: {Authorization: "Bearer " + ScriptApp.getOAuthToken()}, muteHttpExceptions: true};
var res = UrlFetchApp.fetch("https://www.googleapis.com/" + object.batchPath, params);
console.log(res.getContentText())
}
})
spreadsheet.toast("Deleted events: \n" + deletedNames);
}
Note:
Please use this script with V8.
References:
Advanced Google services
Sending Batch Requests
Events: delete
Related
I get the following error message when running some code in Google Apps Script. I don't understand the Line/Column reference, Code:46:18. It appears to point to either a line with too few columns or a process with too few lines. I assume I am not interpreting the reference correctly.
TypeError: Cannot set property 'format' of undefined
at processInbox(processInbox Code:46:18)
Line 46 of all my code is this and certainly doesn't have 18 columns (and it closes a function that doesn't refer to format):
}
The process referred to by the error message, processInbox, is only 39 lines long.
The script is called by selecting "Run Script" in the menu, "CiviSchedule" in the related Google Sheet, which triggers the doTasks function. This menu and and trigger are created in the onOpen function.
How am I misinterpreting the error message? (Full code follows)
[screenshot of error]
[screenshot of lines 40-46]
The code for reference:
//General Info
//
// As detailed in Managing Scheduled Jobs URL method http://wiki.civicrm.org/confluence/display/CRMDOC/Managing+Scheduled+Jobs#ManagingScheduledJobs-URLmethod :
//
// a valid Username and Password (for a Drupal, Joomla or WordPress user who has adequate permissions
// for the job or jobs being run. The minimal permissions for that user are: “view all contacts”, “access
// CiviCRM”, “access CiviMail”). It also requires you to pass in the CIVICRM_SITE_KEY which is configured
// by editing your copy of civicrm.settings.php
//
// I’d recommend setting up a dedicated account for scheduling reports with only minimal permissions.
// Once you have a username/password setup open File > Project Properties and open the Script Properties
// tab. Click ‘Add row’ link and add your setup account name (username), pass (password), key (site key).
// Save the Script Properties and then in the script editor window enter the BASE_URL below of your Civi
// installation (in Drupal this looks like http://[SITEROOT]/sites/all/modules/civicrm/bin/cron.php?.
// File > Save your script
var BASE_URL = "https://www.fubar.org/sites/all/modules/civicrm/bin/cron.php?";
// To get this script to run automatically open Resources > Current project triggers
// and slect doTasks to run as a day timer (we set reports to run between 7-8am)
// If you want to run earlier or later also adjust the RERUN_HOUR below which sets the next run time
var RERUN_HOUR = 1;
var PS = PropertiesService.getScriptProperties();
var param = PS.getProperties();
param.job = "mail_report";
// helper so we know which value is in which column
var COL = {report_id: 0,
type: 1,
last_run: 2,
next_run: 3,
format: 4,
ss_id: 5,
ss_sht: 6,
total: 7};
function onOpen(){
var ui = SpreadsheetApp.getUi();
ui.createMenu('CiviSchedule')
.addItem('Run Script', 'doTasks')
.addToUi();
}
function doTasks() {
var doc = SpreadsheetApp.getActiveSpreadsheet(); // get spreadsheet
var sheet = doc.getSheetByName("Tasks"); // get sheet
var data = sheet.getRange(3, 1, sheet.getLastRow(), COL.total).getValues(); // get values
var now = new Date(); // time now
// for each row of the sheet interate accross
for (var i = 0; i < data.length; i++){
if (data[i][COL.report_id] != ""){ // if there is instance id do something
// collect row values
var report_id = data[i][COL.report_id]
var type = data[i][COL.type];
var next_run = data[i][COL.next_run] || 0;
// check if it's time to run the report again
if (next_run < now && type != "never"){
// if it is ping the report trigger
var new_next_run = callUrl(report_id, type, {format: data[i][COL.format], ss_id: data[i][COL.ss_id], ss_sht: data[i][COL.ss_sht]} );
// ..and record when to run again
sheet.getRange(parseInt(i)+3, 3, 1, 2).setValues([[now, new_next_run]]);
}
}
}
}
function callUrl(report_id, type, optParam){
// build the url to trigger the report
param.format = optParam.format || "print";
if (optParam.ss_id && optParam.ss_sht){
// if we have a sheet name and id force csv
param.format = 'csv';
// make a search string to find our report
optParam.search_str = 'report/instance/'+report_id+'?reset=1 has:attachment is:unread';
// store our search for later
PS.setProperty('search_str_'+report_id, JSON.stringify(optParam));
// set the script to read the email run 15min later
ScriptApp.newTrigger("processInbox")
.timeBased()
.after(1 * 60 * 1000)
.create();
}
// make url
var qs = BASE_URL
for(var key in param) {
if (key.substring(0, 10) != "search_str"){
var value = param[key];
qs += key + "=" + value + "&";
}
}
qs += "instanceId="+report_id;
try {
//gg var resp = UrlFetchApp.fetch(qs); // hit the url
// now calculate when to run again
var d = new Date();
d.setHours(RERUN_HOUR);
d.setMinutes(0);
switch (type){
case "daily":
d.setDate(d.getDate() + 1);
break;
case "weekly":
d.setDate(d.getDate() + 7);
break;
case "monthly":
// Get the first Monday in the month
d.setDate(1);
d.setMonth(d.getMonth() + 1);
while (d.getDay() !== 1) {
d.setDate(d.getDate() + 1);
}
break;
}
return d;
} catch(e) {
return e.message;
}
}
function processInbox(){
var PS = PropertiesService.getScriptProperties();
var data = PS.getProperties();
for (var key in data) {
try { if (key.substring(0, 10) == "search_str"){
var param_raw = data[key];
var param = JSON.parse(param_raw);
// get last 20 message threads using serach term
var threads = GmailApp.search(param.search_str, 0, 20);
// assume last thread has our latest data
var last_thread = threads.length-1;
if (last_thread > -1){
// get message in the last thread
var msg = threads[last_thread].getMessages()[0];
// get the attachments
var attachments = msg.getAttachments();
for (var k = 0; k < attachments.length; k++) {
// get the attachment as a string
var csv_str = attachments[k].getDataAsString();
// parse string as csv
var csv = Utilities.parseCsv(csv_str);
// create destination object
var doc = SpreadsheetApp.openById(param.ss_id);
var sheet = doc.getSheetByName(param.ss_sht);
// clear any old data
sheet.clear();
// write new data
sheet.getRange(1, 1, csv.length, csv[0].length).setValues(csv);
// mark message are read and archive (you could also label or delete)
threads[last_thread].moveToArchive().markRead();
PS.deleteProperty(key);
}
}
}
} catch(e) {
SpreadsheetApp.getUi().alert('problem: ${e}');
}
}
}
at processInbox(processInbox Code:46:18)
The syntax is
at ${FUNCTION}(${FILE}:${LINE}:${COLUMN})
This would suggest that the code causing the error is elsewhere.
At file
processInbox Code
within function
processInbox
And at line
46
and at column
18
You probably have a same function name processInbox at a different file named processInbox Code. In that file, at line 46, col 18, you'll have your error.
I think that you problem may be that this line:
var data = sheet.getRange(3, 1, sheet.getLastRow(), COL.total).getValues();
should be like this:
var data = sheet.getRange(3, 1, sheet.getLastRow() - 2, COL.total).getValues();
I am in the need of listing the users data belonging to a specific group within the organization. The documentation does not specify if this is possible. I was really hoping there could be some kind of query that would allow this. For example email in (1#domain.com,2#domain.com). However, I don't see that being possible. The only way I could think to accomplish this would be:
Get a list of all the members in the group (https://developers.google.com/admin-sdk/directory/reference/rest/v1/members/list)
Get each user data by email (https://developers.google.com/admin-sdk/directory/reference/rest/v1/users/get)
The problem with the above approach is that if a group contains 50+ members, this means that I have to make all that amount of requests, which is counter productive. Imagine how long that would take.
Any ideas? Greatly appreciate it.
Unfortunately I don’t think you can skip this two step process, but you can speed it up using batch requests. This
allows you to request up to 1000 calls in a single request. The steps would be:
Make a batch request to get all the members of all the groups you want (using members.list).
Make a batch request to get all the user info that you need using their id (using user.get).
Notice that the data in the result won’t be sorted, but they will be tagged by Content-ID.
References
Sending Batch Requests (Directory API)
Method: members.list (Directory API)
Method: users.get (Directory API)
I thought about the batching request a couple of hours after I posted the question. The problem with Node JS is that it does not has built in support for batch requests, unlike the php client library for example; Therefore, I had to spent some time implementing support for it on my own since I was not able to find any example. I'll share the solution in case it helps someone else or for my future reference.
async function getGroupMembersData(){
const groupEmail = "group#domain.com"; //google group email
const groupMembers = await getGroupMembers(groupEmail).catch(error=>{
console.error(`Error querying group members: ${error.toString()}`);
});
if(!groupMembers){ return; }
const url = "https://www.googleapis.com/batch/admin/directory_v1";
const scopes = ["https://www.googleapis.com/auth/admin.directory.user.readonly"];
const requests = [];
for(let i=0; i<groupMembers.length; ++i){
const user = groupMembers[i];
const request = {
email: user,
endpoint: `GET directory_v1/admin/directory/v1/users/${user}?fields=*`
};
requests.push(request);
}
const batchRequestData = await batchProcess(url, scopes, requests).catch(error=>{
console.error(`Error processing batch request: ${error.toString()}`);
});
if(!batchRequestData){ return; }
const usersList = batchRequestData.map(i=>{
return i.responseBody;
});
console.log(usersList);
}
//get group members using group email address
async function getGroupMembers(groupKey){
const client = await getClient(scopes); //function to get an authorized client, you have to implement on your own
const service = google.admin({version: "directory_v1", auth: client});
const request = await service.members.list({
groupKey,
fields: "members(email)",
maxResults: 200
});
const members = !!request.data.members ? request.data.members.map(i=>i.email) : [];
return members;
}
//batch request processing in groups of 100
async function batchProcess(batchUrl, scopes, requests){
const client = await getClient(scopes); //function to get an authorized client, you have to implement on your own
let results = [];
const boundary = "foobar99998888"; //boundary line definition
let batchBody = ""; const nl = "\n";
const batchLimit = 100; //define batch limit (max supported = 100)
const totalRounds = Math.ceil(requests.length / batchLimit);
let batchRound = 1;
let batchItem = 0;
let roundLimit = batchLimit;
do{
roundLimit = roundLimit < requests.length ? roundLimit : requests.length;
//build the batch request body
for(batchItem; batchItem<roundLimit; batchItem++){
const requestData = requests[batchItem];
batchBody += `--${boundary}${nl}`;
batchBody += `Content-Type: application/http${nl}`;
batchBody += `Content-Id: <myapprequest-${requestData.email}>${nl}${nl}`;
batchBody += `${requestData.endpoint}${nl}`;
}
batchBody += `--${boundary}--`;
//send the batch request
const batchRequest = await client.request({
url: batchUrl,
method: "POST",
headers: {
"Content-Type": `multipart/mixed; boundary=${boundary}`
},
body: batchBody
}).catch(error=>{
console.log("Error processing batch request: " + error);
});
//parse the batch request response
if(!!batchRequest){
const batchResponseData = batchRequest.data;
const responseBoundary = batchRequest.headers["content-type"].split("; ")[1].replace("boundary=", "");
const httpResponses = batchResponseParser(batchResponseData, responseBoundary);
results.push(...httpResponses);
}
batchRound++;
roundLimit += batchLimit;
} while(batchRound <= totalRounds);
return results;
};
//batch response parser
function batchResponseParser(data, boundary){
const nl = "\r\n";
data = data.replace(`--${boundary}--`,"");
const responses = data.split(`--${boundary}`);
responses.shift();
const formattedResponses = responses.map(i=>{
const parts = i.split(`${nl}${nl}`);
const responseMetaParts = (parts[0].replace(nl, "")).split(nl);
let responseMeta = {};
responseMetaParts.forEach(part=>{
const objectParts = part.split(":");
responseMeta[objectParts[0].trim()] = objectParts[1].trim();
});
const responseHeadersParts = parts[1].split(nl);
let responseHeaders = {};
responseHeadersParts.forEach(part=>{
if(part.indexOf("HTTP/1.1") > -1){
responseHeaders.status = part;
} else {
const objectParts = part.split(":");
responseHeaders[objectParts[0].trim()] = objectParts[1].trim();
}
});
const reg = new RegExp(`${nl}`, "g");
const responseBody = JSON.parse(parts[2].replace(reg, ""));
const formatted = {
responseMeta: responseMeta,
responseHeaders: responseHeaders,
responseBody: responseBody
};
return formatted;
});
return formattedResponses;
}
I just added a script to a Form/Google Spreadsheet. It grabs the Response URL from the Form and pushes it into a column in the response spreadsheet. I would like to have the URL linked to a button(In html, I would of course anchor my image with the Edit Response URL, but now I am a little confuse, since I am not a super experienced script editor). How would that be possible to integrate it to my script?:
function assignEditUrls() {
var form = FormApp.openById('1-Sxpvd9jktE-SVXV0_dfp018xwcIoa3aXMA_fdff9W8');
//enter form ID here
var sheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName('Form Responses 1');
//Change the sheet name as appropriate
var data = sheet.getDataRange().getValues();
var urlCol = 5; // column number where URL's should be populated; A = 1, B = 2 etc
var responses = form.getResponses();
var timestamps = [], urls = [], resultUrls = [];
for (var i = 0; i < responses.length; i++) {
timestamps.push(responses[i].getTimestamp().setMilliseconds(0));
urls.push(responses[i].getEditResponseUrl());
}
for (var j = 1; j < data.length; j++) {
resultUrls.push([urls[timestamps.indexOf(data[j][0].setMilliseconds(0))]]);
}
sheet.getRange(2, urlCol, resultUrls.length).setValues(resultUrls);
}
Its not possible to programatically add buttons or images to spreadsheets.
what you can do is add the url in those cells as a fomula =hyperlink("url",yoururl) so it looks prettier.
On my project, I would like to optimize an ajax request and to know, on average, how many ms I have gained.
So, thanks to the Google Chrome network tab, I have the time of a request, something like that :
Is there a feature to have some stats about our request ? For example the average time.
If no, how to do that ?
Thanks !
Not too difficult to roll your own code in JavaScript.
var times = [];
var sum = 0;
var tries = 10
for(var i=0; i<tries; i++) {
var xhr = new XMLHttpRequest();
xhr.open("GET", window.location.href, false);
xhr.onload = (function() {
var time = (Date.now() - this.start);
times.push(time);
sum += time;
console.log("#" + this.number + " " + time + "ms");
}).bind(xhr);
xhr.number = (i + 1);
xhr.start = Date.now();
xhr.send(null);
}
console.log("avg: " + (sum / tries) + "ms");
Go to url: chrome://net-internals
Description here
For data visualization request in my application, I am sending multiple AJAX requests to a servlet in order to get the data in chunks and on callback of each request, the data received is rendered over map.
For this request, I am trying to calculate:
Request Time (how much total time it took for client to get data from server)
Processing Time (how much total time it took for client to render the data on client side)
In order to do this, I am capturing start time of each request before sending it to server (using jquery "beforeSend") and "onSuccess" event of each request, the end time is captured.
Once the all requests are completed, I am deducting the "start time" of first request from the "end time" of last request in order to calculate the total time the client took for fetching records from server. (Similarly for Processing Time)
But somehow my calculation doesn't produce correct results. Could any one please provide me some suggestions for this issue?
for explaining my question in more better way:
var dataProviderRequestsStartTime = [];
var dataProviderRequestsEndTime = [];
var dataParsingStartTime = [];
var dataParsingEndTime = [];
getResults(ids);
var getResults = function(totalIds) {
for(var i=0; i<10; i++;) {
requestResultForOneChunk(totalIds[i]);
}
};
var requestResultForOneChunk = function(streetIds) {
$.ajax({
beforeSend: function() {
var requestStartTime = new Date().getTime();
dataProviderRequestsStartTime.push(requestStartTime);
},
type : 'POST',
url : "myServlet",
contentType : "application/x-www-form-urlencoded",
data : {
"ids" : streetIds,
},
success : function(response) {
//Request Finished
var dataProvideRequestEndTime = new Date().getTime();
dataProviderRequestsEndTime.push(dataProvideRequestEndTime);
addFeaturesToMap(response);
},
error : function(x, e) {
alert("Something went wrong in the request" + e);
}
});
};
var addFeaturesToMap = function(measurements) {
//Parsing Started
var featureParsingStartTime = new Date().getTime();
dataParsingStartTime.push(featureParsingStartTime);
doParsing(measurements);
//Parsing Finished
featureParsingEndTime = new Date().getTime();
dataParsingEndTime.push(featureParsingEndTime);
};
$("#loading").bind(
"ajaxStart",
function(options) {
ajaxStartTime = options.timeStamp;
}).bind("ajaxStop", function(options) {
var ajaxEndTime = options.timeStamp;
var totalTime = (ajaxEndTime - ajaxStartTime);
calculateTimeBreakDown();
});
var calculateTimeBreakDown = function() {
var totalValues = dataProviderRequestsEndTime.length;
var lastValueIndex = totalValues - 1;
// Request Time calculation
var endTimeOfLastRequest = dataProviderRequestsEndTime[lastValueIndex];
var startTimeOfFirstRequest = dataProviderRequestsStartTime[0];
var totalRequestTime = (endTimeOfLastRequest - startTimeOfFirstRequest);
// Parsing Time Calculation
var endTimeOfLastParsing = dataParsingEndTime[lastValueIndex];
var startTimeOfFirstParsing = dataParsingStartTime[0];
var totalParsingTime = (endTimeOfLastParsing - startTimeOfFirstParsing);
};
Finally, I have requestTime(totalRequestTime) and parsingTime(totalParsingTime). But the problem is adding these both doesn't produce value near to total time which is calculated using ajax start and stop.
look at the .ajaxStart() and .ajaxStop() events for "total time", (<- those are also great for progressbars)
http://api.jquery.com/ajaxStart/
http://api.jquery.com/ajaxStop/
and .ajaxSend() and .ajaxComplete() events for "cumulative time" calculations.
http://api.jquery.com/ajaxSend/
http://api.jquery.com/ajaxComplete/
look at this code:
var totalTime = null;
var cachedTime = null;
function alertLoadingTime() {
if(!totalTime) return;
var loadingTime = totalTime / 1000;
console.log("loaded " + loadingTime + " seconds");
}
function timingStart() {
cachedTime = new Date;
}
function timingEnd() {
var endTime = new Date;
totalTime += endTime - cachedTime;
cachedTime = null;
alertLoadingTime();
}
$(document).ajaxStart(timingStart);
$(document).ajaxStop(timingEnd);
note that it will only account for time spent doing ajax calls and won't include the initial page loading time.
to time the parsing:
use the same functions as before but change totalTime to totalParsingTime. (Note: you can achieve this by changing totalTime to reference some other variable)
Call timingStart() right before you append the result of the ajax call to the dom tree.
Have the server add timingEnd() to the end of every response.
totalTime will then be set to the time it took to add everything to the DOM tree.
solution for you would to rely on jquery ajax callback methods
ajaxStart : Register a handler to be called when the first Ajax request begins.
ajaxStop : Register a handler to be called when all Ajax requests have completed including success and error callbacks
I have used the below code snippet in my application and it works perfectly fine to report page rendering time including ajaxs.
var startTime = 0,endTime = 0;
$(document).ajaxStart(function(){
startTime = new Date();
});
$(document).ajaxStop(function(){
endTime = new Date();
console.log("total time required : ",endTime - startTime); //Total time in milliseconds including time spent in success or error callbacks
});