Google Apps Script

Whilst recently in Berlin I was invited to GDG Berlin to give a presentation on Google Apps Script. My experience when talking to developer groups is there is usually low awareness of Google Apps Script so the challenge is usually to provide enough information to entice developers into finding out more and using this tool. Last year I did a similar talk for the London #DevFest on Google Apps Script: The authentic{ated} playground. This talk focused on how Apps Script is integrated into the Google ecosystem, for me, one of the big selling points of this product. For GDG Berlin I decided to revisit my original slidedeck thinking I could add some minor tweaks. In the end it turned into a major rewrite in part to reflect changes in Apps Script over the last year. Here is how I initially pitched the session: ...continue reading

Last week I was fortunate to attend OEB 2015 (http://www.online-educa.com/) in Berlin. There were many great presentations and sessions at OEB and I particularly enjoyed Jenny Fisher and Chrissi Nerantzi’s Playing for Change – the Role of Creative Activities session, and I look forward to more similar sessions at the ALT Annual Conference next year. My contribution to OEB15 was a workshop on ‘Creating personal tutoring environments with Google Apps Script’. As it happened this workshop followed the opening plenary in which David Price was encouraging delegates to find their inner hacker and take a more autonomous approaches to education (recording here). Below is my submitted session abstract followed by slides: ...continue reading

Later today I’ll be presenting at CiviCon London, the largest gathering in Europe of CiviCRM community. CiviCRM is a big part of my day job at the Association for Learning Technology and this event is the perfect opportunity to hear about the latest developments and get some inspiration for what we can do with our own Civi system.

The talk I’ll be giving is based around some earlier work I’ve already published about  Custom regular reporting from #CiviCRM with Google Sheets and Google Apps Script and Tips on emailing inline Google Charts from Sheets using Apps Script. These outline the method we use to trigger the sending of .csv data attached to an email sent from CiviReport using a Google Sheet: ...continue reading

15 Comments

Google recently announced a new Google Apps Script Service called Execution API. The pitch:

Have you ever wanted a server API that modifies cells in a Google Sheet, to execute a Google Apps Script app from outside of Google Apps, or a way to use Apps Script as an API platform? Today, we’re excited to announce you can do all that and more with the Google Apps Script Execution API. The Execution API allows developers to execute scripts from any client (browser, server, mobile, or any device). You provide the authorization, and the Execution API will run your script.

Getting data into Google Sheets from external sources has been a long held interest of mine. Whilst the Execution API can do much more than just this in this post I want to focus on how you can setup the Execution API to create an authenticated pipeline. Before I go into this it’s worth reminding ourselves of the other ways you can get data into Google Sheets, in particular, publishing a script as a web app. ...continue reading

Photo credit : Chris Bull CC-BY-NC-SA ALT https://flic.kr/p/xx86UaLast week was #altc, once of the largest annual edtech events in the UK calendar. Each year we’ve seen a growth in the use of Twitter at the event and it makes sense for us to have the live Twitter stream displayed throughout the venue. In previous years we’ve relied on 3rd party services but this year I wanted to bring this in-house.  Having looked around I came across a nice example by Remy Sharp for a ‘A conference twitter wall with built in schedule & announcements’. This looked ideal as the schedule and announcements was a nice additional feature, plus it was coded in HTML/Javascript making it easier for me to modify. ...continue reading

Previously I’ve written about how we use Google Apps Script to extract data from one of the Association’s online systems. We use this technique to generate charts which give us some insight to our current standing. One limitation we had was to view the charts we had to open the Google Sheet with them in. There are various techniques for automatically distributing these using Apps Script and in this post I’m going to highlight one method for emailing inline charts. Below is what the result looks like, an email with an embedded chart, no need to download or preview attachments. ...continue reading

As of yesterday TED have uploaded 1,903 videos totaling 1,622,120 seconds of playtime which have been viewed 428,117,012 times and received 4,360,903 likes.

If you’d like to play with the data you can find it in this YouTube Channel Summary – TEDtalksDirector Google Sheet … and if you would like similar data for your or someone else's channel make a copy of this YouTube Channel Summary Google Sheet and follow the setup instructions. ...continue reading

Last year I wrote about how you can  use Google Apps Script to integrate with import.io. If you are not familiar with import.io the service lets you:

transform any website into a table of data or an API in minutes without even writing any code

As part of my work at ALT we recently needed to extract data from our hosted Open Conference Systems (OCS). OCS has some data export options but none that fitted our exact need. As a hosted solution we don’t have access to the backend so I turned to import.io to liberate our own data <sigh>. OCS uses a basic authentication but the great thing about import.io is you can train it to enter your username and password and extract the data from the pages you need.  Getting data behind an authentication layer with the import.io API is a two step process:

Make sure you check out the docs before integrating authenticated sources!
Every time you pass in credentials you will be logged in; pass in credentials once or via a login call and subsequently pass through cookies.

I took a while to get my head around the process because the two links in the support message just take you to the generic API docs. This is a better url to the queryLogin methods. It’s clear that import.io have put a lot of work into the developer experience, but unfortunately I struggled testing the queryLogin method. Using a valid id and model schema for the input just gave an ‘UnexpectedErrorException’. So I then turned to import.io’s own dataset tools. This was another dead end as I was struggling to get it to recognise my OCS login. Peeking under the hood I discovered:

Looking for another ‘in’ a quick search came up with this post on Using import.io authenticated data sources with PHP and Go. Given I do a lot of coding in PHP translating to Javascript/Google Apps Script is relatively straight forward. I was still struggling however with the ‘shape’ of the login payload and the $connectorDomain. The breakthrough came remembering that import.io looked like they were dog fooding their own API in their dataset tool.

Luke use the log

With this I could see what the $connectorDomain should have been and can now happily go off and liberate our data. Here’s my translation of the PHP example in Google Apps Script also available as a gist:

function getResults() {
  var connector = {'username':'YOUR_SITE_USERNAME',
                   'password':'YOUR_SITE_PASSWORD',
                   'connectorDomain':'YOUR_CONNECTOR_DOMAIN',
                   'userGuid':'YOUR_USER_GUID',
                   'connectorGuid':'YOUR_CONNECTOR_GUID',
                   'apiKey':'YOUR_API_KEY'}
  
  var creds = {};
  creds[connector.connectorDomain] = {
    "username": connector.username,
    "password": connector.password
  };
  
  var additionalInput = {};
  additionalInput[connector.connectorGuid] = {'domainCredentials':creds};              
  //get cookies
  var login = query(connector.connectorGuid, false, connector.userGuid, connector.apiKey, additionalInput, false);
  additionalInput[connector.connectorGuid].cookies = login.cookies;  
  var result = query(connector.connectorGuid, {"webpage/url":"http://ocs.sfu.ca/alt/index.php/conferences/altc2015/director/submissionReview/799/1"}, connector.userGuid, connector.apiKey, additionalInput, false);
  // do something with results like write to Google Sheet https://developers.google.com/apps-script/guides/sheets#writing_data
}
 
// http://blog.import.io/post/using-importio-authenticated-data-sources-with-php-and-go
function query(connectorGuid, input, userGuid, apiKey, additionalInput, login) {
 
  var url = "https://api.import.io/store/connector/" + connectorGuid + "/_query?_user=" + userGuid + "&_apikey=" + apiKey;
 
  var data = {};
  if (input) {
    data["input"] = input;
  }
  if (additionalInput) {
    data["additionalInput"] = additionalInput;
  }
  if (login) {
    data["loginOnly"] = true;
  }
 
  var ch = UrlFetchApp.fetch(url, {'method':'POST', 'payload': JSON.stringify(data)});
  var result = ch.getContentText();
  return JSON.parse(result);
}

At the Association for Learning Technology (ALT) one of the core tools we use for membership management is CiviCRM. CiviCRM has a number of ‘out-of-the-box’ reports you can use and run to summaries and analyse memberships and contributions. Given the flexibility of Civi you can also with a bit of know how create custom extensions and reporting options enabling so very sophisticated analytics. At ALT we are keen to make more use of the data we have on memberships but at the same time have limited resources and time to implement these. In this post I’ll highlight how using Google Sheets we’ve developed a framework that allows us to rapidly develop custom reporting.

Problem

If you are already a CiviCRM user you are probably already aware of CiviReports and the ability create basic custom reports which allow you to view and download data. As part of this you can also schedule reports to land in your inbox. This is great but has it’s limitations. In particular, additional setup is required if you don’t want to just report on a daily basis; you end up with tables of data, with no graphical summaries; and combining current and historic data isn’t possible.

Solution

Scheduling reports at custom intervals

CiviCRM provides an interface to schedule a mail_report. The issue many people discover is this will send reports on set intervals usually hourly or daily. You can schedule individual jobs to run a specific periods but you quickly find yourself in the world of command lines and CRON jobs. Crons are scheduled tasks run by a web server. If you have dedicated admin support this is a fairly standard task and the instructions are easy to follow. At ALT we have the option to open a support ticket with our host but this seems like a waste on time and money.

Our solution is to use a Google Sheet… well a Google Sheet with a bit of ‘juice’. The sheet is shared with our team and anyone can add a CiviReport id to Column A and choose how often it runs in Column B using a data validation list option.

deciding what reports run when

But how does this trigger our civi install to run the job? The juice is Google Apps Script, a cloud based scripting language native to Google Drive. Apps Script is a free service from Google and fortunately for us has the ability to run scripts on configured time intervals. It also has the ability to call specific urls using the build-in UrlFetchApp (similar to CURL). I’ll give you a link to this Sheet so you can setup your own later and when you do you’ll see the entire process is managed with a couple of lines of code included below:

function doTasks() {
  var doc = SpreadsheetApp.getActiveSpreadsheet(); // get spreadsheet
  var sheet = doc.getSheetByName("Tasks"); // get sheet
  var data = sheet.getRange(3, 1, sheet.getLastRow(), COL.total).getValues(); // get values
  var now = new Date(); // time now
  // for each row of the sheet interate accross
  for (var i = 0; i < data.length; i++){
    if (data[i][COL.report_id] != ""){ // if there is instance id do something
      // collect row values
      var report_id = data[i][COL.report_id]
      var type = data[i][COL.type];
      var next_run = data[i][COL.next_run] || 0; 
      // check if it's time to run the report again
      if (next_run < now && type != "never"){
        // if it is ping the report trigger
        var new_next_run = callUrl(report_id, type, {format: data[i][COL.format], ss_id: data[i][COL.ss_id], ss_sht: data[i][COL.ss_sht]} );
        // ..and record when to run again
        sheet.getRange(parseInt(i)+3, 3, 1, 2).setValues([[now, new_next_run]]);
      }
    }
  }
}

What this does is read the sheet data and then iterate across each row. If the report is overdue to be run again it calls a another custom function callUrl which will run the CiviReport and return/write when next to run.

Creating graphical summaries and combining data

By this point you may be sensing that I’m partial to solving problems with Google Sheets. With Sheets it’s fairly straight forward to manually export different reports from Civi and analyse using formula and Charts. The manual export of CiviReports can get tiresome so how can we automate this? Again we return to Google Apps Script. One of the options in CiviReports is to attach the data to the emailed report as a .csv file. From the previous example we can see it is possible to read and write data to a Google Sheet. So if we can get the .csv file from our emailed report we can write it to the Sheet … right?

This is actually more straight forward than you may think as another feature of Google Apps Script is to interact with the script owner’s Gmail.  As part of this we can search for messages and get associated attachments. Using this we can read the latest report from Civi, write the data to a sheet and with a bit of clever formula building automatically get the latest summary or custom chart. As Apps Script runs in a pre authenticated environment, no oAuth handshakes here, the code is relatively straight forward:

function processInbox(){
  var PS = PropertiesService.getScriptProperties();
  var data = PS.getProperties();
  for (var key in data) {
    if (key.substring(0, 10) == "search_str"){
      var param_raw = data[key];
      var param = JSON.parse(param_raw);
      // get last 20 message threads using serach term
      var threads = GmailApp.search(param.search_str, 0, 20); 
      // assume last thread has our latest data
      var last_thread = threads.length-1;
      if (last_thread > -1){
        // get message in the last thread        
        var msg =  threads[last_thread].getMessages()[0];
        // get the attachments
        var attachments = msg.getAttachments();
        for (var k = 0; k < attachments.length; k++) {
          // get the attachment as a string
          var csv_str = attachments[k].getDataAsString();
          // parse string as csv
          var csv = Utilities.parseCsv(csv_str);
          // create destination object
          var doc = SpreadsheetApp.openById(param.ss_id);
          var sheet = doc.getSheetByName(param.ss_sht);
          // clear any old data
          sheet.clear();
          // write new data
          sheet.getRange(1, 1,  csv.length, csv[0].length).setValues(csv);
          // mark message are read and archive (you could also label or delete)
          threads[last_thread].moveToArchive().markRead();
          PS.deleteProperty(key);
        }
      }
    }
  }
}

NB

Data protection

There are a couple of things worth noting here. Google Sheets are a fantastic collaborative environment and with this solution we can still share spreadsheets to selected people in our organisation and beyond. Something to remember though is this script runs as the sheet owner so when configuring the CiviReport it needs to go to the email address of the owner. At ALT we benefit from being a Google for Education user so our email and Drive access comes as part of the Google Apps suite. This solution could also be setup to run on a regular Google account but there are data protection issues to consider sending reports to a non-organisation contact. As such you might only want to re-use this solution as an easy way to schedule reports rather than schedule and process the data.

ARRAYFORMULA, FILTER, QUERY functions are your friends

As our solution dumps .csv data in a sheet, clearing any previous data, any formulas you have in the sheet will also be lost. We get around this by doing all data manipulations in a separate sheet which references the imported data. To save copy and pasting lots of formulas we make extensive use of the ARRAYFORMULA, FILTER, QUERY functions available in Google Sheets.

Comparing CiviEvent registrations

One scenario we have is monitoring the number of registrations to an annual conference. It’s easy for us to export registrations for previous years as static data into a spreadsheet. For the analysis we want to group the number of registrations by week number. To get a week number for the live data we create a second sheet that references it. Using the ARRAYFORMULA you can enter the reference once which is then applied to all the values in the range. For example, we get all the registration dates in column A using =ArrayFormula('2015LiveData'!B:B) in cell A1 and then extract the week numbers in column C by using the formula =ARRAYFORMULA(IF(ISBLANK(A2:A),"",WEEKNUM(A2:A))) in cell C2.

ArrayFormula is your friend

Setting up your own copy of Schedule CiviCRM Reports

If this is a project you’d like to use and/or build upon you can copy a blank version of our template below:

Schedule CiviCRM Reports
[While signed in File > Make a copy]

Once you have a copy open Tools > Script editor to see all of the project code and instructions for setting up.

I’ve hopefully given you enough to go on to setup but feel free to leave a comment if you get stuck or have any questions.

Enjoy!