Is learning technology fit for purpose?

In my opinion generally yes ... but are the education systems they support equally optimized? Today's education is focused on administration of learning and as such learning technology inevitably has the similar focus.

It'll be interesting to see One Big Question: a #digped Discussion turns out:

Imagine that no educational technologies had yet been invented — no chalkboards, no clickers, no textbooks, no Learning Management Systems, no Coursera MOOCs. If we could start from scratch, what would we build?

The problem I see with that question is as I recently highlighted:

the tyranny of the present is, it makes it very difficult to think of any change that's not incremental to it  - Alan Kay

I'm personally really looking forward to seeing how the Lawrie Phillips and Donna Lanclos  session goes at #altc this year on Reframing the debate? Learning Technologies part 2 (paper 881 in the programme)

Previously I’ve written about how we use Google Apps Script to extract data from one of the Association’s online systems. We use this technique to generate charts which give us some insight to our current standing. One limitation we had was to view the charts we had to open the Google Sheet with them in. There are various techniques for automatically distributing these using Apps Script and in this post I’m going to highlight one method for emailing inline charts. Below is what the result looks like, an email with an embedded chart, no need to download or preview attachments.

End product - inline charts

Below is the same chart in the Google Sheet. If we make changes to the chart appearance or update the data, when we send the chart we get the latest version.

Source chart in Google Sheet that will be emailed

I found the code snippet for doing this on Stackoverflow. This works well but there are two issues to be aware of.

Google Sheet needs to be shared with anyone with link

As noted in the SO comments for the chart to appear the Google Sheet containing it needs to be shared with ‘anyone with the link’. Without this all you get is a blank image. This potentially creates a data protection issue if using sensitive information. There are ways you can juggle this by such as using the IMPORTRANGE formula to only publish aggregated data. This looks like a bug which I’ve noted in this issue ticket.

Missing numeric labels

The second issue we discovered was that axis with numeric values end up prefixed with ‘General’  e.g. as seen below General88000.

Numeric label issue

The solution is fairly straight forward and actually opens the door to doing a lot more with charts. It uses the EmbedChartBuilder service to grab the chart object. The original Google announcement for the Chart services highlights you can:

  • Add/Remove the ranges this chart represents via addRange and removeRange.
  • Set options that modify how the chart will be rendered as well as change the chart type with setOption and setChartType.
  • Change where the chart will be displayed (the cell and cell offset of the chart container) via setPosition.

To achieve proper axis formatting requires some cross document reference lookup to find the correct setOption to use. In this case it means jumping to Google Chart API reference for Line Charts to see the configuration options and setting the format:

builder.setOption('vAxis.format', '#');

The modified version of the function we use is included below:

Missed a trick and as Alex Bellars points out should have called it #blideo

David Hopkins recently challenged me to a #blimage and as David explains:

if this is the first time you’ve come across #blimage, here’s a brief summary of what it is. In short, Steve Wheeler (@timbuckteeth), in conversations Amy Burvall (@amyburvall) and Simon Ensor (@sensor63), started the #blimage challenge, which is:

“a confection of Blog-Image. (Yes, we are now in the age of blim!) You send an image or photograph to a colleague with the challenge that they have to write a learning related blog post based on it. Just make sure the images aren’t too rude. The permutations are blimmin’ endless.

David provided a picture of a Lego stormtropper at sunset to #blimage me with. You can’t beat a bit of Lego. My first thoughts weren’t the image or what the image could be interpreted as, but the image as a digital artefact. As a digital image it can be endlessly replicated, distributed, remixed and more. As someone who until the late 90s predominately learned from books, the paper kind, it’s interesting to reflect on how ‘digital’ has changed the way I learn and, in particular, the digital moving image. So in true academic style I’m not going to answer the question set, but answer my own question instead:

what #vidage has influenced my thoughts around education?

Sir Ken Robinson – RSA Animate: Changing Paradigms

With over 13 million views I’m guessing you’ve already seen this but given the tail on these things those new to the sector may have missed. Back in 2010 when I first saw this combination of Sir Ken’s clear passion and the ‘live’ sketchnotes was a powerful combination, planting the seed that education should be better.

Bret Victor – The future of programming

So if you think my first #vidage was clichéd this is hopefully where I win you back. So what does the ‘future of programming’ have to do with education. For this Victor presented the world view of programming à la 1973, highlighting projects from early computer science like Sketchpad, massively parallel processor array and Doug Englebart’s ‘mother of all demos’.  Looking at was achieved in this era you cannot but admire the great creativity and innovation. But how was so much achieved with relatively little? The answer … they were making it up as they went along. There was no defined architecture or notion of the best way to do things. Victor warns the issue now is there is becoming a fixed mindset, a dogmatic approach to computer science that is stifling development. This is where I see parallels to education echoed by Sir Ken’s Changing Paradigms. I’ve embedded the punchline below but encourage you to watch the entire talk.

Alan Kay - The Future Doesn't Have to Be Incremental

Sticking with the computer science theme the last #vidage comes from someone who was part of the 60s vanguard, Alan Kay. Kay has and continues to have a big impact on personal computing and computing science. In Kay’s talk the focus is less on these achievements but on the processes and mindset to achieve this, in particular, dreaming about a future reality, drawing on the past but ignoring the present:

the tyranny of the present is, it makes it very difficult to think of any change that's not incremental to it

This is quite a step change for education but the future reality is worth dreaming about. You can watch Kay’s talk below:

So that’s some of my inspirational #vidage. What’s yours?

tl;dr I recently made a WatchMaker Android Wear watch face based on the ‘squircle’ (download here). This post covers the maths behind squircles and getting (x,y) co-ordinates for any given polar angle. Finally I reveal the secrets of functions in WatchMaker Lau script.

A couple of months ago I started to play with the WatchMaker Watch Face app for Android. This app makes it relatively simple to design your own watch faces for Android Wear. After my first foray in becoming a “watch [face] maker” I’ve gone on to customise other watch faces and also put together R∆D∆R, shown below.

R∆D∆R

As part of the WatchMaker app there is a Google+ Community which include a ‘request a watch face’ thread. A lot of the time people are just asking for copies of existing ‘real’ watches, which can result in beautiful replicas but don’t interest me much. However recently a request from Andrew Davis came in that intrigued me.

Calls to squircle 

Squircles

Squircles are apparently the The Hottest Thing In Car Design Right Now and hitting Wikipedia you can find a basic equation to represent the path:

\left( x - a \right)^4 + \left( y - b \right)^4 = r^4

Given Watchmaker uses a Cartesian co-ordinate system with 0,0 at the centre (that is, with a = b = 0) you get Lamé's special quartic. Delving into the recesses of my high schools maths I knew if I solved this equation with that of a line (using tan(ѳ) for a gradient I could get the intercept co-ordinates.

Intercept

So basically:

y=tan(f)x and y^4+x^4=a^4so

tan(f)^4x^4+x^4=a^4

And cue much scribbling and crumpled paper you get…

x=a/(tan(f)^4+1)^(1/4); and y=(a*tan(f))/(tan(f)^4+1)^(1/4);

Putting these equations into a spreadsheet and providing a range of angles gives:

Graphing in a spreadsheet it’s easy to adjust the variables and see the effect. In this case we only have a, the radius of a squircle. And it turns out a is not much to play with e.g. a = 20 and so on.

Wondering if I missed something I hit ‘google’ again and discovered on Wolfram Mathworld that there are ‘two incompatible definitions of the squircle’ (guess wikipedia needs updating), the second by Fernandez Guasti:

 s^2(x^2)/(k^2)(y^2)/(k^2)-((x^2)/(k^2)+(y^2)/(k^2))+1=0,

with squareness parameter s, where s=0 corresponds to a circle with radius k and s=1 to a square of side length 2k. This curve is actually semialgebraic, as it must be restricted to |x|,|y|≤k to exclude other branches.

Having scraped a solution for the first definition of a squircle I didn’t feel confident that I’d be able to do the same with the Guasti definition. Knowing of the existence of software for solving algebraic systems another ‘googling’ turned up the open source Maxima. So cue much keyboard bashing and head scratching to get:

x=(k*sqrt(-sqrt(-4*tan(f)^2*s^2+tan(f)^4+2*tan(f)^2+1)+tan(f)^2+1))/(sqrt(2)*tan(f)*s) 

and

y=(k*sqrt(-sqrt(-4*tan(f)^2*s^2+tan(f)^4+2*tan(f)^2+1)+tan(f)^2+1))/(sqrt(2)*s)

The nice thing about Maxima is you can copy and paste solved equations directly into a spreadsheet and get them to work with little modification i.e. copying the x output can be pasted as =(k*sqrt(-sqrt(-4*tan(f)^2*s^2 + tan(f)^4 + 2*tan(f)^2 + 1) + tan(f)^2+1))/(sqrt(2)*tan(f)*s). To make your life even either you can use Named Ranges to define the constants. Putting some data in to this Google Sheet I was able to see how the definition of a Guasti Squircle renders, and importantly performs.

Something I wanted to do in Google Sheets, but couldn’t get the charting tools to do was to graph different Squircles with varying k and s values on the same x-y scatter graph, the idea being I could use this as the background image for my watch face, so exported to MS Excel I’ve dropped this file on to OneDrive if you want to have a look):

Squircles in Excel

WatchMaker functions and Lau scripting

WatchMaker makes it easy to add primitive elements to your watch face, and a host of predefined variables you can drop in. In previous faces I’ve designed I’ve dropped these into an element property. For example, rotations based on second, minute, hour hand and other device readings can be used as part of the element properties. Below in an example where I’m using the watch battery charge level expressed as a rotation {br} as the degrees in a radar element.

WatchMaker {br} property

So I could use the formula developed in the spreadsheet for the x, y position of an element using a rotation variable, but given I’ve got three rings and separate x and y values this didn’t seem clever particularly if I wanted to adjust the k and s values. Looking at the WatchMaker developer reference I could see you could define functions that return values, but despite my best efforts every time I tried to call these functions in a property nothing would happen. Trying to find a solution online I drew I blank, my breakthrough was to look at other people’s watches I’d imported into WatchMaker to see how they did it. Doing this I was able to finally work out what the documentation was trying to say … or at least what I think it should say. The solution was to define a set of global variables which are then triggered to update every millisecond or every second in a script file that runs on startup. The variables are then used in the element properties. I’ve included the script below after you can see the finished work, which you can download here.

Screenshot_2015-07-13-21-36-42[1]

-- setup some globals
-- var_ms used to prefix on_millisecond variables
var_s_s = 0.85
var_s_k = 230
var_ms_s_x = 0
var_ms_s_y = -var_s_k

-- var_s used to prefix on_second variables
var_m_s = 0.895
var_m_k = 180
var_s_m_x = 0
var_s_m_y = -var_m_k

var_h_s = 0.94
var_h_k = 130
var_s_h_x = 0
var_s_h_y = -var_h_k

-- adaptive layout for circular faces
var_round_face = 0
if ({around}) then
  var_s_s = 0.7
  var_s_k = 187
  var_ms_s_x = 0
  var_ms_s_y = -var_s_k

  var_m_s = 0.71
  var_m_k = 147
  var_s_m_x = 0
  var_s_m_y = -var_m_k

  var_h_s = 0.72
  var_h_k = 100
  var_s_h_x = 0
  var_s_h_y = -var_h_k
  var_round_face = 100
end

-- things to do each millisecond
function on_millisecond()
  var_ms_s_x = xpos({drss}, var_s_s, var_s_k)
  var_ms_s_y = ypos({drss}, var_s_s, var_s_k)
end

-- things to do every second
function on_second()
  var_s_m_x = xpos({drm}, var_m_s, var_m_k)
  var_s_m_y = ypos({drm}, var_m_s, var_m_k)
  var_s_h_x = xpos({drh}, var_h_s, var_h_k)
  var_s_h_y = ypos({drh}, var_h_s, var_h_k)
end

-- function to calc x position
function xpos(a, s, k)
  if (a == 0 or a == 180) then
    return 0
  elseif (a == 90 or a == 270) then
     pos = k
  else
    pos = (k*math.sqrt(-math.sqrt(math.tan(math.rad(a-90))^4-4*s^2*math.tan(math.rad(a-90))^2+2*math.tan(math.rad(a-90))^2+1)+math.tan(math.rad(a-90))^2+1))/(math.sqrt(2)*s*math.tan(math.rad(a-90)))
  end
  if (a>0 and a <90) then
    return -pos
  elseif (a>=90 and a<270) then
    return pos
  else
    return -pos
  end
end

-- function to calc y position
function ypos(a, s, k)
  if (a == 90 or a == 270) then
    return 0
  elseif (a == 0 or a == 180) then
    pos = k
  else
     pos = (k*math.sqrt(-math.sqrt(math.tan(math.rad(a-90))^4-4*s^2*math.tan(math.rad(a-90))^2+2*math.tan(math.rad(a-90))^2+1)+math.tan(math.rad(a-90))^2+1))/(math.sqrt(2)*s)
   end
 if (a>=0 and a <90) then
    return -pos
  elseif (a>=90 and a<270) then
    return pos
  else
    return -pos
  end
end

If you've made it this far, thank you :)

As of yesterday TED have uploaded 1,903 videos totaling 1,622,120 seconds of playtime which have been viewed 428,117,012 times and received 4,360,903 likes.

If you’d like to play with the data you can find it in this YouTube Channel Summary – TEDtalksDirector Google Sheet … and if you would like similar data for your or someone else's channel make a copy of this YouTube Channel Summary Google Sheet and follow the setup instructions.

Setup

To get this working there are a couple of hoops to jump through. As the YouTube API is an Advanced Service and it must be enabled before use. For this project to do this you need to be in the Script Editor then:

  1. Open Tools > Script editor and then click Resources > Advanced Google Services…
  2. Scroll down to YouTube Data API to turn it on then click the ‘Google Developers Console link:
    Enabling Advanced Services
  3. In the Google Developers Console find and turn on the YouTube Data API. After it’s enabled you can close the Console window
  4. Finally, assuming you’ve got the desired username set in cell B1 Run > writeYTChannelSummaryToSheet.

Note: if the channel has a lot of videos the script will automatically start running again after five seconds until it get everything.

How it was made

I made this template following a request from Brian Bennett:

.. so let look in more detail at how you access YouTube data in Google Apps Script. I’ve already highlighted the need to activate the YouTube integration as an advanced service. For advanced services there is less documentation on the Google Apps Script site and generally you are better looking at the API documentation for the service. In this case the YouTube Data API jumping in to the PlaylistItems list examples there is a JavaScript solution to retrieve a channels upload list. As Google Apps Script uses the JavaScript syntax this provides a useful starting point to structure our project. Something to bear in mind is that the JavaScript YouTube client and Google Apps Script are different, so in Javascript you prepare a request which is executed. In Apps Script you can prepare variable if you like but we can jump straight to the execute bit. Below are two examples of the same code in JavaScript and Apps Script

JavaScript

function requestUserUploadsPlaylistId() {
  // See https://developers.google.com/youtube/v3/docs/channels/list
  var request = gapi.client.youtube.channels.list({
    mine: true,
    part: 'contentDetails'
  });
  request.execute(function(response) {
    playlistId = response.result.items[0].contentDetails.relatedPlaylists.uploads;
    requestVideoPlaylist(playlistId);
  });
}

Google Apps Script

function requestUserUploadsPlaylistId() {
  // See https://developers.google.com/youtube/v3/docs/channels/list
  var response = YouTube.Channels.list('contentDetails', {mine:true});
  playlistId = response.items[0].contentDetails.relatedPlaylists.uploads;
  requestVideoPlaylist(playlistId);
}

The eagle-eyed among you may have spotted that on the Google Apps Script site there is a  Retrieve YouTube Uploads example which might be a better starting point. I have to admit I missed this at the time but still believe for advanced services  getting to know the service api docs will be better for you in the long run.

So how do we go from var request = gapi.client.youtube.channels.list() to var response = YouTube.Channels.list(). This is where autocomplete will make your task so much easier. In the App Script editor typing ‘YouTube’ followed by a period ‘.’ brings up the next available options in the call (Tip: pressing Ctrl+Space will list all the available services). Here’s an example for the Analytics service:

autocomplete

If there are parameters required these will be indicated as well as what is returned. Using this with the list example we can see it’s expecting a string part and an optionalArgs object.

autocomplete YouTube

This is when the YouTube Data API reference comes in handy as it lists all the required and option parameters and the values it expects. Some other nice features of this documentation is the option to try a call to the api from the page. This is useful to test values and see a shape of the data returned.

Everyone of course has there own way of working but hopefully you found this useful, so go forth and make your own YouTube Data mashups

Enjoy!

Last year I wrote about how you can  use Google Apps Script to integrate with import.io. If you are not familiar with import.io the service lets you:

transform any website into a table of data or an API in minutes without even writing any code

As part of my work at ALT we recently needed to extract data from our hosted Open Conference Systems (OCS). OCS has some data export options but none that fitted our exact need. As a hosted solution we don’t have access to the backend so I turned to import.io to liberate our own data <sigh>. OCS uses a basic authentication but the great thing about import.io is you can train it to enter your username and password and extract the data from the pages you need.  Getting data behind an authentication layer with the import.io API is a two step process:

Make sure you check out the docs before integrating authenticated sources!
Every time you pass in credentials you will be logged in; pass in credentials once or via a login call and subsequently pass through cookies.

I took a while to get my head around the process because the two links in the support message just take you to the generic API docs. This is a better url to the queryLogin methods. It’s clear that import.io have put a lot of work into the developer experience, but unfortunately I struggled testing the queryLogin method. Using a valid id and model schema for the input just gave an ‘UnexpectedErrorException’. So I then turned to import.io’s own dataset tools. This was another dead end as I was struggling to get it to recognise my OCS login. Peeking under the hood I discovered:

Looking for another ‘in’ a quick search came up with this post on Using import.io authenticated data sources with PHP and Go. Given I do a lot of coding in PHP translating to Javascript/Google Apps Script is relatively straight forward. I was still struggling however with the ‘shape’ of the login payload and the $connectorDomain. The breakthrough came remembering that import.io looked like they were dog fooding their own API in their dataset tool.

Luke use the log

With this I could see what the $connectorDomain should have been and can now happily go off and liberate our data. Here’s my translation of the PHP example in Google Apps Script also available as a gist:

function getResults() {
  var connector = {'username':'YOUR_SITE_USERNAME',
                   'password':'YOUR_SITE_PASSWORD',
                   'connectorDomain':'YOUR_CONNECTOR_DOMAIN',
                   'userGuid':'YOUR_USER_GUID',
                   'connectorGuid':'YOUR_CONNECTOR_GUID',
                   'apiKey':'YOUR_API_KEY'}
  
  var creds = {};
  creds[connector.connectorDomain] = {
    "username": connector.username,
    "password": connector.password
  };
  
  var additionalInput = {};
  additionalInput[connector.connectorGuid] = {'domainCredentials':creds};              
  //get cookies
  var login = query(connector.connectorGuid, false, connector.userGuid, connector.apiKey, additionalInput, false);
  additionalInput[connector.connectorGuid].cookies = login.cookies;  
  var result = query(connector.connectorGuid, {"webpage/url":"http://ocs.sfu.ca/alt/index.php/conferences/altc2015/director/submissionReview/799/1"}, connector.userGuid, connector.apiKey, additionalInput, false);
  // do something with results like write to Google Sheet https://developers.google.com/apps-script/guides/sheets#writing_data
}
 
// http://blog.import.io/post/using-importio-authenticated-data-sources-with-php-and-go
function query(connectorGuid, input, userGuid, apiKey, additionalInput, login) {
 
  var url = "https://api.import.io/store/connector/" + connectorGuid + "/_query?_user=" + userGuid + "&_apikey=" + apiKey;
 
  var data = {};
  if (input) {
    data["input"] = input;
  }
  if (additionalInput) {
    data["additionalInput"] = additionalInput;
  }
  if (login) {
    data["loginOnly"] = true;
  }
 
  var ch = UrlFetchApp.fetch(url, {'method':'POST', 'payload': JSON.stringify(data)});
  var result = ch.getContentText();
  return JSON.parse(result);
}

At the Association for Learning Technology (ALT) one of the core tools we use for membership management is CiviCRM. CiviCRM has a number of ‘out-of-the-box’ reports you can use and run to summaries and analyse memberships and contributions. Given the flexibility of Civi you can also with a bit of know how create custom extensions and reporting options enabling so very sophisticated analytics. At ALT we are keen to make more use of the data we have on memberships but at the same time have limited resources and time to implement these. In this post I’ll highlight how using Google Sheets we’ve developed a framework that allows us to rapidly develop custom reporting.

Problem

If you are already a CiviCRM user you are probably already aware of CiviReports and the ability create basic custom reports which allow you to view and download data. As part of this you can also schedule reports to land in your inbox. This is great but has it’s limitations. In particular, additional setup is required if you don’t want to just report on a daily basis; you end up with tables of data, with no graphical summaries; and combining current and historic data isn’t possible.

Solution

Scheduling reports at custom intervals

CiviCRM provides an interface to schedule a mail_report. The issue many people discover is this will send reports on set intervals usually hourly or daily. You can schedule individual jobs to run a specific periods but you quickly find yourself in the world of command lines and CRON jobs. Crons are scheduled tasks run by a web server. If you have dedicated admin support this is a fairly standard task and the instructions are easy to follow. At ALT we have the option to open a support ticket with our host but this seems like a waste on time and money.

Our solution is to use a Google Sheet… well a Google Sheet with a bit of ‘juice’. The sheet is shared with our team and anyone can add a CiviReport id to Column A and choose how often it runs in Column B using a data validation list option.

deciding what reports run when

But how does this trigger our civi install to run the job? The juice is Google Apps Script, a cloud based scripting language native to Google Drive. Apps Script is a free service from Google and fortunately for us has the ability to run scripts on configured time intervals. It also has the ability to call specific urls using the build-in UrlFetchApp (similar to CURL). I’ll give you a link to this Sheet so you can setup your own later and when you do you’ll see the entire process is managed with a couple of lines of code included below:

function doTasks() {
  var doc = SpreadsheetApp.getActiveSpreadsheet(); // get spreadsheet
  var sheet = doc.getSheetByName("Tasks"); // get sheet
  var data = sheet.getRange(3, 1, sheet.getLastRow(), COL.total).getValues(); // get values
  var now = new Date(); // time now
  // for each row of the sheet interate accross
  for (var i = 0; i < data.length; i++){
    if (data[i][COL.report_id] != ""){ // if there is instance id do something
      // collect row values
      var report_id = data[i][COL.report_id]
      var type = data[i][COL.type];
      var next_run = data[i][COL.next_run] || 0; 
      // check if it's time to run the report again
      if (next_run < now && type != "never"){
        // if it is ping the report trigger
        var new_next_run = callUrl(report_id, type, {format: data[i][COL.format], ss_id: data[i][COL.ss_id], ss_sht: data[i][COL.ss_sht]} );
        // ..and record when to run again
        sheet.getRange(parseInt(i)+3, 3, 1, 2).setValues([[now, new_next_run]]);
      }
    }
  }
}

What this does is read the sheet data and then iterate across each row. If the report is overdue to be run again it calls a another custom function callUrl which will run the CiviReport and return/write when next to run.

Creating graphical summaries and combining data

By this point you may be sensing that I’m partial to solving problems with Google Sheets. With Sheets it’s fairly straight forward to manually export different reports from Civi and analyse using formula and Charts. The manual export of CiviReports can get tiresome so how can we automate this? Again we return to Google Apps Script. One of the options in CiviReports is to attach the data to the emailed report as a .csv file. From the previous example we can see it is possible to read and write data to a Google Sheet. So if we can get the .csv file from our emailed report we can write it to the Sheet … right?

This is actually more straight forward than you may think as another feature of Google Apps Script is to interact with the script owner’s Gmail.  As part of this we can search for messages and get associated attachments. Using this we can read the latest report from Civi, write the data to a sheet and with a bit of clever formula building automatically get the latest summary or custom chart. As Apps Script runs in a pre authenticated environment, no oAuth handshakes here, the code is relatively straight forward:

function processInbox(){
  var PS = PropertiesService.getScriptProperties();
  var data = PS.getProperties();
  for (var key in data) {
    if (key.substring(0, 10) == "search_str"){
      var param_raw = data[key];
      var param = JSON.parse(param_raw);
      // get last 20 message threads using serach term
      var threads = GmailApp.search(param.search_str, 0, 20); 
      // assume last thread has our latest data
      var last_thread = threads.length-1;
      if (last_thread > -1){
        // get message in the last thread        
        var msg =  threads[last_thread].getMessages()[0];
        // get the attachments
        var attachments = msg.getAttachments();
        for (var k = 0; k < attachments.length; k++) {
          // get the attachment as a string
          var csv_str = attachments[k].getDataAsString();
          // parse string as csv
          var csv = Utilities.parseCsv(csv_str);
          // create destination object
          var doc = SpreadsheetApp.openById(param.ss_id);
          var sheet = doc.getSheetByName(param.ss_sht);
          // clear any old data
          sheet.clear();
          // write new data
          sheet.getRange(1, 1,  csv.length, csv[0].length).setValues(csv);
          // mark message are read and archive (you could also label or delete)
          threads[last_thread].moveToArchive().markRead();
          PS.deleteProperty(key);
        }
      }
    }
  }
}

NB

Data protection

There are a couple of things worth noting here. Google Sheets are a fantastic collaborative environment and with this solution we can still share spreadsheets to selected people in our organisation and beyond. Something to remember though is this script runs as the sheet owner so when configuring the CiviReport it needs to go to the email address of the owner. At ALT we benefit from being a Google for Education user so our email and Drive access comes as part of the Google Apps suite. This solution could also be setup to run on a regular Google account but there are data protection issues to consider sending reports to a non-organisation contact. As such you might only want to re-use this solution as an easy way to schedule reports rather than schedule and process the data.

ARRAYFORMULA, FILTER, QUERY functions are your friends

As our solution dumps .csv data in a sheet, clearing any previous data, any formulas you have in the sheet will also be lost. We get around this by doing all data manipulations in a separate sheet which references the imported data. To save copy and pasting lots of formulas we make extensive use of the ARRAYFORMULA, FILTER, QUERY functions available in Google Sheets.

Comparing CiviEvent registrations

One scenario we have is monitoring the number of registrations to an annual conference. It’s easy for us to export registrations for previous years as static data into a spreadsheet. For the analysis we want to group the number of registrations by week number. To get a week number for the live data we create a second sheet that references it. Using the ARRAYFORMULA you can enter the reference once which is then applied to all the values in the range. For example, we get all the registration dates in column A using =ArrayFormula('2015LiveData'!B:B) in cell A1 and then extract the week numbers in column C by using the formula =ARRAYFORMULA(IF(ISBLANK(A2:A),"",WEEKNUM(A2:A))) in cell C2.

ArrayFormula is your friend

Setting up your own copy of Schedule CiviCRM Reports

If this is a project you’d like to use and/or build upon you can copy a blank version of our template below:

Schedule CiviCRM Reports
[While signed in File > Make a copy]

Once you have a copy open Tools > Script editor to see all of the project code and instructions for setting up.

I’ve hopefully given you enough to go on to setup but feel free to leave a comment if you get stuck or have any questions.

Enjoy!

Example of pdf version from altc 2014At ALT we use Google Sheets as an easy way to share and collaborate on draft event timetables. Recent examples are the ALT Annual Conference 2014  and the OER15 timetables. One of the reasons for publishing draft timetables using Google Sheets is  we can get a static url for people to download it as PDFs but the contents can be dynamically updated (see recent post on doing this). The template we use for conferences is continually evolving which isn’t an issue as it’s easy to copy the last version. One headache is that our theme colour usually changes. This can be a bit fiddly change as we use empty cells to create a thicker grid:

Thicker borders

Faced with another cell background switch it made sense to actually do this with code rather than clicks and thanks to Google Apps Script possible in 19 lines of code and a couple of minutes:

function colorReplace() {
  var doc = SpreadsheetApp.getActiveSheet();
  // get all the existing active sheet background colours
  var cells = doc.getRange(1, 1, doc.getLastRow(), doc.getLastColumn()).getBackgrounds();
  var rows = cells.length;
  var cols = cells[0].length;
  // iterate accross
  for (var i = 0; i < rows; i++){
    for (var j = 0; j < cols; j++){
      if (cells[i][j] == '#feeff8'){ // first color to change
        cells[i][j] = '#f3f3f3'; // first color change to
      } else if (cells[i][j] == '#bf0875'){ // second color to change
        cells[i][j] = '#079948'; // second color to change
      } 
    }
  }
  // update backgound colours
  doc.getRange(1, 1, doc.getLastRow(), doc.getLastColumn()).setBackgrounds(cells);
}

Tip

To get the existing cell background colour I used the debugger setting a breakpoint before the loop to see the existing cell colour HEX codes:

debugger to inspect cell colours

When I fly it generally becomes a session in staring into space and letting my mind wonder. It’s not that I haven’t got a long list of Google Apps Script ideas I want to explore, but as a ‘cloud’ based scripting environment a data connection is required (insert pun about in the clouds/above the clouds). For my latest trip I decided to set myself a challenge – create a watch face for Android Wear only using my mobile. This is a lot easier than it sounds thanks to the WatchMaker Watch Face app for Android. Here’s how I got on (or you can just download my watch face)

My watch face on the right displaying the time 9:42 and on the left TokyoFlash Twelve 9-5 B it was inspired by
My watch face on the right displaying the time 9:42 and on the left TokyoFlash Twelve 5-9 B it was inspired by

WatchMaker is one of a number of apps that allows you to do WYSIWYG (what you see is what you get) editing. Making my own watch face was something I started thinking about at the weekend and I’d already scoped out WatchMaker, downloaded some other watch faces and had a peek under the hood. As part of this I discovered WatchMaker integrates with the Lau Programming Language making it possible to do some amazing stuff.

Some of the watch faces I've been playing withAnother great feature of WatchMaker is reuse is baked in. Every watch face you import can be duplicated and customised. This liberal approach makes it easy to see how other people are using WatchMaker features.

As part of my weekend tinkering I’d already taken a picture of my well loved TokyoFlash Twelve 5-9 B watch, shown above. The 1259 B uses LEDs to light up a binary display of the time (see video here), hours shown on the left and minutes equalling the sum of the top (ones) and bottom (tens) rows.

I wasn’t planning on making a watch when I sat down on the plane, but just before the aircraft doors closed grabbed an image of a 1259 B with LEDs on. As I wanted to use a 1259 B LED in my watch the first challenge was cropping the image. I’ve limited image editing apps on my phone but discovered if I successively used the Snapseed cropping tool I could get down to a 16x16px image.

Screenshot_2015-05-19-07-54-15[1] Screenshot_2015-05-19-07-54-09[1]

Adding an image to WatchMaker lets you position the image and appearance using a number of build in WatchMaker variables. For example, {dmo} returns the current minute in the hour in ones. So to progress a minutes led one step from left to right I set the x-position using –90+{dmo}*18. Rotating the hour LED around the bezel was a little more tricky and during the flight my basic trig. maths failed me but the cos/sin rule quickly came back to me once the aircraft doors opened.

There a lot more you can do in WatchMaker with sensor and external data and even animation effects and whilst my first foray in this area is basic it’s mine and my maker journey continues ;)

At ALT we are a Google Apps for Education user and make extensive use of Google Drive for creating and publishing documents. Some of our documents, such as member lists or timetables, get regularly updated and updating links in our websites can be a chore. One of the nice features of documents in Google Drive is you have a couple of ways of publishing documents so anyone can view (no log in required). For Google Sheets the main options are getting a shareable link:

share with others  ... because sharing is good ;)

Or using the ‘publish to the web’ option:

publish to web new school

In the ‘old’ version of Google Sheets the ‘Publish to the web’ option included lots of file types, not just as a web page:

publish-to-web old skool

You can still get new Google Sheets to generate download links for other formats … it’s just a bit complicated. The complicated bit is working out which url tweaks you need. For example, with your magic goggles on you might start spotting a pattern in this url which I’ve line breaked to make easier to see:

https://docs.google.com/spreadsheets/d/1muAOv_chNyCqtx26chZz2RLZlP-wySYsJSFt8UuEO0I/export?
format=pdf&
attachment=FALSE&
size=A4&
fzr=FALSE&
landscape=FALSE&
fitw=TRUE&
gridlines=FALSE&
printtitle=TRUE&
sheetnames=FALSE&
pagenum=FALSE

Remembering all these switches isn’t easy so for our team I’ve shared a simple template with lets us drop in a shared Google Sheet link, choose options for layout and get a url. Here’s a copy in case you find useful:

PDF Link for Google Sheet Template
PDF Link for Google Sheet Template