I was recently invited to London to talk at the GDG DevFest London. The idea for my talk was:

With a pre authenticated cloud-based ecosystem Google Apps Script makes it possible to integrate into other Google services with a couple of lines of code. This turns Google Drive into a rich playground for custom reporting using Google Sheets as a data interface. In this session there will be a quick Google Apps Script 101 highlighting some of it’s main features and affordances. This will be followed by a demonstration of how Google Apps Script makes it easy to combine Google Analytics with other data sources such as Twitter and do many more playful things.

As part of this I highlighted 3 features of Apps Script I find particularly powerful. These were:

  1. Custom UI – creating dialogs and interfaces integrated into Google Docs, Sheets and Forms.
  2. Triggers (in particular timed and POST/GET) – there are a number of different triggers you can use in Google Apps Script but the ones I lean on the most are timed triggers and open scripts to POST/GET requests. Timed triggers let you run scripts on intervals even when you are not at the keyboard. The particular power of publishing a script to handle POST/GET requests that I like is that the script can run as you of the person accessing the script. Build in features also let you restrict access so that the script can only be run by you, someone on your domain, anyone and anyone even anonymously.
  3. Integration – there is some overlap with the previous two but given the potential worthy of note.  In other platforms developers it’s sometimes necessary to do a lot to setup the platform and prepare it for other people to use. Apps Script comes already  integrated with a number of Google products which means in a couple of lines of code you can access other services. As a demonstration you can see below that with Apps Script in one of line of code you can get a persons unread email count … plus a bunch of other Gmail data.

    GDEGmailMid

For what it’s worth slides embedded below:

You’ll find lots of blog posts explaining how to do this the other way, converting a table from Wikipedia into a spreadsheet. I even cover this myself in Feeding Google Spreadsheets: Exercises in using importHTML. This post looks at taking a .csv or a spreadsheet and formatting it as a Wikipedia table.

I’ve been on a bit of a Wikipedia journey expertly guided by a friend, Simon Knight (@sjgknight) … who happens to be Wikimedia UK Vice Chair. The journey has included Wiki ShootMe which has been covered by Alan Levine (@cogdog) and a wonderful Teaching Wikipedia Editing session you can still watch from Connected Courses.

One of the great things about Wikipedia is there is no shortage of information on how to do stuff. Before jumping in I had a quick look for some existing tools and came across this article on Importing (converting) content to Wikipedia (MediaWiki) format. Listed is the Spreadsheet-to-MediaWiki-table-Converter. This is some code you would more normally run on a server possibly used as a library within another project. Wanting something simpler I looked for and found guidance on creating a sortable table. The guidance helpfully includes an example copied below:

{| class="wikitable sortable"
|-
! name
! data
! more data
|-
| cats
| 273
| 53
|-
| dogs
| 65
| 8,492
|-
| mice
| 1,649
| 548
|}

And here is what it looks like:

name data more data
cats 273 53
dogs 65 8,492
mice 1,649 548

Hopefully you can quickly see the pattern, you start with the column headings, one per line of wikicode and then each of the rows again one value per row for each cell. This type of pattern is easy to code … if you can code … … if you want to code. As I was composing my first line I remembered an old project Templated Export for Google Spreadsheets:

Templated Export allows users to reshape and selectively publish data from Google Spreadsheets in custom formats. The tool is powered by Google App Script which allows integration with Google Spreadsheets using existing authenticated access. Example uses include: turning data from a spreadsheet into a RSS feed and publishing data from a spreadsheet in JSON format to be used by other tools/services. The framework for creating a custom wrapper for your data means there are many more opportunities.

Templated Export lets you customise use a basic reference system to create you own output pattern:

Templated Export

There is some learning curve using this and original post has details on setting this up and using it. The basic concept is you use a cell reference as a way to build up your pattern. You can save your templates which means if you update your data in your Google Sheet you can open the Templated Export interface and download it again or use a saved ‘published url’.

Templated Export hasn’t been touched for two years and is perhaps a candidate for some ‘modern’ Google Apps Script techniques such as publishing as an Add-on. If the thought of playing with Google Apps Script doesn’t take you fancy, Templated Export is based on a feature of OpenRefine called Templating Exporter. OpenRefine can be downloaded and run from your desktop and can handle a variety of data file formats.

So go forth and fill Wikipedia with lovely tables for everyone else to reference in ‘Wikipedia table export’ posts ;)

1 Comment

I previously posted about TwtrService: A Twitter API client library for Google Apps Script which makes it easy to interact with Twitter from Google Drive applications like Google Sheets. One of the nice things about TwtrService is that once you setup a connection to Twitter you can use it many times in different projects, basically allowing you to do stuff in one line of code. In the post I said I’d share some of the examples of things I make so here is the first one, EasyTweetSheet.

What it does

At the Association for Learning Technology we organise lots of events. We only have a small staff team so having someone sending out tweets during the event can be a problem. We could use a Twitter client like Tweetdeck or Hootsuite  to schedule tweets during the day. One issue is if something goes wrong like the livestream not working or a session starting late you can look a bit silly. The solution was to draft our tweets in a Google Sheet and have a link we click when we want the message to be sent. Below is a screenshot for the one we used at this year’s ALT Annual Conference:

EasyTweetSheet used at #altc 2014

How to get you own copy working

  1. Open this copy of the EasyTweetSheet templateand File > Make a copy
  2. In your copy open Tools > Script editor and follow the instructions
  3. Start filling the ‘text’ column with what you want to tweet which should enable the ‘tweet’ link

IMPORANT: If you’ve used my other tools like TAGS this template will use the Twitter account you used to set it up. To use a different Twitter account to send the tweets from
replace YOUR_CONSUMER_KEY and YOUR_CONSUMER_SECRET in lines 34-35 of the Script editor code with your Twitter application key/secret. When you Run > setup switch back to the Sheet view and follow the instructions.

Enjoy!

2 Comments

As part of the latest release of TAGS (Twitter Archiving Google Sheet) I moved a lot of the code into a Google Apps Script Library. Libraries are a great way to bundle and release code allowing you to write your own classes and methods. To see some of the functionality already being developed by the Apps Script community you should have a look at the Google Apps Script Samples site and the excellent work Bruce McPherson has done which includes a EzyOauth2 library.

One of the things you can do with libraries is wrap one library into another. When rewriting TAGS it made sense to strip out a separate Twitter client library that I and others could use in different projects. Based on the work by Arun Nagarajan at Google, TwtrService provides  access to Twitter's REST API. The work I’ve done is to add some UI and configuration methods to try to streamline the authentication flow. As part of this developers can offer authentication routes using their own Twitter application or use an application created by users. This particular angle is a result of one of the design principles for TAGS, that every copy of the template should use a Twitter application owned by the user. The reason behind this is to distribute the risk. If Twitter were to suspend my data access because a TAGS user abused their API it would suspend access for all TAGS users. By requiring TAGS users to register their own application with Twitter the responsibility to abide by Twitter’s terms of service lies with them. So in TAGS the auth flow looks like this

The result is hopefully a flexible library that developers can integrate into their own projects or by getting users to register their own.

Over the next couple of weeks I'll be sharing some examples applications we've developed at ALT. In the meantime this post serves as a basic introduction to TwtrService and covers:

Overview of TwtrService

The TwtrService library for Google Apps Script centrally stores your Twitter access details allowing them to accessed from multiple script projects without the need for re-authentication. TwtrService is designed to allow you to directly use the Twitter’s v1.1 REST API GET and POST methods. For example to return Twitter search results for the search ‘Google Apps Script’ you would use:

var data = TwtrService.get('https://api.twitter.com/1.1/search/tweets.json', {q: 'Google Apps Script'});

The url string can also be abbreviated to:

var data = TwtrService.get('search/tweets', {q: 'Google Apps Script'});

Additional options can be passed in the parameters array. For example to return 100 results for the same search you would use:

var data = TwtrService.get('search/tweets', {q: 'Google Apps Script', count: 100});

The project key for this library is MarIlVOhstkJA6QjPgCWAHIq9hSqx7jwh and the TwtrService methods are documented here.

To use the Twitter REST methods TwtrService first needs authenticated access. TwtrService has some built-in methods to do this detailed below. Once a user has authenticated access the TwtrService library stores these as User Properties. This means when a user has authenticated once with TwtrService using the library in another container-bound or standalone Apps Script immediately gives them access to Twitter API results using the get/post methods. In terms of security User Properties are limited to the effective user of the current script.

Usage

Quick start: Personal access

If you would just like to use TwtrService for your Apps Script projects the easiest way to get started is to register a Twitter application and enter it’s details on this page (if you are interested here is the source code for the interface).

Note: If you are already a user of TAGS you’ll already be able to use TwtrService without the step above.

In your Apps Script project you’d like to use the Twitter API in the Script Editor window use Resources > Libraries and add the service  using the project key MarIlVOhstkJA6QjPgCWAHIq9hSqx7jwh.

In your project you can now use the TwtrService.get() and TwtrService.post() methods. The documentation for get() is detailed below (post is the same but uses HTTP POST):

get(string url, Object parameters)

GET wrapper for request to the Twitter REST API. For full documentation of API method endpoints and parameters see https://dev.twitter.com/rest/public. For example to get last 100 tweets containing 'Google Apps Script': var data = TwtrService.get('search/tweets', {q: 'Google Apps Script', count: 100});

Arguments:
Name Type Description
url string Twitter REST API resource url. This can be either long form e.g. https://api.twitter.com/1.1/search/tweets.json or abbreviated e.g. search/tweets
parameters Object additional API parameters as detailed in the Twitter REST API documentation e.g. for search results a search string and count is specified by {q: 'Google Apps Script', count: 100}.
Return Values:
Type Description
Object API response in JSON format.

Quick start: Personal access in Sheets and Documents

If you would like to replicate the TAGS authentication flow where users enter their Twitter application key/secret TwtrService comes with a number of UI methods. For TAGS the following code is used:

/**
* Launches key/secret and auth flow
*/
function setup() {
  if (TwtrService.isUserConnectedToTwitter()){
   var result = Browser.msgBox("Twitter Authorisation", 
                   "You appear to already be connected to Twitter.\\n\\nWould you like to run the setup again?", 
                   Browser.Buttons.YES_NO);
    // Process the user's response.
    if (result == 'yes') {
      // User clicked "Yes".
      TwtrService.showTwitterKeySecret(SpreadsheetApp);
    } 
  } else {
    TwtrService.showTwitterKeySecret(SpreadsheetApp);
  }
}

/**
* Used as part of setup() to process form data
*/
function processForm(formObject) {
  TwtrService.setUserKeySecret(formObject);
  TwtrService.showTwitterLogin(SpreadsheetApp);
}

Quick Start: Shared Twitter Key/Secret

The earlier examples have assumed the user registers their own Twitter application. For scenarios where you would like to have the option for users to have authenticated access using a dedicated Twitter API key/secret it is possible to initialize these values. An example application code can be found here which is also deployed here.

Similar to earlier examples once a user authenticates access with your key/secret as long as these values are also initialized in other script projects the user will have access to the Twitter API via TwtrService.

Instructions for creating a Twitter Application to use with TwtrService

TwtrService requires you to have a registered Twitter application. If you are If you haven’t already done this here are some steps you need to get started:

  1. Register for an API key with Twitter at https://dev.twitter.com/apps/new (if you've already registered for a TAGS sheet you can reuse your existing API Key and Secret).
    • Name, description and website can be anything you like
    • Important Include the Callback URL https://script.google.com/macros/
  2. Read the 'Developer Rules of the Road' before clicking 'Create your Twitter application'

On the Twitter site your application should include a ‘Keys and Access Tokens’ tab which includes the Consumer Key (API Key) and Consumer Secret (API Secret) you’ll need.

4 Comments

Get TAGS

Amazing to think TAGS has been going for over 4 years now. Version 6 is a major code rewrite and the most important new feature is a better setup processes. No more digging into the script editor to run functions, no more entering your Twitter API key and secret each time you create a new archive. Both these things are history thanks to some base code released by Arun Nagarajan at Google. Now you enter/register for a Twitter API key and secret once and each copy of TAGS you make will remember these. This is made possible by incorporating this functionality as a custom Apps Script library I’ve called TwtrService. TwtrService makes it easy to make calls to all of Twitter’s API and I’ll be explaining how it works in another post.

Version 6 comes with some other new features. The one that was most requested was archiving favourited tweets. There was a quick hack version of TAGS that did this but was limited to the last 200. Now when you setup a favourite archive you can get up to the last 3,000 favourited tweets. Another option with TAGS v6.0 is to use Google’s new version of Sheets. This gives TAGS more capacity and performance. One issue however with new Sheets is it isn’t very stable with the Google Visualisation API which is used in TAGSExplorer.

With TAGS v6.0 I’ve also created a dedicated support site. So if you have any questions or need help head over to http://tags.hawksey.info where you can also get the latest version of TAGS.

1 Comment

Thanks to a recommendation by Josef Šlerka (@josefslerka) I got a chance to speak at WebExpo 14 (#webexpo) about some of my work using Google Sheets for Twitter data mining and analysis. I’m always get uneasy at speaking at events for web professionals/developers about spreadsheets as … it is spreadsheets.

Dangers aside, particularly ‘spreadsheet addiction’, hopefully I was able to covey  the possibilities of using Google Apps Script turbocharged Google Sheets. To do this I focused on some of the work I’ve been doing interacting with the Twitter search API. I’m sure many people are familiar with the Twitter Archiving Google Sheet (TAGS) but perhaps not aware of some of the experiments I’ve used it for. Some of these include:

(I was surprised myself to discover the whole project started in 2010)

After my talk it was great that some fellow educators came to speak to me interested in using Google Apps Script for social data mining as part of their curriculum. Slides for my talk and abstract are at the end of this post and the link bundle is here and the #webexpo TAGS archive is here.

Thanks also to my fellow GDE Ivan Kutil for capturing a picture of me dwarfed by one of my own sheets.

IMG_20140913_161030

.

 

There is growing interest in the use of data to provide actionable insight. This interest goes beyond the professional analysts and just as fields such as mathematics and astronomy have benefited from the enthusiastic amateur so does data science. Social networks are a rich playground of data and whilst many provide access to their data via APIs but access via this route can be daunting. You can of course turn to 'analytics as a service' sites which will take your credentials and provide you with some answers, but often this can be what they want to tell you and not what you want to hear. A solution is the spreadsheet. Spreadsheets provide an interface for data exploration for those with basic skills. With Google Sheets the opportunities increase exponentially, not just in terms of collaboration, but also with the power of Google Apps Script. Apps Script provides easy integration into other Google products and services, such as Google Analytics, as well as third party APIs like Twitter. In this presentation we show how Google Sheets can become a rich playground where data from different services can be collected and analysed.

We recently moved the ALT Online Newsletter, which is a self-hosted WordPress site, to https/SSL. We did this before Google announced it would use https as a rank factor in it’s search results so hopefully it will also have a positive boost to our traffic. To do this we opened the WordPress dashboard and switched the WordPress and site URLs to https: 

address and site url

This setting handles the urls generated by WordPress for menus and post links but doesn’t effect hardcoded links in posts or inbound links. To handle these we added a couple of lines to our .htaccess file:

# HTTPS redirect
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP_HOST}/$1 [R=301,L]
</IfModule>

This detects any urls with http and redirects to the https equivalent. All seemed to be fine until Matt Lingard noticed that our Feedburner RSS feed wasn’t distributing posts anymore. The issue … Feedburner can’t handle feeds served from https generating a ‘Received HTTP error code 400 while fetching source feed’:

Received HTTP error code 400 while fetching source feed.

The solution is to detect FeedBurner and not give it https like so:

# HTTPS redirect
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteCond %{HTTP_USER_AGENT} !FeedBurner    [NC]
RewriteCond %{HTTP_USER_AGENT} !FeedValidator [NC]
RewriteRule (.*) https://%{HTTP_HOST}/$1 [R=301,L]
</IfModule>

All straight forward enough but there is one last kicker. Feedburner caches your feed so when you go to Feedburner and Edit Feed Details and click ‘Save Feed Details’ it may still show as a 400 error. You can either wait for the cached version to clear to see if your changes have worked (or as I did spend an hour trying to work out why it wasn’t fixed) or trick Feedburner to ignore it’s cached version with some trash at the end of your url as a querystring e.g. http://newsletter.alt.ac.uk/feed/?source=feedburner

Share this post on:
| | |
Posted in WordPress on by .

1 Comment

I was building an analytics dashboard today that collected data from various services including Google Anaytics and YouTube. Apps Script makes this very easy as highlighted in my previous post. An issue I encountered when I tried to access our YouTube channel reports is even though my account is attached to as a manager I was getting a ‘Forbidden’ error.  Turning to the Channel Reports documentation I discovered:

channel==CHANNEL_ID – Set CHANNEL_ID to the unique channel ID of the channel for which you are retrieving data. The user authorizing the request must be the owner of the channel.

As our YouTube channel is associated with our Google+ page you can’t log in to Google Drive with that account. I did notice however that when I added YouTube Analytics as an Advanced Apps Script service the authentication prompt gave an option of authenticating using our Google+ page.

auth window 

The issue then is if you authenticate against the Google+ page you can’t get access to other services like Google Analytics. I thought of a couple of ways I might tackle this such as writing a separate Apps Script project that just got the YouTube Analytics data and wrote it to the spreadsheet I was working on. I’m not entirely sure how the permissions would work out on that. Instead my solution was to expose the YouTubeAnalytics.Reports.query  in a separate Apps Script published as a web app. Setting this to run ‘anyone, even anonymously’ I could then use UrlFetchApp to get the data in my main script.

Here’s how I set it up. Below (or in this gist) the 'main' script is handling all the data read/write to sheet and a separate 'proxy' Apps Script project running the YouTube Analytics data collection.

Note: This technique exposes our YouTube Channel report to the world (barring security by obscurity). The method we are exposing is read only so we don’t have to worry about an injection. 

Feels a bit hacky but can you see a better way of doing this?

Update 22/07/2014: Matias Molinas had the great suggestion of writing the results to a sheet which would avoid exposing your data. Jarom McDonald has also suggested using Google App Engine would give security and scalability à la superProxy

4 Comments

There are regular reporting features built in to Google Analytics but what if you want to do customised reporting or other regular data collection and integration techniques? One answer is Google Apps Script. Apps Script is a Google Drive feature that makes custom automation so much easier. The big advantage of Apps Script is you are already in a Google authenticated environment. So while anyone can use the existing Google Analytics APIs using Apps Script you don’t need to worry about the authentication handshake to get your data. This means that you can access Google Analytics data in one line of code. There is a couple of tickboxes to get to that one line of code but in this tutorial I’ll walk you through the process.

Note: You can also get Google Analytics (and other analytics data) in a Google Sheet with zero lines of code using add-ons. The current limitation of these is they cannot currently be automatically run on a scheduled basis.

Setting up our project

Google Apps Script lives in a couple of places but for this example we are going to use a Google Sheet so step one is:

  1. From your Google Drive account create a new Spreadsheet
  2. In the new spreadsheet open Tools > Script editor…
  3. If this is your first script you might see a window popup with options to ‘Create projects for’, if so click ‘Close’
    Close Apps Script project window
  4. You should now see the Script editor
    Script editor
  5. The connection to Google Analytics is a Google Apps Script ‘Advanced Service’ and to enable we need to turn it on. To do this in the Script Editor select Resources > Advanced Google services 
    image
  6. At this point you will be prompted to create a project name. Enter a name for your project and click ‘OK’.
  7. You should now see a list of available advanced services. For this project we want to just turn on Google Analytics API by clicking the on/off toggle. So far we have enabled Analytics in our project but we also need to click on the Google Developers Console link highlighted to enable this at the console end.
    Advanced services control
  8. Similarly to the ‘Advanced Google Services’ we need to enable the Analytics API by toggling the on/off (at this point you may be prompted to review and accept terms of service)
    image
  9. Once enabled in the Developer Console you can close this tab/window and click OK in the ‘Advanced Google Services’ box in your Script Editor.

To recap our project is now configured to use the Google Analytics API. You can use multiple advanced services in the same project as needed. Remember ‘Advanced Services’ need to be enabled for each project you create but this only needs to be done once for each project.

Getting Google Analytics data (and some basic Script Editor tips)

For those familiar with coding environments the Script Editor comes with a code autocomplete. This is useful when starting rather than remembering a long list of classes and methods. To use this:

In the Script Editor click on line 2 between the moustaches (‘{‘ and ‘}’) and for PC press CTRL + SPACE or Mac ⌘ + SPACE, from this you can select Analytics from the list. To continue the autocomplete follow it with a dot ‘.’ which brings the associated options. The line we want to get is Analytics.Data.Ga.get(ids, start-date, end-date, metrics, optionalArgs); 

autocomplete

So you’ll see there are a couple of parameters we need to provide defined in the GA Core Reporting Query Parameters Summary:

  • ids - Unique table ID for retrieving Analytics data. Table ID is of the form ga:XXXX, where XXXX is the Analytics view (profile) ID.
  • startDate - Start date for fetching Analytics data. Requests can specify a start date formatted as YYYY-MM-DD, or as a relative date (e.g., today, yesterday, or 7daysAgo). The default value is 7daysAgo.
  • endDate - End date for fetching Analytics data. Request can should specify an end date formatted as YYYY-MM- DD, or as a relative date (e.g., today, yesterday, or 7daysAgo). The default value is yesterday.
  • metrics - A comma-separated list of Analytics metrics. E.g., 'ga:sessions,ga:pageviews'. At least one metric must be specified.
  • optionalArgs – is an object array of the optional query parameters like dimensions, segment, filters, sort etc. An example from this Google Analytics Apps Script tutorial is:
var optionalArgs = {
 'dimensions': 'ga:keyword',              // Comma separated list of dimensions.
 'sort': '-ga:sessions,ga:keyword',       // Sort by sessions descending, then keyword.
 'segment': 'dynamic::ga:isMobile==Yes',  // Process only mobile traffic.
 'filters': 'ga:source==google',          // Display only google traffic.
 'start-index': '1',
 'max-results': '250'                     // Display the first 250 results.
};

Building a Google Analytics Query with Query Explorer

If you are unfamiliar with the Google Analytics Core Reporting API building a query can be quite daunting. Fortunately the Google Analytics team have made the Google Analytics Query Explorer which gives you an interactive interface to build and test queries. Here’s an example to query your top referring sites, which should give you a page like this:

Google Analytics Query Explorer 2

If you haven’t used the Query Explorer before you’ll need to click on the ‘Authorize Access’ button, which will enable a ‘Get Data’ button. You can test and tweak your query as much as you like using the ‘Get Data’ button to see what is returned from your Google Analytics accounts. The Query Explorer is a great starting point but remember it only contains a few of the optional arguments.

If you are using the Query Explorer then below is a helper script for converting your Query URI into an Apps Script snippet. To use this build your query in the Query Explorer and then press the Query URI button image to get a URI to paste in the textfield and then click ‘Submit’

The little helper script gives us some extra code to store/pass the parameters we need. If you are scheduling this script to run on a regular basis you’ll need to modify the start/end date in the query. The are two main ways you can do this: either building the date by manipulating a Date instance formatted as YYYY-MM-DD; or as a relative date (e.g., today, yesterday, or NdaysAgo where N is a positive integer). Below is an example of a modified query made with the Query Explorer and exported to generate the script we need which gets data from the last 7 days. In this example I’ve wrapped it in a new function name fetchMyQuery. Note: In your example you need your own ids value to return results.

function fetchMyQuery() {
  var query = {
    "optionalArgs": {
      "dimensions": "ga:source",
      "filters": "ga:medium==referral",
      "sort": "-ga:pageviews",
      "max-results": "50"
    },
    "ids": "ga:82426939",
    "metrics": "ga:pageviews,ga:sessionDuration,ga:exits",
    "start-date": "7daysAgo",
    "end-date": "yesterday"
  };
  var results = Analytics.Data.Ga.get(query.ids, query['start-date'], query['end-date'], query.metrics, query.optionalArgs);
  Logger.log(results);  
}

Testing/Debugging Apps Script

With your script saved we can test the code. When developing scripts there is a debug feature demonstrated in the video below (the first time you run a new script you need to authorise it. This only needs to be done the first time the script runs or when new permissions are required.

Insert Data Into A Spreadsheet

The final step is to output the results from our query into Google Sheets. For this example let reuse a modified version of the outputToSpreadsheet method in the Automated Access to Google Analytics Data in Google Sheets tutorial:

function outputToSpreadsheet(results, sheet) {
  // Print the headers.
  var headerNames = [];
  for (var i = 0, header; header = results.getColumnHeaders()[i]; ++i) {
    headerNames.push(header.getName());
  }
  sheet.getRange(1, 1, 1, headerNames.length)
      .setValues([headerNames]);

  // Print the rows of data.
  sheet.getRange(2, 1, results.getRows().length, headerNames.length)
      .setValues(results.getRows());
}

This function inserts all the header and reporting data to the sheet. For more information on how to insert data into Google Sheets with Apps Script there is a  Storing Data in Spreadsheets tutorial.

The outputToSpreadsheet method in our example is expecting two objects to be passed in. To do this your fetchMyQuery method needs to include getting Sheet object and passing it to will need to outputToSpreadsheet. The entire project should look like this:

When you now Run > fetchMyQuery you should see the data written in the sheet you specified.

Automate the Script

This project has been about automated data collection so let look at how we set this up. Google Apps Script makes automation very easy using the triggers feature. To set this up

  1. In the Script Editor click Resources > Current project’s triggers
  2. Click ‘No triggers set up. Click here to add one now’
  3. Lets configure the fetchMyQuery to run once a week by setting:
    • The Run dropdown to: fetchMyQuery
    • The Events dropdown to: Time-driven, and selecting Week timer to run Every Monday between 7:00 a.m. to 8:00 a.m.

Triggers

Once saved this script will run as scheduled with no need for your to have the sheet open. If you would like to be told if the script fails whilst unattended click the notifications link which opens a new dialogue box to allow you to configure to which email you want errors to be sent and when.

Summary

This post has introduced you to using Google Apps Script to automatically collect and write Google Analytics data to a Google Sheet. Once you get started with Apps Script you’ll start discovering many more opportunities to automate tasks. For example, as part of our triggered event we could notify people of an update using the MailApp service or write an entire document using Document Service. For some ideas you might want to read about Analytics reporting with Google Apps Script at the UK Cabinet Office.  If you are just starting to use Google Apps Script the Google Developers has extensive documentation and tutorials and if you get stuck there is a dedicated tag in Stackoverflow and an active community on Google+.

2 Comments

At 4pm today (16th July, 2014) I’ll be giving a talk to the Institutional Web Managers Workshop (IWMW14) in Newcastle. The sessions aren’t being streamed but I’ll see if I can stream to my YouTube channel. The main idea I want to convey is that in a world which is benefiting from being digitally distributed, networked and increasing crowd driven the IWMW audience is in the prime position to support their institutions creating opportunities for learning aligned to this.  In particular, I want to highlight the work George Siemens is proposing around the Personal Knowledge Graph (PKG):

The big shift that needs to be made in education is to shift from knowing content to knowing learners.

What is needed in education is something like a Personal Learner Knowledge Graph (PLKG): a clear profile of what a learner knows – Siemens, G (2104)

There are a number of challenges with this idea including technical and ethical, but having played with ideas that could potentially highlight aspects of a persons latent knowledge capacity the concept excites me. As part of my presentation I’ll be highlighting the ways the latest iteration of ALT’s Open Course in Technology Enhanced Learning (ocTEL) align to a Personal Knowledge Graph. One of the strongest features of ocTEL, which we completely fluked, was how Open Badges can be used to support the construction of part of a Person Knowledge Graph.

Open Badges in ocTEL

As part of ocTEL the course was broken into weekly topics. Each week there were five types of badges available for that topic (this was based on the BlendKit Course).

  • Check-in: awarded for reading the week’s course material
  • Webinar: awarded for attending or watching the webinar
  • TEL One: awarded for completing each week’s ‘If you only do one thing’ activity
  • TEL Explorer: awarded for completing each week’s ‘TEL Explorer’ activities (there could be more than one)
  • Topic Badge: awarded for completing at least 3 of the above

All the badges with the exception of TEL One and TEL Explorer were automatically awarded by a click of a button, entering an activity code or the system detecting a configured set of requirements. TEL One and Explorer were awarded by the participant providing a url evidencing their activity which was manually reviewed. Other badges were available outside the weekly set for community related activity. These included badges automatically awarded for adding details to a profile, making posts on other sites recorded in ocTEL as part of our data aggregation using FeedWordPress. There were also other badges tutors could individually award people to recognise contributions to the course.

Open Badges = new nodes and edges

Besides badging appearing to be a positive influence in course engagement the badging system used in ocTEL, the BadgeOS Plugin for WordPress, there are several features of our setup you could argue supported the construction and use of Personal Knowledge Graphs:

Situational awareness

As part of the badging system there was an option to display who else had earned the badge. With this there is an opportunity to be aware of who else is active (an example badge page). Given this was available on all badges activity become categorised by level (basic check-in to advanced activity) and topic. In the current configuration we have no made connections between these but would prove an interesting area for further research. Another area for improvement is the profile pictures link you only to the persons general profile page. Whilst this has uses, outlined below, in the case of badges with evidence attached to them this link isn’t exposed.

Who's been awarded

A profile of what knowledge has been gained by an individual is arguably a key aspect of a Personal Knowledge Graph. The badging plugin used combined with the BuddyPress social network plugin integrates achievements into a person’s profile (an example profile here). This provides another entry point for people to make connections. Whilst BuddyPress has the facility for friend/follower relationships we were keen for participants to identify the personal 3rd party spaces they  exist in.

Another aspect of BuddyPress/BadgeOS we only briefly experimented with was the inclusion of badge award in the person’s activity stream. The main mechanism this could have supported is with a friending tie this activity could have been pushed to followers via email.

ocTEL achievment profile

Self-declared activity

One challenge of open or distributed activity is collecting that activity. ocTEL uses the FeedWordPress model to aggregate activity from participant blogs and other networks. Whilst the hashtag provides some means of collecting information from self-organising spaces there will always be issues with data collection. With the TEL One and TEL Explore badges requiring the person to submit a url as evidence that they have completed the activity this issue is partially overcome, the individual becoming the curator of information in their knowledge graph.

Interoperability

The final aspect of badging in ocTEL worth noting is our efforts to move from the ‘digital badging’ natively used in the BadgeOS plugin to instead directly issue Open Badges. This small difference potentially has a big impact. Most notably this means the ocTEL site exposes information about achievements in the interoperable Open Badges specification. The benefit for the user is they can display badges in other systems, on a personal knowledge graph level it means these achievements are machine readable.

Summary

This hastily written post hopefully gives you a flavour of how the use of Open Badges, or digital badging in general, could support the construction of part of a Personal Knowledge Graph. There are still a number of questions to be answered like how might Open Badges, a skills and competency definition using something like InLOC, and a Personal Knowledge Graph fit together. My slides for my IWMW14 talk, for what  they are worth, are embedded below and I look forward to hearing your thoughts on this area.