Update 24/06/2014: Recording of the session currently here

For a while I’ve been interested in the intersection between Google Analytics (and Google’s other analytic reporting APIs like YouTube) applied to the field of Learning Analytics. There are a number of features of Google Analytics such as segmentation, a-b testing, event tracking which I believe could potentially give useful insight to teaching a learning, for example, a look last year at tracking and validating MCQs with GA.

One of the reasons for my interest in this area is the ubiquity of Google Analytics, the majority of institutions already using Google Analytics for their main institutional websites. It should not be forgotten that with power comes responsibility. Whilst Google Analytics usage policies prevent you using it to track personally identifiable information you are still tracking people which should never be forgotten.

The Google Apps for Education European User Group Meeting (GEUG14) at the University of York is another opportunity to roadtest some of these ideas. The process for preparing for an event often not only sees me revisiting prior knowledge but is often turned into an opportunity to create something new. This can be a product, like the Google Analytics data bridge made for IWMW13, or new knowledge.

This time personal exploration has taken me into the land of the Google Tag Manager. Those familiar with the mechanics of Google Analytics tracking will know that this usually requires adding code to every page you want to track. Often this can be achieved modifying page templates. But what if these templates are hard/costly to edit or you want to make changes to what is tracked. This is where Google Tag Manager comes in. Like Google Analytics you need to install some code in your pages. After that what and how you track things becomes completely cloud based. Through Google Tag Manager to can add additional code/markup to your pages even setting up rules to decide when these are used. Whilst Tag Manager is build around Google products like Analytics and Ads you can use it for other purposes. This video gives you an overview of Google Tag Manager.

Below are the slides from my session which will hopefully be streamed via Google Hangout 23rd June at 11:30am BST (see programme).


Update 18/06/2014: The Open Badges Issuer Add-on is now also available in the WordPress Plugin Directory. Get the Open Badges Issuer Add-on

OpenBadges_Insignia_WeIssue_BannerALT’s Open Course in Technology Enhanced Learning (ocTEL) is entering it’s final week. ocTEL has been and continues to be an excellent opportunity to explore ways in which we support ALT’s community of members. Last year the work we did in setting up a blog and community aggregations site directly feeding into the development of the ALT conference platform. This year one of the areas we were keen to explore was the digital badging of community contributions. The use of community badging is well-founded and predates the web itself. This area has however gained extra interest by educators in part due to Mozilla Open Badges. Mozilla Open Badges specify a framework for the description and award of badges using a common specification. The main benefit or this approach is interoperability. Recipients of Open Badges can collect badges in one place, manage them into collection and control how they are shared across sites, social networks and personal portfolios. One such place is in a Mozilla Backpack.

In the case of ocTEL the creation and award of digital badges, particularly within a community context, has been made very easy thanks to the BadgeOS™ plugin for WordPress. BadgeOS has a number of methods which trigger the awarding or badges including reviewed submissions as well as the completion of a defined set of steps.

One issue for us has been that to issue Open Badges with BadgeOS requires integration with the badge awarding and display site Credly. Sites like Credly are very useful parts of the badge ecosystem but the feeling we had was that if we were going to issue Open Badges we would take on the commitment of hosting the badge data ourselves rather than relying on a 3rd party. BadgeOS, regardless of whether you turn on Credly integration, still provides an excellent framework for creating and awarding digital badges. Even better is BadgeOS is open source and is actively encouraging developers to extend and enhance the plugins core functionality. If you look at the BadgeOS Developer Resources there is a number of ways this can be achieved.

With this in mind, with the support of ALT, I has decided to make my own contribution to BadgeOS  with the development of the Open Badges Issuer Add-on. This add-on achieves two things:

  • Badges designed and awarded using BadgeOS are now exposed as Open Badges compliant Assertion - Assertions are the DNA of Open Badges. They are data files which describe the badge and identify who it has been awarded to. Behind the scene the add-on is using the BadgeOS created data turning it into the required object recognised as an Open Badge. Critically this data existing in the host site. For example, one of my ocTEL badges exists here and is shown below in a formatted view.

Open Badges Assertion

  • The creation of an interface for the user to add badges awarded using BadgeOS to the Mozilla Backpack - this is technically made a lot easier as the add-on uses the existing Issuer API which provides most of the code to get the job done.

The rollout of this plugin in ocTEL is complete as detailed in ‘ocTEL digital badges are now Open Badges: How to add them to your Mozilla Backpack’. I’ve also submitted the plugin as a recognised BadgeOS add-on. The add-on will also shortly be appearing in the official WordPress Plugin Directory. Hopefully this add-on will make it easier for others to issue Open Badges through their WordPress powered site.

Like all projects this development greatly benefits for your feedback, which may include coding suggestions or improved functionality, so we appreciate your feedback.

Download the Open Badges Issuer Add-on for BadgeOS

As part of some work I'm doing with the Open University around the OER Research Hub project I developed this high fidelity prototype which let users explore survey responses collected by the project (the short video below highlights the main features):

In the guest post I wrote on the 'OER Survey Exploratoratorium' I outlined the problem:

When presented with over 4,000 survey responses the challenge was how to let people explore the data set. When presented with this challenge the first thought invariably is what is the shape of the data. In this case the survey responses were collected in Survey Monkey. After considering options like consuming the data into the OER Impact Map via the Survey Monkey API, the overhead in terms of developing user interfaces and squeezing into a WordPress data structure resulted in the exploration of other options. The approach that looked like it would squeeze the most functionality out of little development time was to use Google Fusion Tables to host the data and put a visualisation layer over the top. The reason for choosing Fusion Tables is it allows a Guardian Datastore model of letting the user easily access and reuse the source data either in Fusion Tables itself or exporting into another tool. If you would like to peek at the data behind this there are two tables: one with the survey questions and another with the survey data.

I’ve extracted the main part of the code into this gist so you can get a sense of what’s going on. If this is something you are interested in doing yourself there is some documentation on the Google Visualisation API  for getting Google Fusion Tables. This page is has one example of how you can fetch data from Fusion Tables. It’s however worth noting that as Google Fusion Tables implements  the Chart Tools Datasource Protocol you can query the data as a datasource. This allows you to use the Google Visualization API Query Language with  SQL like syntax. The gist below is a reworking of this query example in the Google Code Playground which you can use to see the differences. The main one is how the query is set by specifying which table the data is from in the query. A couple of notes I have on using Google Fusion Tables as a datasource in this way are:

  • data returned limited to 500 rows. If you want more you can turn to the full Google Fusion Tables API  which has a separate SQL like query language. Using this API is rate limited and requires OAuth and/or API key. I got more than 500 by using LIMIT and OFFSET in my queries. (the full Google Fusion Table API is worth bookmarking for cross-referencing).
  • using back-quotes ` specified in the Visualisation API to wrap column names with spaces doesn’t appear to work. You do specify columns by their name rather that A, B C etc as used in Google Sheets. (The Google Fusion Tables API specifies single quotes which I don't think work in this scenario - this is an example of where cross-referencing helps)
  • Google Fusion Tables doesn’t implement the OR operator (related issue ticket marked Won't Fix. When I mentioned to Tony Hirst (@psychemedia) he suggested De Morgan's laws which would be an alternative)

Hope you enjoy and look forward to seeing you Google Fusion/Visualization mashups ;)

A repost from the ocTEL course blog outlining the way we setup the BadgeOS plugin for WordPress to issue badges as part of the course. This post follows on from an earlier post, 'ocTEL and the Open Badges Assertion', which highlights some progress towards directly issuing Open Badges using BadgeOS ... more to follow on this development.

Moira Maley recently wrote to us asking for some details on how the ocTEL course is configured to issue badges. As others might benefit from this and with Moira's permission here are her questions and my responses. ...continue reading

1 Comment

This was a post I prepared for another site. It got lost in the pending queue so is out of date (you can still register for ocTEL until the end of June), but I thought worth capturing this post here for future reference.

Last year ALT ran an 11 week long open course in technology enhanced learning (ocTEL). ocTEL is back! And you can still register for this year’s iteration of the course which starts on 28th April 2014 and runs for 7 weeks. The ‘course’ introduces various aspects of TEL from pedagogy, resource discovery to evaluation and management. Participating in an ocTEL feedback session at altc2013 it was interesting to reflect on the mindset people bring to these types of 'courses'. The word 'course' itself also reinforces the idea that if you don't finish then you have somehow failed. At altc2013 Stephen Downes was kind enough to drop in to the ALT-C Live studio and talk about MOOCs with Seb Schmoller. As part of this Stephen explained that the conception of ‘a course’ can be misleading. Stephen has subsequently written up more about what he means in this post. Changing people's perception can be challenging and you can read more about how ocTEL is ‘the open course you cannot fail’ in a post by ALT’s Chief Executive Maren Deepwell.

Our approach to ocTEL is not just changing in approach and content and behind the scenes the platform we use is also evolving to include more social features, integration of accreditation options using digital badges and enhanced course activity aggregation.

The development arch

ocTEL was a successful exploration in the Association hosting this type of event and an opportunity to explore ways for supporting distributed communities. Some of these experiments have already been built upon. For example, the ‘course reader’ which aggregates, displays and redistributes community activity was subsequently also used as part of the altc2013 conference platform. This cycle of development continues with the conference platform being used to improve the course platform. The main change has been the inclusion of a social network site plugin BuddyPress.

BuddyPress has been  used within an educational context for a number of years meaning there is already a rich vein of reported uses and supplementary plugins. One of these is BadgeOS which integrates with BuddyPress to provide the functionality for various forms of accreditation and recognition using digital badges. As well as accrediting activities set by the tutor, BadgeOS also has the option we are keen to explore where participants can nominate or award badges to each other. Another feature of BuddyPress we think might be useful for the course is the  ability for tutors and students to create their own groups. Whilst group forming can be very challenging within open courses particularly given the distributed,  chaotic nature and reduced situational awareness, we are interested to see how these work as it may help us find a solution for supporting ALT’s other communities.

The last area of innovation continues the work funded through the MOOC Research Initiative (MRI) which explored the effectiveness of the course reader to attribute a person’s contributions made in multiple networks. Whilst collecting data from 3rd party sites is possible across a range of platforms the identity of who made the post can be less clear cut. Sometimes this is deliberate the person choosing  to write under a nom de plume, but this can also be a result of restrictions on usernames placed by the site. In ocTEL our interest in this area is not to lift the veil on those who prefer to be anonymous, but instead  correctly attribute contributions to the original author. One of the reason for doing this is if using course activity to accredit someone’s learning, evidence of this activity may existing across different channels.

As part of the MRI grant we analysed data from the first iteration of ocTEL which showed given the data sources we targeted an authorship reconciliation of around 50%. As part of the research we identified areas where we could easily improve the procedure used to match authors to an existing course database. Consequently we’ll be incorporating these in the next version of ocTEL.

All these developments are going to be made available under an open source license so why not register for the course and experience the new ocTEL. Also, similar to last year, we’ll be taking the opportunity to develop the platform during the course. One of the developments at towards the top of the list is creating more data export options. These will include on the personal level ‘midata’ export as well as general data feeds.

The ALT Scotland SIG has a lovely day lined up to discuss 'openness' in various aspects of education. It's particularly nice to see people for the Scottish Government and education coming together and hopefully there will be a useful exchange of information and ideas. The event is free to attend but numbers are limited so don't delay to avoid disappointment. We'll hopefully be streaming the event to the ALT YouTube channel so hopefully there will be an opportunity to engage remotely.

Register now!

[on a personal note really looking forward to finally being able to meet Jonathan Worth in person #bigfan].

Open Scotland is a free one day event that provides an opportunity for ALT Scotland SIG members and the wider community to come together and share ideas and experiences of adopting and promoting open educational practices across all sectors of Scottish education.  The event will highlight examples of open education innovation across the Scottish education sector, including adoption of open badges and open assessment and accreditation practices; development of open educational resources and courses and open frameworks for technology enhanced learning.  In addition to showcasing homegrown initiatives, the event will also look further afield to inspiring and innovative projects and developments across the UK. This event will also explore some of the drivers and barriers to embedding open education policy and practice within Scottish education, and will provide an opportunity to discuss the draft Scottish Open Education Declaration prepared by Open Scotland*.

The event has been made possible with support from ALT, SQA, Jisc RSC Scotland and hosting from the School of Informatics at the University of Edinburgh.

Event hashtags: #altc #openscot

*Open Education, Open Scotland builds on the Open Scotland Summit and is facilitated by ALT, Cetis, Jisc RSC Scotland and the SQA.

Register now!

Draft Programme

09:30-10:30Registration (Tea/Coffee)
10:30-10:45Welcome from ALT Scotland SIG – Linda Creanor, Glasgow Caledonian University and Joe Wilson, SQA
10:45-11:00Update from ALT – Maren Deepwell, ALT
11:00-11:30Scottish Government perspectives – Colin Cook, Deputy Director of Digital Strategy, Scottish Government
11:30:12:00SFC Update – David Beards, Scottish Funding Council
OU Scotland’s Open Education Project – Ronald McIntyre, OU Scotland
12:00-12:30Open Badges, Open Borders – Suzanne Scott, Borders College
13:30-14:00Open Courses – Jonathan Worth, Coventry University
14:00-14:30Open Institutions – Natalie Lafferty, University of Dundee
14:30-15:00GLOW – Ian Stewart, John Johnstone (tbc)
15:00-15:15Coffee break
15:15-15:30Scottish Open Education Declaration – Lorna M. Campbell, Cetis
15:30-16:00Plenary discussion
3 Jun 2014 10:30 AM   to   4:15 PM
The Forum
School of Informatics
University of Edinburgh
EdinburghEH8 9LE
United Kingdom

1 Comment

No where in the raging discussion around MOOCs is there anyone talking about sharing the infrastructural/architectural work they’ve done freely with others – Jim Groom in Integrating FeedWordPress with BuddyPress

I wouldn’t say it was a raging discussion, more of a good old fashioned edtech geekout with myself, Alan Levine, Tom Woodward, and latterly joined by Boone (not forgetting a mysterious silhouetted champion of open access ;), talking about the use of WordPress as a connectivist aggregation tool. The chat came about as I’m at the start of the next run of ocTEL, which has given me reams to write about but no time, and Alan and Tom are plotting their next course. We got together via a hangout, the recording is here and embedded below, to swap notes on WordPress as a tool to support open courses. We didn’t give much notice but it was nice to see people like Greg Mcverry were watching along and, underlining the fact that open=opportunity, one of the BuddyPress lead developers, WrodPress guru’s and around cool guy Boone Gorges dropped in for some very useful input and advice.  We chatted for over an hour which leaves the 6-7minute long video sweet spot in the dust, but Martin Lugton has kindly watched the video and pulled out some screen notes.

We might try again next Friday for another get together so watch your scope around 4pm GMT Friday to see if we pull something together. Leave a comment if there is anything you’d like us to talk about or would like a ping.


import.io is a nice service I’ve been dipping into for a while. It’s one of a number of services that provides structured web data scraping. One of the nice features of import.io is it:

transform any website into a table of data or an API in minutes without even writing any code

You load a webpage with their web browser app and start highlighting the parts of the page you’d like to extract. Int3rhacktives has a nice How to scrape data without coding? A step by step tutorial on import.io if you want to find out more.

Once you have the data you want extracted import.io continue to try and keep the bar low allowing easy data download in various formats including .csv. and if you want to use live data there are example itegrations for Excel, Google Sheet and other programming languages.

Looking more closely at the Google Sheet integration import.io document a method that uses their REST API’s HTML table output which is then wrapped in a Google Sheet importHTML formula e.g.

=ImportHtml("https://query.import.io/store/connector/48fd118b-7572-44a6-816c-8f02d088fb6a/_query?_user=5895d593-9461-4b8b-8452-95bb82458bd2&_apikey=YOUR_API_KEY&format=HTML&input/webpage/url=http%3A%2F%2Fwww.scoop.it%2Ft%2Fgas", "table", 1)

import.io easy as 1, 2

I’m a big fan of Google Sheet ‘import’ and have some tutorials on these. The ‘import’ formula are useful for quick results but not appropriate if you need to do additional manipulation or integration into other automated workflows. import.io do have a number of client libraries and code examples you can look at to address this but the one I thought was missing was one for Google Apps Script. One of the great strengths of Apps Script is it’s easy to create time-based routines to pull and push data around as and when needed. So based on import.io’s php example here’s what it would look like in Google Apps Script.

You can read the Google Apps Script Documentation to find out more about what you can do with the result.  Something the guys at import.io might want to think about is creating a Google Apps Script Library. Similar to their other client libraries it will again lower the bar for developers. As a starter I’ve implemented the query method here which means anyone creating a Apps Script project and including a library using the Project Key: M2ZyMvVZdgKdy3SaLP8gq3X797_hv7HHb could just use:

function getImportioExample(){
  // Query for tile Integrate Page Example
  var result = importio.query("caff10dc-3bf8-402e-b1b8-c799a77c3e8c", {"searchterm": "avengers 2",}, userGuid, apiKey, false);

with the benefit of also getting a code autocomplete:


If you've already got Google Apps Script/import.io integrations I'd love to hear about them. Hopefully I'll follow-up this post with an example automation to illustrate what is possible.


Back in the good old days when I was a member of the Glasgow based supergroup with my then colleagues Lorna Campbell and Sheila MacNeill we were approached to write a chapter for the soon to be published ‘Reusing Open Resources’.  We were tasked with writing something on ‘Analytics for Education’. Prior to print our chapter along with four others have been published in the Journal of Interactive Media in Education (JiME) under a CC-BY license. You can read the full Analytics in Education chapter here and copied below is the section I had most input on was ‘future developments’.

Given ‘prediction is very hard, especially about the future’ its interesting to look back at what we wrote in the summer 2013. Something we should have perhaps expanded upon was data privacy concerns particularly in light of the news that news that  non-profit inBloom is shutting down. I often find myself with conflicted interests between data collection as part of my personal quantified self and data collection for quantifying others. TAGS is a prime example of where I initially wanted to collect data to understand the shape of the communities I was in, but now is used by myself and others to extract data from communities we have no investment in.

And right now I'm developing the next iteration of ocTEL which thanks to funding  from the MOOC Research Initiative has helped find areas where we can improve data collection, in particular, resolving identities across networks. Achieving this personally feels like progress but I’m sure many others will disagree.

Are we bound by a data dogma? ...continue reading

repeating seriesSometimes it’s useful to generate a column of data based on a series repeating x number of times e.g. a series 1,2,3 repeated 3 times would give 1, 1, 1, 2, 2, 2, 3, 3, 3 (see column A in here for example). In my particular scenario I want to repeat week numbers for a series from 0-6. There are a number of ways you can do this like indexing row numbers but here’s a little formula I quickly threw together for Google Sheets:



  • D2 is a comma separated series e.g. Week 0,Week 1, Week …
  • D3 is the number of times to repeat

How does it work?

Like a lot of spreadsheet formula is starts in the middle with SPLIT(D2,",") which turns our series of values into an array. If you use this in a single cell in a Google Sheet the values Week 0, Week 1 will be split out across the columns.

Next we want to repeat Week 0 and so on x number of times. This is done with the a combination of REPT, which repeats a given string x times. If we use this by itself it will only apply to the first column of data from the SPLIT so we wrap it in an ARRAYFORMULA like so ARRAYFORMULA(REPT(SPLIT(D2,",")&",",D3))),",")). This repeats the series value the number of times specified in D3. Something to note is the &"," in the REPT. This adds a comma at the end of each repeated value.

ARRAYFORMULA Enables the display of values returned from an array formula into multiple rows and/or columns and the use of non-array functions with arrays.

This now gives us our columns with the repeating text but across several columns e.g. “Week 0,Week 0,Week 0”,  “Week 1,Week 1,We..” etc To get a single value in each column we use a trick of using a JOIN to turn our array of columns into a single cell value separated with a comma. We then use the SPLIT formula again to turn this single cell into multiple cells.

The final part is to use TRANSPOSE to convert our columns of data into rows. Here is the finished version of the Google Sheet with the stages broken down.

Can you think of a better way to do this?