Update 17/07/2014: An issue with this idea is that Google Analytics has a consumer limit of 50 Views (Profiles) per GA Account

At the Google Apps for Education European User Group (GEUG14) I highlighted how Google Analytics could be utilised for Learning Analytics. The type of analytic I have in mind goes beyond pageviews and includes event tracking which through Google Analytics can be explored using segmentations and other built I reporting. This approach is not focused on the individual but for generating course and programme actionable insights. Whilst VLE/LMS vendors and platforms are probably already supporting Google Analytics tracking in their products access to this data often never gets beyond the account administrator. This, in my opinion, is a missed opportunity as the reporting in Google Analytics could easily be applied to a Learning Analytics context.

The solution

Integrate your course creation and management processes with the Google Analytics Management API. With this when a course is created or editing for an instructor a filtered view of the main analytics is also created. With a filtered view instructors would  be able to access their course analytics.

The main advantages of using Google Analytics for Learning Analytics is:

  • the overhead of processing event/click tracking is handled in the cloud by Google
  • scalable and manageable access to analytics

GAforLMS

Share this post on:
| | |
Posted in Analytics, GDE, Google on by .

1 Comment

Back in 2011 I showed how you can use Google Apps Script to write POST/GET data to a Google Sheet. Over the years a couple of things have changed in Apps Script so I thought it was worth a revisit.  The main changes are:

The core concept behind the script is the same. You have a Google Sheet with a set of header column names that matches the names of the data you are passing through. For example if I had a form with:

<input name="bar" type="text" value="" />

I'd need a sheet with the column name 'bar'. For this post I’m going to assume we use a container bound Apps Script in a Google Sheet, but you could easily modify this for a standalone script. So to start you can either create or open an existing Sheet and click Tools > Script editor and enter the code below or copy this template.

Usage

There are a couple of ways you can use this script to collect data. You could use a very traditional HTML form using the web app url as the action parameter. This would send users to a very unattractive JSON response which you could alternatively beautify using the HTMLService. A nicer solution is to use AJAX to submit the data without refreshing or moving page. Below is a simple form based on this Stackoverflow jQuery Ajax POST example which sends responses to this Google Sheet (if you are reading this via RSS/Email you need to visit this post):

The only real change to the stackoverflow example is to specify the destination web app url:

// fire off the request to /form.php
		request = $.ajax({
			url: "https://script.google.com/macros/s/AKfycbzV--xTooSkBLufMs4AnrCTdwZxVNtycTE4JNtaCze2UijXAg8/exec",
			type: "post",
			data: serializedData
		});

The example is using POST but you can also use GET. There is more you can do when handling the data at the Apps Script end other than writing to a Google Sheet. For example, if you wanted to send an email on each submission you could use the MailApp service and add something like:

MailApp.sendEmail("youremailaddress", "a subject", JSON.stringify(e.parameters));

in the try statement. If you do this there are a couple of things to remember. First Apps Script web apps using versioning. This means changes to your script are not 'live' until you push a new version. To do this you need to save your new script and then from the Script Editor select File > Manage versions... and 'Save New Version' before going into Publish > Deploy as web app and updating Project Version. Also when you add new services to your script the authentication scope changes and you need to approve additional services. For example, if you add the MailApp service to your code you need to give permission to send email. The easiest way to trigger this in this example is in the Script Editor Run > setup. I'm sure there are other trip ups but hopefully this gets you most of the way

Google Sheet/Apps Script Code

//  1. Enter sheet name where data is to be written below
        var SHEET_NAME = "Sheet1";
        
//  2. Run > setup
//
//  3. Publish > Deploy as web app 
//    - enter Project Version name and click 'Save New Version' 
//    - set security level and enable service (most likely execute as 'me' and access 'anyone, even anonymously) 
//
//  4. Copy the 'Current web app URL' and post this in your form/script action 
//
//  5. Insert column names on your destination sheet matching the parameter names of the data you are passing in (exactly matching case)

var SCRIPT_PROP = PropertiesService.getScriptProperties(); // new property service

// If you don't want to expose either GET or POST methods you can comment out the appropriate function
function doGet(e){
  return handleResponse(e);
}

function doPost(e){
  return handleResponse(e);
}

function handleResponse(e) {
  // shortly after my original solution Google announced the LockService[1]
  // this prevents concurrent access overwritting data
  // [1] http://googleappsdeveloper.blogspot.co.uk/2011/10/concurrency-and-google-apps-script.html
  // we want a public lock, one that locks for all invocations
  var lock = LockService.getPublicLock();
  lock.waitLock(30000);  // wait 30 seconds before conceding defeat.
  
  try {
    // next set where we write the data - you could write to multiple/alternate destinations
    var doc = SpreadsheetApp.openById(SCRIPT_PROP.getProperty("key"));
    var sheet = doc.getSheetByName(SHEET_NAME);
    
    // we'll assume header is in row 1 but you can override with header_row in GET/POST data
    var headRow = e.parameter.header_row || 1;
    var headers = sheet.getRange(1, 1, 1, sheet.getLastColumn()).getValues()[0];
    var nextRow = sheet.getLastRow()+1; // get next row
    var row = []; 
    // loop through the header columns
    for (i in headers){
      if (headers[i] == "Timestamp"){ // special case if you include a 'Timestamp' column
        row.push(new Date());
      } else { // else use header name to get data
        row.push(e.parameter[headers[i]]);
      }
    }
    // more efficient to set values as [][] array than individually
    sheet.getRange(nextRow, 1, 1, row.length).setValues([row]);
    // return json success results
    return ContentService
          .createTextOutput(JSON.stringify({"result":"success", "row": nextRow}))
          .setMimeType(ContentService.MimeType.JSON);
  } catch(e){
    // if error return this
    return ContentService
          .createTextOutput(JSON.stringify({"result":"error", "error": e}))
          .setMimeType(ContentService.MimeType.JSON);
  } finally { //release lock
    lock.releaseLock();
  }
}

function setup() {
    var doc = SpreadsheetApp.getActiveSpreadsheet();
    SCRIPT_PROP.setProperty("key", doc.getId());
}

I suppose I should start with the why you would want to do this. Every time I join a Blackboard Collaborate session it’s like stepping back in time. Beside the appearance the technology in Collaborate is increasingly out of step with current trends. The software is built on Java, which is claimed to be in over 3 billion devices. Unfortunately for a number of users those 3 billion devices often doesn’t include the computer on their desk. Here is where the problems start as without enough permissions you won’t be able to install Java. To Blackboard’s credit they have spent time developing mobile apps not everyone is going to be able to use these either.

Aware of these barriers for ocTEL we decided to investigate streaming Collaborate session to YouTube. The main advantage for use in getting this to work is that as well as being able to give an alternative means to watch the session we immediately have a recording to share for those who missed it. You can see the results in this session from week 3 of ocTEL.

In this post I’ll outline the technique we use, which can also be more generally applied to any desktop application. It’s also worth highlighting that this is just one of many ways of streaming your desktop and you could achieve similar results using a Google Hangout On Air or the free ‘Wirecast for YouTube’ software Mac|Windows. The reason we didn’t go down that route was we wanted more control over the part of the screen being shared and we didn’t want to have to buy a Wirecast Pro license.

...continue reading

Share this post on:
| | |
Posted in How-to, Streaming and tagged on by .

Update 24/06/2014: Recording of the session currently here

As well as talking about Google {Learning} Analytics at the Google Apps for Education European User Group Meeting (GEUG14) at the University of York I’m also doing a session on Google Apps Script. The abstract I submitted at the time for the session was:

Recently Google announced Add-ons which allow anyone to enhance Google Documents and Sheets with customised features. Already there are a number of add-ons to support teaching and learning such as bibliography and track changes tools. Add-ons are developed in Google Apps Script. Apps Script is free for anyone with a Google account and not only can let you author your own add-ons but also automate your workflows within Google Apps, integrate with external APIs, and more. This talk will introduce users to add-ons exploring some educational scenarios. As part of this we will discuss some threats and opportunities. We will then touch upon how add-ons are authored using Apps Script and highlight opportunities for personalised automation of workflows.

Having been to two other GEUG events I know the event attracts a diverse audience from educators to administrators to developers. This is one of the events strengths but for me makes presenting something on Google Apps Script difficult particularly thinking back to my experience starting with very little code experience but still able to make stuff.

A factor I have to also consider is since submitting the session proposal Google announced Classroom. Classroom is still in development but looks like a layer to Google Drive to administer assignments. This type of functionality has been achievable with Apps Script for a long time and I will be definitely highlighting the awesome work Andrew Stillman & Co. have been doing with scripts like Doctopus.

Another factor in my mind is from my circles the feedback I’m getting is add-ons are causing Google Apps admins headaches in that there are currently no built in fine grained controls to restrict certain add-ons (3rd party solutions are available). I don’t know how many Admins to choosing to switch off add-ons on their Google Apps domain and it’s one of the questions I’m looking for feedback on at the event.

All this has shaped the focus of my talk but hopefully there is enough there for everyone. You can tune in to the streamed Google Hangout of the session 23rd June at 3pmBST (see programme). The current slides for my talk are below:

4 Comments

Update 24/06/2014: Recording of the session currently here

For a while I’ve been interested in the intersection between Google Analytics (and Google’s other analytic reporting APIs like YouTube) applied to the field of Learning Analytics. There are a number of features of Google Analytics such as segmentation, a-b testing, event tracking which I believe could potentially give useful insight to teaching a learning, for example, a look last year at tracking and validating MCQs with GA.

One of the reasons for my interest in this area is the ubiquity of Google Analytics, the majority of institutions already using Google Analytics for their main institutional websites. It should not be forgotten that with power comes responsibility. Whilst Google Analytics usage policies prevent you using it to track personally identifiable information you are still tracking people which should never be forgotten.

The Google Apps for Education European User Group Meeting (GEUG14) at the University of York is another opportunity to roadtest some of these ideas. The process for preparing for an event often not only sees me revisiting prior knowledge but is often turned into an opportunity to create something new. This can be a product, like the Google Analytics data bridge made for IWMW13, or new knowledge.

This time personal exploration has taken me into the land of the Google Tag Manager. Those familiar with the mechanics of Google Analytics tracking will know that this usually requires adding code to every page you want to track. Often this can be achieved modifying page templates. But what if these templates are hard/costly to edit or you want to make changes to what is tracked. This is where Google Tag Manager comes in. Like Google Analytics you need to install some code in your pages. After that what and how you track things becomes completely cloud based. Through Google Tag Manager to can add additional code/markup to your pages even setting up rules to decide when these are used. Whilst Tag Manager is build around Google products like Analytics and Ads you can use it for other purposes. This video gives you an overview of Google Tag Manager.

Below are the slides from my session which will hopefully be streamed via Google Hangout 23rd June at 11:30am BST (see programme).

Share this post on:
| | |
Posted in Analytics, GDE, Google and tagged on by .

3 Comments

Update 18/06/2014: The Open Badges Issuer Add-on is now also available in the WordPress Plugin Directory. Get the Open Badges Issuer Add-on

OpenBadges_Insignia_WeIssue_BannerALT’s Open Course in Technology Enhanced Learning (ocTEL) is entering it’s final week. ocTEL has been and continues to be an excellent opportunity to explore ways in which we support ALT’s community of members. Last year the work we did in setting up a blog and community aggregations site directly feeding into the development of the ALT conference platform. This year one of the areas we were keen to explore was the digital badging of community contributions. The use of community badging is well-founded and predates the web itself. This area has however gained extra interest by educators in part due to Mozilla Open Badges. Mozilla Open Badges specify a framework for the description and award of badges using a common specification. The main benefit or this approach is interoperability. Recipients of Open Badges can collect badges in one place, manage them into collection and control how they are shared across sites, social networks and personal portfolios. One such place is in a Mozilla Backpack.

In the case of ocTEL the creation and award of digital badges, particularly within a community context, has been made very easy thanks to the BadgeOS™ plugin for WordPress. BadgeOS has a number of methods which trigger the awarding or badges including reviewed submissions as well as the completion of a defined set of steps.

One issue for us has been that to issue Open Badges with BadgeOS requires integration with the badge awarding and display site Credly. Sites like Credly are very useful parts of the badge ecosystem but the feeling we had was that if we were going to issue Open Badges we would take on the commitment of hosting the badge data ourselves rather than relying on a 3rd party. BadgeOS, regardless of whether you turn on Credly integration, still provides an excellent framework for creating and awarding digital badges. Even better is BadgeOS is open source and is actively encouraging developers to extend and enhance the plugins core functionality. If you look at the BadgeOS Developer Resources there is a number of ways this can be achieved.

With this in mind, with the support of ALT, I has decided to make my own contribution to BadgeOS  with the development of the Open Badges Issuer Add-on. This add-on achieves two things:

  • Badges designed and awarded using BadgeOS are now exposed as Open Badges compliant Assertion - Assertions are the DNA of Open Badges. They are data files which describe the badge and identify who it has been awarded to. Behind the scene the add-on is using the BadgeOS created data turning it into the required object recognised as an Open Badge. Critically this data existing in the host site. For example, one of my ocTEL badges exists here and is shown below in a formatted view.

Open Badges Assertion

  • The creation of an interface for the user to add badges awarded using BadgeOS to the Mozilla Backpack - this is technically made a lot easier as the add-on uses the existing Issuer API which provides most of the code to get the job done.

The rollout of this plugin in ocTEL is complete as detailed in ‘ocTEL digital badges are now Open Badges: How to add them to your Mozilla Backpack’. I’ve also submitted the plugin as a recognised BadgeOS add-on. The add-on will also shortly be appearing in the official WordPress Plugin Directory. Hopefully this add-on will make it easier for others to issue Open Badges through their WordPress powered site.

Like all projects this development greatly benefits for your feedback, which may include coding suggestions or improved functionality, so we appreciate your feedback.

Download the Open Badges Issuer Add-on for BadgeOS

As part of some work I'm doing with the Open University around the OER Research Hub project I developed this high fidelity prototype which let users explore survey responses collected by the project (the short video below highlights the main features):

In the guest post I wrote on the 'OER Survey Exploratoratorium' I outlined the problem:

When presented with over 4,000 survey responses the challenge was how to let people explore the data set. When presented with this challenge the first thought invariably is what is the shape of the data. In this case the survey responses were collected in Survey Monkey. After considering options like consuming the data into the OER Impact Map via the Survey Monkey API, the overhead in terms of developing user interfaces and squeezing into a WordPress data structure resulted in the exploration of other options. The approach that looked like it would squeeze the most functionality out of little development time was to use Google Fusion Tables to host the data and put a visualisation layer over the top. The reason for choosing Fusion Tables is it allows a Guardian Datastore model of letting the user easily access and reuse the source data either in Fusion Tables itself or exporting into another tool. If you would like to peek at the data behind this there are two tables: one with the survey questions and another with the survey data.

I’ve extracted the main part of the code into this gist so you can get a sense of what’s going on. If this is something you are interested in doing yourself there is some documentation on the Google Visualisation API  for getting Google Fusion Tables. This page is has one example of how you can fetch data from Fusion Tables. It’s however worth noting that as Google Fusion Tables implements  the Chart Tools Datasource Protocol you can query the data as a datasource. This allows you to use the Google Visualization API Query Language with  SQL like syntax. The gist below is a reworking of this query example in the Google Code Playground which you can use to see the differences. The main one is how the query is set by specifying which table the data is from in the query. A couple of notes I have on using Google Fusion Tables as a datasource in this way are:

  • data returned limited to 500 rows. If you want more you can turn to the full Google Fusion Tables API  which has a separate SQL like query language. Using this API is rate limited and requires OAuth and/or API key. I got more than 500 by using LIMIT and OFFSET in my queries. (the full Google Fusion Table API is worth bookmarking for cross-referencing).
  • using back-quotes ` specified in the Visualisation API to wrap column names with spaces doesn’t appear to work. You do specify columns by their name rather that A, B C etc as used in Google Sheets. (The Google Fusion Tables API specifies single quotes which I don't think work in this scenario - this is an example of where cross-referencing helps)
  • Google Fusion Tables doesn’t implement the OR operator (related issue ticket marked Won't Fix. When I mentioned to Tony Hirst (@psychemedia) he suggested De Morgan's laws which would be an alternative)

Hope you enjoy and look forward to seeing you Google Fusion/Visualization mashups ;)

Share this post on:
| | |
Posted in GDE, Google, Visualisation on by .

A repost from the ocTEL course blog outlining the way we setup the BadgeOS plugin for WordPress to issue badges as part of the course. This post follows on from an earlier post, 'ocTEL and the Open Badges Assertion', which highlights some progress towards directly issuing Open Badges using BadgeOS ... more to follow on this development.

Moira Maley recently wrote to us asking for some details on how the ocTEL course is configured to issue badges. As others might benefit from this and with Moira's permission here are her questions and my responses. ...continue reading

1 Comment

This was a post I prepared for another site. It got lost in the pending queue so is out of date (you can still register for ocTEL until the end of June), but I thought worth capturing this post here for future reference.

Last year ALT ran an 11 week long open course in technology enhanced learning (ocTEL). ocTEL is back! And you can still register for this year’s iteration of the course which starts on 28th April 2014 and runs for 7 weeks. The ‘course’ introduces various aspects of TEL from pedagogy, resource discovery to evaluation and management. Participating in an ocTEL feedback session at altc2013 it was interesting to reflect on the mindset people bring to these types of 'courses'. The word 'course' itself also reinforces the idea that if you don't finish then you have somehow failed. At altc2013 Stephen Downes was kind enough to drop in to the ALT-C Live studio and talk about MOOCs with Seb Schmoller. As part of this Stephen explained that the conception of ‘a course’ can be misleading. Stephen has subsequently written up more about what he means in this post. Changing people's perception can be challenging and you can read more about how ocTEL is ‘the open course you cannot fail’ in a post by ALT’s Chief Executive Maren Deepwell.

Our approach to ocTEL is not just changing in approach and content and behind the scenes the platform we use is also evolving to include more social features, integration of accreditation options using digital badges and enhanced course activity aggregation.

The development arch

ocTEL was a successful exploration in the Association hosting this type of event and an opportunity to explore ways for supporting distributed communities. Some of these experiments have already been built upon. For example, the ‘course reader’ which aggregates, displays and redistributes community activity was subsequently also used as part of the altc2013 conference platform. This cycle of development continues with the conference platform being used to improve the course platform. The main change has been the inclusion of a social network site plugin BuddyPress.

BuddyPress has been  used within an educational context for a number of years meaning there is already a rich vein of reported uses and supplementary plugins. One of these is BadgeOS which integrates with BuddyPress to provide the functionality for various forms of accreditation and recognition using digital badges. As well as accrediting activities set by the tutor, BadgeOS also has the option we are keen to explore where participants can nominate or award badges to each other. Another feature of BuddyPress we think might be useful for the course is the  ability for tutors and students to create their own groups. Whilst group forming can be very challenging within open courses particularly given the distributed,  chaotic nature and reduced situational awareness, we are interested to see how these work as it may help us find a solution for supporting ALT’s other communities.

The last area of innovation continues the work funded through the MOOC Research Initiative (MRI) which explored the effectiveness of the course reader to attribute a person’s contributions made in multiple networks. Whilst collecting data from 3rd party sites is possible across a range of platforms the identity of who made the post can be less clear cut. Sometimes this is deliberate the person choosing  to write under a nom de plume, but this can also be a result of restrictions on usernames placed by the site. In ocTEL our interest in this area is not to lift the veil on those who prefer to be anonymous, but instead  correctly attribute contributions to the original author. One of the reason for doing this is if using course activity to accredit someone’s learning, evidence of this activity may existing across different channels.

As part of the MRI grant we analysed data from the first iteration of ocTEL which showed given the data sources we targeted an authorship reconciliation of around 50%. As part of the research we identified areas where we could easily improve the procedure used to match authors to an existing course database. Consequently we’ll be incorporating these in the next version of ocTEL.

All these developments are going to be made available under an open source license so why not register for the course and experience the new ocTEL. Also, similar to last year, we’ll be taking the opportunity to develop the platform during the course. One of the developments at towards the top of the list is creating more data export options. These will include on the personal level ‘midata’ export as well as general data feeds.

The ALT Scotland SIG has a lovely day lined up to discuss 'openness' in various aspects of education. It's particularly nice to see people for the Scottish Government and education coming together and hopefully there will be a useful exchange of information and ideas. The event is free to attend but numbers are limited so don't delay to avoid disappointment. We'll hopefully be streaming the event to the ALT YouTube channel so hopefully there will be an opportunity to engage remotely.

Register now!

[on a personal note really looking forward to finally being able to meet Jonathan Worth in person #bigfan].

Open Scotland is a free one day event that provides an opportunity for ALT Scotland SIG members and the wider community to come together and share ideas and experiences of adopting and promoting open educational practices across all sectors of Scottish education.  The event will highlight examples of open education innovation across the Scottish education sector, including adoption of open badges and open assessment and accreditation practices; development of open educational resources and courses and open frameworks for technology enhanced learning.  In addition to showcasing homegrown initiatives, the event will also look further afield to inspiring and innovative projects and developments across the UK. This event will also explore some of the drivers and barriers to embedding open education policy and practice within Scottish education, and will provide an opportunity to discuss the draft Scottish Open Education Declaration prepared by Open Scotland*.

The event has been made possible with support from ALT, SQA, Jisc RSC Scotland and hosting from the School of Informatics at the University of Edinburgh.

Event hashtags: #altc #openscot

*Open Education, Open Scotland builds on the Open Scotland Summit and is facilitated by ALT, Cetis, Jisc RSC Scotland and the SQA.

Register now!

Draft Programme

09:30-10:30 Registration (Tea/Coffee)
10:30-10:45 Welcome from ALT Scotland SIG – Linda Creanor, Glasgow Caledonian University and Joe Wilson, SQA
10:45-11:00 Update from ALT – Maren Deepwell, ALT
11:00-11:30 Scottish Government perspectives – Colin Cook, Deputy Director of Digital Strategy, Scottish Government
11:30:12:00 SFC Update – David Beards, Scottish Funding Council
OU Scotland’s Open Education Project – Ronald McIntyre, OU Scotland
12:00-12:30 Open Badges, Open Borders – Suzanne Scott, Borders College
12:30-13:30 Lunch
13:30-14:00 Open Courses – Jonathan Worth, Coventry University
14:00-14:30 Open Institutions – Natalie Lafferty, University of Dundee
14:30-15:00 GLOW – Ian Stewart, John Johnstone (tbc)
15:00-15:15 Coffee break
15:15-15:30 Scottish Open Education Declaration – Lorna M. Campbell, Cetis
15:30-16:00 Plenary discussion
16:00 Close
3 Jun 2014 10:30 AM   to   4:15 PM
The Forum
School of Informatics
University of Edinburgh
EdinburghEH8 9LE
United Kingdom
Share this post on:
| | |
Posted in Event on by .