Author Archives: Martin Hawksey

About Martin Hawksey

Learning Technology Advisor

We recently moved the ALT Online Newsletter, which is a self-hosted WordPress site, to https/SSL. We did this before Google announced it would use https as a rank factor in it’s search results so hopefully it will also have a positive boost to our traffic. To do this we opened the WordPress dashboard and switched the WordPress and site URLs to https: 

address and site url

This setting handles the urls generated by WordPress for menus and post links but doesn’t effect hardcoded links in posts or inbound links. To handle these we added a couple of lines to our .htaccess file:

# HTTPS redirect
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP_HOST}/$1 [R=301,L]
</IfModule>

This detects any urls with http and redirects to the https equivalent. All seemed to be fine until Matt Lingard noticed that our Feedburner RSS feed wasn’t distributing posts anymore. The issue … Feedburner can’t handle feeds served from https generating a ‘Received HTTP error code 400 while fetching source feed’:

Received HTTP error code 400 while fetching source feed.

The solution is to detect FeedBurner and not give it https like so:

# HTTPS redirect
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteCond %{HTTP_USER_AGENT} !FeedBurner    [NC]
RewriteCond %{HTTP_USER_AGENT} !FeedValidator [NC]
RewriteRule (.*) https://%{HTTP_HOST}/$1 [R=301,L]
</IfModule>

All straight forward enough but there is one last kicker. Feedburner caches your feed so when you go to Feedburner and Edit Feed Details and click ‘Save Feed Details’ it may still show as a 400 error. You can either wait for the cached version to clear to see if your changes have worked (or as I did spend an hour trying to work out why it wasn’t fixed) or trick Feedburner to ignore it’s cached version with some trash at the end of your url as a querystring e.g. http://newsletter.alt.ac.uk/feed/?source=feedburner

Share this post on:
| | |
Posted in WordPress on by .

1 Comment

I was building an analytics dashboard today that collected data from various services including Google Anaytics and YouTube. Apps Script makes this very easy as highlighted in my previous post. An issue I encountered when I tried to access our YouTube channel reports is even though my account is attached to as a manager I was getting a ‘Forbidden’ error.  Turning to the Channel Reports documentation I discovered:

channel==CHANNEL_ID – Set CHANNEL_ID to the unique channel ID of the channel for which you are retrieving data. The user authorizing the request must be the owner of the channel.

As our YouTube channel is associated with our Google+ page you can’t log in to Google Drive with that account. I did notice however that when I added YouTube Analytics as an Advanced Apps Script service the authentication prompt gave an option of authenticating using our Google+ page.

auth window 

The issue then is if you authenticate against the Google+ page you can’t get access to other services like Google Analytics. I thought of a couple of ways I might tackle this such as writing a separate Apps Script project that just got the YouTube Analytics data and wrote it to the spreadsheet I was working on. I’m not entirely sure how the permissions would work out on that. Instead my solution was to expose the YouTubeAnalytics.Reports.query  in a separate Apps Script published as a web app. Setting this to run ‘anyone, even anonymously’ I could then use UrlFetchApp to get the data in my main script.

Here’s how I set it up. Below (or in this gist) the 'main' script is handling all the data read/write to sheet and a separate 'proxy' Apps Script project running the YouTube Analytics data collection.

Note: This technique exposes our YouTube Channel report to the world (barring security by obscurity). The method we are exposing is read only so we don’t have to worry about an injection. 

Feels a bit hacky but can you see a better way of doing this?

Update 22/07/2014: Matias Molinas had the great suggestion of writing the results to a sheet which would avoid exposing your data. Jarom McDonald has also suggested using Google App Engine would give security and scalability à la superProxy

3 Comments

There are regular reporting features built in to Google Analytics but what if you want to do customised reporting or other regular data collection and integration techniques? One answer is Google Apps Script. Apps Script is a Google Drive feature that makes custom automation so much easier. The big advantage of Apps Script is you are already in a Google authenticated environment. So while anyone can use the existing Google Analytics APIs using Apps Script you don’t need to worry about the authentication handshake to get your data. This means that you can access Google Analytics data in one line of code. There is a couple of tickboxes to get to that one line of code but in this tutorial I’ll walk you through the process.

Note: You can also get Google Analytics (and other analytics data) in a Google Sheet with zero lines of code using add-ons. The current limitation of these is they cannot currently be automatically run on a scheduled basis.

Setting up our project

Google Apps Script lives in a couple of places but for this example we are going to use a Google Sheet so step one is:

  1. From your Google Drive account create a new Spreadsheet
  2. In the new spreadsheet open Tools > Script editor…
  3. If this is your first script you might see a window popup with options to ‘Create projects for’, if so click ‘Close’
    Close Apps Script project window
  4. You should now see the Script editor
    Script editor
  5. The connection to Google Analytics is a Google Apps Script ‘Advanced Service’ and to enable we need to turn it on. To do this in the Script Editor select Resources > Advanced Google services 
    image
  6. At this point you will be prompted to create a project name. Enter a name for your project and click ‘OK’.
  7. You should now see a list of available advanced services. For this project we want to just turn on Google Analytics API by clicking the on/off toggle. So far we have enabled Analytics in our project but we also need to click on the Google Developers Console link highlighted to enable this at the console end.
    Advanced services control
  8. Similarly to the ‘Advanced Google Services’ we need to enable the Analytics API by toggling the on/off (at this point you may be prompted to review and accept terms of service)
    image
  9. Once enabled in the Developer Console you can close this tab/window and click OK in the ‘Advanced Google Services’ box in your Script Editor.

To recap our project is now configured to use the Google Analytics API. You can use multiple advanced services in the same project as needed. Remember ‘Advanced Services’ need to be enabled for each project you create but this only needs to be done once for each project.

Getting Google Analytics data (and some basic Script Editor tips)

For those familiar with coding environments the Script Editor comes with a code autocomplete. This is useful when starting rather than remembering a long list of classes and methods. To use this:

In the Script Editor click on line 2 between the moustaches (‘{‘ and ‘}’) and for PC press CTRL + SPACE or Mac ⌘ + SPACE, from this you can select Analytics from the list. To continue the autocomplete follow it with a dot ‘.’ which brings the associated options. The line we want to get is Analytics.Data.Ga.get(ids, start-date, end-date, metrics, optionalArgs); 

autocomplete

So you’ll see there are a couple of parameters we need to provide defined in the GA Core Reporting Query Parameters Summary:

  • ids - Unique table ID for retrieving Analytics data. Table ID is of the form ga:XXXX, where XXXX is the Analytics view (profile) ID.
  • startDate - Start date for fetching Analytics data. Requests can specify a start date formatted as YYYY-MM-DD, or as a relative date (e.g., today, yesterday, or 7daysAgo). The default value is 7daysAgo.
  • endDate - End date for fetching Analytics data. Request can should specify an end date formatted as YYYY-MM- DD, or as a relative date (e.g., today, yesterday, or 7daysAgo). The default value is yesterday.
  • metrics - A comma-separated list of Analytics metrics. E.g., 'ga:sessions,ga:pageviews'. At least one metric must be specified.
  • optionalArgs – is an object array of the optional query parameters like dimensions, segment, filters, sort etc. An example from this Google Analytics Apps Script tutorial is:
var optionalArgs = {
 'dimensions': 'ga:keyword',              // Comma separated list of dimensions.
 'sort': '-ga:sessions,ga:keyword',       // Sort by sessions descending, then keyword.
 'segment': 'dynamic::ga:isMobile==Yes',  // Process only mobile traffic.
 'filters': 'ga:source==google',          // Display only google traffic.
 'start-index': '1',
 'max-results': '250'                     // Display the first 250 results.
};

Building a Google Analytics Query with Query Explorer

If you are unfamiliar with the Google Analytics Core Reporting API building a query can be quite daunting. Fortunately the Google Analytics team have made the Google Analytics Query Explorer which gives you an interactive interface to build and test queries. Here’s an example to query your top referring sites, which should give you a page like this:

Google Analytics Query Explorer 2

If you haven’t used the Query Explorer before you’ll need to click on the ‘Authorize Access’ button, which will enable a ‘Get Data’ button. You can test and tweak your query as much as you like using the ‘Get Data’ button to see what is returned from your Google Analytics accounts. The Query Explorer is a great starting point but remember it only contains a few of the optional arguments.

If you are using the Query Explorer then below is a helper script for converting your Query URI into an Apps Script snippet. To use this build your query in the Query Explorer and then press the Query URI button image to get a URI to paste in the textfield and then click ‘Submit’:

The little helper script gives us some extra code to store/pass the parameters we need. If you are scheduling this script to run on a regular basis you’ll need to modify the start/end date in the query. The are two main ways you can do this: either building the date by manipulating a Date instance formatted as YYYY-MM-DD; or as a relative date (e.g., today, yesterday, or NdaysAgo where N is a positive integer). Below is an example of a modified query made with the Query Explorer and exported to generate the script we need which gets data from the last 7 days. In this example I’ve wrapped it in a new function name fetchMyQuery. Note: In your example you need your own ids value to return results.

function fetchMyQuery() {
  var query = {
    "optionalArgs": {
      "dimensions": "ga:source",
      "filters": "ga:medium==referral",
      "sort": "-ga:pageviews",
      "max-results": "50"
    },
    "ids": "ga:82426939",
    "metrics": "ga:pageviews,ga:sessionDuration,ga:exits",
    "start-date": "7daysAgo",
    "end-date": "yesterday"
  };
  var results = Analytics.Data.Ga.get(query.ids, query['start-date'], query['end-date'], query.metrics, query.optionalArgs);
  Logger.log(results);  
}

Testing/Debugging Apps Script

With your script saved we can test the code. When developing scripts there is a debug feature demonstrated in the video below (the first time you run a new script you need to authorise it. This only needs to be done the first time the script runs or when new permissions are required.

Insert Data Into A Spreadsheet

The final step is to output the results from our query into Google Sheets. For this example let reuse a modified version of the outputToSpreadsheet method in the Automated Access to Google Analytics Data in Google Sheets tutorial:

function outputToSpreadsheet(results, sheet) {
  // Print the headers.
  var headerNames = [];
  for (var i = 0, header; header = results.getColumnHeaders()[i]; ++i) {
    headerNames.push(header.getName());
  }
  sheet.getRange(1, 1, 1, headerNames.length)
      .setValues([headerNames]);

  // Print the rows of data.
  sheet.getRange(2, 1, results.getRows().length, headerNames.length)
      .setValues(results.getRows());
}

This function inserts all the header and reporting data to the sheet. For more information on how to insert data into Google Sheets with Apps Script there is a  Storing Data in Spreadsheets tutorial.

The outputToSpreadsheet method in our example is expecting two objects to be passed in. To do this your fetchMyQuery method needs to include getting Sheet object and passing it to will need to outputToSpreadsheet. The entire project should look like this:

When you now Run > fetchMyQuery you should see the data written in the sheet you specified.

Automate the Script

This project has been about automated data collection so let look at how we set this up. Google Apps Script makes automation very easy using the triggers feature. To set this up

  1. In the Script Editor click Resources > Current project’s triggers
  2. Click ‘No triggers set up. Click here to add one now’
  3. Lets configure the fetchMyQuery to run once a week by setting:
    • The Run dropdown to: fetchMyQuery
    • The Events dropdown to: Time-driven, and selecting Week timer to run Every Monday between 7:00 a.m. to 8:00 a.m.

Triggers

Once saved this script will run as scheduled with no need for your to have the sheet open. If you would like to be told if the script fails whilst unattended click the notifications link which opens a new dialogue box to allow you to configure to which email you want errors to be sent and when.

Summary

This post has introduced you to using Google Apps Script to automatically collect and write Google Analytics data to a Google Sheet. Once you get started with Apps Script you’ll start discovering many more opportunities to automate tasks. For example, as part of our triggered event we could notify people of an update using the MailApp service or write an entire document using Document Service. For some ideas you might want to read about Analytics reporting with Google Apps Script at the UK Cabinet Office.  If you are just starting to use Google Apps Script the Google Developers has extensive documentation and tutorials and if you get stuck there is a dedicated tag in Stackoverflow and an active community on Google+.

2 Comments

At 4pm today (16th July, 2014) I’ll be giving a talk to the Institutional Web Managers Workshop (IWMW14) in Newcastle. The sessions aren’t being streamed but I’ll see if I can stream to my YouTube channel. The main idea I want to convey is that in a world which is benefiting from being digitally distributed, networked and increasing crowd driven the IWMW audience is in the prime position to support their institutions creating opportunities for learning aligned to this.  In particular, I want to highlight the work George Siemens is proposing around the Personal Knowledge Graph (PKG):

The big shift that needs to be made in education is to shift from knowing content to knowing learners.

What is needed in education is something like a Personal Learner Knowledge Graph (PLKG): a clear profile of what a learner knows – Siemens, G (2104)

There are a number of challenges with this idea including technical and ethical, but having played with ideas that could potentially highlight aspects of a persons latent knowledge capacity the concept excites me. As part of my presentation I’ll be highlighting the ways the latest iteration of ALT’s Open Course in Technology Enhanced Learning (ocTEL) align to a Personal Knowledge Graph. One of the strongest features of ocTEL, which we completely fluked, was how Open Badges can be used to support the construction of part of a Person Knowledge Graph.

Open Badges in ocTEL

As part of ocTEL the course was broken into weekly topics. Each week there were five types of badges available for that topic (this was based on the BlendKit Course).

  • Check-in: awarded for reading the week’s course material
  • Webinar: awarded for attending or watching the webinar
  • TEL One: awarded for completing each week’s ‘If you only do one thing’ activity
  • TEL Explorer: awarded for completing each week’s ‘TEL Explorer’ activities (there could be more than one)
  • Topic Badge: awarded for completing at least 3 of the above

All the badges with the exception of TEL One and TEL Explorer were automatically awarded by a click of a button, entering an activity code or the system detecting a configured set of requirements. TEL One and Explorer were awarded by the participant providing a url evidencing their activity which was manually reviewed. Other badges were available outside the weekly set for community related activity. These included badges automatically awarded for adding details to a profile, making posts on other sites recorded in ocTEL as part of our data aggregation using FeedWordPress. There were also other badges tutors could individually award people to recognise contributions to the course.

Open Badges = new nodes and edges

Besides badging appearing to be a positive influence in course engagement the badging system used in ocTEL, the BadgeOS Plugin for WordPress, there are several features of our setup you could argue supported the construction and use of Personal Knowledge Graphs:

Situational awareness

As part of the badging system there was an option to display who else had earned the badge. With this there is an opportunity to be aware of who else is active (an example badge page). Given this was available on all badges activity become categorised by level (basic check-in to advanced activity) and topic. In the current configuration we have no made connections between these but would prove an interesting area for further research. Another area for improvement is the profile pictures link you only to the persons general profile page. Whilst this has uses, outlined below, in the case of badges with evidence attached to them this link isn’t exposed.

Who's been awarded

A profile of what knowledge has been gained by an individual is arguably a key aspect of a Personal Knowledge Graph. The badging plugin used combined with the BuddyPress social network plugin integrates achievements into a person’s profile (an example profile here). This provides another entry point for people to make connections. Whilst BuddyPress has the facility for friend/follower relationships we were keen for participants to identify the personal 3rd party spaces they  exist in.

Another aspect of BuddyPress/BadgeOS we only briefly experimented with was the inclusion of badge award in the person’s activity stream. The main mechanism this could have supported is with a friending tie this activity could have been pushed to followers via email.

ocTEL achievment profile

Self-declared activity

One challenge of open or distributed activity is collecting that activity. ocTEL uses the FeedWordPress model to aggregate activity from participant blogs and other networks. Whilst the hashtag provides some means of collecting information from self-organising spaces there will always be issues with data collection. With the TEL One and TEL Explore badges requiring the person to submit a url as evidence that they have completed the activity this issue is partially overcome, the individual becoming the curator of information in their knowledge graph.

Interoperability

The final aspect of badging in ocTEL worth noting is our efforts to move from the ‘digital badging’ natively used in the BadgeOS plugin to instead directly issue Open Badges. This small difference potentially has a big impact. Most notably this means the ocTEL site exposes information about achievements in the interoperable Open Badges specification. The benefit for the user is they can display badges in other systems, on a personal knowledge graph level it means these achievements are machine readable.

Summary

This hastily written post hopefully gives you a flavour of how the use of Open Badges, or digital badging in general, could support the construction of part of a Personal Knowledge Graph. There are still a number of questions to be answered like how might Open Badges, a skills and competency definition using something like InLOC, and a Personal Knowledge Graph fit together. My slides for my IWMW14 talk, for what  they are worth, are embedded below and I look forward to hearing your thoughts on this area.

Update 17/07/2014: An issue with this idea is that Google Analytics has a consumer limit of 50 Views (Profiles) per GA Account

At the Google Apps for Education European User Group (GEUG14) I highlighted how Google Analytics could be utilised for Learning Analytics. The type of analytic I have in mind goes beyond pageviews and includes event tracking which through Google Analytics can be explored using segmentations and other built I reporting. This approach is not focused on the individual but for generating course and programme actionable insights. Whilst VLE/LMS vendors and platforms are probably already supporting Google Analytics tracking in their products access to this data often never gets beyond the account administrator. This, in my opinion, is a missed opportunity as the reporting in Google Analytics could easily be applied to a Learning Analytics context.

The solution

Integrate your course creation and management processes with the Google Analytics Management API. With this when a course is created or editing for an instructor a filtered view of the main analytics is also created. With a filtered view instructors would  be able to access their course analytics.

The main advantages of using Google Analytics for Learning Analytics is:

  • the overhead of processing event/click tracking is handled in the cloud by Google
  • scalable and manageable access to analytics

GAforLMS

Share this post on:
| | |
Posted in Analytics, GDE, Google on by .

1 Comment

Back in 2011 I showed how you can use Google Apps Script to write POST/GET data to a Google Sheet. Over the years a couple of things have changed in Apps Script so I thought it was worth a revisit.  The main changes are:

The core concept behind the script is the same. You have a Google Sheet with a set of header column names that matches the names of the data you are passing through. For example if I had a form with:

<input name="bar" type="text" value="" />

I'd need a sheet with the column name 'bar'. For this post I’m going to assume we use a container bound Apps Script in a Google Sheet, but you could easily modify this for a standalone script. So to start you can either create or open an existing Sheet and click Tools > Script editor and enter the code below or copy this template.

Usage

There are a couple of ways you can use this script to collect data. You could use a very traditional HTML form using the web app url as the action parameter. This would send users to a very unattractive JSON response which you could alternatively beautify using the HTMLService. A nicer solution is to use AJAX to submit the data without refreshing or moving page. Below is a simple form based on this Stackoverflow jQuery Ajax POST example which sends responses to this Google Sheet (if you are reading this via RSS/Email you need to visit this post):

The only real change to the stackoverflow example is to specify the destination web app url:

// fire off the request to /form.php
		request = $.ajax({
			url: "https://script.google.com/macros/s/AKfycbzV--xTooSkBLufMs4AnrCTdwZxVNtycTE4JNtaCze2UijXAg8/exec",
			type: "post",
			data: serializedData
		});

The example is using POST but you can also use GET. There is more you can do when handling the data at the Apps Script end other than writing to a Google Sheet. For example, if you wanted to send an email on each submission you could use the MailApp service and add something like:

MailApp.sendEmail("youremailaddress", "a subject", JSON.stringify(e.parameters));

in the try statement. If you do this there are a couple of things to remember. First Apps Script web apps using versioning. This means changes to your script are not 'live' until you push a new version. To do this you need to save your new script and then from the Script Editor select File > Manage versions... and 'Save New Version' before going into Publish > Deploy as web app and updating Project Version. Also when you add new services to your script the authentication scope changes and you need to approve additional services. For example, if you add the MailApp service to your code you need to give permission to send email. The easiest way to trigger this in this example is in the Script Editor Run > setup. I'm sure there are other trip ups but hopefully this gets you most of the way

Google Sheet/Apps Script Code

//  1. Enter sheet name where data is to be written below
        var SHEET_NAME = "Sheet1";
        
//  2. Run > setup
//
//  3. Publish > Deploy as web app 
//    - enter Project Version name and click 'Save New Version' 
//    - set security level and enable service (most likely execute as 'me' and access 'anyone, even anonymously) 
//
//  4. Copy the 'Current web app URL' and post this in your form/script action 
//
//  5. Insert column names on your destination sheet matching the parameter names of the data you are passing in (exactly matching case)

var SCRIPT_PROP = PropertiesService.getScriptProperties(); // new property service

// If you don't want to expose either GET or POST methods you can comment out the appropriate function
function doGet(e){
  return handleResponse(e);
}

function doPost(e){
  return handleResponse(e);
}

function handleResponse(e) {
  // shortly after my original solution Google announced the LockService[1]
  // this prevents concurrent access overwritting data
  // [1] http://googleappsdeveloper.blogspot.co.uk/2011/10/concurrency-and-google-apps-script.html
  // we want a public lock, one that locks for all invocations
  var lock = LockService.getPublicLock();
  lock.waitLock(30000);  // wait 30 seconds before conceding defeat.
  
  try {
    // next set where we write the data - you could write to multiple/alternate destinations
    var doc = SpreadsheetApp.openById(SCRIPT_PROP.getProperty("key"));
    var sheet = doc.getSheetByName(SHEET_NAME);
    
    // we'll assume header is in row 1 but you can override with header_row in GET/POST data
    var headRow = e.parameter.header_row || 1;
    var headers = sheet.getRange(1, 1, 1, sheet.getLastColumn()).getValues()[0];
    var nextRow = sheet.getLastRow()+1; // get next row
    var row = []; 
    // loop through the header columns
    for (i in headers){
      if (headers[i] == "Timestamp"){ // special case if you include a 'Timestamp' column
        row.push(new Date());
      } else { // else use header name to get data
        row.push(e.parameter[headers[i]]);
      }
    }
    // more efficient to set values as [][] array than individually
    sheet.getRange(nextRow, 1, 1, row.length).setValues([row]);
    // return json success results
    return ContentService
          .createTextOutput(JSON.stringify({"result":"success", "row": nextRow}))
          .setMimeType(ContentService.MimeType.JSON);
  } catch(e){
    // if error return this
    return ContentService
          .createTextOutput(JSON.stringify({"result":"error", "error": e}))
          .setMimeType(ContentService.MimeType.JSON);
  } finally { //release lock
    lock.releaseLock();
  }
}

function setup() {
    var doc = SpreadsheetApp.getActiveSpreadsheet();
    SCRIPT_PROP.setProperty("key", doc.getId());
}

I suppose I should start with the why you would want to do this. Every time I join a Blackboard Collaborate session it’s like stepping back in time. Beside the appearance the technology in Collaborate is increasingly out of step with current trends. The software is built on Java, which is claimed to be in over 3 billion devices. Unfortunately for a number of users those 3 billion devices often doesn’t include the computer on their desk. Here is where the problems start as without enough permissions you won’t be able to install Java. To Blackboard’s credit they have spent time developing mobile apps not everyone is going to be able to use these either.

Aware of these barriers for ocTEL we decided to investigate streaming Collaborate session to YouTube. The main advantage for use in getting this to work is that as well as being able to give an alternative means to watch the session we immediately have a recording to share for those who missed it. You can see the results in this session from week 3 of ocTEL.

In this post I’ll outline the technique we use, which can also be more generally applied to any desktop application. It’s also worth highlighting that this is just one of many ways of streaming your desktop and you could achieve similar results using a Google Hangout On Air or the free ‘Wirecast for YouTube’ software Mac|Windows. The reason we didn’t go down that route was we wanted more control over the part of the screen being shared and we didn’t want to have to buy a Wirecast Pro license.

...continue reading

Share this post on:
| | |
Posted in How-to, Streaming and tagged on by .

Update 24/06/2014: Recording of the session currently here

As well as talking about Google {Learning} Analytics at the Google Apps for Education European User Group Meeting (GEUG14) at the University of York I’m also doing a session on Google Apps Script. The abstract I submitted at the time for the session was:

Recently Google announced Add-ons which allow anyone to enhance Google Documents and Sheets with customised features. Already there are a number of add-ons to support teaching and learning such as bibliography and track changes tools. Add-ons are developed in Google Apps Script. Apps Script is free for anyone with a Google account and not only can let you author your own add-ons but also automate your workflows within Google Apps, integrate with external APIs, and more. This talk will introduce users to add-ons exploring some educational scenarios. As part of this we will discuss some threats and opportunities. We will then touch upon how add-ons are authored using Apps Script and highlight opportunities for personalised automation of workflows.

Having been to two other GEUG events I know the event attracts a diverse audience from educators to administrators to developers. This is one of the events strengths but for me makes presenting something on Google Apps Script difficult particularly thinking back to my experience starting with very little code experience but still able to make stuff.

A factor I have to also consider is since submitting the session proposal Google announced Classroom. Classroom is still in development but looks like a layer to Google Drive to administer assignments. This type of functionality has been achievable with Apps Script for a long time and I will be definitely highlighting the awesome work Andrew Stillman & Co. have been doing with scripts like Doctopus.

Another factor in my mind is from my circles the feedback I’m getting is add-ons are causing Google Apps admins headaches in that there are currently no built in fine grained controls to restrict certain add-ons (3rd party solutions are available). I don’t know how many Admins to choosing to switch off add-ons on their Google Apps domain and it’s one of the questions I’m looking for feedback on at the event.

All this has shaped the focus of my talk but hopefully there is enough there for everyone. You can tune in to the streamed Google Hangout of the session 23rd June at 3pmBST (see programme). The current slides for my talk are below:

4 Comments

Update 24/06/2014: Recording of the session currently here

For a while I’ve been interested in the intersection between Google Analytics (and Google’s other analytic reporting APIs like YouTube) applied to the field of Learning Analytics. There are a number of features of Google Analytics such as segmentation, a-b testing, event tracking which I believe could potentially give useful insight to teaching a learning, for example, a look last year at tracking and validating MCQs with GA.

One of the reasons for my interest in this area is the ubiquity of Google Analytics, the majority of institutions already using Google Analytics for their main institutional websites. It should not be forgotten that with power comes responsibility. Whilst Google Analytics usage policies prevent you using it to track personally identifiable information you are still tracking people which should never be forgotten.

The Google Apps for Education European User Group Meeting (GEUG14) at the University of York is another opportunity to roadtest some of these ideas. The process for preparing for an event often not only sees me revisiting prior knowledge but is often turned into an opportunity to create something new. This can be a product, like the Google Analytics data bridge made for IWMW13, or new knowledge.

This time personal exploration has taken me into the land of the Google Tag Manager. Those familiar with the mechanics of Google Analytics tracking will know that this usually requires adding code to every page you want to track. Often this can be achieved modifying page templates. But what if these templates are hard/costly to edit or you want to make changes to what is tracked. This is where Google Tag Manager comes in. Like Google Analytics you need to install some code in your pages. After that what and how you track things becomes completely cloud based. Through Google Tag Manager to can add additional code/markup to your pages even setting up rules to decide when these are used. Whilst Tag Manager is build around Google products like Analytics and Ads you can use it for other purposes. This video gives you an overview of Google Tag Manager.

Below are the slides from my session which will hopefully be streamed via Google Hangout 23rd June at 11:30am BST (see programme).

Share this post on:
| | |
Posted in Analytics, GDE, Google and tagged on by .

3 Comments

Update 18/06/2014: The Open Badges Issuer Add-on is now also available in the WordPress Plugin Directory. Get the Open Badges Issuer Add-on

OpenBadges_Insignia_WeIssue_BannerALT’s Open Course in Technology Enhanced Learning (ocTEL) is entering it’s final week. ocTEL has been and continues to be an excellent opportunity to explore ways in which we support ALT’s community of members. Last year the work we did in setting up a blog and community aggregations site directly feeding into the development of the ALT conference platform. This year one of the areas we were keen to explore was the digital badging of community contributions. The use of community badging is well-founded and predates the web itself. This area has however gained extra interest by educators in part due to Mozilla Open Badges. Mozilla Open Badges specify a framework for the description and award of badges using a common specification. The main benefit or this approach is interoperability. Recipients of Open Badges can collect badges in one place, manage them into collection and control how they are shared across sites, social networks and personal portfolios. One such place is in a Mozilla Backpack.

In the case of ocTEL the creation and award of digital badges, particularly within a community context, has been made very easy thanks to the BadgeOS™ plugin for WordPress. BadgeOS has a number of methods which trigger the awarding or badges including reviewed submissions as well as the completion of a defined set of steps.

One issue for us has been that to issue Open Badges with BadgeOS requires integration with the badge awarding and display site Credly. Sites like Credly are very useful parts of the badge ecosystem but the feeling we had was that if we were going to issue Open Badges we would take on the commitment of hosting the badge data ourselves rather than relying on a 3rd party. BadgeOS, regardless of whether you turn on Credly integration, still provides an excellent framework for creating and awarding digital badges. Even better is BadgeOS is open source and is actively encouraging developers to extend and enhance the plugins core functionality. If you look at the BadgeOS Developer Resources there is a number of ways this can be achieved.

With this in mind, with the support of ALT, I has decided to make my own contribution to BadgeOS  with the development of the Open Badges Issuer Add-on. This add-on achieves two things:

  • Badges designed and awarded using BadgeOS are now exposed as Open Badges compliant Assertion - Assertions are the DNA of Open Badges. They are data files which describe the badge and identify who it has been awarded to. Behind the scene the add-on is using the BadgeOS created data turning it into the required object recognised as an Open Badge. Critically this data existing in the host site. For example, one of my ocTEL badges exists here and is shown below in a formatted view.

Open Badges Assertion

  • The creation of an interface for the user to add badges awarded using BadgeOS to the Mozilla Backpack - this is technically made a lot easier as the add-on uses the existing Issuer API which provides most of the code to get the job done.

The rollout of this plugin in ocTEL is complete as detailed in ‘ocTEL digital badges are now Open Badges: How to add them to your Mozilla Backpack’. I’ve also submitted the plugin as a recognised BadgeOS add-on. The add-on will also shortly be appearing in the official WordPress Plugin Directory. Hopefully this add-on will make it easier for others to issue Open Badges through their WordPress powered site.

Like all projects this development greatly benefits for your feedback, which may include coding suggestions or improved functionality, so we appreciate your feedback.

Download the Open Badges Issuer Add-on for BadgeOS