#LAK13: Recipes in capturing and analyzing data – Canvas Network Discussion Activity Data

In my last post I looked at the data available around a course hashtag from Twitter. For this next post I want to start looking at what’s available around the Canvas Network platform which is being used to host Learning Analytics and Knowledge (LAK13). Sizing up what was available I did come across the Canvas LMS API documentation, which provides a similar method of accessing data as the Twitter API. I wasn’t sure if this extended to Canvas Network but because the authentication method it uses (oAuth2) isn’t possible using my dev tools of choice (mainly Google Apps Script) I looked for something else.

Whilst browsing on the discussion page for the course I noticed that my browser was auto-detecting a feed:

image

an excerpt of this feed is below:

 https://learn.canvas.net/courses/33/discussion_topicsLearning Analytics and Knowledge Discussion Feed
  2013-02-20T12:39:02+00:00
  	Discussion: Week 8 Discussion Forum
    tag:canvas.instructure.com,2013-02-01:/discussion_topics/discussion_topic_580
    2013-02-01T21:15:20+00:00
    2013-02-01T21:15:02+00:00
    	George SiemensDiscussion: Week 3 Discussion Forum
    tag:canvas.instructure.com,2013-02-01:/discussion_topics/discussion_topic_575
    2013-02-01T21:15:56+00:00
    2013-02-01T21:13:24+00:00
    	

      George Siemens

Looking at the raw feed I could see it wasn’t limited (often feeds only contain the last 10 entries) and contained the entire content of messages.

Update: Below I use Google Apps Script to extract the data. Since then I've created a 'no code' solution that only uses existing Spreadsheet formula. There is an accompanying presentation (the webinar should eventually appear here)

Looking for an easy way to consume this I first turned to the importFeed formula in Google Spreadsheet’s but unfortunately it only returned the last 20 results. A trick I’ve used in the past is to put feeds through Yahoo Pipes to get a JSON/CSV to work with, but as working with dates this way isn’t straight forward I opted for some Google Apps Script which would create a custom formula to fetch the feed from Canvas Network and enter the results into a sheet.  The 12 lines of code for the main part of this are below:

function getCanvasDiscussions(url) {
  var response = UrlFetchApp.fetch(url);
  var contentHeader = response.getHeaders();
  if (response.getResponseCode() == 200) {
    var d = Xml.parse(response.getContentText()).feed.entry;
    var output = [['published','updated','title','author','link','id','content']];
    for (i in d){
      output.push([getDateFromIso(d[i].published.Text),getDateFromIso(d[i].updated.Text),d[i].title.Text,d[i].author.name.Text,d[i].link.href,d[i].id.Text,d[i].content.Text]);
    }
    return output;
  }
}

The getDateFromIso is a subfunction I use quite often and is available in this stackoverflow answer. Adding the above code to a Google Sheet (via Tools > Script editor..) allows me to use a custom formula to fetch the data.

Below is a quick look at the data returned (here it is published in a table). In the columns we have publish dates, title, author, link, uri and post content. As it goes this isn’t too bad. The big thing that is missing is whilst we can see which topic the message is in the reply threading is lost.

Canvas Network Discussion Data

Even with this like the #lak13 Twitter dashboard from last week I can quickly add some formulas to process the data and get an overview of what is going on (for the live view visit this spreadsheet – File > Make a copy is you want to edit).

CanvasNetworkDashboard

This obviously isn’t a complicated analytic and it wouldn’t surprise me if the course tutors didn’t have something similar on the backend of Canvas Network. As a student it’s useful for me to see how I’m doing compared to others on the course and get a sense of who else is contributing. [Adam Cooper has a great post on How to do Analytics Right... with some tips he picked up for John Campbell who is behind Purdue’s Signals Project which fits in nicely here.]

Summary

So with a bit of ken and a couple lines of code I can see how the #lak13 discussions are going. Again I’ve avoided any deep analytics such as analysing what has been said, to who, at what time, but hopefully now that I’ve highlighted and freed the data you can do something else with it. Not being able to extract the conversation thread is a little disappointing as it would have been nice to fire up SNAPP or NodeXL, but I’ll have to save those for another day ;)

10 thoughts on “#LAK13: Recipes in capturing and analyzing data – Canvas Network Discussion Activity Data

  1. Martin, this is wicked cool. I’m really digging the view you’ve put together there. Just so you know, as a student in the course you have API access to the discussion forums as well, which you could use to pull out the threading information you’re looking for.

    Check out the docs here: https://canvas.instructure.com/doc/api/discussion_topics.html

    You can generate an access token a the bottom of your user profile/settings page and use that to skip the OAuth flow.

    1. Post author

      I'm Glad you like Brian. Awesome tip on the access token! In a couple of minutes I've managed to start getting data back. Stay tuned for follow up post ;)
      Martin

  2. Pingback:

  3. On

    Hi Martin,
    I just found out your works. While I don't know how to replicate your work on the computer (I don't even see the "Feed" pop up and I don't have any programming background. I would like to learn it if you have detailed instruction of how to do so for a non programmer when u have time), however, the Canvas Network Discussion Dashboard you created is very informative. It allows me to see who participated in which topics. Based on the info, I probably can do akin of content analysis on the topic. It's wicked. I assume the doc will be updated automatically right? Is there anything I have to be cautious when using it?

    Thanks a million for creating this. Appreciated.

    Cheers,
    On

    1. Post author

      Hi On - I always forget browsers seem to be moving away from telling you when they detect a feed. I use Chrome and the RSS Subscription Extension. The non-programmer way of using this feed is to use Yahoo Pipes. This allows you to put the feed in and get a csv file out. There are a couple of hidden tricks in doing this so I'll try and write it up at some point.

      Yes the data get automatically updated. There is a delay due to caching but at most it's one day old if not a lot fresher. Usual spreadsheet issues like ambiguity of value and formula, which mean you can't do things like sort the dashboard using built-in sort tools. The spreadsheet will also reach a point when it takes too long to read all the data. Other than that I find it's a useful playground

      Thanks,

      Martin

  4. Pingback:

  5. Pingback:

  6. Pingback:

  7. Pingback:

Comments are closed.