Feedback

6 Comments

I’ve mentioned the appear.in service a couple of times. This allows you to convene small meetings (up to 8 people) with video, voice and chat without the need for logins or additional browser plugins on both desktop and mobile (my quick video demo here). Today I got an email from appear.in saying

Get notified when someone enters your room!

We have now made it even easier to start a video conversation. When someone enters your appear.in room, you will receive a desktop notification that you can click to enter the room.

How can you use notifications?

  • Get notified when someone shows up for a meeting
  • People who want to talk to you can just go into your room
  • Make sure everyone on your team is alerted when your team meetings start

Read more on our blog.

Rooms you followNotifications work using a Chrome extension, but once you have this installed to can monitor multiple rooms.

So if you were wanting to run remote tutor support hours you could claim an appear.in room and enable notifications. Once you advertise your office ours you can monitor the room, get on with other work and wait for notification.

Because appear.in allows you to ‘lock rooms’ if you are providing one to one support you can prevent someone else ‘walking in’.

The awkward bit is handling the locked room. There is no queuing service and anyone visiting a locked room will be presented with the message below. Unfortunately if someone visits a locked room, sees the locked message when the message doesn’t go away when the room is unlocked.

Locked room

A way around this might be to have two rooms – corridor and office. The corridor room would always be open. As people arrive in the corridor room you could greet them and invite them to your ‘office’ and lock the office during consultation. Once done you could go back to the ‘corridor’ room if anyone else is waiting. If the ‘corridor’ gets busy (more than 7) you’ll have to sit in it yourself or lose the ability to enter (unless as an owner you get priority).

[Writing this it’s all sounding very faffy. I’d imagine you could do something similar with Google Hangouts but I love the fact appear.in requires no login. What do you think?]

Share this post on:
| | |
Posted in Feedback, Half baked, Mashup on by .

1 Comment

Here is some text I prepared for a possible Google Apps Developer blog guest post. It doesn’t look like it’s going to get published so rather than letting it go to waste I thought I’d publish here:


Martin Hawksey is a Learning Technology Advisor for the JISC funded Centre for Educational Technology and Interoperability Standards (JISC CETIS) based in the UK. Prior to joining JISC CETIS, and in his spare time, Martin has been exploring the use of Google Apps and Apps Script for education. In this post Martin highlights some features of a Google Apps Script solution which combines Google Spreadsheet and Google Documents to speed up and standardise personal feedback returned to students at Loughborough College.

One of things that drew me to Apps Script over two years ago was the ease in which you could interact with other Google services. I also found that both using Google Spreadsheets and a coding syntax I recognised ideal as a ‘hobbyist’ programmer.

Late last year when I was approached by Loughborough College to take part in their ‘Fast Tracking Feedback’ project, I saw it as an ideal opportunity to get staff using Apps Script  and showcase the possibilities of Apps Script to the Google Apps for Education community.

The goal of the project was to produce a mechanism that allows tutors to input assignment grades using a custom UI that mirrors the final feedback sheet or enter details directly into a Google Spreadsheet.  These details are then pushed out as individually personalised Google Documents shared with the student. This sounds relatively simple, but the complication is that each assignment needs to map to a predefined set of rubrics which vary between units. For example in one course alone there are over 40 units and every unit can be assessed using multiple assignments with any combination of predefined criteria ranging from pass, merit and distinction.

Below is an example student feedback form highlighting the regions that are different for each assignment.

Example student feedback form highlighting the regions that are different for each assignment

The video below shows a demonstration of the current version of the of the ‘Fast Tracking Feedback’ system is set-up and used:

Solution highlights

A number of Apps Script Services have been used as part of this project. Lets look at how some of these have been implemented.

DocList Service – The self-filing Google Spreadsheet

The eventual plan is to rollout the Fast Tracking Feedback system to teaching teams across the College. To make the life of support staff easier it was decided to use a common filing structure. Using a standardised structure will help tutors stay organised and aid creation of support documentation.

When a tutor runs the setup function on a new feedback spreadsheet it checks that the correct folder structure exists (if not making it) and moves the current spreadsheet into the pre-defined collection.


Self-generating folder structure and organization

The code that does this is:

// code to generate folder structure and move spreadsheet into right location 
// + ROOT_FOLDER 
// |- SPREADSHEET_FOLDER 
// |- DRAFT_FOLDER 
// |- RELEASED_FOLDER 
var rootFolder = folderMakeReturn(ROOT_FOLDER); // get a the system route folder (if it deosn't existing make it 
// create/get draft and release folders 
var draftFolder = folderMakeReturn(DRAFT_FOLDER,rootFolder, ROOT_FOLDER+"/"+DRAFT_FOLDER); 
var releaseFolder = folderMakeReturn(RELEASED_FOLDER,rootFolder, ROOT_FOLDER+"/"+RELEASED_FOLDER); 
var spreadsheetFolder = folderMakeReturn(SPREADSHEET_FOLDER, rootFolder, ROOT_FOLDER+"/"+SPREADSHEET_FOLDER); 

// move spreadsheet to spreadhsheet folder 
var file = DocsList.getFileById(SpreadsheetApp.getActiveSpreadsheet().getId()); 
file.addToFolder(spreadsheetFolder); 

// function to see if folder exists in DocList and returns it 
// (optional - if it doesn't exist then makes it) 
function folderMakeReturn(folderName,optFolder,optFolderPath){ 
try { 
   if (optFolderPath != undefined){ 
     var folder = DocsList.getFolder(optFolderPath); 
   } else { 
     var folder = DocsList.getFolder(folderName); 
   } 
   return folder; 
} catch(e) { 
   if (optFolder == undefined) { 
     var folder = DocsList.createFolder(folderName); 
   } else { 
     var folder = optFolder.createFolder(folderName); 
   } 
   return folder; 
} 
}

UI Service – Hybrid approach

A central design consideration was to make the Fast Tracking Feedback system easy for College staff to support and change. Consequently wherever possible the Apps Script GUI Builder was used to create as much of the user interface as possible. Because of the dynamic nature of the assessment rubrics part of the form is added by selecting an element holder and adding labels, select lists and textareas. Other parts of the form like the student information at the top can be added and populated with data by using the GUI Builder to insert textfields which are named using normalized names matching the spreadsheet column headers. The snippet of code that does this is:

app.getElementById(NORMHEADER[i]).setText(row[NORMHEADER[i]]);

Where NORMHEADER is an array of the normalized spreadsheet column names and row is a JavaScript Object of the row data generated based on the Reading Spreadsheet data Apps Script Tutorial.

Hybrid UI construction using GUI Builder and coding

Document Services – Master and custom templates

The process for filling in personalized feedback forms has three main steps. First a duplicate of the Master Template is made giving it a temporary name (DocList Services). Next the required assessment criteria are added to the form using the Document Services mainly using the TableCell Class. Parts of the document that are going to be filled with data from the spreadsheet are identified using a similar technique to the Apps Script Simple Mail Merge Tutorial. Finally for each student the assignment specific template is duplicated and filled with their personalised feedback.

if (email && (row.feedbackLink =="" || row.feedbackLink == undefined)){
  // Get document template, copy it as a new temp doc, and save the Doc's id
  var copyId   = DocsList.getFileById(newTemplateId)
                         .makeCopy(file_prefix+" - "+email)
                         .getId();
  var copyDoc  = DocumentApp.openById(copyId);
  // move doc to tutors folder
  var file = DocsList.getFileById(copyId);
  var folder = DocsList.getFolder(ROOT_FOLDER+"/"+DRAFT_FOLDER);
  file.addToFolder(folder);

  // select the document body
  var copyBody = copyDoc.getActiveSection();

  // find edittable parts of the document
  var keys = createKeys(copyDoc);

  // loop through elements replacing text with values from spreadsheet
  for (var j in keys) {
    var text = keys[j].text;
    var replacementText = ""; // set the default replacement text to blank
    if (row[keys[j].id] != undefined){ // if column value is defined get text
      replacementText = row[keys[j].id];
    }
    copyBody.replaceText('{%'+keys[j].text+'%}', replacementText); // replace text
  }
  copyDoc.saveAndClose();

  // create a link to the document in the spreadsheet
  FEEDSHEET.getRange("B"+(parseInt(i)+startRow)).setFormula('=HYPERLINK("'+file.getUrl()+'", "'+copyId+'")');
  FEEDSHEET.getRange("C"+(parseInt(i)+startRow)).setValue("Draft");
  // you can do other things here like email a link to the document to the student
}

Currently the system is configured to place generated feedback forms into a draft folder. Once the tutor is happy for the feedback to be released either individual or class feedback forms are distributed to students from a menu option in the feedback spreadsheet for the assignment, a record being kept of the status and location of the document.

Easy record keeping

Next steps/Get the code

The Fast Tracking Feedback System is currently being piloted with a small group of staff at Loughborough College. Comments from staff will be used to refine the system over the next couple of months. The current source code and associated template files are available from here.

The Fast Tracking Feedback project is funded by the UK’s LSIS Leadership in Technology (LIT) grant scheme.

3 Comments

Note: This is a personal post made outwith my current employment at JISC CETIS

Back in May 2011 Tony Hirst looked at the Visual UI Editor For Google Apps Script and commented that he thought before long I would have posted something about it. Well almost a year later here’s what I’ve got for you.  As part of the Fast-tracking feedback project (funded by the LSIS Leadership in Technology grant) with Loughborough College I ran a training session at the beginning of the month to help staff learn about user interface construction in Google Apps Script. The session follows on one of my earlier blog posts releasing some code to batch fill in Google Docs from a spreadsheet of feedback comments. As part of the session I produced a step-by-step guide for creating a Google Sites based form/gadget that could read and write data to a spreadsheet. As part of the project this guide is available for re-use using the link below. Before you download/use a couple of things worth bearing in mind:

  • the guide has been tweaked slightly for publication and as a consequence I may have inadvertently broken it. If you find something is wrong leave a comment in the document
  • in the guide an image is used to help you layout a form and is not intended to be part of the final navigation

Introduction to Google Apps Script: Custom Interfaces Guide

4 Comments

In October last year the Sport Learning Technologists at Loughborough College successfully won funding from the LSIS Leadership in Technology (LIT) grant scheme for the Fast-tracking feedback using Google Scripts project. Here’s an extract from the project summary:

This project will effectively combine Google Apps for Education and Google Apps Script in order to create a tool which allows tutors to enter grades and feedback in a single spreadsheet which then automatically populates individual feedback proforma, simultaneously sharing these results with students, progress tutors, and administrators as appropriate.

The benefit will be an increase in the efficiency with which assessment feedback can be shared, improving the speed and quality of paper-less student feedback. A successful conclusion to this project will be demonstrated by reduced submission turnaround times and a reduction in the errors brought about by inconsistencies in data entry.

Project funding is not just for deploying technology but also increases the capacity within the organisation at the operational level. With this in mind I have been working with Loughborough, helping them in the technical aspects of developing the Fast-Tracking Feedback System and also learn about Google Apps Script via a series of workshops. Friday was the first of these and I thought I’d share the story so far.  

The Loughborough group had already got of to a flying start successfully modifying My #eas11 world premier: Creating personalised Google Documents from Form submissions. 5 months is a long time in Google Apps Script and since then not only is there some new functionality in Apps Script, but I’ve also picked up some new tips. My own understanding has come on along way thanks to receiving a preview copy of Google Script: Enterprise Application Essentials by James Ferreira [due out 27th January]. I’ve been a regular visitor to James simpleappssolutions.com site and tutorials so wasn’t sure if his book would teach me much more, but how I was wrong. Part of the reason I believe for this is the book is geared towards ‘enterprise applications’ so concentrates on documents and workflows, just as assessment in education (for better or worse) is concentrated on documents and workflows.   

So below are two links of the current version of the Google Apps Script Spreadsheet and example Document template followed by a quick video to show how it is used. Obviously these are still work in progress as there is still 6 months to run on the project but there’s already enough there for others to benefit from and perhaps feedback on design.

Stay tuned for more developments

Last week my colleague, Kenji Lamb, and I were up in Inverness providing some support to the University of the Highlands and Islands (UHI) EDU Team. We were exploring the use/approach to assessment and feedback, sharing what is going on in the sector for the EDU Team to disseminate around UHI. Below are a couple of slide decks I used over the two days.

Having worked on the REAP project a couple of years ago there was a bit of material I recycled from that (as the ripples from this project are still resonating finding there way into publications like Effective Assessment in a Digital Age and workshop/design tools like the JISC funded Viewpoints project) Note to self: must write about Viewpoints once online tool is available 

I also took the opportunity to test drive my idea for a Google Form/Visualization mashup for electronic voting (couldn’t be bothered lugging voting handset up was my excuse ;). Technically it worked reasonable well. One major improvement would be to monitor how many people had voted in real-time.

1 Comment

Previously I’ve promoted the use of audio and video feedback on student work. Methods I’ve highlighted include creating audio and video files using a wide range of software tools and distribution methods. (At this point I would normally direct you to my Student Audio Feedback: What, why and how post but recently rediscovered ALT-C 2009 II: Audio and screen visual feedback to support student learning (and research methodologies), which is pretty good)

Recently a member of staff from one of our supported institutions interested in the use of this form of feedback contacted me with concerns over students reposting personal feedback in the public domain i.e. just as a tutor respects a student’s privacy in not publishing a student work without permission, shouldn’t students do the same. In particular they were wondering if any student declaration was needed to prevent this from happening.

My initial response was along the lines of that any feedback produced by the tutor would remain the intellectual property of the institution and any public reposting would automatically need the consent of the institution, therefore all the tutor needs to do is highlight the existing legal position rather than having students make any extra declarations. But as I wasn’t completely sure of my interpretation of IPR I put a query with JISC Legal and here was the response I got (Disclaimer: The following text is provided as information only and does not constitute formal legal advice):

The recording of the feedback given by the lecturer will either belong to the lecturer or the institution.  S.11(2) of the Copyright, Designs and Patents Act 1988 provides that the employer will be the first owner of copyright, unless there has been an agreement otherwise.  It could be that there is sufficient ‘dramatic’ content in giving the feedback too that there is a performer’s right in the recording too, which would stay with the academic, unless there is agreement to transfer those to the institution.

In any case, the student would need to get permission before doing any of the copyright-restricted acts, which would include copying the work, adapting it, and communicating it to the public by internet dissemination in this particular case.  It may be worth reminding the students of this, and I’d suggest including an explanation that the feedback is personal and given within the teaching relationship, and so dissemination of the work would be disrespectful as well as copyright infringement.  Beyond the legal issue, it might also be worthwhile addressing the underlying reasons why the student or students might want to share the feedback – is there a need for more generic feedback that can be shared more widely?

So generally speaking my guidance was along the right lines, but the information from JISC Legal not only identifies particular nuances of the legal implications but also highlights how the risk of getting into problems can be mitigated and addressing some of the fundamental pedagogy. I hard to see how advice like this could get any better.

This isn’t the first time JISC Legal have provided some first-rate guidance and if you haven’t checked out their service it’s well worth an explore. Before you think this level of support is only available to other JISC Advance and JISC related staff it’s not. JISC Legal endeavour to support anyone in the UK tertiary education sector “to ensure that legal issues do not become a barrier to the adoption and use of new information and communications technologies”.

As well as individual guidance JISC Legal have a wealth of support material. Recent goodies include:

JISC Legal = pure quality btw

4 Comments

On the 28th May 2009  I wrote a post on Generating Student Video Feedback using ScreenToaster. As ScreenToaster is now ‘toast’ I thought I’d repost highlighting screenr instead. As the process for using ScreenToaster/screenr is so similar I haven’t re-recorded the demo video, but hopefully you get the idea (I’m glad I downloaded the original and put it on vimeo ;)

Flickr Tag Error: Call to display photo '4198881540' failed.

Error state follows:

  • stat: fail
  • code: 95
  • message: SSL is required

As I’ve recently revisited on generating audio feedback it seemed timely, particularly with a request from UHI coming into my inbox, to also have another look at video feedback. Russell Stannard recently won a Times Higher Education Award partly for his work in this particular area. In Russell’s work he uses screen capture software to record feedback on electronic submissions of student work. More information on this technique is available in a case study Russell prepared for the Higher Education Academy English Subject Centre on Using Screen Capture Software in Student Feedback. An example of using this technique is also available - click here for a short example of video feedback.

In my original post I highlighted Using Tokbox for Live and Recorded Video Feedback as a possible solution to distribute video feedback. At the time I felt there were two niggling issues with using Tokbox. First there was the requirement to install the ManyCams software to allow you to display your desktop and secondly Tokbox was very slow in uploading video you had recorded. For live video feedback Tokbox might still be worth considering, but shortly after publishing the post I discovered ScreenToaster., but for recorded feedback you might do better with screenr.

ScreenToaster Screenr allows you to record your desktop without installing any software. It’s very easy to setup and the videos you create can be immediately uploaded allowing you to decides how you want to distribute and share them [You can also publish them directly to YouTube and/or download the video in MP4 format. The following video shows you how easy it is to setup and highlights some of the useful features. Even if you are not interested in delivering video feedback to students this is still a great site to record other material like demonstrations of software.

ScreenToaster Screencast 
Example of using ScreenToaster to deliver video feedback on student submitted work from Martin Hawksey on Vimeo

Flickr Tag Error: Call to display photo '2469419808' failed.

Error state follows:

  • stat: fail
  • code: 95
  • message: SSL is required

A long, long, long time ago I wrote a post Using Tokbox for Live and Recorded Video Feedback in which I demonstrated how the free ManyCam software could be used to turn your desktop into a virtual webcam to provide feedback on students work in a Russell Stannard styley. Recently my colleague Kenji Lamb was showing me how you could directly record your webcam using YouTube, so I thought I would revisit this idea.

This time instead of focusing on the use of the visual element as a tool to direct students attention to a specific part of a assessment submission (e.g. highlight and talking about parts of a word document), I thought it would be interesting to demonstrate it in a more abstract way using images to reinforcing audio comments (e.g. you did good – happy face; you did bad – sad face).

When previously looking at audio feedback I’ve been very aware that reducing as much of the administrative burden is very important. Online form filling whether it be through the VLE, other systems or in the YouTube example, can be a bit of a chore so in this demonstration I also touch upon using bookmarklets to remove some of the burden. Here is a link to the bookmarklet I created for student feedback on YouTube (YouTube Feedback Template – you should be able drag and drop this to your bookmark toolbar but if you are reading this through an RSS reader it might get stripped out).

Having this link in you toolbar means when you get to the video settings you can click it to populate the form. Bookmarklets are a nice tool to have in the chest so I’ve covered them in more detail in Bookmarklets: Auto form filling and more. This post also shows you how you can create your own custom filling bookmarklet using Benjamin Keen’s Bookmarklet Generator.

So here it his a quick overview of using YouTube for recording student feedback:

[utitle mode=2 comment=yes]lQ0KclIhfl8[/utitle]

1 Comment

Flickr Tag Error: Call to display photo '2597104631' failed.

Error state follows:

  • stat: fail
  • code: 95
  • message: SSL is required

For the next post in my ALT-C series I’m going to highlight a session I didn’t actually attend but immediately regretted when comments started filtering in on twitter.

The session was based around the paper by Rodway-Dyer, Dunne and Newcombe from University of Exeter which summaries a study of audio and visual feedback used in two 1st year undergraduate classes. Click here for the paper and abstract.

Comments I picked up on this paper via twitter appeared to show audio feedback was not well received. Issues highlighted were:

  • the finding that “76% of students wanted face-to-face from a tutor in addition to other forms of feedback” [@adamread, @JackieCarter]
  • students found that receiving negative audio comments was harder than when written [@adamread, @ali818, @narcomarco]. Although this is still open to debate as @gillysalmon said that “duckling project at Leicester has found human voice easier to give negative feedback by audio than text”

Obviously there are issues with making assumptions based on a few 140 character tweets and it should be noted that the authors conclude that “overall, it seems that "there is considerable potential in using audio and screen visual feedback to support learning”, although students did express concerns in a number of areas.

Having had a chance to digest the paper the question I’m left with is how much of the negative experiences were a result of the wider assessment design rather than the use of audio feedback in itself. For example, reading the focus group discussions for audio feedback in geography I noted that:

  • students were not notified that they would be receiving audio feedback;
  • that despite the tutors best attempts students hadn’t engaged with assessment criteria; and
  • that this was the first essay students submitted at university level and they were unclear of the expected standards.

Similar issues to these were addressed in the Re-Engineering Assessment Practices (REAP) project, which produced an evolving set of assessment principles. Principles which could be successfully applied to the geography example might be:

Help clarify what good performance is – this could be achieved in a number of ways including creating an opportunity for the tutor to discuss criteria with students, or perhaps providing a exemplar of previous submissions with associated audio feedback.

Providing opportunities to act on feedback – as this was the students first submission providing feedback on a draft version of their essay not only allows students to act on feedback (it’s not surprising when students ignore feedback if they have no opportunity to use it).

Facilitates self-assessment and reflection - One of the redesigns piloted during REAP was the Foundation Pharmacy class, in which students submitted a draft using a pro-forma similar to that used by tutors to grade their final submission. Students were required to reflect on distinct sections of their essay, which again also allowed them to engage with the assessment criteria.

Encourage positive motivational beliefs – using the staged feedback described above would perhaps also address the issue of students becoming disillusioned.

Talking to a friend during the lunch break the research methodology used by the authors was also mentioned, in particular the use of ‘stimulated recall’. For this the authors played back examples of audio feedback to the tutor asking him to explain his thought processes and reflect on how his students would have responded to his comments. This methodology seems particularly appropriate to evaluate the use of audio feedback, and is something I want to take a closer look at.

A moment of serendipity

Whilst searching the twitter feed for comments on the session I noticed a tweet by @newmediac which was promoting a free webinar in which  “Phil Ice shares research on benefits of audio feedback” (here’s the full tweet). The session has already passed  but the recording for this event is here.

Tweets - Moment of serendipity
Moment of serendipity

The presenter, Phil Ice, has been working on audio feedback in the US for a number of years and has a number of interesting findings (and research methodologies) I haven’t seen in the UK.

For example, Ice and his team report:

students used content for which audio feedback was received approximately 3 times more often than content for which text-based feedback [was] received”

and that

students were 5 to 6 times more likely to apply content for which audio feedback was received at the higher levels of Bloom’s Taxonomy then content for which text-based feedback was received”.

These results were from a small scale study of approximately 30 students so aren’t conclusive. Ice has also conducted a larger studies with over 2,000 students which used the Community of Inquiry Framework Survey. Positive differences were found across a number of indicators including excessive use of audio to address feedback at lower levels is perceived as a barrier by students.

Ice has also conducted studies which breaks audio feedback into four types: global – overall quality; mid level – clarity of thought/argument; micro – word choice/grammar/punctuation; and other – scholarly advice. The study indicates that students prefer a combination of audio and text for global and mid-level comments.

Findings from Ice have been submitted for publication in the Journal of Educational Computing Research (which will soon feature a special issue on ‘Technology-Mediated Feedback for Teaching and Learning’).

Screenshot showing inline audio comments
Screenshot showing inline audio comments

Finally, I would like to mention the method Ice uses for audio feedback. He uses the audio comment tool within Acrobat Pro 8 to record comments ‘inline’. This appears to be particularly useful for students to relate comments to particular sections of their submitted work. Click here for a sample PDF document with audio feedback (this isn’t compatible with all PDF readers - I’ve tested on Acrobat Reader and Foxit Reader).

Hopefully this post has not only stimulated some ideas in the use of audio feedback, but also highlight a range of methodologies to effectively evaluate it.

Flickr Tag Error: Call to display photo '3102986116' failed.

Error state follows:

  • stat: fail
  • code: 95
  • message: SSL is required
The National Student Survey results has been published by HEFCE which has no doubt left school/department managers burning the midnight oil to see how they have faired. Feedback remains to be a talking point with only just over half of Scottish students agreeing or strongly agreeing that feedback has been prompt, detailed and helpful.

But what about the students who neither agree or disagree? If you turn the question around and ask what proportion of students disagree or strongly disagree with the level of feedback they receive then you are looking at approximately a quarter of students. Obviously this is still a substantial number and still makes feedback the worst performing area, but if you are drilling down into course level performance perhaps it is worth bearing in mind.

Table 1 below shows the results for the percentage of Scottish students who responded disagree or strongly disagree to the NSS questions.

Looking at how this analysis effects the overall satisfaction with Scottish HEIs the most notable changes are University of Stirling and Robert Gordon University who (by my calculations*) jump 2 rankings. Below (#) denotes rank.

University of St Andrews 91% (1) 3% (1)
University of Glasgow 91% (1) 5% (2)
University of Aberdeen 89% (3) 6% (4)
University of Stirling 88% (4) 5% (2)
University of Dundee 88% (4) 7% (5)
University of Strathclyde 87% (6) 7% (5)
Robert Gordon University 84% (7) 7% (5)
Glasgow Caledonian University 84% (7) 8% (8)
University of Edinburgh 82% (9) 9% (9)
Napier University 81% (10) 9% (9)
Heriot-Watt University 81% (10) 10% (11)
Glasgow School of Art 69% (12) 22% (12)

*Data provided by the NSS is susceptible to rounding errors. For example University of St. Andrews has an overall percentage agree for Q22 of 92% yet the percentage breakdown is 35% agree and 56% strongly agree, which equals 91%. To allow comparison with the percentage of disagreement, the sum of percentage of responses for agree and strongly agree have been used.

Table 1: Unofficial National Unsatisfied Student Survey (UNUSS)
Provisional sector results for Full-time students - Scotland Registered HEI (% of students who disagree/strongly disagree) extracted from HEFCE NSS 2009 Data

Question 2008 2009
The teaching on my course
1 - Staff are good at explaining things. 4 4
2 - Staff have made the subject interesting. 6 6
3 - Staff are enthusiastic about what they are teaching. 4 4
4 - The course is intellectually stimulating. 5 5
Assessment and feedback
5 - The criteria used in marking have been clear in advance. 17 15
6 - Assessment arrangements and marking have been fair. 11 10
7 - Feedback on my work has been prompt. 28 27
8 - I have received detailed comments on my work. 29 27
9 - Feedback on my work has helped me clarify things I did not understand. 27 25
Academic support
10 - I have received sufficient advice and support with my studies. 12 11
11 - I have been able to contact staff when I needed to. 6 7
12 - Good advice was available when I needed to make study choices. 13 12
Organisation and management
13 - The timetable works efficiently as far as my activities are concerned. 10 11
14 - Any changes in the course or teaching have been communicated effectively. 12 15
15 - The course is well organised and is running smoothly. 11 13
Learning resources
16 - The library resources and services are good enough for my needs. 12 11
17 - I have been able to access general IT resources when I needed to. 6 6
18 - I have been able to access specialised equipment, facilities or room when I needed to. 6 6
Personal development
19 - The course has helped me present myself with confidence. 7 6
20 - My communication skills have improved. 5 4
21 - As a result of the course, I feel confident in tackling unfamiliar problems. 5 5
Overall satisfaction
22 - Overall, I am satisfied with the quality of the course. 7 7