Fortunately TwapperKeeper has been given until the 20th March to comply, but this is still a blow for 4th party developers, like myself, who build applications around the TwapperKeeper archive/API and will no longer have an easy way to grab historic tweets. Part of the problem is Twitter provides no easy access to collections of tweets over 5 days old necessitating the need for services like TwapperKeeper, so it’s discontinuation is going to cause headaches for some.
So what about other services like my other Twitter mashup uTitle and Andy Powell’s Summarizr, which rely on the TwapperKeeper service. There are obviously other Twitter archive services out there, TwapperKeeper even has the JISC funded open source version yourTwapperKeeper, but all the online services potentially face the same problem of being prevented from making archived tweets reusable. One solution might be to develop a distributed cloud based archive. For example, initially I thought of using Google Spreadsheet hack for capturing tweets, and then sharing these archives with the community as a searchable index. Anyone fancy building an interface for this?
With 24 hours of video uploaded every minute to YouTube, your videos can quickly be lost within a sea of content. Not only this, but because videos have historically been difficult for search engines to catalogue, your drop in the ocean of content can become indistinguishable from everything else.
It’s not surprising therefore that the current kings of search and owners of YouTube, Google, announced that in March 2010 that video’s on YouTube would be auto-captioned. Whilst this announcement is pitched at improving accessibility for the hearing impaired, it also means there is wider accessibility in terms of how the videos are indexed and ultimately searched. Need proof? The following Google Search returns this video (which convenient also highlights the value of captioning videos for search engine optimisation).
But what if you have conference videos or other educational resources, like lecture capture, which isn’t on YouTube? There are a number of options to captions including: using standalone voice recognition software, various caption/annotation tools, professional captioning, or just sitting down and manually writing captions in a text editor. All of these potentially have a cost associated with them. If only there was a way you could crowdsource captions … hold that thought.
A well as the rise in popularity of video, conference delegates are increasingly using the micro-blogging service Twitter to share ‘What’s happening’ with other participants as well those further afield. For many this is becoming a valuable medium allowing the individual to find voice in a format which is usually dominated by whoever is standing at the front of the room. At the same time conference organisers are benefitting, from what is usually thousands of tweets, amplifying and raising the profile of the event.
The record of conference tweets is arguably a resource which is equally as valuable as any conference proceedings, papers, posters, videos, but the nature of a tweet means if not consumed in the moment then they can potentially loose context. And it is here that two worlds collide. Using what was said by the audience to caption a video of the presentation, contextualising ‘what’s happening’ with what happened.
The hypothesis was that providing providing a twitter subtitle track would improve the discoverability of FOTE10 videos. Does it work? Well if anyone is ever searching for an “ed tech jackanory” there should be a happy ending.
In the last edition of RSC NewsFeed we posted Farewell Lectures? Donald Clark Stirs it Up which highlighted Donald Clark’s “Don’t lecture me” keynote from ALT-C 2010. In this post we directed you to a related post on the Learning Conversations blog which mentioned that there was a lot of backchannel dialogue during Donald’s presentation by delegates in the room and those remotely watching via Elluminate. The use of Twitter at educational conferences has really taken off in the last couple of years allowing an individual find a voice in a format which is usually dominated by whoever is standing at the front.
The tweets from Twitter are potentially not only just a valuable historic record of the audience reaction but can potentially improve the navigation and searchability of video resources. This is explained in more detail in Martin Hawksey’s (RSC Scotland North & East) guest post on the FOTE website Making ripples in a big pond: Optimising FOTE10 videos with an iTitle Twitter track, in which he describes how and why he was able to combine conference videos with subtitles of the audiences tweets.
The same technology has now been used with some of the video from ALT-C 2010. So you now can see what the audience ‘tweeted’ during Donald and Sugata Mitra’s keynotes.
In March 2009 Tony Hirst posted a solution for Twitter Powered Subtitles for Conference Audio/Videos on Youtube. A year and a half later, numerous evenings tweaking code, lots of support, advice and promotion from Tony, Brian Kelly and others, and we have come full circle. What began for me as a method to playback real-time tweets with the BBC iPlayer has returned to its origins, Twitter powered subtitles for a conference video on YouTube.
To date the examples of using the Twitter subtitling tool (iTitle), including Reliving ALT-C 2009 keynotes with preserved tweets, have focused on replaying externally hosted video content through this site using the JW Player. This method has allowed greater control over certain aspects like interface design and features like the timeline jump navigation. The disadvantage of this extra control is sustainability.
Whilst I’m very happy working for the RSC there will come the day when I move on, or this website might disappear altogether subsumed into another RSC system/service. If this were to happen there is no guarantee that iTitled videos would still be able to be replayed.
This issue has been at the back of my mind since the very beginning which is partly why from early on I made the iTitle code available for download (I should really update this version of the code). But there has been another solution which has been available since the very beginning but I’ve never had an example to demonstrate it. Just as Tony’s original post demonstrated how the SubRip (*.srt) subtitle file format could be uploaded as part of one of your YouTube videos, iTitle has had the ability to generate SubRip files almost since the very beginning.
So in August when I saw ALT had uploaded videos from ALT-C 2009 to their YouTube channel I thought it would be a great opportunity to amplify keynotes from this years ALT-C and highlight YouTube’s built-in subtitling tools. So after some idea dropping (via Twitter of course) and some follow up emails with Matt Lingard and other members of the ALT team you can now enjoy Donald Clark’s and Sugata Mitra’s keynotes with the ability to see what was said on Twitter in YouTube (links for these at the end of this post).
If you watch these videos via the YouTube site you might need to turn the subtitles on by clicking the ‘cc’ button in the playback toolbar. Annoyingly there doesn’t appear to be a setting for the video which forces captions to play every time, instead YouTube remembers your last choice, but captions can be forced on when a video is embedded. Here is the YouTube help for this feature.
A nice feature of YouTube’s implementation of subtitles/closed captions is their interactive transcript which has a navigable list of the subtitle track, highlighting the active caption. Hopefully YouTube will get around to providing some sort of filtering/search solution like the one used in iTitle’s timeline jump navigation.
Unfortunately I won’t be able to attend ALT-C this year and will have to muddle on as a remote delegate, primarily surfing the conferencing twitter stream.
Brian Kelly posted about the Use of Twitter at the ALTC 2009 Conference last year and by all accounts if ALT are able to video stream the keynotes again combining these two channels should mean it will be practically like I am there (but without the lunch queue ;-).
In Brain’s original post I noticed he mentioned the Twapper Keeper service (perhaps his first mention on this on hos blog) and that he had created a notebook for #altc2009. Having missed Martin Bean and Terry Anderson’s keynotes and wanting to gear myself up for ALT-C 2010 I thought I’d see if I could relive the keynotes with the preserved twitter stream using iTitle.
Knowing the twitter archive was available the next step was to see if I could find the video. On the official ALT-C 2009 keynotes page I saw they had the videos hosted on blip.tv. Unfortunately this wasn’t one of the video hosting sites currently supported by iTitle. This isn’t the first time I’ve had this problem as I had to manually tweak the pages for the JISC10 Conference Keynotes with Twitter Subtitles. Rather than having to keep tweaking pages I thought a simple solution would be to just let the user define a url for where a video is hosted (which works well with blip as they give direct links for videos in .flv and .mp4 format). So here are the videos with tweets (NB the jump navigation only works for loaded parts of the video)
Vice-Chancellor of the Open University
In Brian Kelly’s Captioned Videos of IWMW 2010 Talks post last month he had three suggestions for improvement: searchable collections; a ‘RESTful interface’ to link to specific tweets in the video; and better cross browser support.
If you try this link you’ll see the list of tweets is filtered, which is great, but … clicking on it doesn’t always jump to the appropriate part of the video. This is a limitation of the JW Player and Vimeo videos which isn’t able to jump to parts of the video which haven’t been buffered yet (not a problem for YouTube video mind you because the way they are served is different).
So I think this ticks off the ‘RESTful interface’ for iTitle and uTitle.
In a follow-up post by Brian (Twitter Captioned Videos Gets Even Better) there was a useful discussion started by Anthony Leonard about whether the lag between a tweet being sent and what was being said in the video clip was distracting. As there are a number factors which can effect the amount of time between starting to type a tweet and hit the send button synchronising both streams isn’t easy.
Tony Hirst’s then commented that the auto-mark feature he suggested for uTitle which timestamps a tweet as soon as you start typing could be modified for a ‘live event tweeting’ client. This got me thinking:
Once I include Vimeo support for uTitle you could upload a blank video which is say 90 minutes long. Load up the video in uTitle and when the session starts hit play. Then use the uTitle interface for making tweets (the Tony Hirst patent pending auto-mark feature creating timestamps when the tweeter starts typing).
Once you are done uses Vimeo’s feature of being able to replace existing videos to swap the blank video with the actual video of the event which will have the same video ID.
At this point you could either parse out the results into iTitle using the csv import feature or even cooler distribute a link to the uTitle clip so that other people can add their own comments.
[I know what I'm going to be doing tonight ;-)]
It took more than one night but uTitle now also supports Vimeo videos. I wonder if this method will be used for IWMW11 …
Having other people using iTitle is incredible useful as it adds some participatory design to the development of the tool. It’s also been great to have Tony Hirst point out some useful tips and tricks. More about those later.
So first up …
Vimeo gets cross browser support with flash-based player playback. In March after I managed to get in-browser playback with YouTube videos Paul Hadley from Just Blogging contacted me and asked if something similar was possible with Vimeo. At the time I briefly tried getting them to run through the flash-based JW Player which was used with the YouTube vids but couldn’t get it to work. I probably gave up too early, instead being lured into playing with Vimeo’s beta HTML5 output. It was a great opportunity to learn about the possibilities of HTML5 video and emerging formats for subtitles, of course not forgetting about the joys of video codecs. Many a blog post has been written on HTML5 video and I won’t get into that here, the bottom line being the h.264 codec used by Vimeo is only available in a handful of browsers.
IWMW10 was a great stimulus to have another look at using Vimeo’s existing flash-based video. Had I done a bit more digging around back in March I probably would have come across Andrew Beveridge’s code posted on the JW Player forum. This code is able to convert a Vimeo video ID into a url for the source .mp4 file which can be played in the same JW Player used for YouTube content. This is a big step forward as it means Vimeo’s excellent hosting service, which even with a free account allows videos over 10 minutes, can now be mashed-up with Twitter content and available for viewing in *97% of desktop web browsers. Now that’s got to strengthen its use to enhance conference archives.
Twitter authentication for user protected archives Before IWMW10 Brian asked if there were any ‘gotchas’ he should be aware of with iTitle. I couldn’t think of any but it wasn’t long before Brian came up with one.
The way I setup iTitle for the in-browser video playback with subtitles was to created a cached copy of the XML caption file using the video ID as an identifier/filename. The problem here is that anyone could come along later and recreate a completely different subtitle file for the same video overwriting the original. I’ve been able to write protect Brian’s hard work my changing the file permissions on the server but this model isn’t sustainable.
What I’ve decided to do is allow a user to login to iTitle via Twitter using the same TwitterOAuth code used in uTitle. When they are logged in their Twitter user ID is then used to create a unique subtitle files which only they can overwrite. For example below is a version of one of the IWMW10 talks I created with just tweets from the @iwmwlive account (I did this by downloading the Twapper Keeper archive for #iwmw10 in .csv format and filtering the results in Excel, exporting as a .csv again then selecting this as the source in iTitle):
Pulling video meta data using oEmbed This one came from a tip from my partner in crime, Tony Hirst. After the ‘embed, embed, embed’ post Tony passed me a link to the oEmbed site. oEmbed is an open format implemented by a number of media hosting sites which makes it easy for 3rd parties to embed content with a single link. So for example if I included a url to a YouTube video if your site had the right plugins it could convert this into a playable version of the clip.
I think when Tony was suggesting oEmbed he was thinking that I could expose an API for the iTitle site which would make it easy for people to embed the results on their site with a single url. As I’m not aware of many 3rd party sites that use oEmbed yet for embedding content it’s lower down my to do.
Instead I thought it would be an opportunity to grab some meta data from either YouTube or Vimeo to include with the iTitle content. I’ve kept it quite basic so all I’m doing is pulling the video title and author. The main reason I wanted to do this was to add some meaningful tracking data to my Google Analytics (another tip Tony had which I haven’t implemented yet is Tracking YouTube Embedded Player Plays with Google Analytics)
Things still to do
Brian had some useful suggestions for further development including: searching for tweets across a collection of videos; and a RESTful interface to link to specific parts of a captioned video. The searching across a collection interests me the most and with enabling Vimeo content in JW Player this opens opportunities to use more of the player’s built-in features like playlists and clip control.
The other thing on my mind is the YouTube video timeline commenting tool, uTitle, could easily be extended to include Vimeo. Which leaves the question where will all this end …
Having added embed options for the Twitter subtitling of live events (iTitle), it made sense to include these with the Twitter/YouTube commenting tool (uTitle). As both these tools have some code in common adding embed options for the player and player+navigation was relatively straight forward. So now you can embed Twitter commented YouTube videos in other sites.
But what about providing an interface to let people comment on videos from your own site? This is now also possible … kind of.
Using the same method for embedding the player with jump navigation <iframe>, it is possible to just make a bigger one which includes the whole of the screen. The issue I haven’t been able to resolve is that the user needs to authenticate via Twitter. This processes unfortunately breaks the user out of the <iframe>, before automatically redirecting back them back to the comment interface on http://www.rsc-ne-scotland.org.uk/mashe/utitle/. My preference would be if a users is making a comment via uTitle embedded on www.yourweb.com that you stay on that site.
That just leaves the WordPress plugin. This was suggested by Dries Bultynck
in response to the original uTitle post. As I plan on using commentable videos on this blog it seemed like a good idea to put something together on this. So here it is the WordPress uTitle Plugin. This turns a custom shortcode into an embed/iframe.
Brian was interested in using iTitle to create twitter captioned versions of videos from IWMW10. Their plan was to use Vimeo to host the videos as it allows upload of videos greater than 10 minutes. This led me to update the iTitle code to include the timeline jump navigation which I originally developed for YouTube videos.
Whilst doing this it occurred to me that I should really get around to providing a way for users to direct link to the results page (something I had been meaning to do from the very beginning). What this means is if you are using the iTitle for in-browser playback of subtitled YouTube or Vimeo videos you can share the results with a direct link. So for example you can see Brian’s open address for IWMW10 at http://hawksey.info/ititle/v/id/13314385/ or the Google I/O 2010 Android Demo at http://hawksey.info/ititle/u/id/IY3U2GXhz44/
More importantly I thought it would also be useful to include the ability to embed the results in other websites. With the introduction of the timeline jump navigation using the typical <embed> code you see with YouTube video isn’t possible (also I’m also using the HTML5 version of Vimeo videos which also doesn’t <embed>).
I’ve instead opted to automatically generating some <iframe> code which is included in the display/result page. So using Brian’s speech again as an example the resulting code generated to embed the video in your own website is:
<iframe style="border-bottom: medium none; border-left: medium none; width: 470px; height: 570px; overflow: hidden; border-top: medium none; border-right: medium none" src="http://hawksey.info/ititle/v/id/13314385/" frameborder="0" allowtransparency="allowtransparency" scrolling="no"></iframe>
To display just the video player with twitter subtitles I was able to <embed> code for the YouTube videos as they are Flash based. The JW Player which I use for playback has a ‘viral plugin’ which can generate the embed code (and send email links). A big plus point is that it preserves the link to the Twitter subtitle file. The player only version of Vimeo uses <iframe> again. With all these embed options I leave it to the author to decide if they link back to the original.
An update on YouTube/Twitter commenting (uTitle) coming soon …
Previously when looking at twitter subtitling of videos the focus has been on replaying the backchannel discussion with the archive video from live events. The resulting ‘Twitter Subtitle Generator’ has now been used to generate and replay the twitter stream for programmes on the BBC iPlayer (some iPlayer examples), the JISC 2010 Conference (See Searching the backchannel with Twitter subtitles) and more recently as a way to enhance lecture capture. The founding premise behind all these examples and the question originally posed by Tony Hirst was how to allow a user to experience and replay the synchronous channels of what was said from the stage, and what was said about what was said from the audience (physical and virtual). Having looked at synchronous communication I was interested to extend the question and look at asynchronous communication (i.e. what was said about what was said after it was said).
My first step has been to experiment with the use of Twitter for timeline commenting on YouTube videos. The idea of timeline commenting of media isn’t entirely new and has been a feature of audio services like SoundCloud for a while. Likewise I’m sure the idea of integrating with the Twitter service as a method of capturing comments has also been used (but for the life of me I can’t find an example- another project perhaps).
The result is a prototype tool I’m calling uTitle. How it works is best explained in the video below:
As can be seen in the video uTitle allows a user to make comments at any point in the video timeline. These comments are also captured and can be replayed and added to at a later point. The link below lets you try out uTile for yourself (the paint is still wet so if you come across any problems or have any feedback this is greatly appreciated – use comments below).
A couple of features of uTitle worth highlighting. Firstly, as demonstrated by the example link above it is possible to directly link to a timeline commented video making sharing resources easier. Another important point is that because twitter comments for YouTube videos are aggregated by using the video id this makes it possible to use this data with other services (at one point I was considering short-coding the ids to make less an impact on the Twitter 140 character limit, but I wanted to make generated tweets has human readable as possible.
How it was done
For those of you interested here are a couple of the key development milestones:
Step 1 Indentify way to integrate with Twitter
I already knew Twitter had an API to allow developers to integrate with the Twitter service so it was a case of finding a head start on which I could build upon. As I do most of my coding in PHP I went straight to this section of the Twitter Libraries. Having tried a couple out I went for TwitterOAuth by Abraham Williams (mainly because it used OAuth and when I looked at the code I could understand what it was doing).
Step 2 Submit a form without page refresh
Something I’ve known is possible for a while but never needed. I knew I wanted to allow users to make comments via uTitle without refreshing the page and loosing their place in the video. This post on Ask About PHP was perfect for my needs.
Step 3 Jot down the pseudo code
This is what I wanted uTitle to do:
Get YouTube video id
If video id doesn’t exist as a notebook on Twapper Keeper make one
Else get results from Twapper Keeper for video id
Get results from Twitter Search
Merge data and remove duplicates
Generate XML subtitle file from results
Display interface page
On comment submit to twitter
Step 4 Put it all together Some late nights pushing bytes across the screen …
These examples demonstrate how it is relatively straight forward to extract part of the Twitter timeline
Future development areas
Some quick notes on areas for further research/development:
Comment curation/admin – currently anything on the public timeline with a YouTube video is pulled in. A similar problem exists for the Twitter Subtitle Generator and it is something Tony and I have identified as a feature … but just haven’t had a chance to implement a solution. Part of the reason for developing the prototype is to start finding use cases (ie find out where the ship is leaking)
Merging synchronous with asynchronous – basically how can Twitter Subtitle Generator and uTitle be merged so comments can be collect post event (the issue here is there are two ways the subtitle timestamps would have to be generated and distinguishing what was said from what was said about what was said).
Other video sources – I’m primarily interested in how uTitle might work with BBC iPlayer (particularly as the latest developments are exploring social networks – as highlight by Tony).
Spamming your followers with comments – Interested to see if users are willing to use there main twitter account for generating comments.
Hmm I think I may have bitten off more than I can chew …