Google Spreadsheets as a lean mean social bookmark/share counting machine

It’s been a whole social metricy couple of weeks. First there was UKOLNEIM where I got a chance to chat to Martin Weller about digital scholar dashboards, and yesterday I flicked through Nicola Osborne’s Listen, Repeat, Learn: How to use Social Media Conversations to Measure and Demonstrate Impact and Improve Engagement session at IWMW11.

In Nicola’s presentation she mentioned that they use a spreadsheet to aggregate stats for EDINA online activity (page views, social reaction etc). This fitted in nicely with the ideas I was forming after talking to Martin Weller about centralising this type of information for digital scholars. There are already services like PostRank and Readermeter which can be used to start aggregating different part of this web of data, but I’ve bigger and more personal ideas in mind.

Perhaps not surprising my glue to prototype this is Google Spreadsheets. Yesterday I started looking at pulling social bookmark/share counts for specified urls. What I’ve come up with (or found from elsewhere) is a bunch of Google Apps Script snippets for getting social bookmark/share counts as cell formula.

This script has already been added to this spreadsheet. If you look in cell D2 you’ll see the formula =getFacebookLike(B2). What this does is look up the Facebook API to see how many times the url in B2 has been liked. Here’s the complete list of formula this spreadsheet has access to:

  • getFacebookLike(url)
  • getFacebookShare(url)
  • getFacebookComment(url) 
  • getTweetCount(url)
  • getBuzzCount(url)
  • getPlusones(url) – this one was published by Tom Anthony
  • getLinkedInCount(url)
  • getDiggCount(url)
  • getDeliciousCount(url)
  • getStumbleCount(url)

Note: To use these formula you must have a copy of the Apps Script code mentioned earlier or make a copy of the spreadsheet

Limitations

It’s worth noting that if you start using these formula in a lot of cells you’ll start getting ‘-‘ as the returned value. This is because the code uses the UrlFetch service which appears to be rate limited. So if you have a sheet with over 100 of these formula in it, when it opens after Thinking… about the cell value and fetching the individual stats from the various service APIs it runs out of UrlFetches and can’t get anymore data (it appears the sheet also cache results but I don’t know for how long).

Where next

For what I have in mind I need over 100 calls to these formula so I’ll be looking for a different method, which might include using the Shared Count API which would aggregate a number of these stats into one call. 

Some things I learned along the way

I hate handling callbacks and this is how you create md5 hashes

chevron_left
chevron_right

Join the conversation

comment 2 comments

Comments are closed.

css.php