Author Archives: Martin Hawksey

About Martin Hawksey

Learning Technology Advisor

Kin Lane has done some great work in highlighting the importance of APIs in education. If you are unfamiliar with APIs they are a way for separate programs to communicate and share information or perform actions.  With the growing usage of data in education I believe APIs are the only way to use data effectively and efficiently. Kin’s University of API white paper  is a great starting point to get more context.

Reading the white paper reminded me how important it is to get people to think beyond the webpage and consider the underlying data used to generate it.

Luke, view the source

Back in the day when I discovered the work of Tony Hirst this was a real threshold concept for me. Five years ago unpicking data powering the web felt a lot easier there was usually only basic authentication required, if any. Now you usually have to do some sort of authentication handshake. This additional step often immediately lands you in codeland. Even if you don't do code there are still opportunities to explore APIs. Any decent API service will usually have interactive documentation for developers or API consoles. In a recent talk, which you can see the fragments of here, I highlighted what data is behind a tweet. If you'd like to explore the Twitter API in a non-cody way here's how:

Interactively exploring the Twitter API

1. Go to Twitter API console https://dev.twitter.com/rest/tools/console. Make sure you logged in (should see your avatar top right)

2. In the Authentication drop down select OAuth 1 - this will prompt you to sign in with twitter

 image

3. When bumped back to the pack select /status/show/{id}.json

image (3) 

4. After it prefills some details switch to the Template tab. In the id box enter a twitter id number e.g. in my tweet https://twitter.com/mhawksey/status/591156241969319936 you'd just enter 591156241969319936 and hit the orange send button

 image (2)

5. In the pane you should get a response. The main data bit starts:

{
  "created_at":

6. To get details of other tweets click the Template tab and enter another id.

image (1)

7. If you are interested in other API calls you can make click the Service box and select another

Enjoy!

TAGSPresenter

Later today (2.30pmUTC) I’ll be presenting at #oer15 about Twitter in open education (tune in here). As I wanted to highlight the network effect of Twitter I wanted to engage not just the room, but leave ‘footprints’ as for others to follow. I know people like Alex Couros and Alan Levine have done cool stuff live tweeting from Keynote. I’ve dabbled with doing stuff with Microsoft PowerPoint but was never fully satisfied. Given Twitter now supports a number of embedded media formats I thought rather than trying to fit Twitter into another presentation tool, to turn my live tweets into my slides.

And so TAGSPresenter is born! Using a Google Sheet as an editor, Google Drive to host images and a bit of Google Apps Script to glue it together I’ve got my own Twitter based presentation tool. I don’t have time to write about how it was technically achieved but if you want to peak under the hood of the hack here are my ‘slides’ which are published here.

Tune in at 2.30 to see how it goes ;)

In a couple of weeks I’ll be talking about TAGS at OER15 (14:30-15:00, 15th April). Whilst parallel sessions aren’t going to be streamed I’ve got a couple of ideas for broadcasting my session. If I pull this off I’ll be co-tagging my presentation #oer15 #740.

My notes for structuring the session so far is:

  • Networks – shape, strength and characteristics
  • Networked education – making the connection
  • Footprints – seeing the connection
  • Twitter in Ed – activities Adoption Matrix (Mark Sample) and examples
    ”anyone to become an active participant in the conversation” Ross, 2012.
  • APIs – me speaky code
  • Twitter Search and the Twitter Search API
  • Anatomy of a tweet
  • TAGS/TAGSExplorer
  • TAGS in the wild
  • Context – SNA, ethics, vulnerability

You’ll see I’ve highlighted two items in that list and this is where you can help. If you have used TAGS to support your class/course I’d like to know:

  1. how does Twitter generally fit in to the course? Are you using directed activities or is there a more organically;
  2. how TAGS is used to support this? Post/ongoing SNA, situational awareness, …

As the session will mostly be me talking, preferably some video or audio would be great. If you’ve previously talked about Twitter/TAGS in education and a recording is available I’ll happily look at this and see if there is something I can use.

Thanks!

2 Comments

There was a question that came up in the Google Apps Script G+ community about moving a row of data to another sheet. The person was reusing some code posted by Victor Yee back in 2012 which hooks into the onEdit event in Google Sheets. The idea is a Google Form is used to collect data into a Google Sheet. Someone then looks at the data entered and decides if it should be actioned. If yes then the data is moved to an appropriate sheet within the spreadsheet. The route of the problem appeared to be not only has Google Apps Script changed a lot since then but so has Google Sheets and Forms. In particular it looks like new Sheets “Cannot cut from form data. Use copy instead.”:

Cannot cut from form. Use copy instead

To use ‘copy’ instead in Victor’s code you would replace moveTo with:

s.getRange(rowIndex, 1, 1, colNumber).copyTo(target);

and add the line afterwards of

s.deleteRow(rowIndex);

which will delete the row just changed. I’m not sure why moveTo doesn’t work. Perhaps there is conflict between the onSubmit and onEdit events. Looking through Victor’s code I was surprised he didn’t use the onEdit fields available. For example:

FieldExampleNotes
e.sourceSpreadsheetA Spreadsheet object, representing the Google Sheets file to which the script is bound
e.rangeRangeA Range object, representing the cell or range of cells that were edited
e.value10Only available if the edited range is a single cell

.. so I've reworked into:

/**
 * Moves row of data to another spreadsheet based on criteria in column 6 to sheet with same name as the value in column 4.
*/

function onEdit(e) {
  // see Sheet event objects docs
  // https://developers.google.com/apps-script/guides/triggers/events#google_sheets_events
  var ss = e.source;
  var s = ss.getActiveSheet();
  var r = e.range;
  
  // to let you modify where the action and move columns are in the form responses sheet
  var actionCol = 6;
  var nameCol = 4;

  // Get the row and column of the active cell.
  var rowIndex = r.getRowIndex();
  var colIndex = r.getColumnIndex();
  
  // Get the number of columns in the active sheet.
  // -1 to drop our action/status column
  var colNumber = s.getLastColumn()-1;
  
  // if our action/status col is changed to ok do stuff
  if (e.value == "ok" && colIndex == actionCol) {
    // get our target sheet name - in this example we are using the priority column
    var targetSheet = s.getRange(rowIndex, nameCol).getValue();
    // if the sheet exists do more stuff
    if (ss.getSheetByName(targetSheet)) { 
      // set our target sheet and target range
      var targetSheet = ss.getSheetByName(targetSheet);
      var targetRange = targetSheet.getRange(targetSheet.getLastRow()+1, 1, 1, colNumber);
      // get our source range/row
      var sourceRange = s.getRange(rowIndex, 1, 1, colNumber);
      // new sheets says: 'Cannot cut from form data. Use copy instead.' 
      sourceRange.copyTo(targetRange);
      // ..but we can still delete the row after
      s.deleteRow(rowIndex);
      // or you might want to keep but note move e.g. r.setValue("moved");
    }
  }
}

which you can also get by making a copy of this sheet (Update: remember to open Tool > Script editor and then click Resource > Current project triggers to add the onEdit event to the function). See also Michael's comment about using var s = e.range.getSheet();

A question came in on the Google Apps Script Google+ Community which in part was asking about parsing .csv files into a Google Sheet. I saw the question come in over my phone and knowing it was very achievable directed the person to the Interacting With Your Docs List tutorial on the Google Developers site as a starting point. In particular this tutorial included a function to parse a .csv file using regular expressions. Shortly after +Andrew Roberts kindly pointed out  that Apps Script includes a parseCsv method … doh. +Marcos Gomes then pointed out that the tutorial uses the Docs List Service that was deprecated on December 11, 2014 … doh. Given the tutorial I referenced was written in May 2010 it’s not surprising given the updates in Apps Script that it’s now out of date and the tutorial itself should probably be carrying a deprecated notice. One of the really nice things about the Google Developers site is most, if not all, the documentation is released under Creative Commons Attribution 3.0 License. So for my penance below is a reworking of Tutorial: Interacting With Your Docs List CC-BY Google Inc. (Jan Kleinert)

...continue reading

Back in 2012 I shared some of the exploratory work I did issuing Mozilla Open Badges with using a combination of Google Apps Script and Google Sites. This solution had bit of a fudge which required the badge Assertion (the JSON data file behind each individual badge), to be proxied via a web host. It was great to see this post was followed up by David Mullet who was able to add some additional functionality as well as avoiding the badge proxying. Since then I've also revisited Open Badges, this time as an add-on to issue badges from self-hosted WordPress sites. As part of this I learned more about issuing badges and in particular the Mozilla Issuer API. The Issuer API is a JavaScript library which makes it easy for you to let people add badges they've achieved to their badge portfolio in the Mozilla Backpack. Late last year I decided to revisit Open Badges and Apps Script, this time integrating the Issuer API into a Script powered web app, removing the need send people to a Google Site. This wasn't as straight forward as I had anticipated and but in this post I'll cover what is possible.

Why?

Survey participant emailSo why issue badges directly from Google Apps Script? This was inspired by feedback from a survey we were running at ALT. This was distributed as a Google Form. To help improve the response rate we included a feature to receive an email confirming the respondent's participation in the survey. The confirmation, sent using Apps Script, included a suggested tweet and a digital badge for them to display highlighting participation. Because the badge wasn't in the Open Badge format respondent's couldn't add it to their Mozilla Backpack. I could have of course reused the earlier work I and David have done on issuing badges via Google Sites, but wanted to see if it was possible to remove that dependency and issue Open Badges from a single container bound script or by embedding an Apps Script in Google Sites.

What’s possible?

Looking through the source code of the Issuer API you can see it has two methods for launching a badge claim interface. The default, OpenBadges.issue (code here) is a full screen modal interface created by injecting a iframe. The fallback, OpenBadges.issue_no_modal (code here), adds the badge data to a web form which is automatically submitted to the Backpack to handle. At the time I was exploring this problem, the method for creating html based interfaces in Apps Script was restricted to a sanitised version of HTML which didn’t support the modal/iframe way of doing things. That all changed in December when Google announced an alternative way of serving HTML in Apps Script that skipped most on the sanitisation (see Speeding up HtmlService in Apps Script).

This all sounds promising, directly wrapping the Issuer API JavaScript library in a Apps Script powered web app. Trying this out I hit my first issue:

I'm using Google Apps Script Content Service to generate a badge assertion (1.0.0 spec), example here https://script.google.com/macros/s/AKfycbx5uvVcmmHweZwhIxzO0IAUrUtY_cW88Sz1B-MpquZxopvfyIY/exec?type=assertion&uid=54354354354-2

When trying to issue the badge using the Issuer API the issuer returns a 504 bad gateway example header response. Using Google Apps Script Content Service there are 302 redirects are in place for the badge assertion which end in 200 OK (this is how Google configure their service). Here is example 302 redirect headers and example eventual 200 OK.

Using Google Apps Script Content Service to create badge assertions appears to only affect assertions that include verify.url (a 0.5.0 spec assertion is issued without issue).

A 302 redirect 1.0.0 spec assertion was issued okay e.g. http://labs.hawksey.info/badges/1_0_spec_redirect.json which suggests something else in the headers is causing the 504 bad gateway

If you fill in this form with your Backpack email it sent a link to a Google Site where you should be able to claim a badge associated to your email https://docs.google.com/a/hawksey.info/forms/d/1IKfJQGu1spJyTpkPWYL8TlXcRFQxm8gv8ZMATXcVQWU/viewform

I'm still getting a 504

A lot to take in here. Basically Open Badges can use a JSON object for each badge also known as the Assertion.  The Assertion spec is at 1.0 which requires  a verification url for the badge. The verfication url is the same as the url where the Assertion lives. Here is an example badge generated using Apps Script:

Now if you hit the verify.url: "https://script.google.com/macros/s/AKfycbx5uvVcmmHweZwhIxzO0IAUrUtY_cW88Sz1B-MpquZxopvfyIY/exec?type=assertion&uid=54354354354-2" you get passed through a 302 redirect:

302 redirect

In terms of Badge Verification “eventual 200 OK” is permitted:

The use of the term "eventual 200 OK" is meant to mean that 3xx redirects are allowed, as long as the request eventually terminates on a resource that returns a 200 OK.

Using the Open Badges Validator throwing in the verify.url returns a valid Assertion

valid Assertion

So the Mozilla Backpack issue ticket remains open. Currently, as far as I can see, a version 1.0 spec Assertion generated in Apps Script 504s in the Issuer API. If you are desperate to issue Open Badges in Apps Script NOW!!! the fix ain’t pretty … the Open Badges 0.5 spec assertion (bloody nightmare to find this reference …). This is basically where I was in 2012, a badge assertion that doesn’t use a verify.url.

Lets however look at a couple of example workflows for issuing Open Badges using Google Apps Script.

Example: Using a Google Form to Issue Open Badges via a Google Site

Back to the original scenario of giving someone an open badge for completing a Google Form.  Here’s an example Form/Sheet you are free to copy/reuse. Things to note are:

  • an extra sheet called IssuedBadges; and
  • some code in Tools > Script editor (source extracted into a gist) which needs you to go to Resources > Current project’s triggers and add a onSubmit trigger .

The code is hooked into the form submission and records a new badge in the IssuedBadges sheet and generates an email to the recipient to collect their badge (note current issue with duplicate emails from Apps Script). You’ll also see from the code it includes a BASE_URL, this is the place we’ll send people to  collect their badge. A Google Site can happily have a container bound Apps Script, but in this example I want to highlight how you can have a standalone script embedded in a Google Site. So our next bit of code is a standalone Google Apps Script which handles the Open Badges Assertion creation and issuing. You can copy and then modify the standalone script here or browse as a gist).

Hopefully the source has enough in-line documentation to make sense of it. Basically the code looks for the type of data it’s returning. If it’s another machine it throws JSON and for humans shows the Mozilla Issuer API which will help the person collect their badge in the url.  To deploy this code we need to publish it as a web app. If you haven’t done this before from your copy of the script in the Script Editor select Publish > Deploy as web app which should give a similar dialog to the one below:

Deploy as web app

A couple of neat things here. For this to work you need to allow the app to ‘execute as me’. This lets the script access the Google Sheets your account can access. This means the script will selectively be able to read the Google Form responses you get. We also allow the script to run for anyone, even anonymously. This doesn’t mean that anyone can start reading all of your data, the script is programmed only to ready specific data and only return a specific interface designed by us.

When you hit ‘Deploy’ this gives you a ‘service url’. You can try this url and you’ll get a browser window with something like:

image

Back at the script attached to your Sheet/Form you could replace the BASE_URL with ‘service url’ (this line of code). So when someone fills in your form they will get an email to your deployed script which will have the extra bits needed in the link to issue them a badge. This isn’t great UX so instead lets see how you can use the same url in a Google Site, the advantage being you can provide additional information about collecting a badge in a Google Site. Fortunately the Google Developers site has some great guidance on Embedding your web app in Google Sites which should allow you to paste in your ‘service url’.

You can adjust the gadget settings like size and border, and of course add extra information. To see all of this in action enter your email in the form below and you should get an email with a ‘claim link’ to the ‘sandbox’ site (emails won’t be shared or used for any other purpose than sending the claim url).

Wrapping up

So if you’ve made it this far !!!WELL DONE!!! (#epic). Some things to note. This example currently only works with 0.5 spec Open Badge Assertions. Hopefully the 504 issue with Backpack/Issuer API is resolved. Some of you may have noticed that in this example I opted to use the issue_no_modal of the Issuer API rather than take full advantage of the new Apps Script IFRAME/modal mode. The reason I didn’t is there are some open issues with IFRAME and Caja’d version using no_modal Issuer API works perfectly fine (I’ve left the iframe version in the code if you prefer this way).

Enjoy and as always comments to clarify any of this are welcome

2 Comments

Update 10/02/2015: A recording (slides/audio) is available here

Today I’ve been invited to ALT’s White Rose Learning Technologists SIG to talk about Learning Analytics and educational mining of Twitter. The Learning Analytics part of this was something I was somewhat reluctant to do as I have recently realised how much I don’t know about this area. Another factor was my fear that learning analytics is being eroded by ‘counts’ rather than actually caring about the learner. This appears to be a shared concern and I was fortunate to recently see a talk given by Dragan Gasevic which addressed this. A recording on his session is currently here and Dragon’s slides are on Slideshare, which I heavily reference in my own talk. I’ve put my own slides online and a recording may be available soon. Here are some notes/reflections:

Absence of theory

Amazon cares not a whit *why* people who buy german chocolate also buy cake pans as long as they get to the checkout buying both - Mike Caulfield - Short Notes on the Absence of Theory

I was fortunate to be in the bar room when ‘absence of theory’ was being discussed at MRI13. The thing that hit me hardest was the reflection that throughout my career there has been an absence of theory. Like many other learning technologists I jumped into this area from another discipline, in my case structural engineering. Consequently I started in this area with more knowledge of the plastic analysis or portal frames than  educational theory. Being a curious person has taken me down numerous avenues and often along the way I’ve been lucky to work and learn with some of the best. For example I was fortunate to work with Professors David Nicol and Jim Boyle at the University of Strathclyde, Jim arguably being responsible for importing Peer Instruction to the UK. So while I have some theory I don’t have enough and whilst I have connections to some of the best people in LA my job isn’t aligned. But enough about me, without theory the danger is you have data but no actual insight to what it means.

Visualizations

Graphs can be a powerful way to represent relationships between data, but they are also a very abstract concept, which means that they run the danger of meaning something only to the creator of the graph … Everything looks like a graph, but almost nothing should ever be drawn as one. - Ben Fry in ‘Visualizing Data’

The consequence is ‘every chart is a lie’, a representation of data defined by it’s creator. One option here is to turn the learner into the creator. With modern web browsers it’s becoming even easier for someone to become the explorer of their own data. Dashboards, which appear to have the same appeal as Marmite, can also be personalised to give more meaning to the learner. Even with personalisation and customisation there is a danger of misinterpretation which Dragon highlighted with Corrin, L., & de Barba, P. (2014). ‘Exploring students’ interpretation of feedback delivered through learning analytics dashboards’.

image

Ethics and privacy

The worlds of privacy and analytics intersect …not always happily – Stephen Downes

I was browsing some slide decks by Doug Clow as part of the LACEProject and he captured the sentiment nicely highlighting that there needs to be transparency when using learning analytics. He contextualised this around guidance and support rather than  surveillance and control. Given the varying degrees of apathy I see around data and privacy this is a conversation always worth having. There is clear outline of ethical considerations in the Analytics for Education chapter penned by my co-authors Sheila MacNeill and Lorna Campbell:

Ethical Issues

As institutional managers, administrators and researchers are well aware, any practice involving data collection and reuse has inherent legal and ethical implications. Most institutions have clear guidelines and policies in place governing the collection and use of research data; however it is less common for institutions to have legal and ethical guidelines on the use of data gathered from internal systems (Prinsloo & Slade, 2013). As is often the case, the development of legal frameworks has not kept pace with the development of new technologies.

The Cetis Analytics Series paper on Legal, Risk and Ethical Aspects of Analytics in Higher Education (Kay, Korn, & Oppenheim, 2012) outlines a set of common principles that have universal application:

  • Clarity - open definition of purpose, scope and boundaries, even if that is broad and in some respects extent open-ended.
  • Comfort and care - consideration for both the interests and the feelings of the data subject and vigilance regarding exceptional cases.
  • Choice and consent - informed individual opportunity to opt-out or opt-in.
  • Consequence and complaint - recognition that there may be unforeseen consequences and therefore provision of mechanisms for redress. (p. 6)

In short, it is fundamental that institutions are aware of the legal and ethical implications of any activity requiring data collection before undertaking any form of data analysis activity.

Opportunity

At best analytics can help start a conversation. People have to be willing to take the conversation on - Roberts, G. Analytics are not relationships

My biggest fear is Learning Analytics just becomes ‘computer says no’. I’m reassured that there are many people working very hard to make sure this doesn’t happen, but in the glitz and glamor of ‘big data’, prediction algorithms and dashboards there is a danger that we start caring about the wrong thing. For me the biggest opportunity is analytics are used as feedback, helping inform the conversation.

Setup for #altc 2014

This post is for my own reference but others may find useful. The information below is the result of drawing on the expertise of Darren Moon (LSE) for some of our ALT live events.

ALT has a YouTube channel - for any channel that has more than 100 subscribers* that is ‘good standing’ YouTube gives the option to be able to stream live events. With this you create an event on YouTube in your channel which people can view there or embedded in another site (as you would embed Youtube videos). A nice feature of Live Events is they are immediately available for playback as well as being streamed live. 

*thanks to Steve Boneham for the correction

To stream the event you need some software. Google have a deal with Telecast to provide their Wirecast software for free (stripped down version). The software allows you to connect cameras and other inputs (you need to have equipment that supports a live video feed. Firewire was the way this was mostly done but now it's moving to HDMI input - to get HDMI into our laptop we used these magic boxes). The Wirecast software also allows you to do a live production with indents and different camera shots. For belt and braces the software also has the option to record your mix to your hard disk in case the stream is lost (we also record the raw footage on the camera in case Wirecast falls over).

[Wirecast can also output to a Google Hangout OnAir if you don't have a YouTube Channel - we haven't made use of this though] 

The usual headache doing this is (particularly as we do this on the road):

  • internet access (wired preferred) - we've never had an issue with (ports) configuration but something we always check
  • audio source - if using wireless mics getting the feed is something we ask for. In smaller events placing a boundary mic and getting the cable to the camera/laptop requires very long wires
  • desktop mirror - the best way to include slides is to mirror what is being thrown to the data projector. When this isn't available we either have a second camera on the main screen or another laptop running the presentation
  • space - you need space for at least one laptop and camera. We monitor the feed and back channel so often have 2-3 extra laptops doing different things
  • people - you need at least one person to do all of this, we usually pair up
  • power - less of an issue for sort events when you can run from batteries

There's lots of info on how to do this sort of thing online with recommended kit lists, alternative live streaming services. ALT has the Video in Education SIG (ViTAL SIG) which would be a good place to go for advice

It’s been a while since I've had a chance to blog about a failed attempt to do something. Recently I was trying to get full screen access in Google Drive using Google Apps Script and I couldn't get it to work resulting in this feature request (star to vote ;). In this post I want to share some of my thought processes and highlight why this would be a useful feature.

In the middle…

One of the nice features of Apps Script is access to custom dialogs and interfaces in Docs, Sheets and Forms. These can be generated using the HtmlService, which enables you to write very rich and interactive interfaces. Until very recently all the Html you created went through Google’s Caja Compilier. Caja created a lot of headaches for Apps Script developers striping and reformatting html tags and restricting what you could do. For example you could forget using SVG (scalable vector graphics) making it impossible to use well established visualisation libraries like D3.js.

Román Cortés 1kb Christmas Tree in Apps ScriptHowever, in December the Google App Script team announced Speeding up HtmlService with the IFRAME sandbox mode. With this there is now the option to skip Caja sanitisation and a whole host of modern browser-based application deployment. For example here’s  Román Cortés 1kb Christmas Tree in Apps Script (I've tipped it over the 1k mark as it was throwing an error in Chrome. - the source code for this project is here). Huh turns out with blocks can't be used anonymously, as script owner this works fine :(

The IFRAME mode should not be confused with the ability to use <iframe> to embed your published scripts. This isn't currently possible for security reasons and there is an issue ticket for it.

In the beginning…

My interest in using custom visualisations in Google Sheets stems back to 2011, my first attempt wrapping Protovis SVG graphs, the precursor to D3.js, as a Google Spreadsheet Gadget. Alas Spreadsheet Gadgets got deprecated in the ‘spring clean of 2012’. One of the big advantages of Spreadsheet Gadgets was you didn’t have to publish the data to be able to generate a visualisation, (a route I use for my TAGSExplorer tool). Unfortunately I don’t have a screen shot of how the Protovis Gadget rendered so you’ll have to make do with a later version which switched to D3.js:

Google Spreadsheet Gadget Example

With the new IFRAME  mode there is an opportunity to re-explore how custom visualisations and analytical tools could be integrated in Sheets as well as Docs and Forms. Bruce Macpherson has already shown an example of D3.js in the Sheet sidebar:

CC-BY-SA brucemcpherson
CC-BY-SA Bruce Mcpherson - D3.js in the Sheet sidebar

With this we've come almost full-circle with the ability to create custom visualisations around data in Google Drive without the need to publish it. One limitation here is the amount of screen space you can play with. Sidebars are restricted to 300px wide. You can use dialog windows but these would have to be a fixed size and as far as I'm aware there are no browser detection methods for factor in the available screen size. But lets not forget that I'm predominately talking about scalable vector graphics here and most modern browsers now support the Fullscreen API (Fullscreen API spec). To see this in action embedded below is an iframe of this page, which if supported by your browser, will let you to pop it to fullscreen using the button (I'm just using a raster image to keep with the Xmas theme ;).

So using Apps Script we could have fixed size interfaces with the ability to stay in Docs, Sheets or Forms interface but pop to fullscreen … Well as you have probably already noticed from the post title nope. Here’s a Sheet where I’ve used the same full screen code in a custom dialog. Clicking the ‘Image Test’ link in the spreadsheet opens the dialog but while the fullscreen script runs nothing happens. Initially I thought it was something wrong with my code but trying a YouTube embed within a custom dialog resulted in now full screen option being included.

YoutTube Embed - no full screen

In the future…

Having access to Fullscreen API would be one way to get us back to 2011 and there may be other feature requests around embedding custom interactive graphs in Google Drive. If you’d like access to the Fullscreen API in HtmlService here’s the feature request.

Oh and happy holidays ;)

4 Comments

Having a domain of your own is a wonderful thing but like many things in life it comes with responsibilities. Self-hosted WordPress blogs are a regular focus of spammers and brute force attacks and more than likely your webhost already has measures in place to prevent the bots getting in. There are proactive measures you can also take to protect your domain and a host of plugins and tutorials to help with this.

To add to the collection I want to share with you my experiences of setting up and using CloudFlare. CloudFlare creates an extra layer between your website and the world allowing them to block threats as well as host and optimise your content. At an entry-level you can also use CloudFlare for free, which I've been doing for the last couple of months.

httpsRecently CloudFlare announced they were Introducing Universal SSL for all users for free. SSL/TLS are encryption protocols used to secure Internet communication. Hopefully you are familiar to looking for the https:// and not just http:// when you are doing your online banking or online shopping, the ‘s’ indicating the information you see and enter is secure, encrypted, preventing eavesdropping or tampering of the data. But why would I want to use a similar level of security for my blog as for banking? CloudFlare make a great case for this in their post:

Having cutting-edge encryption may not seem important to a small blog, but it is critical to advancing the encrypted-by-default future of the Internet. Every byte, however seemingly mundane, that flows encrypted across the Internet makes it more difficult for those who wish to intercept, throttle, or censor the web. In other words, ensuring your personal blog is available over HTTPS makes it more likely that a human rights organization or social media service or independent journalist will be accessible around the world.

[Another consideration is Google announced it would use https as a rank factor in it’s search results]

So how do you go about moving your blog to https? Well first, if not already, you need to sign up and setup CloudFlare. There are two routes to do this provided by CloudFlare. Unfortunately my host, Reclaim Hosting, isn’t yet so I had to go through changing nameserver settings (CloudFlare have additional tips on using their service with WordPress). When CloudFlare is set up, to switch enable SSL you need to go into the CloudFlare Settings > Setting overview where you see the SSL configuration:

SSL settings

CloudFlare SSL OptionsCloudFlare actually have a couple of options: Off, Flexible SSL, Full SSL, and Full SSL (Strict). Flexible is by far the easiest to set up and for most people the best place to start. There is a lot more detail about the difference between these in this CloudFlare post. In this they say:

Flexible SSL encrypts all data between your site’s visitors and CloudFlare using TLS configured with best practices such as forward secrecy and more. This is where most threats to web traffic happen: in your coffee shop, by your ISP, and others in the local network.

With this enabled and directing traffic through CloudFlare you can start using https on your blog. There are a couple of things to bear in mind as well as things you have to do. To use https effectively you need to tell your blog this is what you want to use.  There are plugins like WordPress HTTPS (SSL) that can help with some of this but I decided to do it the manual way.

Dashboard and site over SSL

Initially when I tried switching to SSL I ended in an endless redirection loop. Fortunately I came across this post on how to  Setup SSL on WordPress Behind Cloudflare Reverse Proxy. This is backed up by the WordPress documentation on Administration Over SSL and you might want to start with using the first 3 lines of code in your wp-config.php file to test that SSL is working:

define('FORCE_SSL_ADMIN', true);
if ($_SERVER['HTTP_X_FORWARDED_PROTO'] == 'https')
       $_SERVER['HTTPS']='on'; 
define('WP_HOME','https://yoursite.com');
define('WP_SITEURL','https://yoursite.com');

When switching to SSL you may find parts of you posts don’t load. Some of this is down to how your theme has been written and for good themes it shouldn't be a problem. The next problem might be missing videos or images in posts. Basically modern browsers don’t like mixing http with https content. So if you use iframes as a way to embed content like YouTube videos and they are loaded over http nothing will appear. Google updated the embed snippet for YouTube videos from <iframe src=http://www.youtube… > to <iframe src=//www.youtube… > which defaults to serving the content using the same method as the main page. You can run MySQL commands to update these or it might be better to use one of the WordPress SSL rewrite plugins.

Forcing to SSL

So far we've enabled the option for your blog to be browsed over SSL including internal navigation links, but some one still might initially land on an http address. CloudFlare mention Page Rules in their admin interface for forcing to SSL but that setting appears to have disappeared. CloudFlare also mention a way that Apache hosted blogs can use the .htaccess file to redirect a user on to SSL:

 RewriteCond %{HTTP:CF-Visitor} !'"scheme":"http"'
 RewriteRule ^(.*)$ https://www.domain.com/$1 [L]

If like me you still use FeedBurner (I know) for your RSS feed you should also bear in mind that you need to keep your feeds alive for Feedburner (avoiding 400 error) – basically Feedburner doesn’t like to be given feeds over SSL. So my snippet for .htaccess becomes:

# HTTPS redirect
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteCond %{HTTP_USER_AGENT} !FeedBurner    [NC]
RewriteCond %{HTTP_USER_AGENT} !FeedValidator [NC]
RewriteCond %{HTTP:CF-Visitor} !'"scheme":"http"'
RewriteRule ^(.*)$ https://yoursite.com/$1 [L]

Remember if you are forcing to SSL that any tools/apps you use connected to your blog like Live Writer might need updating .. as I discovered

Letting Google know you've moved

The last thing I did was let Google Search know my blog was now over SSL. You may have plugins that handle your sitemap and updates to search engines. For me I did this using Google guide on Move a Site with Url changes.

Future

So a bit of work to get myself on SSL and save a couple of bucks on a SSL certificate. The good news is this sort of thing is hopefully going to get easier and more widespread