Skip to content

QuickKey, multiple choice and hinge questions.

April 14, 2014

I have been using #RAG123 marking with my year 9 group (set 3 out of 3) for about 2 months now (for a blog on what #RAG123 marking is read this by me or this by the founder of #RAG123 @ListerKev. For how effective it is read this)

I have noticed a huge improvement in the volume of work that pupils produce during lessons. This is especially true of cover lessons where the books are then collected in and returned to me and marked before the following lesson. I have seen the same benefits in the increased dialogue with pupils through the marking and it has certainly helped inform my planning of following lessons by seeing after the lessons exactly what pupils can and can’t do.

However, one area where pupils in my year 9 group were struggling (particularly compared to the group in Year 11) was in justifying their reasons for giving themselves an Amber or a a Red based on the learning outcomes of the lesson. Despite being clear (at least as clear as possible) on the expected learning outcomes and RAG rating themselves against those I would more often than not get “A1 because I found it hard” or “A2 because I didn’t quite get it all.”

Pupils had got into the habit of giving themselves a RAG rating instinctively; G meaning they feel they have done well in the lesson and A (or R) if they had found parts of it a struggle. What I really wanted though is for the pupils to communicate to me exactly what they have struggled with (based on the learning outcomes). This in turn informs my planning for future lessons. I also wanted pupils to rate themselves a G only if they felt they had “satisfied” the learning outcomes.

As a further tweak, towards the end of the lesson I would ask pupils to put pens down; I’d then show the outcomes and then pupils would pick pens up and RAG123 against them. Following this I would then have a plenary based on the outcomes where I can probe pupils to check for understanding. I feel it is important that pupils RAG before the actual plenary otherwise the plenary could “paper over the cracks” in their understanding and they could rate themselves a G when they will still take away misconceptions from the lesson.

What I want from RAG123 is a constant feedback on what my pupils can and can’t do and what they do and don’t know. It was clear that pupils needed further training. After a week trying to further improve their self RAG rating I thought I would design some multiple choice questions (I like using multiple choice questions- as long as they are well designed- read this by @joe__kirby). These would be given at the end of the lesson. The questions would further unpick the learning outcomes. I would compare the results from the multiple choice questions against the pupils RAG ratings for the lesson. I wanted to see whether someone giving themselves a G based on the learning outcomes scored well in the multiple choice questions. I also wanted to see if pupils who had given themselves an A and the reason for the A would fall down on the multiple choice questions that tested that area. In short, I wanted to carry out a quality assurance on how well the pupils were RAG rating themselves.

Lesson 1
These were the learning outcomes for the lesson (with a nod to @fullonlearning and @BelmontTeach)

(the outcomes stretched over 2 lessons so at the end pupils had to RAG rate against the first two only). I wanted to compare the content in their books and their RAG rating against their multiple choice responses.
In the previous lesson we had learned about balanced and unbalanced forces and the effect these have on an object. An early task in the lesson was for pupils to hold up the correct post it note to describe an object’s motion:

(All pupils were able to choose the correct post it note for each of the scenarios- the above of course being A for acceleration- their prior knowledge seemed secure).

Here are the multiple choice questions:





(Question 6 was a numeracy question on rounding the skydiver’s descent time to 1 decimal place)

As I was hurriedly putting the questions together (time being the enemy as always) I realised that question 5 was without doubt the hinge question for the lesson. A hinge question is:

For more on hinge questions read this by @HarryFletcherWood. Hinge questions are more usually associated with a “hinge” point in the lesson but they are just as useful and legitimate at the end of a lesson, particularly if the subsequent lesson will rely on key knowledge acquired in the previous one. A correct response for this question would mean the key learning point (and the middle learning outcome) has been understood. Importantly, a wrong answer (in particular which wrong answer) would inform the teacher of what their misconception might be. I’m a big fan of hinge questions because every wrong answer tells its own story. I would admit though that I haven’t use them as often as I would like. The outcome of question 5 would certainly influence my planning for the next lesson.

The next lesson would involve pupils carrying out a practical on dropping cake cases and measuring the time it takes for the cases to reach the floor. Would pupils be able to carry out the practical and collect their results as normal or would further intervention be required? Would I need to “press pause” and spend another lesson teaching and reinforcing terminal velocity? (The answer would come from RAG123 and the hinge question).

I considered the best way to collect the pupil responses. Mini whiteboards would be handy, as would just writing the letter in their books. However I decided to use an app called QuickKey to collect pupil responses. QuickKey is a quick and efficient way of collecting multiple choice responses. For more info check out here. You scan each pupil’s answer sheet and the app does a number of useful things with the data. You set up your class in the app (on a PC you can import an excel file for a whole class). Each pupil is then given a unique identifier.
You set up a quiz- by that I mean you give it a name, set the number of questions and then input which are the correct answers.

I had used this app with the group before and so the QuickKey answer sheets were already available for the class (if you photocopy them you have a ready made set for as and when they are needed).

My previous use of QuickKey had been a little mixed with some answer sheets proving more difficult to scan and the iPad/iPhone needing to be held a certain way. However, I had noted there had been an update and the promise was that scanning was easier and the iPad/iPhone could be held landscape in either direction.

I decided to use my “ready made” QuickKey tickets as exit tickets. Scanning of the exit tickets was very straightforward (with no technical hitches).



Two pupils had changed their mind on a question and had crossed their answer out. This meant they needed to be manually scanned. The manual entry was quick enough though.


And the outcome of the questions?


I was teaching the group fairly difficult science (terminal velocity in terms of changing forces often comes up in higher tier GSCE) but I was disappointed with the results. In particular, the hinge question showed that less than a third of pupils actually understood the key learning point that terminal velocity is reached when air resistance becomes the same size as gravity. Clearly this shows that a key focus for next lesson is still to understand the concept of terminal velocity.

What about their RAG123 ratings?

38% (8 out of 21) of pupils had rated themselves a G for learning against the learning outcomes. All bar one pupil had seemingly satisfied the learning outcomes in their FAIL to SAIL task. 6 out of the 8 pupils that rated themselves a G had got the hinge question correct.
Whilst I was disappointed with the % answer to the hinge question it was at least reassuring that many pupils recognised that they had struggled and reflected it in their RAG rating. Their comments on what they were struggling with are still something they need to develop but there were signs of improvements (although in fairness to articulate which part of terminal velocity they are struggling with is hugely difficult).

So, at the end of the lesson I am left with the knowledge (from QuickKey and their RAG rating) that nearly all students are struggling with the concept of terminal velocity. The results of Questions 1-4 provided by QuickKey also show that within the class there are misconceptions about what happens to air resistance and what happens to the weight as the skydiver falls.

Quick key provides the following information:

1. Number of questions correct for each student (and by clicking on each student you can see which they got right and what their answer was for the wrong answer- very handy with a hinge)


2. The % of pupils that got each question correct.


3. If you log on to your QuickKey account on a Mac/PC you can export the results to excel and see at a glance the answers that pupils gave. Again this info is so important for hinge questions.

For my hinge question all incorrect responses were A or B.

By looking at the wrong responses for my hinge question I could put the pupils into “misconception” groups. For instance the pupils that answered b for question 5 are getting acceleration and terminal velocity mixed up. However I decided that because most pupils struggled with most of the questions I would proceed with the practical as normal (with pupils in their normal groups). What I would do is briefly “reteach” terminal velocity at the start of the lesson and circulate and reinforce the key concepts as the pupils undertake the practical.

Lesson 2
The next lesson had exactly the same “so that” learning outcomes.

The lesson went well (far better then the previous lesson) with more pupils seemingly “getting” the concept. The individual discussion with small groups was very helpful in that respect. Towards the end of the lesson I overheard a conversation between 2 pupils working on putting their results into their results tables. One had made the classic error of adding the three times that it took the cake case to drop and then dividing by three (without pressing equals). Her peer was explaining to her why her answers must be wrong. At this point I asked all pupils to put their pens down so that I could show the learning outcomes on the board and they could then RAG123 rate their work. This pupil (pupil y) had already started to RAG before I had put the outcomes up so I asked her to wait. She replied “I don’t need to wait, I’m giving myself an A because I didn’t work out my averages properly and I didn’t check them.” Fantastic. This pupil was getting better at articulating exactly why she was an A for the lesson (I do appreciate that the numeracy outcomes are far more “concrete” than the SOLO verb learning outcomes linked to knowledge).

With the pupil better able to articulate what she was struggling with I was in a far stronger position to give her more specific targeted feedback.

So how effective had the follow up lesson been in helping the class understand the concept of terminal velocity?

Lesson 3
At the start of the next lesson I used the QuickKey tickets as entry tickets. I gave the pupils the same multiple choice questions as they had had 3 days ago. They filled their tickets in and I collected them in. As the lesson started and pupils moved on to their graphs (after some had recalculated their averages) I used my iPhone to scan in their tickets (I called the quiz terminal 2 to avoid getting duplicate scan message). The whole scanning process took less than 2 minutes. I was then able to feedback instantly to the class on how much progress they had made on their understanding of terminal velocity.
The photo below shows the original results on the left and the follow up results on the right.

Huge progress on all questions. The hinge question had almost doubled in success from 33% to 65%.

Question 6 which was about rounding to 1 decimal place had been answered so much better as this was a learning outcome of the previous lesson.
At the end of lesson 2 pupil x had ragged as:


He had got the question right on the entry ticket in lesson 3 (hopefully due to the feedback I’d given).

Question 7 which I did not give originally but I did when I repeated the test was:

(It was a question totally unrelated to the skydiver and was about Eli Walker doing sprint training and what his average distance was)

What was very pleasing was that pupil y (she’d given herself an A for calculating averages and checking) had got this question right. The intervention from her peer and my comments while RAGging had “closed the gap.”

Where next for using QuickKey and also hinge questions?

My use of QuickKey and the multiple choice questions was primarily designed to quality assure the pupils’ RAG rating. This it did but it gave me a lot more.
Once you have a set of tickets for a class then they can be used whenever you want.
I really like how @RobGeog has used them for an end of topic tests:

The speed at which they can be scanned in makes the whole process hugely efficient. This would be something that many departments might want to use. If the multiple choice questions are well designed (perhaps using @joe__kirby’s 8 rules as a guide) they give both summative and formative information.

I think I am going to continue to use QuickKey every few lessons as formative exit tickets with one of the questions being a hinge question (and possibly the other questions testing pupils on topics from last lesson/week/term). It is clear that using the data that QuickKey has provided has given me an additional layer of AfL to inform my planning. And surely this can only be a good thing.

This process has also reminded me about the importance of hinge questions. I believe that RAG123 marking is in itself the “hinge” point because the outcomes of the marking feeds forward into the planning for the next lesson. However, I know that by using regular hinge questions (midway through the lesson or at the end) I can add to the efficacy of pupil RAG123 rating, my own RAG123 and the comments that I leave. Regular hinge questions (if they are well designed) will also give me insight into how the pupils are thinking when they get the question wrong.

Thanks for reading. As always, feedback is welcome.

*Since completing the blog I’ve tracked down the twitter conversation where I was introduced to using QuickKey and I certainly need to credit @QuarkyScienceUK. Here is part of the conversation:

@simonrenshaw has also been blogging about QuickKey and hinge questions (great minds and all that..,) and you can read about his work here. *

  1. Damien

    Really like the effort and determination you are trying to build into your student’s learning
    A really interesting journey and I am going to try and incorporate some of this into my classroom


    • Thanks for reading and commenting Andy. I enjoy the blogging process because it puts (as best it can) my pupils’ learning under the microscope. It makes me reflect on what I’m doing and why (something that time often doesn’t allow). Thanks again Andy, enjoy Las Vegas,

  2. A very interesting blog, thanks for this. I like the use of QK and will certainly give this a try (so long as my Samsung can handle it!). I really like the use of exit and entry tickets and this just gives them a little more weight – certainly in terms of showing students progress. How cool would it be to just show that data to anyone wanted to observe you? It could certainly lead the way for ‘dynamic iterative tests’ where we could tailor students tests depending on their strengths and weaknesses – is that something that would actually be desirable – not sure. Marvellous work though

    • Massive thanks Marvin for reading and commenting. I agree that the ease at which the data can be gathered makes QK ideal for entry and exit tickets. In the example in my blog it was great to show students the progress they had made in their understanding. Interesting point about using QK to identify strengths and weaknesses to tailor students test; I’m going to give a big 30 question mc test to my year 11 group based on their year of learning. I will use the results to help them identify strengths and weaknesses do their revision can be more tailored. How practical (and accurate) this will actually prove remains to be seen but the scope is certainly there. Thanks again Marvin,

  3. Typical elitist iPhones! LOL. Coming soon to android. Be great if the program could generate individual user feedback for the progress tests.

  4. A really interesting blog post! I’m charged with a mini research project to advise teachers about marking & feedback and I’m looking at trying RAG123. Your experience has helped a lot! Thanks for sharing!

    If you have any other marking/feedback techniques, I’d be really grateful to hear them

    • Thanks Fiona. Good luck with implementing RAG123 and let us know how you get on. Thanks again for reading and commenting.

  5. Lucie permalink

    thank you for the credit – I was QuarkyScience and now Tweet as @Yorks_Bunny and @SheepScience – its so nice to see when something is picked up and used. Glad you like it and thank you for sharing your experiences

Trackbacks & Pingbacks

  1. Reflecting on teaching in 2013-14 (the routine) – what went well, what needs to be improved? | mrbenney
  2. Moneyball for schools: can we use data like the Oakland A’s? | Improving Teaching
  3. Marking; planning for feedback- identifying then closing the gap. My #TLT session | mrbenney
  4. 26 Strategies for meaningful manageable assessment by @powley_r – UKEdChat

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: