Thursday 17 July 2014

Open Mentor

The use of a 'learning support tool for tutors' (Whitelock, 2014) enabling them to 'reflect on the quality of feedback they provide for their students' (ibid), is to be welcomed.  In general, time and budget restrictions prevent constant reviews or supervision of tutors' commentaries, thus a relatively cost free tool that can monitor tutor's input could be an effective measure to ensure continuous improvement. 

There are challenges with this type of tool, particularly with its initial construction: how the creators perceive useful and effective feedback, what they consider to be key words or sentences and the reasoning behind their decisions to these to selected categories.  The database and algorithms also needs to be reviewed and updated regularly to incorporate current terminology or correct errors if it is to provide useful feedback which would suggest a need for ongoing investment and financial support if the tool is to survive.

On submission of an 'essay' to "Open Mentor" (OM) it will provide a report that analyses the assessor's comments, clustering them into groups similar to Bale's categories:
Bale:               a) Positive reactions
                        b) Attempted Answers
                        c) Questions
                        d) Negative reactions (Whitelock, 2003)

Open Mentor: a) Positive reactions
                         b) Teaching points
                         c) Questions
                         d) Negative reactions

The report will also suggest an 'ideal' number of comments for each category. Tutor's can access representational graphs as well as an overall analysis of submitted work for a group or cohort.

Despite its shortcomings it provides a useful and speedy overview of the assessor's annotations. It can identify issues such as the lack of constructive or too many destructive comments. OM will helpfully makes suggestions about the ideal number of comments that could be included within each category, something that a new (or experienced) assessor might aim towards.

There are major limitations to this utility. It has the ability to recognise formal and informal comments but the algorithms for determining which category or how many categories to which a comment is allocated appears to be abitrary. In 'Brown's' sample essay, OM identified a large number of comments that simply referred to 'reference' issues both as positive teaching points and/or questions. These comments were concerned with technical aspects of the essay rather than its qualitative content and yet OM skewed the overall review of the tutor's approach by considering them as positive inputs.

For example: the comment below appears in both the Teaching points category and Questions category in OM's report:

"Here you should have a citation. Did you get this from a particular report?"

It would probably be more advantageous and less time-consuming to cluster repetitive comments and highlight to the tutor that one acknowledgement would suffice.  OM, however, appears unable to recognise repitition. Neither can it distinguish between a technical comment/question and comments or questions that will add depth to the student's learning and increase motivation.

Clearly 'Open Mentor' is still a work in progress.

Refs:
Open Mentor (2012) http://openmentor.org.uk (Accessed 17th July 2014)

Whitelock, D. (2014) Open Mentor, H817, Block 4, Week 24, The Open University, Milton Keynes

Whitelock, D., Watt, S., Raw, Y. and Moreale, E. (2003) ‘Analysing tutor feedback to students: first steps towards constructing an electronic monitoring system’, Research in Learning Technology, vol. 11, no. 3, pp. 31–42; also available online at http://www.researchinlearningtechnology.net/index.php/rlt/article/view/11283/12973 (Accessed 15th July)

Tuesday 15 July 2014

Nicol's model for workplace training

An adaptation of Nicol's Assessment and Feedback model to incorporate workplace training.


 
Nicol, D. (2007) ‘Principles of good assessment and feedback: theory and practice’ [online], paper presented at the REAP International Online Conference on Assessment Design for Learner Responsibility, 29–31 May 2007, http://www.reap.ac.uk/reap/public/papers//Principles_of_good_assessment_and_feedback.pdf (accessed 14th Jul 2014)



Nichol's ten principles of good feedback practice


Do any of your criteria appear among Nicol’s ten principles and do you think Nicol’s principles cover the important ideas mentioned in your groups

Nicol's principles:(those in italics replicate to some degree the main feedback principles that were identified in in an earlier blog or on the OU forum)


1. Help clarify what good performance is (goals, criteria, standards).
To what extent do students in your course have opportunities to engage activelywith goals, criteria and standards, before, during and after an assessment task?

2. Encourage ‘time and effort’ on challenging learning tasks.
To what extent do your assessment tasks encourage regular study in and out of class and deep rather than surface learning?

3. Deliver high quality feedback information that helps learners self-correct.
What kind of teacher feedback do you provide – in what ways does it help students self-assess and self-correct?

4. Encourage positive motivational beliefs and self-esteem.
To what extent do your assessments and feedback processes activate your students’ motivation to learn and be successful?

5. Encourage interaction and dialogue around learning (peer and teacher-student.
What opportunities are there for feedback dialogue (peer and/or tutor-student)around assessment tasks in your course?

6. Facilitate the development of self-assessment and reflection in learning.
To what extent are there formal opportunities for reflection, self-assessment or peer assessment in your course?

7. Give learners choice in assessment – content and processes.
To what extent do students have choice in the topics, methods, criteria,weighting and/or timing of learning and assessment tasks in your course?

8. Involve students in decision-making about assessment policy and practice.
To what extent are your students in your course kept informed or engaged in consultations regarding assessment decisions?

9. Support the development of learning communities.
To what extent do your assessments and feedback processes help support the development of learning communities?

10. Help teachers adapt teaching to student needs.
To what extent do your assessment and feedback processes help inform and shape your teaching?

Nicol, D. (2007) ‘Principles of good assessment and feedback: theory and practice’ [online], paper presented at the REAP International Online Conference on Assessment Design for Learner Responsibility, 29–31 May 2007, http://www.reap.ac.uk/reap/public/papers//Principles_of_good_assessment_and_feedback.pdf (accessed 7 February 2012

Nicol, D. (2006) ‘Assessment for learner self-regulation: enhancing the first year experience using learning technologies’ [online], paper presented at the 10th International Computer Assisted Assessment Conference, 4–5 July 2006, pp. 329–40, https://dspace.lboro.ac.uk/dspace-jspui/handle/2134/4413

Monday 14 July 2014

Benefits and Challenges of Online Assessments

Benefits of instant feedback

Motivational - the 'immediacy' confirms the learner's success or [provides encouragement to do something differently in order to achieve success

Timely - immediate feedback ensures that an issue can be resolved whilst the learner is still focused on the problem. There is no 'lag' whereby interest has waned or even the original answer is forgotten.

Appropriate - being impartial, directed feedback fits the question and assists in learner's improvement. The learner is not overwhelmed with too much information

Aid and deepen understanding - the feedback is designed to promote a conceptual understanding of a problem, which is gained when learners are prompted to consider alternatives strategies to achieve a 'correct' answer.

Aid Progression - improves overall learning outcomes giving learners the confidence to progress with their learning

Learner Led - Flexibility in terms of accessing both the assessment material and receiving feedback, not tied to locations, specific times or days

Encouraging responsiblity for learning - learners are in control of whether they solicit feedback and the degree to which they act on the feedback

Speedier interventions - lecturers/tutors can intervene quickly; if a learner is experiencing difficulties, rather than having to wait for face to face contact

Additional Learning  - Feedback can include links to previous learning sessions or additional supportive materials, that are easy to access

Inclusiveness - support all learners regardless of their backgrounds, physical needs, language etc.

Linked to outcomes - The feedback is relevant to the overall goals as well as specific error

Integrated with online learning systems - can be integrated with institution's learning systems, enabling recording of feedback and ability in some cases to track learner's progress vis a vis the feedback

Low cost - once running it is cheap to maintain and edit

Scaleability - no limits to the numbers of people accessing the assessments and receiving feedback

Challenges of instant feedback

Limitations -  the feedback is only as good as the creator of the quiz/questionnaire and their ability to identify potential errors that students may demonstrate

Students' confidence - students may not be fully confident with the online feedback and choose to ignore it

Costly setup - the initial set up of online feedback systems can be time consuming and in turn more expensive than face to face sessions

Bugs and glitches - can demotivate

Generic feedback - may be deemed unhelpful by some learners who may regard further feedback as time wasting

Quality - ensuring that the feedback is of quality and value to the learner




Adaptive Learning and Feedback

The use of interactive on-line formative quizzes in mathematics - Dr J. Ekins

At a simplistic level the feedback/hints from the quizzes that Ekins refers to are useful, although the quality of learning resulting from the hints is questionable. 

When I inputted a mistaken calculation on the OU Maths assessment site, the programme unhelpfully fed back that my answer was "too low". Not  a ‘little low’, ‘way too low’ or ‘derisorily low' simply a very generalized "too low" - which might be marginally more helpful than 'incorrect'. 
I made the error because I failed to read the question correctly. Feedback that suggested I read the question again might  have been more apt. (Attempting the quiz without wearing reading glasses was probably never considered!)

The type of feedback referred to in Ekins' example is dependent upon the degree to which the programme's authors can empathise with and predict the difficulties faced by a learner.  This will always mean that this type of feedback will be limited to the author's previous experience.

University of Glasgow College of Medical, Veterinary and Life Sciences - Level 1 Biology Course
The University of Glasgow uses Moodle for its assessment platform. Students can access 4 formative self-assessment quizzes, which: are accessible for one week following completion of the related taught module, allow an unlimited number of attempts, provide feedback immediately. 
The quiz programmes allow lecturers the following options when creating questions:
  • Cloze questions (embedding answers in text)
  • Simple Calculated Question
  • Calculated multiple choice
  • Description Question
  • Maths Skills (GU) Question
  • Matching
  • Multi Choice
  • Random Short Answer Matching 
  • Short-Answer
  • True/False
  • Tagged MC Question
  • Third party questions (totalling approx. 50)
Somewhat more sophisticated than Ekins OU example. Lecturers can also allow students to have greater choice over which questions they wish to answer or skip. 
Given that the statistical analysis of both examples suggest the quizzes lead to an overall improvement of learners' achievements does support the belief that this type of formative assessment plays an important role in enhancing learner development. 


Refs:
Ekins, J. (2007) ‘The use of interactive on-line formative quizzes in Mathematics’ [online], paper presented at the 11th International Computer Assisted Assessment Conference, 10–11 July 2007, pp. 163–75; http://caaconference.co.uk/pastConferences/2007/proceedings/Elkins%20J%20e2_formatted.pdf (Accessed 9th July 2014)

Griffiths, M., Mcvey, M. and Finlay, C. (2011) Developing e-assessment using the Quiz Activity within Moodle:Empowering student learning,University of Glasgow College of Medical, Veterinary and Life Sciences[Online] Availalble from: http://www.gla.ac.uk/media/media_231899_en.pdf (Accessed 12 July 2014)

Saturday 12 July 2014

An Authentic Assessment

This is an example of a formative authentic assessment from  a 'Presentation Skills' Course.

It is an online assessment of a session on 'PowerPoint'. Following completion of the session, learners will be asked to produce a short autoplay PowerPoint presentation, demonstrating appropriate use of graphics, with an accompanying voice narrative. They will upload it to Youtube (optional – privacy mode) share the link with their peers and invite feedback.

Activity Brief: 
Using no more than 10 slides, you will create a Power|Point presentation to identify the following:
What you consider to be best practice in terms of planning PowerPoint presentations, formatting and presentation style
Pitfalls to avoid when creating a PowerPoint presentation
Types of learning and information sharing scenarios that can be enhanced through PowerPoint
How you will implement PowerPoint in the future, identifying what you will do differently as a result of the programme.
Share your link and invite your colleagues to provide feedback.

Assessment scores:
Criteria
0%
-       33%
-       66%
-       100%

Organisation

Badly organised, impossible to follow
Some organisation, still jumps about
Clear organisation following a linear path
Clear organisation in a logical sequence with additional creativity
Powerpoint visuals

No graphics or Irrelevant, or unclear
Some clear graphics not  relevant
Clear, supporting graphics
Clear supporting graphics, animated or containing hyperlinks
Delivery

Incoherent mumbling
Clear narrative, not supporting slides
Clear narrative, supporting slides
Clear narrative, supporting the slides and asking rhetorical questions to include viewer
Demonstration of learning

No demonstration of learning
Generalised learning points
Specific learning points identified and means of implementing them in future
Specific learning points identified and identification of transferability of learning points to additional tasks and job roles.