Preventing Too Little Too Late: A Novel Process of Continuous Curriculum Evaluation

Claire L. Colebourn, Oxford Radcliffe Hospitals NHS Trust
Linda Jones, Department of Clinical Education and Leadership, University of Bedfordshire



This study was designed in response to the implementation of the UK's first Critical Care Fellowship in Echocardiography (ultrasound of the heart). The curriculum is a two-year, mentored, self-directed learning process enrolling one senior trainee annually. This expansive, situated apprenticeship was designed and implemented by the first author and members of the Cardiology department.

This paper presents a novel evaluation process as applied to an emergent curriculum designed for a highly specific and mainly self-directed programme. We argue that whilst this case study describes a highly specialised tool applied in unique circumstances, the principles that emerge are applicable to emergent curricula being tested in other specialist post-graduate programmes. Evaluation methods are often post-hoc processes. We report on the power and potential for a cyclical action-reflection evaluation tool to maintain a novel self-directed postgraduate medical curriculum.

We adapted the qualitative process of 'Distance Travelled Analysis' (DTA) from vocational education to construct an evaluation tool, which was used to collect the primary data from the Echo Fellow (singleton user of this unique curriculum) repeatedly over a five-month period. The quality and trustworthiness of the subjective primary dataset was triangulated using a variety of techniques including reflective conversations, spontaneous positive storytelling, face-to-face questionnaires and peer assessment. DTA successfully translated onto the study curriculum, offering purposeful and active evaluation. The tool was able to identify a critical incident and alerted stakeholders to negative change allowing contemporaneous rather than post-hoc action. An unexpected finding was that the DTA evaluation tool significantly impacted on both learner and tutor, creating active learning exchange. The study findings expose the importance of reflective conversation as a balancing component of DTA.

This qualitative case study lends weight to the potential for continuous evaluation in high stakes curricula. While evaluation may bring to mind a post-hoc assessment of usefulness, we aim to help catalyse a paradigm shift towards a broader concept of evaluation as a tool for ongoing management of a curriculum. The process of DTA could offer curriculum evaluators an agile alternative to traditional retrospective methods, preventing evaluation from being 'too little, too late'.

Precious cargo: creating new curricula

A curriculum is distinct from a syllabus in that it describes both the intended outcomes of learning but also the processes by which these may be achieved. The complex and active nature of a curriculum can be likened to '...a windmill which is eternally at work to accomplish one end, although it shifts with every variation of the weather-cock, and assumes ten different positions in a day' (Harden 1991). The implication is that a curriculum 'is' rather than 'does'. Therefore, maintenance of a curriculum should engage with tools which capture this fluidity and enable early and timely adaptation to the curriculum design and delivery.

New order perspectives on evaluation

The last fifty years have witnessed the National Health Service give power to the public, the new consumers of healthcare (Carr 1998). This frame-shift in ownership has demanded greater transparency from providers of healthcare and their teachers. Post-war investment in mandatory education and stakeholder interest in education quality launched the applied science of evaluation in the 1980s (Goldie 2006).

When, in 1993, 'Tomorrow's Doctors' probed the effectiveness of undergraduate teaching in medicine, the scene was set for the development of evaluation tools with the capacity to convincingly evidence medical educational process and effect (General Medical Council 1993). A review of the literature generated four key principles of good evaluation practice. Firstly, without consideration of all stakeholders, evaluation becomes purposeless and incomplete, risking partial or biased evaluation where there is evaluator investment in the curriculum (Mason 2009). Secondly, an effective evaluation tool must be purposeful, clear in its intent and encompass all relevant areas. Whether the purpose of evaluation is accountability, knowledge or development, transparent awareness of purpose prevents ineffectiveness and informs design (Chelinsky 1997). Contemporary evaluation practice can be viewed as an extended process of reflection (Goldie 2006). A third practice principle is, therefore, to view evaluation as a cyclical process of action reflection. 'Loop closure' helps prevent complacency and inactivity (Griffin and Cook 2009). Awareness of timeliness in curriculum evaluation provides us with a fourth practice principle. Viewing evaluation as an 'early warning system' can prevent derailment of learning. This is particularly relevant when the programme is prolonged and the stakes for patients are high (Griffin and Cook 2009).

Study context

This qualitative action research study reports on the cyclical evaluation of an innovative post-graduate Fellowship curriculum designed and implemented by the first author in August 2009 (Colebourn 2010). The study was originally conceived for the first author's Master's Thesis in Medical Education, supervised by the second author at the University of Bedfordshire. The work was carried out through the institution at a Major Teaching Hospital in the Oxford region. The curriculum addresses a gap previously identified in the provision of echocardiography training for senior trainees in Critical Care Medicine, accommodating one Fellow annually (Colebourn 2010, Fox 2008). The programme runs over a two-year period and culminates in accreditation with the British Society of Echocardiography.

The programme is educationally underpinned using constructive alignment – an active educational process whereby teaching and learning are specifically designed to achieve purposeful aims (Biggs 2003) – and the development of a community of practice – a safe and inclusive apprenticeship learning environment highly suited to tutoring practical skills (Lave and Wenger 1991). The programme aims to teach a complex practical skill through mentored self-directed learning and therefore requires a 'rapid response' evaluation process incorporating the four key practice principles listed above.

In designing the evaluation process, we were mindful of a powerful image of the risks of post-hoc evaluation in this setting. '…evaluation at the end of a curriculum is like trying to turn a super-tanker around in mid-seas with adverse weather conditions...' (Griffin and Cook 2009).

Designing the study tool

Design of the evaluation tool was based on ideas from the vocational education arena. DTA has been successfully used to demonstrate progress and the acquisition of skills for workers in vocational apprenticeship environments but has rarely been translated to the post-graduate education arena. DTA was chosen since it was thought to be an appropriate tool for evaluating hard-to-measure outcomes such as confidence, learner perception of knowledge acquisition or availability of staff support for self-directed professional development progress.

DTA takes the form of a series of statements chosen by the evaluator to intentionally evidence either positive or negative changes in the learner based on pre-specified areas of a curriculum (Department of Work and Pensions 2003). The user subjectively self-evaluates at intervals, using the statements to critique his or her own progress towards specified goals. Another intention in this study was to overcome risks of the fellows' personal professional development remaining part of a hidden curriculum thus under-addressed and under-evaluated within the pedagogic relationship. Hafferty (1998: 770) defines the hidden curriculum as interpersonal pedagogical influences and taken-for-granted rules about learning, structure and culture of an organisation, or processes, pressures and constraints which fall outside… the formal curriculum, and which are often unarticulated or unexplored. It has been argued that hidden aspects of the curriculum are especially important in professional education, which characteristically includes prolonged periods of exposure to the predominant culture.

The evaluation tool

A set of items is designed and administered at the start and at intervals throughout the programme. Each statement is awarded a score which the learner feels reflects his or her current position. Over time, the statements can evidence either positive or negative changes. In the context of this study, DTA has considerable strengths, including, potential for the incorporation of multiple stakeholder perspectives, direction of statements toward both explicit and hidden curricula, the opportunity for cyclical re-assessment, and the latent power of the statements as a reflective trigger for action.

Figure one shows the distance travelled statements used as the evaluation tool designed for this study. An Osgood scale quantifies user responses to each statement to maximise visual impact (Osgood 1957). The statements incorporate both hard and soft outcomes and address three pre-identified key areas of the programme:

  1. Explicit curriculum product: does the design of the fellowship curriculum fulfil its explicit objectives to teach echocardiography skills to the learner and facilitate their use with the critically ill? These explicit outcomes are associated with hard end-points such as passing written and practical examinations.
  2. Professional development: does the programme support the Fellows' professional development? Specifically in this setting, do these endpoints relate to whether the learner is developing the self-awareness required to operate safely within the limits of personal ability, to identify specific learning goals, and to achieve those goals? It also recognizes the processes of moving from peripheral involvement in the community of practice, to becoming a full member of the community of practice.
  3. Curriculum user satisfaction: do learning processes meet the needs of this learner? In this case study, the question relates to statements evidencing the structure and content of the programme and the learner's relationship with his or her mentor.

Figure one: study evaluation tool

Please indicate your response to the following questions. Ten is the strongest response and one the weakest.

Figure 1: Study evaluation tool 

A full explanation of the process of DTA was provided and informed consent was obtained from the Fellow who then completed the evaluation tool at four time points over a five-month period. The statements were completed without interference from the study author, thus providing the primary study data.

Triangulation processes

In this case study, the data were primarily the subjective responses of a single Fellow and so additional methods and triangulation were deemed appropriate to validate and enhance credibility of the data gathered and the interpretations made. 'Triangulation is less a tactic than a mode of inquiry by self-consciously setting out to collect and double-check findings, using multiple sources and modes of evidence, the researcher will build the triangulation process into ongoing data collection' (Huberman and Miles 1994: 438).

The explicit curriculum product was triangulated through an interim practical assessment undertaken by the Fellow at the end of the five-month study period. Professional development was triangulated through spontaneous positive storytelling in ad hoc conversations with the wider team and a face-to-face survey of four Deanery colleagues of the Fellow. The survey included three requests for inter-professional respondents to give an example of positive behaviours and the frequency with which they had observed this in the Fellow. It was made clear to each respondent that no examples of negative behaviours should be given to guard against negative impact on the Fellow whilst evaluating the programme. The Fellow's satisfaction scores were triangulated through reflective conversations between learner and mentor. The intention of these conversations is to allow the learner to express both negative and positive views on the progress of his or her learning as they are experiencing it at that time. This allows mentors to find ways of improving the experience of the learner, the learning outcomes, and to tweak the curriculum during, rather than at the end of, the programme.

Primary data

Figure two shows the scores attributed to each statement at the four time points during the five month study period. The figures show the statements grouped according to the three key evaluation aims:

2a. Explicit curriculum product: addressed by the statements:

My current confidence rating in practical echocardiography is
My current knowledge rating in echocardiography is
I am comfortable teaching others echocardiography skills within my current capabilities
Current confidence that I will achieve accreditation with two years

2b. Professional development: addressed by the statements:

I am setting self-directed learning goals
I am meeting those self-directed learning goals
I am identified by others as part of the ICU echo team
I feel confident enough to offer others my opinion on echocardiography findings (in line with my ability)

2c. Curriculum user satisfaction: addressed by the statements:

The Fellowship programme is meeting my learning needs
Rate mentor availability
Rate mentor effectiveness

Fig 2a 

Figure 2b 

Fig 2c 

In addition, the curriculum product – the Fellow's echocardiography skills and knowledge – were assessed at week five by the author using a practical examination, and questions based around three cases were picked at random. The Echo Fellow successfully completed the assessment. Evidence of the Fellow's professional development is highlighted in the spontaneous positive storytelling paraphrased by the study author.

'The Fellow thinks the echo machine is theirs...they get annoyed when anyone leaves it in a mess.' 'I am amazed at how well the Fellow can get good images, I wish I could do that.' 'I have really noticed a huge improvement in the Fellow's confidence.' 'The Fellow saved our bacon yesterday. They came over and scanned a very sick patient for us, it really made such a difference.'

Differing degrees of user satisfaction with the curriculum became evident at each point. The following extracts show key 'learning/action points' agreed and distilled between the Fellow and the author at each of the four study time-points. The phrases and excerpts from the author's reflective diary are chosen to illuminate the quality and nature of the conversation between the Fellow and the author. They also highlight the importance of ongoing rather than end point evaluation as this enabled timely and reflexive responses to learner need.

Key learning and action points

Week 5: Learning/action points:

  1. Self-directed learning goals need to be more specific.
  2. The aim of the first five weeks was practical competence and this is progressing well.
  3. We now need to add a second focus and develop a knowledge base Reflection.

Mentor's reflection

'The echo fellow is taking ownership of their process and going off to the department and meeting people and accessing training, but I am unsure that this is currently meeting the expectations of the echo fellow. I am finding it hard to let go and let them drive their own process, but I am also aware of the curriculum intent to generate a self-directed learner. I need to sit on my hands for the time-being.'

Week 10 Learning/action points:

  1. Numbers of studies performed has dropped off dramatically.
  2. Reduction in the number of studies is causing anxiety.
  3. In order to be ready for interim competency assessment planned for week 20, we need to guarantee more studies per week.

Mentor's reflection

'I have been standing on the side-lines allowing the echo fellow to develop their own community of practice which they have managed to an amazing degree, but I need to be more directive now. I did not realise quite how they were feeling until I saw the statement response fall from 10 to 5.'

Week 17: Learning/action points:

1   To be specific when setting self-directed learning goals.

2   Look to the next 6-8 months and start planning dissertation project.

Mentor's reflection

'The echo fellow has established a firm identity within their peer group and they are starting to understand the need to be focused when planning self-directed learning.'

Week 21 Learning/action points:

1   There is room for improvement in some practical aspects.

2   The need for a week by week revision guide is now paramount (eight weeks until the written exam).

Mentor's reflection

'I feel secure that the echo fellow will progress to completion within two years. They appear happy in their new community of practice and their confidence is building as they become established in a routine.'


The strength of the findings of this study are subject to the limitations of insider research since the author is also programme lead for the Fellowship and engaged in a one-to-one teaching relationship with the Fellow (Mason 2009).

The unilateral nature of the triangulation processes used to evidence professional practice is also acknowledged. Both the structured questionnaire and the relay of spontaneous positive storytelling are inherently skewed towards the positive. Although necessary for ethical reasons, some may consider this as a methodological flaw.

Triangulation of qualitative data does not claim to provide proof or objectively test validity. It does however ground the analysis firmly in practice by corroborating the primary data with real-world reflectors enhancing credibility (Mays 2000). The design of this research embraces the use of a variety of methodologies used to meta-evaluate the evaluation tool, enhancing the emerging practice descriptors.

Main findings:

The power of embracing negative indicators

Figure 2c illustrates how the evaluation tool was able to capture a 'critical incident' within the curriculum through the Fellow's response to the statement 'the curriculum is meeting my learning needs' at week 10. This is corroborated by evidence from the author's reflection at this time. The effect of action taken in response to these negative indicators is shown by a sharp rise in the Fellow's score, attributed to the same statement at week 17, and the author's subsequent reflection.

The normalization of ongoing evaluation enabled the echo Fellow to express his or her disquiet without implicating individuals: the tool was focused on the curriculum itself. This seems particularly relevant in a one-to-one or small group teaching situation to avoid a sense of 'personal attack' through evaluation. The deliberate inclusion of 'fuzzy' statements may, therefore, enhance our ability to 'diagnose the learner' by providing a deliberate forum for the learner to safely express negative indicators (Grow 1991).

Evaluation agility

The evaluation tool was also able to identify positive and negative aspects of the curriculum simultaneously. Figures 2a and 2c demonstrate strong positive changes in the Fellow's self-assessment of his or her practical skills and knowledge over the first ten weeks of the programme, despite the simultaneous precipitous fall in satisfaction with the programme. This evidences the inherent agility of DTA. Evaluation of different aspects of the curriculum through different statements allows evaluators to address focused issues without 'throwing the baby out with the bathwater' and threatening positive learning process.

Triangulation processes may enhance DTA in practice

Meta-evaluation was used to triangulate primary information provided by the evaluation tool. However, the process of triangulation itself appeared to strengthen the evaluation. For example the 'red flag' provided by the Fellow in response to the statement 'the curriculum is meeting my learning needs' at week five was converted into activity through the reflective conversation this prompted between the author and the Fellow. Conversely, interplay between the statements and the author's reflective process at week five maintained crucial inactivity by the author at a time when the Fellow's responses to the statements in Figure 2b clearly show the generation of their community of practice and identity (Schon 1987).

Evaluator perspective and subject expertise in mentors

In this study, reflection linking the Fellow's statement scores and the author's opinion of progress mandated that the author in the role of mentor was adequately knowledgeable in echocardiography. The term 'mentor' in this and similar contexts implies adequate skills and knowledge to teach and coach a mentee, which is often referred to as being 'expert'. The 'expert' status of the author in this study was vital since the Fellow was subjectively diagnosing his or her own progress using DTA. Without the relevant expertise, the author would not have been able to triangulate progress adequately. It is therefore unlikely that mentors without expertise in the evaluated field of study will be able to exploit the full potential of DTA as an evaluation process. This informs practice: to counterbalance the need for 'expert' mentors in DTA the evaluator/mentor must remain aware of his or her own ontological perspective when writing DTA statements (Mason 2009). In this study this is evidenced by the three least sensitive statements: 'rate mentor availability', 'rate mentor effectiveness', and 'confidence I will achieve accreditation within two years'. The Fellow awarded these statements consistently high ratings from the outset of the programme as shown in Figures 2a and 2c. The statements did not invite the Fellow to express a sense of progress over time. This informs effective use of DTA: the statements could be activated by encouraging an accurate reflection of perceived progress, either negative or positive, in the context of the whole curriculum.

Distance travelled statements can evidence the hidden curriculum

DTA proved highly effective for both monitoring and encouraging professional development (Fuller 2003). Statements evidencing professional development, shown in Figure 2b, mirror both the author's reflection and the evidence received from the questionnaires and spontaneous storytelling as shown in the results section. This may inform future practice. Statements focusing on desired professional behaviours may have helped to cultivate those behaviours by raising awareness. To maximise this impact, professional behaviours could be more explicitly defined by the statements encouraging awareness and bringing the hidden curriculum forward (Beckman 2009, Ende 1983).

This simple evaluative distance travelled tool captured gaps or problems in curriculum design and delivery in this Fellowship programme. By involving the learner in critical reflection and scoring of learning opportunities and support adaptations, corrections could be made quickly. The usefulness of this tool and approach to ongoing evaluation of curriculum design and delivery has been demonstrated. No claims for generalisability of the broad descriptors in this single practitioner case study would be appropriate. Nonetheless, it seems in principle that regular measurements of student satisfaction and experiences could be up-scaled and applied to larger or bespoke programs. The tool and its application would need to be tested and adapted for larger cohorts, but arguably offers a model approach for capturing curriculum issues before it is too late for learners to benefit from adaptations.


The findings of this study evidence the effectiveness of DTA as a contemporary cyclical evaluation tool. In accordance with the four pre-identified practice principles, DTA exhibits the breadth to represent various stakeholder perspectives; the ability to provide purposeful and active evaluation through a reflection action process; and timeliness in identification and management of critical incidents.

The study findings have provided broad descriptors for the effective use of DTA. When designing statements for a one-to-one teaching process, the evaluator must be mindful of his or her ontological perspective and actively seek negative indicators. DTA is unlikely to be effective where the evaluator is non-expert and the tool should be viewed as the stimulus for an essential complementary action reflection process. Statements should take into account negative changes without blame and use specific statements to focus the learner onto both explicit and hidden curriculum aims. We suggest similar approaches could be adapted for ongoing evaluation worthy of further testing and study. The experimental technique of DTA proved to be an agile and effective tool for the maintenance of a high stakes postgraduate medical curriculum. Post-hoc methods of evaluation face a significant challenge to their continued role in contemporary medical education.


Beckman, TJ & Lee, MC. (2009) 'Proposal for a collaborative approach to clinical teaching',  Mayo Clinic Proceedings, 84(4), pp.339-344.

Biggs, J. (2003) Teaching for quality learning at University.  2nd edition. Buckingham: Open University Press.

Carr, W. (1998) 'The curriculum in and for a democratic society', Curriculum Studies, 6(3), pp.323-340.

Chelinsky, E & Shadish, WR. (1997) Evaluation for the 21st century: a handbook. Thousand Oaks, CA: Sage.

Colebourn, CL, Davies, IKG & Becher, H. (2010) 'Bridging the gap: training critical care clinician-echocardiographers through a collaborative curriculum', JICS, 11(1), pp.13-16.

Department for Work and Pensions (2003) A practical guide to measuring soft outcomes and distance travelled. London: Department for Work and Pensions.

Ende, J. (1983) 'Feedback in clinical medical education', JAMA, 250, pp.777-781.

Fox, K. (2008) 'A position statement; echocardiography in the critically ill. On behalf of a collaborative working group of the British Society of Echocardiography', JICS, 9, pp.197-98.

Fuller, A. & Unwin, L. (2003) 'Towards expansive apprenticeships', Journal of Education and Work,  16(4), pp.407-426.

General Medical Council (1993) Tomorrow's doctors: recommendations on undergraduate medical education. London: General Medical Council.

Goldie, J. (2006) 'AMEE Education Guide no.29: Evaluating educational programmes', Medical Teacher, 28(3), pp.210-224.

Griffin, A. & Cook, V. (2009) 'Acting on evaluation: twelve tips from a national conference on student evaluations', Medical Teacher, 31, pp.101-104.

Grow, GO (1991) 'Teaching learners to be self-directed',  Adult Education Quarterly, 41(3), pp.125-149.

Harden, RM. (1991) 'AMEE Guide No. 14: Outcome-based education: part 1 – an introduction to outcome-based education', Medical Teacher, 21(1), pp.7-14

Hafferty, F.W. (1998) 'Beyond curriculum reform: confronting medicine's hidden curriculum', Academic Medicine, 73(4), pp. 403-407.

Huberman, A. & Miles, M. (1994) Data management and analysis methods in Handbook of qualitative research. Edited by N. Denzin, Y. Lincoln.

Lave, J. &, Wenger, E. (1991) Situated learning. Legitimate peripheral participation. Cambridge: Cambridge University Press.

Mason, J. (2009) Qualitative research. 2nd edn. London: Sage publishing.

Mays, N. & Pope, C. (2006) Qualitative Research in Health Care 3rd edition. Oxford: Blackwell BMJ publishing.

Miles, M.B. & Huberman, A.M. (1994) Qualitative data analysis: an expanded source book. London: Sage Publishing

Osgood, CE, Suci, GS, & Tannenbaum, PH. (1957) The measurement of meaning. Urbana, IL:University of Illinois.

Schon, D. (1987) 'Preparing professionals for the demands of practice' in Schon, D. Educating the Reflective Practitioner: towards a new design for teaching and learning in the professions. San Francisco: Jossey-Bass Inc.


Centre for Learning Excellence
University of Bedfordshire
University Square
Luton, Bedfordshire


Our Tweets: @bedsCLE