Student and staff perceptions and experiences of the introduction of Objective Structured Practical Examinations: A pilot study
Department of
Physiotherapy, University of the Western Cape, Bellville, Cape
Town, South Africa
Background. The Objective Structured Practical Examination (OSPE) is widely recognised as one of the more objective methods of assessing practical skills in healthcare programmes, including undergraduate physiotherapy curricula.
Objectives. To obtain feedback from both students and staff who were involved in the introduction of an OSPE in 2011, in order to refine and standardise the format throughout the curriculum.
Methods. A qualitative research design was used. Data were gathered through a questionnaire with semi-structured open-ended items and focus group discussion. Participants were all third-year undergraduate physiotherapy students (N=47) and all staff members (N=10) in the 2011 academic year who were exposed to the OSPE format or were involved in the first OSPE.
Results. The main concerns raised by both students and staff were: (i) pressure due to time constraints and how this might affect student performance; and (ii) the question of objectivity during the assessment. However, their initial concerns changed as they experienced the OSPE in a more positive manner owing to the structure and objectivity of the process of implementing the OSPE.
Conclusion.
While
both students and staff reported positive experiences, the
challenges that emerged provided valuable insight in terms of
refining the OSPE format in this undergraduate physiotherapy
department.
AJHPE 2013;5(2):72-74.
DOI:10.7196/AJHPE.218
Assessment of clinical competence is an essential component of health professions education, requiring educators to make informed decisions that measure students’ clinical knowledge and skills accurately. Such clinical assessments have often been challenged by a lack of objectivity. The Objective Structured Clinical Examination (OSCE) was originally developed in Dundee in the mid-1970s1 with the aim of assessing clinical competence in an objective, structured way. Bartfay et al. 2 and Major3 highlighted the fact that using the OSCE format introduces standardisation that aims to improve objectivity in assessment. When using the OSCE method, clinical competencies are assessed as students move through a number of ‘stations’ where they are individually graded using precise criteria in the form of a checklist.
The term Objective Structured Practical Examination (OSPE) was derived from the OSCE in 1975, when it was modified to include practical examination.4 The OSPE, like the OSCE, tests students’ ability to perform a practical skill rather than what they know. However, while the OSCE focuses on assessing clinical competence, the OSPE is designed to assess competence in performing a practical skill outside the clinical context.
The OSPE has several distinct advantages over other forms of practical assessment, including the fact that it can be used as a summative assessment to evaluate individuals’ performance in the practical skills component of the module, as well as for formative evaluation where the student gets feedback as part of the learning process. In addition to its role in assessment, an OSPE includes a focus on the individual competencies being tested, and the examination covers a broader range of practical skills than a ‘traditional’ examination.5 The traditional examination in this department was an unstructured evaluation of different techniques, and was neither valid nor reliable, since every student was seen by a different examiner and given a different assessment task. In the OSPE, an individual’s ability to perform a technique is tested in a more objective manner because all candidates are exposed to the same predetermined set of techniques and questions, which minimises the subjectivity of the assessment.6
Mastering practical skills is an important aspect of a course like physiotherapy,7 which means that its assessment component will influence the learning strategies of students.8 However, if an assessment task is to achieve the desired outcome, it has to employ instruments that yield valid, accurate data which are consistent and reliable. In addition, inter-rater variability among examiners can be large, being informed by differences of opinion that are based on the subjective perception of individual examiners. 9 This lack of objectivity among examiners assessing practical skills was a problem area identified in this undergraduate physiotherapy department in the Western Cape, South Africa, and a departmental decision was made to pilot the OSPE. The aim of this study was to determine the perceptions and experiences of students and staff following the introduction of the OSPE format in the department.
Since the OSPE was a new format for assessing practical
competence, specifically developed to enhance objectivity,
students and staff were approached and asked to describe their
experiences and perceptions of the process following its initial
implementation. The importance of both students’ and staff
attitudes and perceptions of the training programme in
undergraduate health professions education was acknowledged.
Method
Design
The study utilised qualitative data-gathering methods in the
form of a questionnaire with open-ended questions and a focus
group. A focus group was chosen because it encourages
participants to share ideas and experiences, creating meaning
that may not have emerged independently.10 Both staff and students
completed the questionnaire immediately after the first OSPE in
the department in March 2011; only staff members were asked to
participate in the focus group discussion.
Setting and sample
The survey sample included all third-year undergraduate
physiotherapy students (N=47) who
were registered for the 2011 academic year. They were the first
to be exposed to the OSPE format. All staff members (N=10) involved in the OSPE were also
included. One year later, a focus group discussion was held
among the staff members who were involved in the initial
implementation of the OSPE (N=8).
This delay allowed for the assessment format to be developed and
refined based on student and staff experiences and informal
feedback.
OSPE implementation
The OSPE was conducted in most of the core physiotherapy
modules in the third year of the programme and consisted of four
stations, each assessing one practical skill. Two parallel
tracks were used to move students through the stations more
quickly, so that eight stations were used to test four practical
skills. Students proceeded through each station, completing a
practical technique and answering a related theoretical
question. An additional two people assisted the staff members
conducting the OSPE, one handling the logistics of moving
students between venues, and the other keeping time (students
had to complete each station within a predetermined period). The
lecturer responsible for the module prepared assessment rubrics
for each station, set up stations with the necessary equipment,
ensured that there were enough examiners, and selected a
spacious venue for the OSPE. Rubrics were reviewed by all
examiners involved in the OSPE before the assessment date.
Data collection instrument
Two instruments with open-ended questions were developed to
collect data for this study. The seven questions asked to
students focused on concerns, time issues, challenges,
improvements, impact of the change in format for assessment of
practical skills, and positive and negative aspects of the OSPE.
The four questions posed to the staff focused on concerns,
challenges, improvements, and the amount of practical skills
covered. The items were based on reviews of the available
literature and were circulated among academic staff in the
physiotherapy department for face and content validity. The
focus group discussion was conducted with the staff members and
lasted for 45 minutes.
Procedure
Students were given the questionnaire as they left the
assessment venue and asked to complete it on the same or the
following day. Staff completed the questionnaire directly after
the OSPE assessment. After one year of using the OSPE process in
the department, staff were invited to participate in a focus
group discussion.
Data analysis
Survey data were transcribed into word processing files. Focus group data were transcribed verbatim. Themes were identified from the transcriptions by two reviewers and areas of disagreement were discussed until consensus was reached. The open-ended questions were analysed using Braun and Clarke’s11 six-phase guide to conducting a thematic analysis.
Phase 1 involves familiarising oneself with the data, and phase
2 requires initial codes to be generated. The next step is to
identify themes. The researcher also reviews the themes under
consideration and then defines and names them. Lastly, the
results are reported.
Ethical considerations
Permission to conduct the survey was obtained from the head of
the physiotherapy department and informed consent was obtained
from all participants. Both students and staff were informed
that they were not required to participate, and
non-participation did not negatively impact on either staff
members or students. Anonymity and confidentiality were ensured
by not gathering personally identifiable data.
Results and discussion
There was a response rate of 20/47 (42.6%) among the students
and 7/10 (70.0%) among staff. Four themes were found in the
responses of students and staff regarding the use of the OSPE:
time (initial reaction v. post event), increased pressure, role
of the examiner, and format of the OSPE.
Time
The length of time allocated per station was highlighted as a concern by all the participants, with both staff and students worried that the time allocated at each station was not enough.
‘… I was also concerned about logistical issues such as “fixed” time constraints imposed on students to conduct the necessary tasks in a specific time frame, and as they need to stick within the given time frame.’ (Staff member)
‘… Are we going to get enough time to do everything?’ (Student)
‘… would not have time to mentally process the question and
perform the treatment. Basically time constraints …’ (Student)
These concerns are similar to those highlighted by Abraham et al.,12 who reported in a
quantitative study that more than 50% of the participants felt
that time was a concern during an OSPE. In addition, Hasan et al.13 indicated that although
time does seem to be a problem with the OSPE, it should not
become an exercise of how fast students can perform the
technique, but rather focus on how well they can perform it. As
a result, one of the changes made to the approach has been that
lecturers consult with each other about the time needed to
complete each station. Each station has to be completed within
the same length of time, and stations are run simultaneously,
students starting each station at the same time and being
required to stop at the same time and then move on to the next
station.
Increased pressure
The OSPE appears to create more pressure than ‘traditional’ practical assessment methods, and is therefore more stressful.12 This concern was raised by both the students and the examiners in the current study. Staff felt pressured to hurry through their instructions to the students, and students felt intimidated, which did not allow them to perform to the best of their ability.
‘… There was tremendous pressure on the examiner to give instructions and to ask the question …’ (Staff member)
‘… was too intimidating, could not perform to my best ability
…’ (Student)
This anxiety was reported at the beginning of the process, but the literature has indicated that students’ anxiety tends to decrease after the assessment begins14 and that they generally perform well.15 It is noted that the increased anxiety could also be because staff and students were being asked to do something new. Despite the initial stressful experience, students have come to view OSPEs in a favourable light in this department.
‘… It was okay; I only had to concentrate on one task at a time.’ (Student)
‘… Yes, but positively. Everything was more equal.’ (Student)
‘… Yes, in the end I felt it was long enough to get to the
station and have a bit of a “breather”.’ (Student)
Role of the examiner
The question of prompting students was raised by both staff and students. The feeling among the staff was that there was a need for consistency from one student to the next. According to Major,3 in an OSPE/OSCE the examiner is assigned to one station and measures students’ performance using a predetermined checklist or rubric. Objectivity is ensured by setting out standards such as no prompting, all students receiving the same instructions, and having a rubric guide for allocating marks. Each staff member is then informed of the requirements at each station before the OSPE/OSCE commences.
‘The lecturers should ask the question in the same way, and if they are going to give a hint, then the next person should get the same treatment.’ (Staff member)
Students, on the other hand, felt that the examiner/lecturer should be allowed to assist more.
‘I understand that no prompting was allowed, but if the student is misinterpreting the question, could the lecturer maybe steer them in the right direction?’ (Student)
These challenges highlighted by both the students and the staff
indicate that there is still a need to improve the way in which
the OSPE assessment is currently conducted. Efforts should
therefore be made to ensure that both staff and students
experience the OSPE as an objective assessment for all. When the
process of OSPEs were implemented in this department, staff
members decided that in order to maintain objectivity no hints
would be given to students.
Format of the OSPE
Concerns were initially expressed by staff members that the new format might be problematic for the students. The main concerns included the change to the new format, the length of time allocated to each station, and the understanding of the roles of the examiner and the student. Before the OSPE, the lecturer responsible for the module should ensure that all staff and students know what is expected of them.
‘I was concerned whether students would be able to make the transition from the old format to the OSPE format.’ (Staff member)
‘I found the [theory] question of the task disturbed the students’ thought processes on executing the activity.’ (Staff member)
However, it became evident that students experienced the new format in various ways.
‘Well organised and efficient.’ (Student)
‘It may have impacted on my performance as I was rushing as I did not know what to expect for the first round.’ (Student)
The main advantage of an OSPE format is that it improves the objectivity of the assessment by ensuring that each student performs the same technique in front of the same examiner. In addition, when questions are included in the assessment, they are uniformly presented to students. Finally, the presence of the checklist and rubric mean that all students are assessed in a fair and accurate manner, as all examiners are basing their marks on the same performance criteria.16
Conclusion
This pilot study provides insight into the challenges
experienced when introducing the OSPE assessment format into the
undergraduate curriculum. The study determined the experiences
and perceptions of students and staff members who were involved
in the initial implementation of the OSPE in this physiotherapy
department. The main challenges raised by both students and
staff were the impact of the time constraint on student
performance, and examiner objectivity during the OSPE. The OSPE
remains a more objective method of assessment than the
traditional method that was previously used in the physiotherapy
department. This pilot study provided valuable feedback in the
process of refining and standardising the OSPE format in the
department. Major outcomes that emerged following evaluation of
the process were that lecturers now work collaboratively to plan
the assessments, and that standardised assessment methods
produce less anxiety in the students as they become more
familiar with the OSPE. Finally, evaluation of the teaching and
learning process was identified as an essential aspect of
improving practice and continues to be used in the department.
References
1. Harden RM, Gleeson FA. Assessment of clinical competence using an Objective Structured Clinical Examination (OSCE). ASME Medical Education booklet No. 8. Med Educ 1979;13(1):39-54. [http://dx.doi.org/10.1111/j.1365-2923.1979.tb00918.x]
2. Bartfay WJ, Rombough R, Howse E, LeBlanc R. The OSCE approach in nursing education: Objective structured clinical examinations can be effective vehicles for nursing education and practice by promoting the mastery of clinical skills and decision-making in controlled and safe learning environments. Can Nurs 2004;100(3):18-25.
3. Major D. OSCEs – seven years on the bandwagon: The progress of an objective structured clinical evaluation programme. Nurs Educ Today 2005;25(6):442-454. [http://dx.doi.org/10.1016/j.nedt.2005.03.010]
4. Harden RM, Cairncross RG. Assessment of practical skills: The objective structured practical examination (OSPE). Studies in Higher Education 1980;5(2):187-196. [http://dx.doi.org/10.1080/03075078012331377216]
5. Ananthakrishnan N. Objective structured clinical/practical examination (OSCE/OSPE). J Postgrad Med 1993;39(2):82-84.
6. Mitchell ML, Henderson A, Groves M, Dalton M, Nulty D. The objective structured clinical examination (OSCE): Optimising its value in the undergraduate nursing curriculum. Nurse Educ Today 2009;29(4):398-404. [http://dx.doi.org/10.1016/j.nedt.2008.10.007]
7. World Confederation for Physical Therapy. WCPT Guideline for Physical Therapist Professional Entry Level Education. London: WCPT Secretariat, 2011:1-42.
8. Scouller K. The influence of assessment method on students’ learning approaches: Multiple choice question examination versus assignment essay. Higher Education 1998;35(4):453-472.
9. Boursicot K, Roberts T. How to set up an OSCE. Clin Teach 2005;2(1):16-20. [http://dx.doi.org/10.1111/j.1743-498X.2005.00053.x]
10. Babbie E, Mouton J. The Practice of Social Research. Cape Town: Oxford University Press, 2006.
11. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol 2006;3(2):77-101. [http://dx.doi.org/10.1191/1478088706qp063oa]
12. Abraham R, Raghavendra R, Surekha K, Asha K. A trial of the objective structured practical examination in physiology at Melaka Manipal Medical College, India. Adv Physiol Educ 2009;33(1):21-23. [http://dx.doi.org/10.1152/advan.90108.2008]
13. Hasan E, Ali L, Pasha A, Arsia J, Farshad S. Association of the pre-internship objective structured clinical examination in final year medical students with comprehensive written examination. Med Educ Online 2012;17:1-7. [http://dx.doi.org/10.3402/meo.v17i0.15958]
14. Brosnan M, Evans W, Brosnan E, Brown G. Implementing objective structured clinical skills evaluation (OSCE) nurse registration programmes in a centre in Ireland: A utilisation focused evaluation. Nurse Educ Today 2006;26(2):115-122. [http://dx.doi.org/10.1016/j.nedt.2005.08.003]
15. Nicol M, Freeth D. Learning clinical skills: An interprofessional approach. Nurse Educ Today 1998;18(6):455-461. [http://dx.doi.org/10.1016/S0260-6917(98)80171-8]
16. Wolf K, Stevens E. The role of rubrics in advancing and assessing student learning. The Journal of Effective Teaching 2007;7(1):3-14.
Article Views
Full text views: 13578
Comments on this article
*Read our policy for posting comments here