Integrated practical examination : a novel approach to evaluate undergraduate medical students in physiology practices

Background: The development of a reliable and valid method to assess laboratory exercises in preclinical sciences is a challenging task. The use of different assessment methods helps assess various aspects of clinical competence. Integrated Practical Examination (IPE) was thus incorporated as an assessment tool in physiology at Melaka Manipal Medical College (Manipal Campus), India aiming to test a wide range of practical skills and to improve the validity of our practical examinations.


Introduction
Student evaluation is useful to assess the knowledge, comprehension as well as skills and attitudes.Every assessment method possesses its own merits and demerits and each has a place depending on context, relevance and resources (Jones et al.,1999).The challenge then is to find the most appropriate tool for a specific purpose and the best set of tools for the spectrum of components of interest.The validity of the information provided by classroom tests depends on the care that goes into the planning and preparation of the tests (Gronlund,1985).
The conventional practical examination in Physiology at many medical schools in India consists of actual performance of two experiments, a major and a minor by the student.
But of late, this kind of assessment has been questioned because it is regarded as too narrow, selective and insufficiently comprehensive to test various aspects of practical skills and attitudes needed by the doctors (Harden et al., 1975).Realizing the inadequacy of the conventional practical examination, the objective structured practical examination vol.2, no. 1, 2008 -49 - (OSPE) is being widely used in many medical schools because of its objectivity and reliability (Nayar et al.,1986).OSPE has an edge over the conventional method as it incorporates a variety of test methods and allows all students to be examined uniformly on the content and time, which is not feasible in traditional methods.However, the practical skills essential to the medical students cannot always be tested by OSPE.OSPE may reduce some of the problems inherent in the traditional subjective evaluation, but its validity needs serious attention.Thus an integrated approach to evaluate laboratory experiments is necessary.
Considering the merits and demerits of both the conventional method and OSPE, an integrated practical examination (IPE) in Physiology was developed at our institution, where the performance type of examination (PE) is used in conjunction with OSPE.

MBBS program:
The undergraduate medical course at Melaka Manipal Medical College (Manipal Campus), Manipal, is a five-year, intense academic program.Students are taught basic science subjects in the first year, which include anatomy, physiology and biochemistry.The first year curriculum is spread over four blocks, each block of tenweek duration.There are two hours of physiology practicals every week.

Evaluation methods:
The practical examination in Physiology is conducted towards the end of each block.For the study sample, the examination was administered in the form of objective structured practical examination (OSPE) in the first and third blocks and performance exercise (PE) in the second block.
Integrated practical examination which includes both OSPE and PE was administered in the fourth block when all practical exercises were covered.Student performances in OSPE and PE were studied separately in the fourth block with 3 batches of students (March 2003, September 2003and March 2004).In each batch, the class was divided into 4 smaller groups for each of these examinations as the laboratory facility Answer key to each of these stations was prepared and reviewed by the faculty for evaluation process.The answer scripts were then evaluated by the faculty according to the answer key.OSPE carried 40 marks, which was later scaled down to 20 marks for the purpose of comparison.

2.Performance exercise (PE)
Following OSPE, students were subjected to PE.Each session was of two hour duration.Students were made to pick a card on which the seat numbers were indicated.Students occupied the respective seat and in each station there were different combinations of a major and a minor experiment.Each student had to perform a major and a minor exercise in the presence of the examiner.Simulated patients were used for most of the experiments.The major and minor experiments were selected taking into account factors like the complexity of the experiment and length of the experiment.PE included exercises such as clinical examination of the cardiovascular and respiratory systems, recording of blood pressure, determination of vital capacity,   The correlation coefficients between the marks on OSPE and PE were found to be poor for all three batches.The correlation was highest with the September 2003 batch.

Comparison of student performance in OSPE and PE during 2003-2004
Student performances in OSPE and PE during 2003-2004 were analysed.The results are shown in tables 5 and 6.There was a significant difference in the mean scores on OSPE for all three batches (P value=0.014).There was also a significant difference in the mean scores on PE for all three batches (P value=0.013).

Discussion and conclusion
It has been established that the mode of assessment influences the learning style of student (Brown & Knight, 1994;Entwistle & Entwistle, 1991).The type of learning activity in which students will engage is primarily determined by the type of assessment used (Guilbert, 1997).A change in assessment procedure can result in a change in learning behaviour (Latif, 1992).If a student expects to be examined in a variety of practical skills he will wish to learn these from his teachers before the examination.The objective structured practical examination (OSPE) was used as an objective instrument for assessment of laboratory exercises in conjunction with the performance exercise (PE) in which students are expected to perform a given experiment.
The development of a reliable and valid method to assess laboratory exercises in preclinical sciences is a challenging task.
Although OSPE is a well accepted method for assessing laboratory exercises because of its high reliability (Nayar et al., 1986), it does not always offer an opportunity to assess practical skills like physical examination, interpretation of data and time management which are considered to be the key components of clinical competence (Gleeson, 1994).Performance exercise helps overcome these deficiencies.However, a student can score well in PE even if he is not adept in most of the practical skills because of chance factor.A student may get to perform an experiment that was prepared well by chance and can manage to score well.Thus PE is less comprehensive.The score in PE is often awarded on the basis of answers to a few oral questions, which may be aided by clues from the examiners.In our study we observed that, student performance in performance exercises was better than that in OSPE in all three batches (March 2003, September 2003and March 2004 batches).On the other hand OSPE offers an objective assessment which is reliable and easily marked.In our study we observed that the range of scores in OSPE was wider compared to that in PE at least with two batches i.e.September 2003 (1.5 to 19.25) and March 2004 (3.75 to 18.75) batches, which suggested that OSPE discriminated different levels of competence better than the PE.Further, we observed that student performance in OSPE was poorly correlated with that in PE in all three batches.The correlation was highest with September 2003 batch (r=0.59).This indicates that the two instruments of IPE tested different types of abilities in the students (Bijlani, 1981).This supports the usefulness of different vehicles for evaluation.Our study revealed that the performance of September 2003 batch in both the components of IPE was significantly higher compared to that of March 2003 and March 2004 batches.
Each assessment method is marred in some fundamental way.The solution does not lie in perfecting the imperfectible but rather in deploying complementary modes of evaluation that compensate for the serious deficiencies of other methods in measurement (Shulman, 1989).Students have differing strengths and weaknesses and each component of IPE tests different aspects of knowledge, understanding and abilities (McLeod et al., 1996).OSPE tests a much wider sphere of subject matter while Performance Exercises are more suitable to assess physical examination and other practical skills necessary for clinical practice.Thus OSPE is used in conjunction with PE in our evaluation system.Integrated practical examination allows us to evaluate the knowledge, skills and attitudes necessary for clinical practice and it assures reasonable content and face validity.Medical Education, 33, 8-13. Latif, A.A., (1992) An examination of the examinations: the reliability of the objective structured clinical examination and clinical examination.Medical Teacher, 14, 179-183. McLeod, P.J., Cuello, C., Capek, R., & Collier, B., (1996) A multidimensional evaluation system in a basic science course in medicine.Medical Teacher, 18, 19-22. Nayar, U., Malik, S.L., & Bijlani, R.L. (1986) Objective structured practical examination: a new concept in assessment of laboratory exercises in preclinical sciences.Medical Education, 20, 204-209. Shulman, L.S. (1988) The paradox of teacher assessment, in: New directions for teacher assessment.Proceedings of the 1988 ETS invitational conference.Princeton, NJ: Educational Testing Service,1989.