The Correlation between Written and Practical Assessments of Communication Skills among the First Year Medical Students

Background: Communication is a core clinical skill that is essential for clinical competence. Practical assessments, such as Objective Structured Clinical Examinations (OSCEs) commonly assess communication skills among undergraduates; however organizing an OSCE is an expensive and complex process. Faculty of Medicine Unjani University uses essay format tests in communication skills assessments of the first year communication block. The evidence of written assessment in communication skills is still very limited. Objective: To study the correlation between written and practical assessments of communication skills among the first year medical students; to study the validity and reliability of written assessments to assess communication skills. Methods: A cross sectional study was conducted among the first year students in the Faculty of Medicine, Unjani University. At the end of the communication block, students faced a written assessment comprised of Modified Essay Questions (MEQ) and a practical assessment in one station with a simulated patient where the performance at the station was videotaped. There were two examiners for each assessment. Result: Kappa coefficient for inter-rater reliability of MEQ was 0.707 and practical assessment was 0.735. The correlation coefficient between written and practical assessments from the two examiners ranged between 0.063 – 0.127, n=120, p>0.01. On the item level, correlation coefficient in building initial rapport was -0,067, identifying the reason(s) for consultation was 0.030 and gathering information was 0.107. This result showed a low concurrent validity of the written test in assessing communication skills. Conclusion: Written assessments cannot predict the students‘ communication skills competence. Written assessments have a high reliability, nevertheless they have a low validity to assess communication skills.


Introduction
An effective doctor-patient communication has a positive relationship with the higher rates of patient recovery, therapy compliance, and lower rates of medical malpractice.
There are many conceptual frameworks that guide teachers in the teaching and assessment of communication skills; Arizona Clinical Interview Rating Scale (ACIR), Calgary-Cambridge Observation Guides (CCOG), SEGUE Framework (Set the stage; Elicit Information; Give Information; Understand patient's perspective; End the encounter), and MAAS-Global Rating List for Consultation Skills of Doctors (Schirmer et al., 2005;Kurtz et al., 2005;Makoul, 2001).
In compiling the medical curriculum, Hulsman et al., (1999) and Kurtz (2005) have emphasized the importance of teaching and assessing communication skills gradually throughout the curriculum.Windish (2005) has described a communication curriculum, with CCOG as the conceptual framework.It teaches basic communication skills by building rapport, giving open-ended questions and active listening skills.Baerheim et al., (2007) and Morrows et al., (2009) Hulsman et al., (1999) argue that successful communication skills assessment is done at three levels; (1) Subjective perception about knowledge on the manner of communication which can be achieved by a written test or a self evaluation, (2) Objective assessment of communication skills, e.g with OSCE, (3) Assessing an output aspect of communication process, e.g simulated patient's perception.Hulsman et al,. (2004) has reported the frequent use of OSCEs to assess communication skills at undergraduate level.However according to Kelly and Murphy (2002), OSCE incurs a higher cost for its preparation, implementation and organization including the training of stimulated patients and observers.
Several studies describe the use of written tests in assessing communication skills.Humphirs and Kaney (2000) developed a video-based written assessment method, Objective Structured Video Examination (OSVE).
In OSVE, students watch a videotaped doctor-patient communication and then answer written questions using their observations and communication skills knowledge.This method can be facilitated by Computer Assisted Assessment (CAA) (Hulsman et al., 2004).
The Communication block is the second block in first year curriculum of the Faculty of Medicine, Unjani University.Its aim is to study the basic communication skills applied in the integrated block during the next steps of education.At the end of this block, communication skills are assessed using written assessments.Concurrent validity of this test can be evaluated by correlating the results of the written and practical assessments.

Methods
A total of 153 first year students of the Faculty of Medicine, Unjani University who had recently completed the communication block, participated in the study.However, during the study, three students had incomplete attendance for the practical assessment.Due to technical problems with one camera during practical assessment sessions, only 120 sessions were examined.
A set of Modified Essay Questions (MEQ) and a scoring rubric for the written assessment and a checklist for the practical assessment station were developed.Both instruments assessed three points of basic communication skills, i.e. building rapport, identifying consultation reason(s) and gathering information.Before the study began, a qualitative content validity was done on instruments.

Modified Essay Question (MEQ)
MEQ is a scenario based written test where structured questions are given in a predetermined sequence.MEQs are designed to test decision making skills, ability to identify issues and resolving them logically.Students were given a MEQ set with three part questions in three separate papers.Each new part question was administered only after the previous one was answered.students wrote the answers in three separate papers, and were not allowed to return to the previous answer, once they got the next question.Write your ways to gather information on the patient, such as giving questions, listening, conducting reflection, and making a summary on the conversation with the patient.Give your questions that you will raise with the possibility of the patient's answers from the above condition.

Objective Structured Clinical Examination (OSCE)
As the practical exaimnation, a single OSCE station for basic communication skills was developed.Students played the role of a doctor, and demonstrated their communication skills with standardized patients (SP).All performances were videotaped, and judged by two examiners.The observation checklist was developed to ensure the standard among examiners, and it consisted of (1) Building rapport: greet patients and ask the patients' name/identity; Conduct self-introduction, explain the purpose of the session, ask the patients' informed consent if necessary.(2) Identify patients" complaints or problems which the patients want to speak by using appropriate opening questions; (3) Gathering information: Give open and closed questions correctly (commenced with open-questions and detail with closed-questions); Listen with full attention to the patients' statements without interrupting; Conduct reflection on the content of the conversation; Conduct reflection on the patients' appropriate feelings; Use appropriate nonverbal elements; Make a summary on the content of the conversation.
The assessment took place on two consecutive days; on the first day, students were given a MEQ and on the second day students were videotaped in the OSCE station with a simulated patient.Both assessments were scored on a numerical scale (0-100) by two examiners.The reliability of written and practical assessments was determined by consistency from two examiners (inter-rater reliability).The correlation between written and practical assessment results were analyzed using Pearson's correlation.

Reliability of written and practical assessment
The reliability of written and practical assessments was determined by inter-rater reliability.The results are presented in table 1.Both the practical and written assessments showed good inter-rater reliability.

Written and Practical assessment correlation as a Concurrent validity
The correlation between results of MEQ and practical assessment from the two examiners ranged between 0.063 to 0.127 with p>0.01.
On the item level shown in Table 3, correlation coefficient in building rapport was -0.067, identifying consultation reason(s) was 0.030, and gathering information was 0.107.All pvalues were > 0.01 (Table 2).The correlation between MEQ and practical assessment was not significant.This indicates that written assessment could not predict the performance of communication skills in practical assessment as a ‗criterion'.Thus, written assessment had a low concurrent validity to assess communication skills.

Discussion
When determining an appropriate assessment method for communication skills, aspects of validity and reliability have to be considered.Shumway and Harden (2005) have defined validity as the degree to which an instrument measures what it is supposed to measure.Before the study began, content validity was checked by reviewing the representativeness of MEQ items and checklist items.Although the inter-rater reliability of MEQs in this study was good (0.73), the evidence of concurrent validity of MEQs to assess communication skills was low.It's indicated by the insignificant correlation between written and practical assessments.
The basic communication theory mentions that the communication process is influenced by many factors such as knowledge, self-concept, ethical and cultural factors.These factors have a role in interpreting an idea (Dwyer, 2005;Ali et al., 2006).Spitzberg (1983) supports this theory that one's performance in communication is influenced by knowledge and motivation.In relation with educational process, Miller's pyramid can explain that knowledge (knows and knows how) is a fondation of skills (shows and does).In communication skills, Hulsman (2004) states that building communication skills requires a detailed "knows how" level, i.e., ‗knows why and when" and ‗integration" levels.Those theories support an opinion that there should be a significant relation between knowledge and performance that can be presented in written and practical assessment of communication skills.
However, according to Van Dalen et al., (2002) there has been a lower correlation between written and practical assessments in communication skills compared to correlation between written and practical assessments of other clinical skills.Furthermore, van der Vleuten (1989, cited by Van Dalen et al., 2002) explains that the correlation can increase in the final year of education.Norcini and Lipner (2000) have also demonstrated a low corelation between student's ability in written and practical tests in communication skills.Humpirs and Kaney (2001) in their cohort study showed that OSVE in the first year is a low predictor of OSCE of communication skills in the next year.However Individual factors, i.e. student's personality, influence more on ensuring communication skills competence (Van Dalen et al., 2002).
The results of this study show similar evidence that there is a low or insignificant correlation between the result of written and practical assessment of communication skills.McCrosney (1983) states that communication competence is -the ability in applicating knowledge in a communication practice on a certain situation".Based on the definition, it is possible that someone has a high level of knowledge about communication but cannot communicate well, because they cannot apply the knowledge in communication performance.Kurtz (2005)  This study also showed that on the item level, the lowest correlation was in building rapport.In this item, the assessment points were (1) Greeting and asking patients' identity, and (2) Self introducing and explaining objective of the session.It might indicate that individual factors in building rapport skills were higher than identifying the reason and gathering information skills.

Conclusion
The written assessments could not predict students' performance in communication skills.Even though written tests are reliable, they have a low validity in assessing performance of communication skills.

Recommendations
Further research may be used to describe the correlation between written and practical assessments in a longitudinal study from the first until last year of education.The assessment instrument of practical test can be added by simulated patient's perception.

Table 3 : Correlation coefficient on item level
M1: mean result of written assessment on building rapport, M2: mean result of written assessment on identifying consultation result, M3: mean result of written assessment on gathering information, 01: mean result of practical assessment on building rapport, O2: mean result of practical assessment on identifying consultation result, O3: mean result of practical assessment on gathering information