full article:
Medical education is increasingly moving from the academic medical center to diverse settings in distant locations1. Increases in class size and faculty workload, as well as new experiential approaches to clinical training, are calling for learners to spend less time on campus and more time in different learning settings. An important example is training medical students in rural locations, which provides valuable educational experiences while also assisting in addressing rural workforce shortages. However, the distance from the home educational institution can affect student engagement with faculty and with other learners. Additionally, it is difficult for the home educational institution to assess clinical learning for distance students. Instead, the clinical experiences and preceptor feedback for distance students often become proxies for the clinical assessments administered to on-campus learners. As more students move off-campus for clinical training, medical institutions need more robust methods of assessing clinical competence for distance learners that are comparable to the assessment methods for on-campus students.
Although many medical schools incorporate distance learning into their curricula2, assessing students at a distance occurs much less often due to the inherent challenges3,4. For example, preceptor-proctored knowledge-based exams are relatively simple to administer to remote students, while clinical skills exams are not. Objective structured clinical exams (OSCEs) utilize standardized patient (SP) actors to simulate real-world clinical encounters in a safe teaching environment. The utilization of OCSEs in medical education is common and the efficacy in assessing clinical learning is well documented5-8. As medical schools increase the numbers of off-campus learners, it is important for institutions to consider how they will assess clinical skills development without requiring learners to return to campus.
Few options currently exist to centrally administer OCSEs to remote learners. Existing options are often proprietary and fee-based, which may be prohibitive for programs with limited funding9-11. Optimal educational options would include non-proprietary solutions that institutions can share and implement to meet their own distance learning assessment needs.
We conducted a pilot program to study implementation of a non-proprietary distance OSCE solution, the teleOSCE. In spring of 2013 and spring of 2014 the teleOSCE was administered online to nine rural distance learners using commercially available Adobe Connect video-conferencing software, cell phones and a primary care focused diabetes management case. Due to the novel nature of the activity, we sought to determine the feasibility and acceptability of the teleOSCE. We defined feasibility as requiring development and implementation costs of US$5000 or less, including faculty and staff time and effort, and defined acceptability in terms of student receptivity and evaluation of the usefulness of the experience.
All study activities were approved by the Institutional Review Board of Oregon Health & Science University (OHSU). Study participants were third-year medical students enrolled in the Rural Scholars Program (RSP) at OHSU. The RSP is a competitive admission program for students focused on careers in rural medicine. RSP students complete a significant portion of their family medicine coursework via distance learning, spending a minimum of 10 continuous weeks in a remote, Oregon rural clinical site. During the family medicine clerkship, on-campus learners participate in a 'teaching' OSCE, a formative assessment in a simulated clinical encounter completed early in the clerkship. Unlike traditional summative assessment OSCEs, the teaching OSCE is a 'pass-no pass' assessment, with an SP and a faculty member giving immediate formative feedback at the end of the simulated encounter. The formative nature of the feedback makes this a popular activity among participating students. Due to their remote locations, RSP students historically did not take part in the teaching OSCE. The teleOSCE was developed to bridge this educational gap.
Case development
A telemedicine scenario was chosen for the teleOSCE to mimic a real-world situation in which a physician interacts with a rural patient in a non-face-to-face manner. Telemedicine is 'the remote delivery of healthcare services and clinical information using telecommunications technology12.' Since RSP students are located in rural settings, the teleOSCE simultaneously solved the logistical challenge of centrally assessing clinical skills in real time, while also exposing learners to a new model of rural patient care.
Three competency domains were assessed in the teleOSCE: (1) clinical knowledge: learners must identify diabetes management issues and recommend appropriate follow-up, (2) patient-centered use of technology: learners must remain patient-focused despite performing the clinical encounter using the telemedicine software, and (3) understanding of the geographic and socioeconomic realities of rural patients: learners must incorporate rural circumstances into the plan of care (travel distance, lack of an in-town pharmacy, and few nutritional options).
Implementation logistics
We used Adobe Connect online meeting software as the teleOSCE 'exam room'. Adobe Connect enables multiple live video and audio feeds as well as the ability to access documents from inside the digital meeting room, making it an ideal platform for a telemedicine simulation. RSP students were already familiar with the Adobe Connect technology, having used it for other curricular requirements in the program. Each student was given a specific appointment time to connect with the SP via the internet in the virtual exam room. One non-clinical faculty member served as the meeting operator to provide technical support for the session, and a clinical faculty member served as the observer. Each encounter lasted 20 minutes, with 15 minutes for the clinical encounter and 5 minutes for feedback. From four separate locations learners, faculty members and the SP all participated in the teleOSCE from their own computers and cell phones. Figures 1 and 2 illustrate the teleOSCE setup.
Figure 1: Each objective structured clinical exam participant connects to the meeting room from a separate location via the internet with a laptop computer and a cell phone. All interactions take place online in an Adobe Connect virtual meeting room.
Figure 2: The standardized patient in the objective structured clinical exam is shown on the left and the student is on the right. The observing faculty member and technical operator are also present with their webcams turned off.
Data collection
Participants digitally signed consent forms prior to participating in the teleOSCE. A qualitative case study framework was chosen for the study, with two RSP student cohorts participating as part of a convenience sampling. Cohort 1 (n=4) participated in spring of 2013 and cohort 2 (n=5) participated in spring 2014. Cohort 1 data were collected by phone interview by RP within one month of the conclusion of the teleOSCE. All students participated in the interviews. Interview audio was coded by RP using categorical aggregation13 and clustering14. Atlas.ti v10 (Atlas.ti; http://www.atlasti.com/index.html) was used as the qualitative analysis software. Interview analysis results were shared with participants to verify accuracy. To simplify the transcription and coding process, we converted the interview protocol from cohort 1 to a secure online survey form and emailed it to cohort 2 students within a week of completing the teleOSCE with a respondent rate for cohort 2 of 100%. Coding methodology for the online survey responses matched cohort 1 interview coding methodology. BS reviewed both interview audio and survey responses for coding accuracy. The following interview protocol was used in both telephone and online survey data collection:
A. Was this an acceptable format for you to conduct an OSCE exercise? Explain why or why not.
B. How realistic was it for you to assess a patient in the format of the telemedicine OSCE?
C. What was your experience with the technology used to do this OSCE?
D. Do you feel this educational activity was a good use of your time while on your rural rotation? Briefly explain why or why not.
We determined that the teleOSCE was financially feasible if development and implementation costs were at or below US$5000. This amount was based on years of professional experience in curriculum development as well as an expert consensus process involving similarly experienced faculty peers. Faculty costs were calculated by a standard multiplier of faculty hourly rate times total hours spent in both development and implementation. Standardized patient compensation and teleconferencing fees were factored into the feasibility calculations.
Ethics approval
The study protocol was approved by Oregon Health & Science University's Institutional Review Board.
Financial feasibility
TeleOSCE financial feasibility was determined by calculating the total cost of faculty full-time equivalents needed for case development and implementation, telephone charges for the Adobe Connect meeting room, and SP costs. As described in Table 1, the project entailed a total cost of US$1577.20, meeting our definition of financial feasibility. The 'extrapolated cost' column in Table 1 illustrates how the teleOSCE may be implemented at a feasible cost for an even larger group of students.
Table 1: Cost of development and implementation of the tele-objective structured clinical exam.
Acceptability
Transcript coding of student interviews revealed the teleOSCE to be an educational activity of acceptable quality and importance. Coding themes are illustrated in Table 2.
Table 2: Coding themes for and excerpts from the tele-objective structured clinical exams.
Discussion
Strengths
The strengths of the teleOSCE are its scalability and its ability to clinically assess learners from a distance. While Adobe Connect was used for this implementation, other video-conferencing software such as Skype, GoToMeeting, or FaceTime may be used as well. The teleOSCE case we developed is also freely available on the Family Medicine Digital Library database (Society of Teachers of Family Medicine) for any institution to use and share15. Additionally, the teleOSCE allows the SP and the faculty participants to 'work from anywhere', easing recruitment of faculty and actors, and reducing travel and scheduling time. Finally, the teleOSCE provides a time efficient, financially feasible and educationally acceptable format for centrally assessing the clinical skills and competence of distance learners in a manner comparable to that used in evaluating on-campus learners. By using or modifying the teleOSCE, institutions are now capable of directly assessing their learners using their own institutional faculty, SPs and educational competency metrics.
Limitations
Because this was a pilot, the sample size was small and represents only one institution. Also, findings of this study are qualitative and may not be generalizable to all other learners at other institutions. Additional trials of the teleOSCE are needed to validate comparability, acceptance by faculty and students in broader settings, and to ensure replicability.
Results of this study indicate that administration of the teleOSCE to remote learners is both financially feasible and acceptable to students. In addition to solving logistical issues, the teleOSCE seems well suited to expose students to telemedicine visits as a new model of rural care, while simultaneously increasing awareness of common issues in rural population health.
The next steps are to expand the teleOSCE to other health training programs and settings. Further validation of comparability is being undertaken with a modified version of the teleOSCE being used in the on-campus family medicine clerkship OSCE at the study institution. Development and validation of additional teleOSCE cases will also be important. Increased utilization of the teleOSCE cases and format by other programs and institutions will generate a larger framework to support future scholarly inquiries and build upon this initial exploration.
Acknowledgements
The authors are grateful for editing assistance from Dr Patty Carney, Professor, Department of Family Medicine, Oregon Health and Science University, Portland, Oregon.
References
1. Kahn MJ, Maurer R, Wartman SA, Sachs BP. A case for change: disruption in academic medicine. Academic Medicine 2014; 89(9): 1216-1219.
2. Parisky A, Ortiz T, McCann K, Hoffmann E, Boulay R. How top US medical schools are using distance learning resources: an exploratory study of four institutions. In T. Bastiaens, et al, Eds. Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education. Chesapeake, VA: Association for the Advancement of Computing in Education, 2009; 2930-2935.
3. Mattheos N, Schittek M, Attström R, Lyon HC. Distance learning in academic health education: a literature review. European Journal of Dental Education 2000: 5: 67-76.
4. Tangalos EG, McGee R, Bigbee AW. Use of the new media for medical education. Journal of Telemedicine and Telecare 1997; 3: 40-47.
5. Davidson R, Duerson M, Rathe R, Pauly R, Watson RT. Using standardized patients as teachers: a concurrent controlled trial. Academic Medicine 2001; 6: 840-843.
6. Dong T, Saguil A, Artino AR, Gilliland WR, Waechter DM, Lopreaito J, et al. Relationship between OSCE scores and other typical medical school performance indicators: a 5-year cohort study. Military Medicine 2012; 177(9 Suppl): 44-46.
7. May W, Park JH, Lee JP. A ten-year review of the literature on the use of standardized patients in teaching and learning: 1996-2005. Medical Teacher 2009; 31: 487-492.
8. McGraw RC, O'Conner HM. Standardized patients in the early acquisition of clinical skills. Medical Education 1999; 33(8): 572-578.
9. WebOSCE.net. WebPatient encounter. (Internet) 2013. Available: http://webcampus.drexelmed.edu/webosce/ (Accessed 13 November 2013).
10. Daetwyler CJ, Cohen DG, Gracely E, Novack DH. eLearning to enhance physician patient communication: a pilot test of 'doc.com' and 'WebEncounter' in teaching bad news delivery. Medical Teacher 2010; 32(9): e381-e390.
11. Novack DH, Cohen D, Peitzman SJ, Beadenkopf S, Gracely E, Morris J. Pilot test of WebOSCE: a system for assessing trainees' clinical skills via teleconference. Medical Teacher 2000; 24: 483-487.
12. American Telemedicine Association. Telemedicine FAQs. (Internet) 2012. Available: http://www.americantelemed.org/learn/what-is-telemedicine/faqs (Accessed 13 November 2013).
13. Creswell JW. Qualitative inquiry and research design. Thousand Oaks: Sage Publications, 2007.
14. Marshall C, Rossman GB. Designing qualitative research. Los Angeles, CA: Sage Publications, 2011.
15. Society of Teachers of Family Medicine. Resource library. (Internet) 2015. Available: http://www.fmdrl.org/index.cfm?event=c.beginBrowseD&clearSelections=1&criteria=palmer#5045 (Accessed 17 November 2015).