Original Article

A Pilot Study of the Effect of a Change in the Scheduling of Canadian Medical Licensing Examinations on Two Cohorts of Students Studying in Ireland


Kate Niethammer1, Pishoy Gouda2, Edina Moylett3


doi: http://dx.doi.org/10.5195/ijms.2015.112

Volume 3, Number 1: 40-44
Received 16 02 2015: Accepted 24 03 2015

ABSTRACT

Background:

The Medical Council of Canada and most Canadian residency programs require international medical graduates seeking training in Canada to pass the Medical Council of Canada Entrance Examination, in addition to the newly established National Collaborative Assessment. In order to facilitate this additional examination, the Medical Council of Canada has altered the suggested examination timeline and examination eligibility criteria.

Methods:

A cross-sectional survey was sent via an online survey tool to members of the North American Irish Medical Student Association. The survey aimed to elicit differences in the Medical Council of Canada Entrance Examination experience between two cohorts of Canadians studying abroad in Ireland: those who completed the examination before and after the new timeline. Statistical analysis was conducted with independent t-tests and Pearson's Chi-Square tests using SPSS version 21.

Results:

Of 24 respondents, 13 had completed the examination after the timeline change. Participants who attended the examination prior to the change achieved higher results (353.8 ± 56.5) than participants who attended the examination after the change (342.3 ± 35.1), although not statistically significant (p=0.56). In the cohort who took the examination after the timeline change, 61.5% of participants expressed discontent with their examination results; 84.6% ‘strongly agreed’ or ‘agreed’ to feeling disadvantaged due to the change.

Conclusion:

The new Medical Council of Canada examination timeline has had an impact on the examination experience of Canadians studying in Ireland. Simple modifications to the current timeline are warranted to reduce unnecessary disadvantage for this cohort of students applying to postgraduate training in Canada.

Keywords: Students, Medical; Education, Medical; Educational Measurement; Education; Emigration and Immigration.

Introduction

In order to preserve a high standard of health care providers in Canada, the Medical Council of Canada (MCC) requires that in ternational medical graduates (IMGs) seeking postgraduate tra ining in Canada pass several entrance examinations. These checkpoints were developed to ensure a high level of knowledge required to succeed in postgraduate training. These examinations include the Medical Council of Canada Entrance Examination (MCCEE) and the newly established National Collaborative Assessment (NAC) Objective Structured Clinical Exam (OSCE). These examinations must be completed prior to sitting the Medical Council of Canada Qualifying Exam (MCCQE) I and II (Medical Council of Canada. Available from: http://mcc.ca/examinations/nac-overview/application-information/#Timing, updated 2015, cited 2015 Feb 16).

Prior to the establishment of the NAC OSCE, medical students studying abroad, many of whom are Canadian citizens, sat the MCCEE in their final year of study in order to apply for residency positions through the Canadian Residency Match System (CaRMS). Prior to the 2015 CaRMS cycle, only a select number of provinces (British Columbia, Alberta, and Quebec) required the completion of the NAC OSCE in order to apply to their postgraduate training programs. Since then, additional provinces (all except Saskatchewan) have added the NAC OSCE as an eligibility requirement (Canadian Resident Matching Service. Available from: http://www.carms.ca/en/match-process/your-application/match-tips/nac-osce/, updated 2015, cited 2015 Feb 16). Due to this change, the MCC has made significant changes to the examination timeline and requirements (Figures 1).

Figure 1.

Previous (top) and Current (bottom) Canadian Residency Match System R1 Match Timeline.


In order to be eligible to sit the NAC OSCE, applicants must have received a pass result in the MCCEE prior to the application. Therefore, the latest an applicant could sit the NAC OSCE, to be eligible for the CaRMS match, is the September prior to the match. To ensure eligibility for the September sitting, the MCC suggests that candidates write the MCCEE no later than March, one year before the expected match.

For Canadian IMGs, referred to as Canadians studying abroad (CSAs), this change in the timeline has caused significant challenges to the process of applying to postgraduate training in Canada unique to this cohort of IMGs.1 CSAs are now required to write the MCCEE half way through their penultimate year, which means that many applicants will not have had exposure to the core clinical rotations covered in the exam, further complicating an already difficult process.2 While the difficulties of immigrant physicians in Canada are well documented, there is sparse information regarding the return migration of CSAs.3

In 2010, there were an estimated 3,500 Canadian students studying medicine abroad, with more than 700 of them studying in Irish medical schools;4 90% of CSAs have indicated a desire to return to Canada for postgraduate training.5 Foreign medical schools are seeing an increasing trend in Canadian students' enrollment,6 without a guarantee of local postgraduate training.5 This has created a highly competitive field for the limited number of residency positions designated for IMGs in Canada, with only 25% of IMG candidates successfully matching to residency positions in Canada in 2010.6 The positions available frequently require a return of service contract that are vital to ensure that many underserved, rural physician posts are filled.7 Nonetheless, Canada has still been described as underutilizing IMGs by IMGs and their families.8

This study aims to describe the differences in the MCC certifying examination experience between two cohorts of Canadian medical students in Ireland, comparing those who completed the exam prior to and those following the implementation of the new examination timeline. Our results may be used to inform the debate on the appropriateness of the newly implemented timeline, as well as student feedback on the effects of these changes. There is currently no reported literature on the subject.

Due to students being required to write the MCCEE earlier than previous students studying abroad, we hypothesized a decrease in the average scores and a more negative examination experience in the post-change cohort.

Methods

In April 2014, a cross-sectional survey was sent out to the 746 members of the North American Irish Medical Student Association (NIMSA). Representatives from each medical school were contacted via email and requested to contact all eligible Canadian students in their final or penultimate year studying at their school (n=288). All candidates were provided a URL to the survey hosted on an online survey tool (http://www.surveymonkey.net). Due to the methods of distributing the survey and having no central database of Canadians studying in Ireland, it is difficult to confirm if all students eligible to partake in the survey received it.

Participation was voluntary, and students were allowed to discontinue at any stage of the study. Students were assured at multiple stages of the study that their responses were anonymous. The survey remained open for six months.

The questionnaire aimed to elicit methods of preparation for the MCCEE, MCCEE results, as well as personal satisfaction with the exam experience and results. The survey contained a combination of nominal, ordinal, and scale items.

Statistical analysis was conducted using the independent t-test to describe score differences between the two cohorts. Pearson's Chi-Square test was used to compare MCCEE exam experience between the two cohorts. The Statistical Package for Social Science (SPSS) software version 21 was used for data analysis.

Ethical approval was granted by the Research Ethics Committee at the National University of Ireland, Galway on July 4, 2014 (Number 2014-06-12).

Results

Participant Demographics

Out of the potential 288 students in final and penultimate year of study, 24 participants completed the survey. Fourteen (58.3%) were female, and the average age was 27.4 years. Thirteen (54.2%) participants wrote the MCCEE following the change to scheduling. Sixteen (66.7%) of participants were enrolled in an undergraduate medicine course, the remainder post-graduate. Based on academic record, the majority of participants were of 2nd class honors standard (20/24). (Table 1).

Table 1.

Participant Demographics Information.

Characteristic Pre-Change Cohort (n=11) Post-Change Cohort (n=13)
Female (%) 8 (72.7%) 6 (46.2%)
Age (Mean ± SD) 27.4 ± 2.0 27.3 ± 3.0
Postgraduate (%) 4 (36.4%) 4 (20.8%)

MCCEE Preparation

The majority of students (54.2%) studied 20-39h per week in the month leading up to the exam; nine participants (37.5%) 0-19h per week, one (4.2%) 40-59h and one studying more than 60 hours per week. Eight (37.5%) students reported using only online question banks; the remainder (66.7%) used a mixture of questions banks and textbooks. There was no reported difference in study method or study hours between the pre-change cohort and the post-change cohort. Nine participants (37.5%) stated that they did not complete rotations in all the examined clinical disciplines of the MCCEE (Medicine, Surgery, Psychiatry, Pediatrics, and Obstetrics and Gynecology); the vast majority (88.9%) of them were participants in the post-change cohort.

MCCEE Results

Self-reported scores were available for nearly all participants (95.8%). All participants passed the exam (250 is the standardized pass mark). The average mark obtained was 347.8 ± 45.9 (Standard Deviation, SD). Female participants were non-significantly more likely to achieve a higher result than male participants (355.5 ± 48.0 vs. 335 ± 42.2, p=0.33). Participants in the pre-change cohort were also non-significantly more likely to achieve higher results than participants in the post-change cohort (353.8 ± 56.5 vs. 342.3 ± 35.1, p=0.56).

Reflection on MCCEE Experience

Concerning the statement, “Did you feel like you had enough time to prepare for the exam?” three (12.5%) participants ‘Strongly Agreed’, 10 (41.7%) ‘Agreed’, two (8.3%) neither ‘Agreed or Disagreed’, five (20.8%) ‘Disagreed’ and four (16.7%) ‘Strongly Disagreed’. While not statistically significant, a higher proportion of post-change students (23.1%) ‘Strongly Disagreed’ that there was enough time to prepare compared to pre-change students (9.1%) (p=0.35).

Regarding satisfaction with MCCEE results, in the post-change group, 61.5% either ‘Disagreed or Strongly Disagreed’ with the statement that they were pleased with their result, compared to only 18.2% of the pre-change group (p=0.10).

Discussion

Of the 24 students that participated in this study, 13 sat the examination post schedule change. The majority (61.5%) of those in the post schedule change group had not completed the core rotations and topics tested in the MCCEE. As hypothesized, our study showed that lower average examination results were attained in the post-change group, although not statistically significant. This may, in part, be explained by the fact that this cohort sat the MCCEE with a large part of their clinical education still ahead of them.

Despite no recorded difference in the nature of designated study time, we found that students were negatively affected by the earlier exam date. As hypothesized a higher proportion of post-change students chose ‘Strongly Disagree’ when asked if there was enough time to prepare for the MCCEE. The majority of students' who stated that they were not pleased with their scores and felt disadvantaged with the timing of the exam in the medical degree were in the post-change cohort.

Overall, the change in the MCCEE timeline negatively affected CSAs studying in Ireland who sat it prior to March 9th, 2014. The students who sat the exam after the change in their penultimate year received overall lower scores, felt less prepared and were not pleased with their score results, although these findings were not statistically significant.

Prior to the examination scheduling change, students typically completed the examination in the September sitting, allowing them to prepare over the summer. By contrast, prospective residency applicants are now suggested to apply for the February/March sitting of the exam. It must be noted that this new suggested date falls in the period where students are full-time clinical clerks and have to meet the demands of their clinical education while studying for the MCCEE.

Residency programs receive hundreds of applications for a limited number of positions. It is a common practice to use cut off points to filter applications for further consideration, with MCCEE score being used as one of these.9 Applicants who score below a certain threshold will not have the remainder of their application considered. This has several implications. If students proceed with the currently suggested timeline, and as suggested by our study, are not pleased with their scores, there is no opportunity to repeat the examination. If, in fact, the changes in the examination timeline result in widespread lower scores for CSAs, many qualified students will not even be considered for residency positions in Canada. Due to this pos sibility, the authors feel that it is appropriate that further discussion takes place with residency programs and regulatory medical bodies in Canada to ensure that everyone involved is aware of the changes. The authors would suggest that due to the new scheduling of the MCCEE, scores of the examination should be interpreted with care, as it is unlikely to reflect the clinical knowledge that the candidate will possess at the end of their clinical training. This is supported by a lack of correlation between MCCEE scores and the NAC OSCE, which examines clinical skills,10 and studies showing a stronger correlation between structured clinical assessment results and IMG competence.11 Data have also demonstrated that using the CaRMS application process, which includes records of students' marks, clinical experience, extra-curricular activities, reference letters and personal statements in choosing applicants, is a better predictor of residency performance, when compared to using exam scores alone for applicant selection.7

Strengths and Limitations

Our study is the first to evaluate the examination experience of a subset of IMGs, the CSAs, in the process of applying to postgraduate training in Canada. However, there are limitations to our study. There is no centralized process to track medical students in Ireland taking Canadian licensing examinations, making it difficult to ascertain if all students who wrote the examination received the questionnaire.

The Health Education Authority of Ireland has shared with the authors that during the 2013/2014 academic years, there were 122 final year Canadian medical students studying in Ireland and 166 penultimate year Canadian medical students in Ireland. These figures represent the greatest possible number of individuals who were eligible to sit the MCCEE. Using these figures, our study represents 8.3% of this cohort, which is a conservative figure, considering that of the theoretically eligible 288 students, many will not have registered for the exam for a variety of reasons. While our sample size is small, it greatly surpasses the sample size required for a pilot study.12 This represents only a sample of CSAs in Ireland, but indicates that the full extent of the issue could be further explored in the future.

Finally, all exam results are self-reported and, therefore, an exaggeration bias must be considered. However, survey participants were reassured that the survey was completely anonymous. As the authors expected a small sample size the decision was made to not collect the institution of applications to further add to anonymity, which was relayed to survey participants.

Recommendations

Despite showing no statistically significant differences in exam scores between the two cohorts, our study highlights some of the effects of the new scheduling of Canadian entrance exams. We would recommend that the Medical Council of Canada consider the impact of this change has on the selection of future residents. If students were not required to have passed the MCCEE prior to writing the NAC, then they would be able to sit both of these exams in September of their final year. This would ensure that more applicants would have completed core rotations in the topics tested. In addition, this suggested timeline would provide candidates with more designated time during the summer to study for both exams while still meeting the requirements for applying for residency positions in Canada.

Residency program directors in Canada should also be made aware of the changes in the scheduling. If the program directors or member of the selection committee were not trained outside of Canada, they might be unaware of the change in the application process for CSAs applying to Canadian residency programs. This may have considerable effects on the way that program directors interpret MCCEE results and re-evaluate its use as a cut off factor.

Our study has identified several issues with the change in the MCC's schedule of the MCCEE for IMGs, particularly the CSA subset. To explore the issues raised in this pilot study, we propose that a large-scale study, supported by the MCC, is necessary to address and justify the changes caused by the examination scheduling on a unique and important cohort of IMG applicants, the CSAs.

Acknowledgments

None.

Conflict of Interest Statement & Funding

The Authors have no funding, financial relationships or conflicts of interest to disclose.

Author Contributions

Conception and design the work/idea: KN, PG, EM. Collect data/obtaining results, Analysis and interpretation of data, Write the manuscript: KN, PG. Critical revision of the manuscript, Approval of the final version: KN, PG, EM.

References

1. Gouda P, Fanous S, Gouda J. Challenges faced by international medical students due to changes in Canadian entrance exam policy. Int J Med Students. 2014 Nov-2015 Mar; 3 (1): 70–1.

2. Barer ML, Evans RG, Hedden L. False hope for Canadians who study medicine abroad. CMAJ. 2014 Apr 15; 186 (7): 552.

3. Satkauskas R, Pavilanis A. The plight of immigrant physicians in Canada. Can Fam Physician. 1990 Jan; 36: 119–27.

4. Esmail T, Gouda P. Impact of changes in Canadian postgraduate training on the Irish health service. Ir Med J. 2015 Jan; 108 (1):28.

5. Walsh A, Banner S, Schabort I, Armson H, Bowmer MI, Granata B. International medical graduates - current issues. Members of the Future of Medical Education in Canada Postgraduate (FMEC PG) Project consortium; 2011.

6. Watts E, Davies JC, Metcalfe D. The Canadian international medical graduate bottleneck: a new problem for new doctors. CMEJ. 2011: 2 (2): 86–90.

7. Schabort I, Mercuri M, Grierson LE. Predicting international medical graduate success on college certification examinations: responding to the Thomson and Cohl judicial report on IMG selection. Can Fam Physician. 2014 Oct; 60 (10):e478–84.

8. Taghizadegan S. The underutilization of international medical graduates in Ontario and Canada: a selective review of the existing literature on the experiences of international medical graduates in the context of Canadian health care policies [dissertation]. [Toronto]: Ryerson University;2013.

9. Thomson G, Cohl K. IMG selection: independent review of access to postgraduate programs by international medical graduates in Ontario; 2011.

10. Hofmeister M, Lockyer J, Crutcher R. The multiple mini-interview for selection of international medical graduates into family medicine residency education. Med Educ. 2009 Jun; 43(6): 573–9.

11. Takahashi SG, Rothman A, Nayer M, Urowitz MB, Crescenzi AM. Validation of a large-scale clinical examination for international medical graduates. Can Fam Physician. 2012 Jul; 58 (7): e408–17.

12. Hertzog MA. Considerations in determining sample size for pilot studies. Res Nurs Health. 2008 Apr; 31(2): 180–91.


Kate Niethammer, 1 National University of Ireland Galway, Nui Galway, Ireland.

Pishoy Gouda, 2 Final Year Medical Student, National University of Ireland Galway, Nui Galway, Ireland. Past-Chairperson of the Association of Medical Students in Ireland, Student Editor of IJMS.

Edina Moylett, 3 MB, BCH, BAO, LRCPI, LRCSI, DCH, MRCPI, Senior Lecturer, Dept. of Paediatrics, Clinical Science Institute, National University of Ireland Galway, Nui Galway, Ireland.

About the Author: Kate Niethammer is a fifth year medical student at the National University of Ireland, Galway of a five year medical program. She is the recipient of the Berman Prize in Medical Informatics in 2011, from the National University of Ireland, Galway.

Correspondence Kate Niethammer. Address: University Road, Galway, Ireland. Email: k.niethammer1@nuigalway.ie

Cite as: Niethammer K, Gouda P, Moylett E. A Pilot Study of the Effect of a Change in the Scheduling of Canadian Medical Licensing Examinations on Two Cohorts of Students Studying in Ireland. Int J Med Students. 2014 Nov-2015 Mar;3(1):40-4.


Copyright © 2015 Kate Niethammer, Pishoy Gouda, Edina Moylett



International Journal of Medical Students, VOLUME 3, NUMBER 1, March 2015