Student perceptions of the utility of the Pharmacy Curriculum Outcomes Assessment
A B S T R A C T
Introduction: This study assessed student perceptions, preparation, and result use strategies of the Pharmacy Curriculum Outcomes Assessment (PCOA). Secondarily, it studied the effect of schools/colleges of pharmacy (S/COP) PCOA management on student perceptions.
Methods: A 52-item electronic questionnaire assessed PCOA preparation of final year students, review/use of results, remediation participation, self-reported motivation, and perceptions of the exam’s ability to measure PCOA blueprint areas and North American Pharmacy Licensure Examination (NAPLEX)/advanced pharmacy practice experience (APPE) readiness. Programs were given a questionnaire to determine their PCOA practices. Results: The student survey was completed by 341 students (40% response rate). Students pre- pared very little for the PCOA and few reported participation in PCOA-based remediation (6%). Students perceived the PCOA to measure the four domains moderately well, although adminis- trative sciences were significantly lower. Students reported less confidence in the exam’s ability to measure APPE/NAPLEX-readiness. Although few used the PCOA to guide their NAPLEX pre- paration (18%), they were more likely to do so than for APPEs (4%). Students reported a higher perceived increase in motivation if PCOA results were connected to APPE placement, remedia- tion, and progression as opposed to prizes, rewards, or other recognitions. Conclusion: This is the first multi-institutional study to review student perceptions about the PCOA. These data can be used along with other PCOA data to help schools develop incentive, remediation, and examination administration procedures depending on the programs desired use for the PCOA exam.
Introduction
The Pharmacy Curriculum Outcomes Assessment (PCOA) is a validated examination required to be administered by all schools/ colleges of pharmacy (S/COP) prior to students’ advanced pharmacy practice experiences (APPEs).1 The PCOA was developed by the National Association of Boards of Pharmacy (NABP) in 2008 and its blueprint is based on a national curriculum survey. The PCOA blueprint was updated in 2016.2 The PCOA has four major content areas: basic biomedical sciences, pharmaceutical sciences, social/ behavioral/administrative sciences, and clinical sciences. Scores are provided to students and the program as “scaled scores” that can be compared to average student performance both within the S/COP as well as with the national sample of S/COPs.2Due to the accreditation requirement for the use of the PCOA by all S/COPs, there has been considerable discussion in the academy on the PCOA’s place in both curricular and student assessment. Currently published research regarding the PCOA primarily focuses on use, implementation, and interpretation of the exam by S/COP faculty and administrators. Most studies center on (1) using the PCOA to identify students at risk for academic progression issues or failure of the North American Pharmacist Licensure Examination (NAPLEX), (2) describing strategies for incentives or stakes to improve individual student performance, or (3) the relationship between PCOA performance and other indicators.3–10 Most programs appear to be administering the PCOA in a low stakes environment.10,11 The 2017 American Association of Colleges of Pharmacy (AACP) Council of Deans Task Force survey [n = 126/139 (91% response rate)] found that 59% of programs neither tied performance nor participation with the PCOA and even fewer tied performance to remediation (17%), course grade (11%), or progression (8%).11 Seventy percent of programs offered no student incentives for taking the exam and < 30% of programs indicated a requirement consisting of some type of student follow-up or remediation post-examination due to poor performance. Trends of programs' stakes attached to the exam have been similar despite the changing number of programs using it for accreditation requirements.8Studies have been conducted to evaluate the relationship between PCOA performance and board exams and determine if PCOA is a valuable tool to identify students at risk for NAPLEX failure.
Two individual pharmacy schools evaluated the relationships between the PCOA, pre-APPE grade point average (GPA), and NAPLEX.5,6 Statistically significant correlations were found between PCOA and NAPLEX (r = 0.51–0.64), as well as pre-APPE GPA (r = 0.47–0.60). One conflict between the studies was that GPA was shown to be a stronger predictor of NAPLEX than PCOA in one (8% of NAPLEX score variance attributed to PCOA vs, 14% attributed to GPA), whereas PCOA was shown to be a stronger predictor of NAPLEX performance in the other.5,6 Another single school analysis showed a significant correlation between PCOA and NAPLEX score (r = 0.59).4 Six programs in public, research-intensive universities pooled NABP-matched data (n = 1460) and found total scores from PCOA and NAPLEX were significantly and moderately correlated (r = 0.54) and the pharmaceutical sciences and clinical sciences domains were significant predictors of NAPLEX performance.7Given the high level of variability in aspects of the PCOA including its use, implementation, preparation of and incentivization of students, and interpretation of PCOA by S/COPs, many questions regarding best practices remain unanswered, one of which includes questions of students' perceptions of these varying approaches. There is limited published literature with regard to student per- ceptions on the use of this examination. Naughton and Friesner11 studied students' perceived vs. actual knowledge on the four main domains of the PCOA and found only a weak, albeit significant, correlation between students' perceived knowledge and actual performance on the basic biomedical sciences domain of the exam but not on the other three domains. Waskiewicz12 described using a standardized tool, the Student Opinion Scale (SOS), and students' results to determine the level of motivation in a low stakes environment.
A comparison was made between students that were incentivized pre-exam to those who were not to determine if incentivization pre-PCOA affected performance. Authors utilized a data filtering technique called motivational filtering that “sys- tematically removes the test scores of unmotivated test takers to reduce bias.”12 Motivational filtering using the SOS and PCOA results was used to determine accuracy of data, and it was found that less filtering of scores was needed when students were incentivized. Neither of these studies addressed student use of exam scores for enhancing their professional development or asked questions about their perceptions of the testing process or opinions about incentivizing.The purpose of this study was to add student perspectives and practices and S/COP practices to the existing body of PCOA literature. The main objective for this study was to assess student attitudes and behaviors surrounding the PCOA including their perceptions of the exam itself, preparation for the exam, and strategies for use of their results. A secondary objective was to study the effect S/COPs' management of the PCOA, including study resources, incentives, and use of results, had on student perceptions.To determine PCOA related practices at programs of surveyed students, researchers developed an eight-item multiple-choice questionnaire with additional open-ended “other” options, named the PCOA School Questionnaire (PSQ), for the eight participating S/COPs to complete via a Microsoft Word document. One question pertained to doctor of pharmacy (PharmD) program length. Additional questions included: turnaround time, format of reported scores, resources provided to students, stakes associated with student performance, and student accountability. Eleven S/COP affiliated with the AACP Assessment Special Interest Group (SIG) designed the study to assess student perceptions of the PCOA. The team developed a 52-item questionnaire, named the PCOA Student Survey (PSS), that used a mix of yes/no, five-point Likert scale, and open-response items.
Items were grouped into seven sections designed to assess student: (1) demographics; (2) post-PCOA results review and remediation practices; (3) PCOA study habits and level of preparation effort; (4) use of results to prepare for APPEs and the NAPLEX; (5) perceptions of the PCOAs ability to measure the four areas of knowledge (basic biomedical sciences; social/behavioral/administrative sciences; pharmaceutical sciences; and clinical sciences),2 (6) motivation for performing well; and (7) perceived utility of incentives. The set of questions pertaining to student motivation was adapted from a validated motivation scale developed by Thelk et al.13 The items in section seven were newly developed by the authors and asked students to rate the potential influence of various incentives on their motivation to perform well on the exam. Items in both sections six and seven utilized a five-point Likert scale. All questions included within the survey were based on group consensus and the survey was created in both Qualtrics14 and SurveyMonkey15 to accommodate respective school platforms. The PSS was beta tested by the eleven re- searchers using both platforms prior to student administration. A blueprint for the PSS tool is included in Table 1. The full instrument is available upon request from the corresponding author.Participants included senior level pharmacy students in their final year from eight of the 11 schools. Students from three of the 11 schools were not invited to participate in the study due to timing and logistical challenges with institutional review boards (IRBs). In order to protect student confidentiality, the survey was administered anonymously, and each set of student names and emails was only accessed by the individual researcher at the affiliated school. Thus, each researcher was responsible for administering the survey and gathering the data at his/her own school. The PSS was administered between January and May 2018, and the specific timing depended upon the individual school. Scores were returned to all students at each S/COP prior to the survey release.
Each ques- tionnaire was open for one month with two reminder emails sent to non-respondents after approximately two and three weeks. No incentives were provided to students to participate in the study.PSS data obtained from the multiple schools were merged into a single dataset using Microsoft Excel. Incomplete responses were retained for all students who answered at least one item other than the demographics. Analysis of quantitative survey items consisted of descriptive statistics, bi-variate correlations (Pearson), and tests (one-way and two-way ANOVA) and non-parametric (Mann- Whitney and Kruskal-Wallis) tests that were calculated using SPSS version 24.16 Qualitative analysis of open-ended items was conducted using Microsoft Excel. The non-parametric tests were used to examine potential differences in survey responses based upon student demographics. For the qualitative analysis, one independent coder completed the first round of coding and development of themes. A second coder with qualitative expertise provided validation. The study was approved by the IRBs of each of the eight participating programs.
Results
The PSQ was administered to and completed by all eight of the S/COPs who also administered the PSS. Seven of the S/COPs had a traditional four-year PharmD curriculum (three years of didactic and one year of experiential coursework) and one had an ac- celerated three-year program. Five S/COPs were public and three private. Of the eight S/COPs, five were located in the South, two were located in the West, and one was in the Midwest region.In terms of how the PCOA scores were used, six of the eight (75%) S/COPs provided either information directly from NABP or a school-developed explanation to students for how to interpret their results. Two S/COPs required students to review their PCOA results with their faculty advisor, one required students to meet with an administrator, and another only required students who needed remediation to meet with an administrator. The remaining four S/COPs did not require their students to review their scores with any specific faculty or administrator, regardless of how each student performed.The participating S/COPs used a variety of approaches to incentivize their students. Two of the eight S/COPs used the PCOAscores as a required component or an opportunity to earn bonus points toward a course grade. Two of the eight S/COPs used the PCOA scores to determine the need for remediation and to prompt a mandatory meeting with a faculty advisor or administrator (three of eight S/COPs). At another S/COP, students received an academic misconduct penalty if they did not take the PCOA. None of the S/ COPs canceled class meetings or assignments to provide students with dedicated time to prepare for the exam.The PSS was administered to a total of 850 senior level pharmacy students at the eight participating S/COPs. In total, 341 responded to the survey, for an overall response rate of 40%.
Individual S/COP response rates ranged from 13% (9 of 69) to 89% (71 of 80). The distribution of responding students based on S/COP location, institution type, and program length are reported in Table 2 along with respondents' self-reported gender, age, and cumulative pharmacy GPA.With regard to study and preparation strategies, 40% of respondents studied prior to taking the PCOA, and 90% of those who did study prepared less than seven days prior to the exam. When asked to rate their level of effort in studying for the PCOA on a 5-point scale ranging from 0 (“I did not study”) to 5 (“I studied like I would for a course assessment or final”), 23% rated their effort at or above a 2. Of the students who studied for the PCOA, 44% prepared using a school-developed practice exam or a PCOA review session conducted by the school (28%), followed by a NAPLEX prep resource (10%). Four (0.5%) students used a practice exam from a commercial vendor or another type of resource, but none utilized a Foreign Pharmacy Graduate Equivalency Examination (FPGEE) resource or PCOA review session from a commercial vendor.The next set of questions on the PSS asked how students used and interpreted their PCOA results. Four percent of students reported including their scores on their resume or curriculum vitae (CV) and 2% reported that a prospective employer or residency program had requested a copy of their scores. Most students (84%) did not review their PCOA scores with faculty members or administrators in their school. Those students who reviewed their scores did so with a faculty advisor (7%), as part of a class (4%), or with an administrator (3%). Respondents were also asked about any remediation activities they engaged in as a result of their PCOA performance. Most students did not participate in any remediation (94%), and when they did, it was required by the school (5%) rather than student-initiated (1%).
Students who completed remediation (n = 15) used a NAPLEX prep course or resource (87%), reviewed class notes or lecture handouts (67%), or reviewed recorded lectures (47%).Few students (4%) indicated their PCOA results changed the way they prepared for APPEs; however, 18% reported the scores changed their approach to NAPLEX preparation. Using a thematic analysis of open-ended responses from 14 students who answered the question “how the PCOA results influenced their APPE approach” revealed that many of those students increased their pre- paration time prior to starting their rotations (43%), increased their study effort or time (14%), or the PCOA results allowed them to identify areas of weakness to focus upon (29%). Most of those who elaborated on “how the PCOA impacted their NAPLEX pre- paration” (n = 55) reported they would be devoting more time or intensity to their studying (38%) or made a general statement about increasing their focus on topics identified as needing specific attention (27%). A few students (13%) went so far as to cite one ortwo specific areas for which to increase their efforts, whereas some (11%) noted their results encouraged them to invest in NAPLEX preparation materials.In the next set of questions, students were asked to rate how well they believed the PCOA measured six different domains: the four content areas it was designed to assess as well as their readiness for APPEs and the NAPLEX (Table 3). The scale ranged from 1 (not well at all) to 5 (extremely well). Across the six domains, students' mean ratings ranged from 2.4 to 3.0, suggesting they believed the exam measured each of the domains approximately somewhat well. Results from a Friedman's two-way ANOVA demonstrated sta- tistically significant differences in students' ratings between the different domains (χ2(5) = 223.43, p ≤ .001).
Pairwise comparisons further revealed that, on average, students believed the exam measured biomedical sciences, pharmaceutical sciences, and clinical knowledge equally well, and better than administrative (managerial) knowledge, APPE-readiness, and NAPLEX-readiness. Ad- ditionally, bivariate correlations (Spearman-Rho) were calculated for students' ratings between each domain (Table 4). All corre- lations were positive, statistically significant, and moderate to very strong (0.491 ≤ ρ ≤ 0.825).Next, students were asked to rate the level of effort they put forward when completing the PCOA, using a five-point Likert-type scale ranging from strongly disagree to strongly agree (Table 5). Results for these items appear mixed. A large portion of students agreed or strongly agreed that they engaged in good effort throughout the test (53%) and were able to persist to completion (64%) which means they were able to finish the exam in the time allotted. On the other hand, a lower percentage agreed or strongly agreed that the test was important to them (30%), that they were concerned about their score (42%), or that they gave their best effort (39%). The average total effort score calculated from all 11 of the items included in the effort scale was 36 on a 55-point scale. This suggests that overall students put forth moderate effort, and their typical item-level responses were approximately at the scale mid- point (neutral).Possible differences in student effort based on demographics were examined using several types of parametric tests. No statis- tically significant differences were found in students' overall effort score according to gender (t (264) = −0.451, p = .652); however, the results of a one-way ANOVA showed significant differences between GPA groups (F (2,264) = 7.844, p < .001). Using a Bonferroni post-hoc test, it was determined that students in the lowest and middle GPA regions (GPA ≤ 3.0 and 3.1 ≤ GPA ≤ 3.5) reported significantly lower levels of effort on the PCOA, on average, compared to students in the highest region (GPA ≥ 3.6). No differences were identified between students in the lowest and middle GPA regions, however. Bivariate correlations (Spearman Rho) were also estimated between students' ratings for the extent to which they believed the PCOA could measure each of the six domains noted earlier (Table 4) and their total effort on the exam.
As reported in Table 4, all correlations between the items for student confidence in the PCOA and their effort on the exam are statistically significant, positive, and weak to moderate in magnitude (0.330 ≤ ρ ≤ 0.399). This suggests that students with a higher level of confidence in the PCOA's ability to measure the four content areas and APPE- and NAPLEX-readiness generally reported higher levels of effort while taking the exam.The final set of questions asked students to rate the perceived effect of various performance-based incentives on their motivation to perform well on the PCOA (Table 6). A 5-point scale ranging from 1 (would not affect) to 5 (would strongly affect) was used. The results reveal that using the PCOA as a determinant for progression, remediation, or APPE placement would have the strongest perceived effect on students' motivation on the PCOA. To a lesser extent, students were motivated by providing the whole class a reward for performance, a drawing for a prize for participating in the exam, or a letter of congratulations from the dean's office. Overall, the use of either an individual or a class-wide dress-down day appears to have little to no effect on student motivation. However, it was identified through the PSQ that only four of the eight participating S/COPs had a student dress code in place. Non- parametric tests (Mann-Whitney U) showed that students attending any of the four S/COPs with a dress code reported significantly higher changes in motivation in response to an individual (U = 9284, p = .039) or class-wide (U = 9292, p = .048) dress-down day. Nonetheless, even at S/COPs with a dress code, use of a dress-down incentive had the lowest perceived effect on motivation among all types of incentives.A Friedman's two-way ANOVA showed statistically significant differences in students' self-reported change in motivation on the PCOA based on the type of incentive (χ2(7) = 709.83, p ≤ .001). The pairwise comparisons confirmed the ratings were statistically equivalent for the first three types of rewards (progressions, remediation, and APPE placement) and that these were statistically different from all other incentives. Additional analyses were conducted to test whether any differences existed based on student demographics regarding the effect of the various incentives. Results from non-parametric tests (Mann-Whitney U and Kruskal-Wallis) showed that no statistically significant differences existed between male and female students or students across different GPA ranges, given an alpha level of 0.05.
Discussion
Our study provides a unique perspective on students' perceptions of exam preparation, personal and program impact, and mo- tivation to study for the PCOA, something that has yet to be reported in the literature. In general, results from this study can be used by programs to inform their curricular and student assessment and student remediation decisions. Students in this sample prepared very little for the PCOA. Less than half of respondents reported any preparation at all, and the vast majority of those who did prepare reported doing so within a week of the exam and with little effort. Notably, most students who did prepare took advantage of school-facilitated programs rather than self-directed review plans. The variability in preparation by students makes it difficult to know whether the results of the exam can be reliably used as intended, especially for the simultaneous assessment of students and curricula. One aspect of student preparation for the PCOA that remains virtually unknown is the potential impact on individual student results. Only the study by Naughton and Friesner,11 evaluated students' perceptions of performance versus actual performance (n = 157) and found that only students' perceptions of performance on the biomedical sciences section related to actual performance. This suggests either students do not have accurate perceptions of their own abilities, many were capable of performing at a higher level than they actually did, or a combination of the two. Limited information exists in other health professions regarding students' perceptions of curricular exams, but in the profession of medicine, studies have shown conflicting evidence on the relationship between student preparation and performance.17,18
Giordano et al.17 attempted to develop a predictive model to determine United States Medical Licensing Exam (USMLE) Step 1 Scores and found that performance was not related to number of days studied or students' perception of appropriateness of time spent studying.17 Kumar and colleagues evaluated medical students' self-reported study habits and found students who studied > 8–11 hours per day had higher USMLE Step 1 scores than those that studied less per day, but no additional benefits were seen in groups studying > 11 hours. No significant differences were seen in Step 1 scores for those that prepared from less than or equal to 20 days compared with those spending 21–40 days; scores fell for those preparing > 40 days.18 While the USMLE is different than the PCOA in relationship to student stakes, medical programs use it as one of the markers for “curricular success.” The stakes of the USMLE for students and programs potentially make it useful for simultaneous assessment of students and curricula.
Several programs have reported a variety of methods when selecting a cutoff below which students must remediate the PCOA6,9,19–22 indicating that certain institutions may be using these data for student assessment in addition to curricular assessment. If an individual program intends to use PCOA results for curricular revision, that program should consider whether the most valuable data for that purpose would result from students in a prepared versus unprepared condition. If the desire is for a pure measure of knowledge retention (unprepared condition), then this would likely preclude the use of stakes such as tying performance to re- mediation decisions. Once stakes are introduced, it is reasonable to assume that students will prepare in order to perform better and avoid additions to their existing workloads, delayed graduation, or other consequences. This assumption is supported by the results of this survey, that indicated that students would be motivated to a greater extent if the exam was tied to stakes such as remediation or progression. Very few students reported participation in remediation tied to their PCOA performance. This is consistent with an earlier survey taken of assessment and administrative personnel in S/COPs in which < 25% of programs reported requiring a remediation for poor performers.3 The current study is also similar to the previous survey in that various remediation processes were reported ranging from utilization of a NAPLEX preparatory course to review of class notes and recorded lectures. When combined with what had previously been reported by program administrators, the current study supports the suggestion that most S/COPs are not presently utilizing the PCOA to identify at-risk students leading to provision of remediation opportunities for those in need. This may represent a missed opportunity. In addition, the extent to which existing remediation efforts are effective also remains to be seen, which may represent a key reason why remediation is infrequently required by programs based on PCOA results. Identification of best practices for selecting students in need of remediation at standardized points within their pharmacy school curriculum represents an area of need within the academy. There is also a need to identify best practices for the provision and content of remedial programs for students who do not perform adequately on the PCOA or other benchmark assessments. Students in this sample perceived the PCOA to measure the four domains of the exam blueprint moderately well, although the administrative sciences had a significantly lower rating than the other domains. However, it is unknown if students' potential lack of understanding of the distinctions between the four PCOA content areas may limit the validity of these ratings. Students reported less confidence in the exam's ability to measure APPE- and NAPLEX-readiness. Of interest, students were more likely to use their PCOA performance to inform their approach to NAPLEX preparation than APPE preparation. This may be linked to the knowledge-based content assessed by the PCOA as opposed to practice-based skills more closely aligned with APPE performance. These results are consistent with previous publications that report that programs are usually utilizing PCOA results to assess students' knowledge-based abilities and to assess the success of their individual program's curriculum.5,10–12,23 Unlike the NAPLEX, which was developed with the intent to assess practice readiness, the PCOA was created to assess didactic curricula.2 Even so, about one out of every five students reported using the PCOA to influence their NAPLEX preparation by increasing the intensity of their preparation efforts, identifying areas on which to focus their study efforts, and increasing the likelihood of purchasing NAPLEX preparation materials. Student motivation when taking the PCOA varied and appeared to be correlated with GPA. Students who did well in their courses, as evidenced by higher grades, also had higher self-identified motivation scores on average. Our results seem to point to the need to provide motivation to students with low- and mid-tier GPAs specifically. If S/COPs wish to improve motivation on the PCOA, it appears that raising the stakes of the exam by either affecting student progression, prioritizing APPE placements, or requiring extra work for low scores could be the most impactful. However, such negative student consequences, such as remediation for a low score and/or delayed progression, may promote more intense student preparation, thus skewing the results away from a true measure of student retention. Strengths of this study included TAK-901 the student demographic make-up showing diversity in schools (public vs. private), location, age, gender, and overall student performance. Public S/COPs were somewhat overrepresented compared to the national landscape; however, the student demographics resemble national figures related to types of student enrollment.24 In addition, the survey design was done in a scholarly manner undergoing development from other published resources, consultation with other professional working groups (Council of Deans Task Force Assessment Group for PCOA), and beta-testing with students prior to larger release.