Title: Use of a mobile application to support learning of evidence-based practice in higher education ABSTRACT FOR PUBLICATION: Background: In this proposal, we address the challenges associated with the use of a mobile application to support learning of evidence-based practice (EBP) among students in health and social care education. Research show that students typically struggle to apply EBP in clinical settings. In partnership with students, we developed a mobile application (app), the EBPsteps, to better equip students to meet the expectations of practicing evidence-based. The app guides students through the five EBP steps (ask, search, appraise, integrate and evaluate), enables documentation of the process, and provides links to internet-based learning resource. Our aim was to explore user experience of the EBPsteps among bachelor students who have used the app during clinical education. Methods: We conducted four focus group interviews in 2017 with students from social education (n=10), occupation therapy (n=3) and physiotherapy (n=2). Interviewing different participant-categories ensured comparative analysis and enabled us to exploit differences in perspectives and interactions. Interpretive description guided the data collection and analysis (Thorne, 2008). Results: We found evidence of three integrative themes associated with use of the mobile application: “Triggers for EBP”, “EBP competence - a prerequisite”, and “design matters”. Students experienced that they used the “EBPsteps” app when exposed to triggers for EBP, such as information need during clinical placement, supervisor wanting them to find research, and demands from teachers. Several students felt that EBP competence was a prerequisite for using the app. In particularly, lack of searching skills for research evidence was identified a barrier. Links to learning resources in the app were helpful when competence was lacking. Students preferred links in the app to books about EBP. When lacking EBP competence, the design of the app was helpful as the design structured the process of the EBP steps and supported the students to work evidence based. Students experienced the interface as intuitive, as the app gave a good overview of the EBP process, facilitated the EBP steps and enabled them to store information in one place. Not all students realised the potential of the app, for example, opportunities to use the app on phone and computer; or functions as email or glossary for research terms. Our findings indicate a need for studying user experiences of the app further, and there is a need for developing an instruction video. Conclusions: The EBPsteps is a promising tool for supporting the learning of EBP within health- and social care programs.
Background: Teaching and learning of evidence-based practice (EBP) should be interactive and related to clinical questions. Furthermore, students need tools to help them organize the learning process, and EBP educators need instruments to assess the competence of individual students, and to evaluate study programs. Existing critical appraisal tools are available. However, we have not identified a generic tool that covers all EBP steps and assess EBP behaviour. Aim: To develop a generic EBP tool for learning and assessment purposes. Methods: Characteristics of the tool: The tool is designed for health care students and professionals. It requires participants to document all EBP steps, based on real clinical questions. Thus, the tool allows assessment of self-reported EBP behaviour. Testing of the tool: The tool was tested in different settings; in the curriculum for medical students in their 5th year (n=90); in a post-graduate course for clinical physiotherapy instructors (n=14) and in continuous medical education (CME) for physiotherapists (n=8). Usability and layout of the tool was improved on the basis of feedback from teachers and students, as well as the researchers own experience. Development of a scoring instrument: To assess self-reported EBP behaviour a system for assessment was developed. The Adapted Fresno Test and its grading rubric were used as point of reference. Results: The tool and scoring instrument will be presented at the conference. Feedback from participants and our informal evaluation suggests that the tool is easy to use, structures the learning process and enhances EBP skills and EBP behaviour. Scoring criteria were negotiated between teachers until consensus was reached. Using a standarised scoring instrument may give more accurate scores. Conclusion: The EBP tool shows promise for students? learning and teachers? assessment. However, both the tool and the scoring instrument need to be further tested and evaluated for feasibility, reliability and validity.