In its third year, DELTA provided funding for six projects, which were selected from a competitive field of 38 proposals. Each of the funded projects will deploy innovative uses of digital technology to enhance the university’s teaching and learning enterprise for a wide variety of populations, including Johns Hopkins graduates, undergraduates, faculty, patients, and the general public.
Principal Investigators: Carrie Wright, Stephanie Hicks, Leah Jager, Margaret Taub and John Muschelli of the Bloomberg School of Public Health
Research of college undergraduates shows that education is especially successful when experiential. With the COVID-19 pandemic, there is an unprecedented need for engaging and active online course material. The Johns Hopkins Data Science Lab has pioneered development of online education material utilizing numerous technological platforms for building and delivering teaching materials at massive scale. Having developed the first massive open online course in data science, our program has now had over 7 million enrollments since 2014. One recent innovation is the Open Case Studies (OCS) project which provides self-contained, multimodal, peer-reviewed, and open-source guides for vetted real-world examples for active educational experiences. These guides can be used 1) in the classroom (either onsite or online) by engaging students to actively participate in a hands-on problem-solving experience, and 2) outside of the classroom by providing an archive of stand-alone examples of best practices.
We propose to leverage our diverse portfolio of capabilities to develop an OCS Delivery Platform to take our OCS guides to the next level by 1) incorporating experiential interactive activities with real-time feedback, 2) increasing accessibility by creating a tool to translate OCS to other written languages, 3) enhancing utilizability by creating tools for finding and modifying OCS, and 4) enabling generalizability by empowering others with tools to create OCS, including educators with limited technological experience. To assess our pedagogical method, we aim to 1) evaluate student and educator experience using our materials in the classroom and 2) track the usage of our materials and tools. Our proposed tools align with the DELTA award criteria and the 10 by 20 goals by enhancing the teaching and learning experience of the Johns Hopkins community and by further promoting our global presence in education.
Principal Investigators: Justin Jeffers, Therese Canares and Keith Kleinman of the School of Medicine and James Dean, Blake Schreurs, and Scott Simpkins of the Applied Physics Laboratory
Augmented reality (AR), although prominent in other sectors, has been slow to infiltrate healthcare education. There are numerous opportunities to utilize AR in training current and future healthcare providers, specifically around resuscitation and cardiopulmonary resuscitation (CPR) performance. High quality CPR has been shown to improve pediatric cardiac arrest survival yet adherence to current guidelines for CPR performance is poor. Numerous attempts have been made to improve guideline adherence with varying degrees of success but there is still room for improvement. This project, in collaboration with the Johns Hopkins University Applied Physics Lab, aims to develop AR software to improve CPR education and performance by providing real-time feedback in the direct field of vision of someone performing chest compressions. This software will be integrated into existing AR hardware creating an affordable, portable, and effective means of training and debriefing chest compression performance. The platform will be applied to a variety of learner groups across multiple disciplines. Evaluation and assessment will focus on learner improvement in CPR performance compared to traditional and current feedback models and education. This will initially be done as a pilot study and then on a larger multi-institutional scale. Eventually, this work will be evaluated in the clinical setting to determine direct impact on patient survival. This project addresses several of the university’s 10×20 goals including supporting the core academic mission, faculty-lead interdisciplinary collaboration, enhancing the impact of Johns Hopkins Medicine, and developing the resource base necessary to support investments in key academic priorities.
Principal Investigators: Jeff Day and Bonnielin Swenor of the School of Medicine, Donna Schnupp of the School of Education, and Valerie Hartman of The Peabody Institute
Short animated explainer videos can be effective ways to communicate important educational messages to the public because of their shareability, quick messaging, and engagement. Moreover, they can be widely accessible to those with disabilities when following guidelines from Section 508 Amendment to the Rehabilitation Act of 1973 (Section 508). However, the guidelines for audio description (the narration of important visual elements on screen) are vague. In this project, we will create several explainer animations with varying amounts of narrated description and test them with low-vision and normal-vision groups for preference. We will publish and report the results with the expectation that they can clarify user needs for audio description. The subject matter of the explainer animations will be used to promote the Johns Hopkins University Disability Health Research Center and Universal Design for Learning concepts, both of which will expand usability for all learners. We will also create a video promoting an underrepresented health topic as a test of accessibility in public health communication. It is hoped that these engaging, sharable videos will inspire educators and creators in the Johns Hopkins community and beyond to think more universally about teaching and clarify needs for accessibility. This project will partner the Schools of Medicine, Education and The Peabody Institute.
Principal Investigators: Luo Gu, Orla Wilson, Sakul Ratanalert and Patty McGuiggan of the Whiting School of Engineering and Robert Leheny and Meredith Safford of the Krieger School of Arts and Sciences
The transition to remote teaching has exposed a critical need in the science and engineering curriculum. In particular, how can the necessary laboratory skills be learned remotely? These skills not only involve scientific problem solving, but observational acuity and the ability to work collaboratively. To address this need, we propose using mixed reality headsets coupled with electronic notebooks in lab-based courses in WSE and KSAS. The headsets will enable not only real time remote visualization but an entire interactive laboratory experience to students participating remotely. Assessment measures will be determined to evaluate the outcomes of using the headsets and electronic notebooks in the lab courses. If successful, this system will modernize the laboratory training experience, expand our educational offerings, and better prepare our students in their careers.
Principal Investigators: Kristen Brown, Shawna Mudd, Catherine Horvath and Nancy Sullivan of the School of Nursing and Nicole Shilkofski, Justin Jeffers, Julianne Perretta and Sandy Swoboda of the School of Medicine
Simulation is used in academia to provide experiential learning opportunities in a safe environment. However, to provide students with a fully immersive simulation experience, there is the associated burden of cost, space restrictions and faculty. Virtual Reality (VR) (fully immersive) and Virtual Simulation (VS) (screen-based) are tools that have the capacity to shape the way we deliver experiential learning in healthcare education. The goal of this project is to implement an asynchronous and synchronous training platform to provide interprofessional education (IPE) experiences for both the Johns Hopkins University School of Nursing and Medicine. This cutting-edge project will be the first VR/VS platform in nursing and medical education to utilize multi-player technology allowing for collaborative training and faculty debriefing to advance interprofessional education (IPE). Utilizing platforms for both asynchronous and synchronous training allows for repetitive practice to maintain knowledge and competency in a cost-efficient manner. This positively impacts the capacity for enrollment in online programs and the ability to further integrate simulated experiences into multiple programs which will produce a better prepared workforce impacting delivery of care.
Principal Investigators: T. Peter Li, Alex Johnson and Dawn LaPorte of the School of Medicine and Stewart Slocum and Arpan Sahoo of the Whiting School of Engineering
Orthopaedic surgery residents work up to 80 hours a week during their 5-6 year training program [1, 2]. In their limited off-duty time, they must self-study hundreds of topics to prepare for yearly OITE exams and a final ABOS certification exam, on which a significant portion of test-takers fail each year . We plan to develop a virtual study assistant, named Socratic Artificial Intelligence Learning (SAIL), to augment the education of orthopaedic surgery residents. Drawing on state-of-the-art methods in natural language processing (NLP), SAIL will identify and address knowledge gaps by engaging users with questions and answers in a conversational manner. The tool will evaluate the level of understanding demonstrated by user responses, which can be used as input to adaptive learning algorithms that are part of existing orthopaedic education suites. Being a handsfree auditory learning tool, residents can find more study time by using SAIL on the go, i.e. while doing chores or commuting. In addition to making studying more convenient and enjoyable, we believe that using SAIL along with other study methods can produce more learned physicians, bolstering patient care. We plan to evaluate SAIL with a short-term cross-over study of residents from the Johns Hopkins Department of Orthopaedic Surgery. We will gather data on the tool’s accuracy (at identifying correct vs. incorrect user responses), user-friendliness, and effectiveness for learning (by measuring improvement between pre- and post-test scores.