edluminaries

The Importance of Data Literacy in Higher Education

July 2023 | 37 min 31 sec

Ed_luminaries

Episode Description

In this episode, we are excited to welcome Ashley Hurand, an expert in data literacy within the realm of higher education. With over 8 years of experience in the University Analytics & Institutional Research Department at the University of Arizona, Ashley brings a wealth of expertise to our conversation. Her unwavering passion lies in advocating for data literacy and empowering higher education institutions to make well-informed decisions through the strategic utilization of data. Join us as Ashley sheds light on the significance of data literacy in the context of higher education, offering invaluable insights and practical advice. Don’t miss out on this enlightening discussion that will reshape your perspective on the power of data literacy in the academic sphere.

Speakers

Ashley Hurand

Assistant Director, University Analytics & Institutional Research | The University of Arizona

Ashley Hurand is the Assistant Director for University Analytics & Institutional Research at The University of Arizona. With over 16 years of experience at UArizona, she leads the Student Data and Customer Experience & Support teams, focusing on robust student reporting, data visualization, and user experience. Ashley’s expertise in data literacy has contributed to the development of Interactive Fact Book, Arizona Profiles, and the Data Exploration Series. She holds a Master of Science in Management Information Systems, a Master of Social Work, and a Graduate Certificate in Business Intelligence and Analytics.

Ale

Alejandra Zertuche

CEO of Enflux

Alejandra Zertuche is the Chief Executive Officer of Enflux, an intuitive data analytics and decision support platform for higher education. She possesses a diverse educational background, holding a Bachelor of Science in Industrial Engineering from an accredited institution, an MBA from St. Mary’s University, and a Master of Science in Biomedical Informatics from UT Health at Houston. Alejandra is a highly skilled expert in academic assessment and accreditation and was named one of San Antonio Business Journal’s 40 under 40 and one of Texas’s Top 50 Women to Watch by The Society Diaries.

Transcript

Alejandra

0:00 
Hey, everyone. Thank you so much for joining us today. And you’re listening to the luminaries podcast where we talk with educational leaders to find out how they’re thinking and working creatively to drive students success. Today. Today’s, podcast, data driven remediation, have to identify and address learning gaps for student success, discusses the importance of utilizing data to identify student learning gaps and how to effectively address them. We’re excited to have Eric grant Associate professor, department chair and program director, and Randolph on college. Eric is a medical educator and expert in utilizing data and analytics to improve teaching and learning outcomes. He would be sharing his insights and strategies on how educators can incorporate learning objective level data into the remediation process to not only enhance students success but also to support academic excellence. Eric is a physician assistant and has a passion for engaging and developing learners to help them reach their personal and Professional growth and goals. He joined Randolph Megan college in 2020 after 16 years as a faculty member at wake forest school of medicine. Eric’s. Areas of educational expertise include classroom learning and via rents medical curriculum, innovation, enter Professional education and technology event. Met. His research has been presented at state regional national and international conferences, and he has published work on patient safety training, family conference, skill development, predictive, student success, and P a schools enter Professional education design models, scalability, and medical education, and his areas of medical expertise. We’re thrilled to have Eric here with us today to share his knowledge and insights on utilizing data to identify and address learning gaps for student success. Eric. Thank you so much for joining us today.

Erich

2:10 
Thank you all. It’s a pleasure to be here and I appreciate the chance to talk about some of these important topics for our learners and our faculty across medical education than.

Alejandra

2:18 
Thank you so much. And I apologize for mispronouncing your school name. I said, Randolph Macon and I’m like, no, we just talk about it. It’s Randolph Macon like rage Macon college. We’re really thrilled to have you today and I would love to start by asking you to share more with our audience about your background and how you became interested in physician assistant education.

Erich

2:47 
Thanks. I’d be happy to, before I went to pa school, I was living in North Carolina in Winston Salem and had the opportunity to teach at the local community college. And the program was an adult high school program. So, these were students who had dropped out of high schools either due to truancy pregnancy disruptive home situations or other problems, but they wished to come back to high school and get their diplomas. A, but a typical high school would not be the place for them. So, the community college system had night school that would allow these young folks to come in and complete their high school requirements. And I was frankly looking for a second job and recognized that I was qualified to teach this. So I taught consumer math, I taught English lit, I taught environmental science and other high school level courses. And recognized in that moment, the diversity of learning approaches and the challenge is that face learners. This group of students had probably an elevated level of challenges such as adhd or lack of resources, poor insight. Maybe this Lexi was a little higher. And so, it gave me a good set of foundational expectations for what teaching looks like outside of just knowing what you’re talking about. Then when I graduated from pa school, I wanted to apply this. And I had the opportunity to join the faculty at wake forest and started as an instructor there in 2004. And slowly took on additional tasks, courses and activities to where I eventually became vice chair of the department over educational innovations. And so, as part of that role, and as a chair of the student progress committee, I had a graduate level look at some of these same issues which can be difficult for our learners. So that’s really the quick background of how I got into education. And with pa.

Alejandra

4:41 
That’s a wonderful journey. And in both of your experiences or the, for instance, the community college, helping the students get their high school diploma. And also a wake forest, did you find? Is that where you find your passion for assessing the students and trying to identify the gaps of, does not just learning gaps but also gaps on do they have limited resources? Do they have any disabilities or anything like that?

Erich

5:13 
I think that’s fair. The primary things I noticed working with high school students is how vivid their learning challenges are at that level. But by the time you get to graduate school, grown ups are pretty good at hiding what they’re not good at. They’re pretty good at showing their best side and maybe hiding their weaknesses and perhaps not even addressing them if they don’t know they have them. And what we know from background data is that if learners are going to have real difficulty with curriculum, it’s often in third grade or so when they start doing math for the first time really in the us and then in graduate school that’s when the bandwidth and the cognitive load get so high that if you don’t have all of your systems running, you’re going to possibly find something there that didn’t show up in high school or undergrad. And so looking at it through that lens, allowing me to see not just good, bad, smart, not smart, but that there’s a lot of diversity in what can hold students back and what helps them be successful.

Alejandra

6:15 
Do you think that it has something to do with them feeling comfortable not knowing and admitting that they don’t know something?

Erich

6:25 
I do think so. I don’t have data to support this necessarily. But if we think about the typical psychology of the student these days, there is reward for knowing and there’s reward for high performance. There’s not a lot of time in many American classrooms for lots of question asking and lots of uncertainty. So, I think if that’s not built into the learners style and socratic method is not used through the developmental years, it may not be in the nature of the learner to first think to ask and to express their doubt or lack of understanding at the graduate school level.

Alejandra

6:57 
Which is so unfortunate, because when you learn to feel comfortable feeling uncomfortable, meaning asking what we call stupid questions or dumb questions or just saying, hold on a second. Sorry for asking this, but I don’t know what we mean or what it is. My God, as soon as you start feeling comfortable asking those questions, you will realize that you can learn about everything so much faster.

Erich

7:23 
We, we typically think about this in a competency curve where at the very beginning most folks are unconsciously incompetent where they don’t know anything that they don’t know. And so they’re very happy to be learners as long as it’s not too unwieldy, and we move into this place of conscious and competence, which is where we feel like we know nothing. And finally to a place of conscious competence where we know what we know, and we’re okay with the not knowing and that’s when we become more comfortable asking questions with a good foundation.

Alejandra

7:55 
So, what are the things that you’re seeing with the pa students currently at Randolph Macon?

Erich

8:03 
That’s a great question. So, the COVID pandemic has really made it interesting to assess learner preparedness for graduate school. Our typical metrics of gpa are a little harder to deploy. Now, there were a lot of complete incomplete or pass fail courses during the pandemic. A lot of folks were learning CORE science courses online through virtual phases. We don’t have a hefty amount of outcome data on what that looks like yet for graduate school preparedness. But the impression we’re getting is that looking at the raw data like we used to won’t quite do it. So we have to be prepared for an even more diverse set of readiness for graduate level medical curriculum. And that can be everything from a superficial understanding of anatomy to having lack of clarity on critical thinking skills or application of those skills to clinical scenarios.

Alejandra

8:58 
What, what’s your current process to identify those schools, the students that are struggling? And can you tell us a little bit more about a remediation process that you have in place?

Erich

9:10 
Yes, I can do that. So we do weekly short quizzes that are based on the previous weeks learning these count for a fraction of their grade. And frankly the students could not take any of them and they wouldn’t fail a course. They are very small, but they do count. These are a sampling of major topics from each of the courses from the past week and that helps us check bandwidth if they’re addressing all of the courses equally and able to grasp that information on a timely basis. We know that the speed of pa learning especially is what catches people by surprise. Then when we move into our end of term evaluations, we use several different assessment methods to do that. So we have standardized patient assessment, which looks that come some of the psycho motor skills of how your form physical exam and how you engage a patient’s, the attitudinal skills of how you approach the patient’s challenges, and then simply the knowledge of how you conduct your history and the questions you ask. And we get a look at them that way. We also employ multiple choice question tests as well as observed skills exams. And finally, which is probably most unique to our program which we also developed at wake forest was an oral exam similar to what physicians undertake with board certification. So this is a oral examination based on a student case in which the student must sit down with a faculty member and go through their reasoning process, explain their decision making, validate, lab choices made, and their interpretation thereof as well as the assessment and plan. So by taking the educational journey of the student apart and decoupling things as much as we can, we get a look at individual areas which might be weak. So for example, we do find that sometimes students struggle to generate adequate hypotheses at first. Maybe they didn’t go deep or wide enough in some of their book studies or their research. And so when a patient presents with fatigue, they have a very narrow differential instead of exploring widely, they may do everything else very well in the reasoning process or in their knowledge tests. But their hypothesis was too narrow. And so we can Zoom in on that and think about, is that bad enough to merit remediation? Does it reach the benchmark or not?

Alejandra

11:31 
And when you provide remediation, what is remediation look like? And what are some of the common challenges that you or the faculty members that are helping you with remediation face when they’re trying to help the students?

Erich

11:47 
That’s great. I think maybe I’ll start with a definition of remediation. Cpa, our crediting body requires that remediation be a measurable and documented process that programs used to remediate identified deficiencies. And so cpa also requires us to identify deficiencies in a timely manner. And so the assessment mapping process needs to occur frequently enough that you can catch these issues but not so frequently that you over focus on things that need time to develop. So in this case, for remediation, it looks like a focus on the students punctuated or specific performance gaps. So in this case, we first take a look at all of our mapping and we look at how our assessments are mapped to learning outcomes. This is an RP, a standard as well. We’ve got to have our assessment to match the learning outcomes. We expect. We further map those down to instructional objectives. So we can most of the time look back to a lecture or a session or an activity where this learning was supposed to take place and target, maybe what happened in that session was the student absence that day? Is it a shared issue across the whole class? So maybe it’s more our issue than it is the students? Or if a learning outcome or program objective is shared across multiple courses, we might be able to see a thematic issue with the students that requires a more elegant approach for remediation. The course directors typically dictate what happens in the remediation process, but we coordinate it through a central committee so that we don’t triple the work for a student who may need a singular solution for a problem that’s underpinning multiple areas of deficit?

Alejandra

13:35 
What are some… how do you leverage the data? How do you synchronize all of the assessments so that you can have everything? Do you have everything easily accessible? So that you can quickly identify or assess students?

Erich

13:50 
This is great. So with the testing platform that we use, we are able to tag the questions that we use to multiple choice tests all the way down to blooms taxonomy as well as cpa standards, et cetera. And that platform provides a strength and opportunities report which we can use for the student for individual courses. We can then lay those side by side or put them into data tables and take a look at strengths and opportunities across all the courses and look for alignment of issues that are common across the courses or across the cohort that might let us do that. And I think that’s the key you’re pointing out Ali most programs will have some level of difficulty with this in terms of corralling all that data and putting it in front of the right people so they can see it and then make sense of it. If you’re just looking at one multiple choice test in a clinical medicine course, a big medicine course, it might be difficult for you to say much beyond. Well, the student didn’t know billiary, diseases and they struggled with particular disease. But maybe there’s something more about the students understanding of anatomy and physiology and relevance to the topic. Then just re taking that question a few times.

Alejandra

14:58 
Yeah, you don’t want to prepare them to answer a question. You wanna prepare them so that they’re competent on mastering those competencies, not on testing. And sometimes it’s difficult to help them when you don’t get the whole journey or educational journey. If you’re only assessing how they did in one assessment, that might not be representative on how they’re doing in the program because there’s some, there’s still some I have seen in the data that sometimes the questions were not written correctly. So that’s what, what’s confusing the students? And it wasn’t that the student didn’t know the concept. So there’s a lot of noise within the data as well. So that’s why I was intrigued on how do you make sure, like validate the data integrated and have everything in one place so that you can provide remediation?

Erich

15:50 
Right. I can add one point to this as well, which is sometimes it’s test taking and that’s all it is. So we make sure that we do a mix of different test taking types. For example, we use open book multiple choice testing much like the Henry la now the longitudinal assessment product. It’s out there to give students a chance to hone their quick reference skills and answer questions that way. And we may see a student who answers those questions really well. And then when we give them closed book exams, we find that they scatter all over the place that their comfort and confidence with test taking is not very good. So, we would say, well, that student probably doesn’t have a deep content deficit. They probably don’t they probably have test taking problems. And so there’s good data to say take more multiple choice tests that will make you a better multiple choice test taker. But does it really address the root cause of what might haunt this student in other exams and high stakes exams, which is confidence and test taking strategy?

Alejandra

16:48 
Yeah, definitely. That was me. I wasn’t good at taking exams. I was better at close book testing than open.

Erich

16:57 
Because…

Alejandra

16:57 
Then I would want to validate all my answers when I had the book. And then that would take away time from the other questions. But yeah, everyone has a different way of being good at testing as well as learning. And as you mentioned, pa has a definition for remediation, which one it has to be measurable which you’re doing a fantastic job. Now, the second part is documentation. How are you documenting the remediation process?

Erich

17:29 
That’s a great question too. I know this is done very differently by programs around the country and there is no one perfect way to do it. What we do is create a longitudinal form that has three main parts. The first part is the notification to the student. A, that is a formal letter that comes from our progress committee and it lets the student know what evaluations were performed, where they fell below benchmark. And then the next form is the remediation plan which includes a deep dive on their areas below benchmark in terms of what I just discussed with assessment type frequency of that problem, how deep it is. And then a suggested plan for remediation. And so that usually includes retest testing them in the same way they were tested before because we need to make sure we don’t make it more difficult than the original assessment that was done. Sometimes faculty do get over zealous with their remediation, and the students do way more than probably is needed. The other key piece is what the re, learning is going to be. So that’s the term. We use a lot as re, learning, that is the plan through which the student will acquire the knowledge that was missing the first time or acquire the skills or techniques that might need to be there to demonstrate competency with the maneuver. So this is where the data really helps us. Because if we just measure a student and say you got a 70 on that test and that’s not high enough retake the test, they’re redoing a lot of work that probably isn’t needed. And if we trust that our original questions were good measures of student knowledge, we don’t need to ask them to do it again right away. That doesn’t make a lot of sense. So for the bandwidth issues you try to target down to what’s really wrong as best you can tell with your data. And so, the last piece of the form is the remediation outcomes form. And that form asks yes or no questions. I, did the student complete all the tasks on the plan? Yes or no? Did they complete all the tasks at benchmark or better? Yes or no? And are there any other areas of concern that arose during the remediation process? It should be addressed through referral, et cetera. The faculty member must also document the method through which that competency was assessed and the raw scores and benchmark that they had to attain?

Alejandra

19:44 
That is a wonderful process. Are you documenting all of this? Like in word document, storing it in folders. What advice would you give someone that would like to implement something similar?

Erich

19:57 
This is a good question too. So our institution has an internal document housing system that is secure. And so, it’s very important that where you house these is secure, you don’t want the wrong people accidentally or on purpose accessing sensitive student documentation like this. That’s a risk that you have to watch out for. So, we have an in house program that allows us to upload PDF and documents. And that system also tags other people who might need to see it such as academic excellence offices, advisers, et cetera. Who can look at the forms but not alter them once they are uploaded by the faculty.

Alejandra

20:35 
That is wonderful because you have to make sure that the right people have access to the right data and that they’re also in a safe secure place. So that’s wonderful to hear. Can you share with us some of your success stories? Can you give us some examples of how this process has led to improving students’ success and outcomes?

Erich

21:00 
Absolutely. So back to my Wake Forest days, I was directing the clinical reasoning and inquiry-based learning course. This was our problem-based learning small group course. I was also the Chair of the Student Progress Committee at the same time. So I had a comprehensive understanding of the students’ performance in relation to clinical reasoning. What we realized over a couple of years, with our students who required remediation multiple times in that course, was that there was a recurring pattern of errors. This was later looked at as… we took our reasoning apart as I mentioned earlier to the different steps to be either a knowledge content gap, but often it was biased. Some bias in their reasoning process made them make a clinical reasoning error. We audited our curriculum and recognized that we really didn’t talk about clinical reasoning bias in a formal way with the group. We may have had it scattered about, but it wasn’t early in the curriculum, nor foundational. So we changed this around in the remediation process for clinical reasoning. The students had to read a chapter in Harrison’s Principles of Internal Medicine on clinical reasoning. In that chapter, there’s some nice information on biases, common types such as anchoring and recall biases. That caused us to do things that we didn’t realize. We often rely on heuristics and take shortcuts, and we don’t realize it at the time. But upon reflection, you can see it. Then the student are prompted to reflect on that process after reading that chapter and identifying what they did wrong with the bias, which one they inappropriately employed. Then they’re given another case to try again with plenty of tricks or trap doors in it and they have to identify those things and complete the case. What we discovered after implementing this approach with a few students was a decrease in the rate of repeat offenders. The same individuals were no longer making the same mistakes. Additionally, they began demonstrating an awareness of biases in their small-group learning activities and exams. They started utilizing this knowledge to safeguard themselves against errors. While there is conflicting data regarding the role of biases in clinical reasoning errors, for these learners, it proved to be a valuable cognitive element. They recognized that biases could affect them as new learners. As a result, we decided to introduce this concept early on in the curriculum for all students. It became a standard component thereafter.

Alejandra

23:22 
What was the response from the students?

Erich

23:25 
It was really fascinating in the remediating students. Most students very much dislike remediation. They don’t like being told they didn’t do it well enough. They don’t like being told they’re on probation. They howl at remediation. All the while the faculty were saying, ‘Trust me, this is good for you. It’s something that’s going to be good. Just wait.’ Almost universally at the end, they would say, ‘I’m so glad I did this. I had no idea what I was really doing in my head while I was making these conclusions or solving these problems. Now, I can see my thought processes. Now, I understand what I was doing.’ So there was genuine and authentic gratitude on the part of most folks who did it, despite the extra work.

Alejandra

24:06 
That’s wonderful. You know, it’s interesting how we do have a negative view of remediation. If you were to tell me that I have to take remediation,  I almost feel like I fail. But actually, you haven’t. It’s an opportunity not to fail. I think that associating that negativity with the word, it might make some students feel hesitant about being part of it. But I’m so glad that they saw the benefit of the process… because if we think about it, sometimes even the students who didn’t take part in the remediation… probably would want it to be part of the remediation process because the students benefit tremendously. At the school where I worked, we did have some high performers that wanted to be part of those remediation classes because they wanted to keep learning more practice and everything else. They didn’t see it as a negative thing. They saw it as something good. ‘Wait, you’re providing additional support. I want that support. I want that additional supplemental studies or information.’ So, yeah, that’s really interesting. What advice would you give to schools that are trying to implement this? How can they support faculty in utilizing assessment data to improve the remediation process and enhance student success?

Erich

25:40 
Well, if you solve that for everybody, I think that you’ve probably got a new company you need to start up. That would be a big thing. I think if you go to the national conferences for PA educators, there’s always a packed session on remediation. Everyone loves going to these sessions because the questions are so many in these challenging situations and sometimes they break our hearts with students, and we feel so attached to their performance, and we want to help them, but it’s so difficult in many cases. The things that we can do better. So, I think clarity of linkages between the program outcomes that you’re looking for to your educational institution, back to the learning outcomes for a course, down to the instructional objectives for an activity, and especially the assessment methods that you have in your syllabus. So make sure your syllabi have all that stuff. ARC-PA is now asking for it in those latest updates to the standards. It’s been part of the provisional process for a while, but now they’re applying that clearly to continually credited programs. So you do have to have those in your sylabi. That’s a lot of work. But it actually helps when students don’t meet an outcome, they don’t meet a benchmark. You know exactly how to trace backwards to what you need to do to help that student. This is a classic bandwidth issue for faculty as well who are looking for ways to make this more streamlined. So, I think make sure that the course directors are talking to each other and make sure there’s a centralized to process after a major assessment to look for themes for students’ support and improvement. You don’t have to do siloed remediations across every course. If there is a common theme or you can achieve the re-earning in a specific way as long as the re-testing is fair and aligned with the benchmarks that you’ve set for the rest of the class, the re-learning approach can be a little bit different than what happened the first time, as long as you’re attending to what the student needs. I think those are the two biggies. Make sure your pathways are clear between learning outcomes and your assessments. There are a lot of platforms that can help you map that and show that. And make sure you’re talking to each other or have a central person who’s watching over those themes. So you don’t overwork yourselves or the student.

Alejandra

27:55 
And making sure they put everything in a safe place as well. Yeah, absolutely. It’s extremely important to keep the students’ data confidential and only have the right people with access to the data. Eric, thank you so much for joining us today. It has been really insightful… Before we say good-bye, do you have any other final thoughts or advice that you would like to share with our listeners?

Erich

28:22 
If we’re talking about remediation, I think my advice would be to talk about this with your students early as soon as they hit the programs front door and let them know that remediation is a normal thing. That very few people go all the way through medical training and don’t have something to fix. There are plenty of wonderful providers out there, amazingly accomplished folks who are not the best in the world at one or two particular things. And they’re well served by asking for help when that happens. And our programs are beholden to make sure we identify any gaps that are out there and you’re gonna want that to happen and you’re gonna want your students to buy in. So I suggest suggestion would be to destigmatize as much as you can and build a process that’s collaborative and helpful for the students.

Alejandra

29:03 
That’s a wonderful advice because that would take the negative view of remediation that were remediation. So the students don’t see it as falling or a process that they don’t want to be part of. That’s. A wonderful advice. Well, thank you so much for joining yesterday and to our listeners. I hope you found our discussion with Eric branch in cycle and informative, Eric share his experience on utilizing data to identify and address learning gaps in the most effective way. We explore strategies for incorporating learning objective level data into the remediation process. And you utilize and using meta cognitionto address student deficits in a targeted an effective manner. As we wrap up this episode, we want to encourage you to reflect on the insights and strategy share today, so that you can come up with innovative ways to have your remediation process in place at your institution. Thank you for listening to today’s episode. You can subscribe to our events by going to influx dot. Com. You can also find us a LinkedIn where we post announcements, our solutions and resources like today’s session, Alexa, and you have been listening to a luminaries.

The End

Expand

Collapse

Subscribe to our newsletter not to miss the next episode