No results found.

Update the search term and try again.

No search term added.

Please type a search term and try again.

loading...

Equity

AI and equity in higher ed: A Q&A with Jairo McMican

| Jennie Aranovitch

News & Updates
September 3, 2024

This month, Achieving the Dream will launch a new AI Workshop Series, Opportunities for Exploration and Learning, to comprehensively explore the role of artificial intelligence in community colleges. On Sept. 16, we will hold the first of four workshops in the series, Ensuring Inclusive Learning Experiences, to focus on how innovative AI tools and techniques can be leveraged to support diverse learning needs and create personalized and accessible educational environments.

We checked in with Jairo McMican, ATD’s associate director of equity initiatives, to get a sense of what participants in the workshop can look forward to learning, with a specific focus on how AI can both advance and challenge equity in the classroom. 

Q: What can participants in the Ensuring Inclusive Learning Experiences workshop expect? What will the format of the workshop be like, and what do you want attendees to come away with? 

Participants can expect a format that emphasizes discussion and practical, research-based strategies for inclusive teaching. This workshop is designed to be interactive. It will allow participants to reflect on their own practices and engage in dialogue about effective techniques. Educators will learn strategies to interrogate AI-generated content critically, which includes learning how to craft AI prompts that consider and address diverse learning needs, cultural backgrounds, and abilities, thereby reducing the likelihood of biased responses. 

Q: How can AI help bridge the gap in educational access for historically excluded or marginalized groups? 

There are a few key ways that AI can help in this regard. First, it can assist educators by providing insights into student performance and freeing up time for more direct student engagement. Secondly, it can assist educators help meet the needs of a culturally diverse classroom. For example, AI technologies can help educators find more culturally relevant examples and can also convert educational materials into various formats, such as audiobooks or translated texts, which can help overcome linguistic barriers, such as those faced by students who are learning English as an additional language, for example. And thirdly, AI tools can be used to accommodate students with diverse learning needs. One way it can do this is by adapting content to match each student’s pace for learning, which helps ensure that students can engage effectively with the material. Another way is through AI-powered chatbots and virtual tutors that provide additional support and feedback, as that can be particularly helpful to students who require extra assistance. 

Q: It has been said that AI can perpetuate biases and reinforces stereotypes. How is it possible for a technology to be biased? And what kinds of biases should an educator be on the lookout for? 

AI can perpetuate biases because it is trained on data that may reflect societal biases. In other words, AI outputs are only as unbiased as the data used to train the models on which they are based. However, participants in this workshop will learn to identify common types of biases that can appear in AI-generated content, such as those related to race, gender, and socioeconomic status. This understanding is crucial for educators to mitigate the impact of these biases on learning experiences.  

Q: Could we delve a little deeper into that point? How can we mitigate the risk of bringing such biases into the classroom? Are there things that users can do to limit biased responses from AI? 

The most important thing to remember is that educators have to critically evaluate AI outputs and not rely solely on AI for decision-making. They must ensure that human judgment remains central. I like to explain this concept using a metaphor: AI is like a kite flying in the wind. To function effectively, it still requires a human to control it and determine when to bring it closer or let it soar higher as the conditions change.  

During the workshop, participants will be encouraged to acknowledge that they are both receiving information from AI and contributing to its learning process through their prompts. Not everyone is aware that each prompt provided to the machine has the potential to either increase or decrease its bias, so it is important for users to be mindful of their responsibility in shaping its development. The goal is to educate individuals on the significance of accountability in this process. You not only have to be thoughtful about evaluating what comes out of an AI prompt but also about what you’re putting in. 

Q: Is there anything else educators can do, besides increase their own vigilance, to prevent biases in AI-generated output from seeping into their classrooms? 

Educators must develop and enhance their AI literacy to use these tools effectively, ethically, and responsibly. But, in addition, students should be taught to critically evaluate AI-generated content, recognizing its limitations and potential biases. So, educators must learn how to be the primary gatekeepers to keep potentially biased information generated by AI out of the classroom, but they also impart critical skills to their students by teaching them how to carefully consider their own prompts and to interrogate the outputs. 

Register for the Ensuring Inclusive Learning Experiences workshop. 

Learn more about ATD’s equity services. 

Copy link