User-friendly artificial intelligence tools like ChatGPT are still new, so professors aren’t sure how they will shape teaching and learning. That uncertainty also applies doubly to how this technology will affect students with disabilities.
On the one hand, these tools work like personal assistants. Ask ChatGPT to create a study schedule, simplify complex ideas, or suggest topics for research papers, and you can do just that. This can be a boon for students who have difficulty managing time, processing information, and organizing their thoughts.
On the other hand, concerns about cheating can lead professors to make changes to tests and assessments, which can have a negative impact on students who do poorly on oral exams, in-class tests, and so on. Also, students who lack confidence in their learning abilities may allow the products of these AI tools to replace their voices and ideas, rather than using AI as a mere learning aid.
Of course, such scenarios apply to a wide range of students. You don’t have to have ADHD to struggle with organized thinking. And oral exams aren’t just for students with extreme anxiety. However, education professionals may be in a rush for teachers to understand or curb these tools, failing to consider how they affect students, especially those with disabilities. I am afraid that
“People take academic honesty and academic honesty seriously for good reason, and we’re using these new tools to understand what that means,” said Casey Boyle, director of the Digital Writing Lab at the University of Texas at Austin. We are trying to redefine what we mean by He chairs the Working Group on Accessibility of Digital Content. But people are just now starting to talk about the opportunities and challenges around AI and disability.
Students with disabilities and needs of accommodation are already studying uphill. When we overreact, the slope of the hill increases.
Students with disabilities have long faced challenges in the classroom. First, it is difficult to ensure an environment that supports better learning, such as being given extra time to take notes, take tests, or be allowed to type instead of handwrite. It is mentioned that it is. Boyle said she heard instructors were moving from take-home writing assignments to in-class timed writing exercises to prevent students from using her ChatGPT. Students who suffer from cognitive load, dyslexia, or who have difficulty concentrating do not perform well in such situations.
“Students with disabilities and students with special needs are already learning uphill,” Boyle says. “When we overreact, the hill slopes even further.”
Professors are understandably concerned that students may use AI tools inappropriately, but some education experts believe that AI tools are not the only way to help students with disabilities. warned against a complete ban on the use of
- Students with mobility difficulties may find it easier to use generative AI tools like ChatGPT and Elicit to help them with their research if it eliminates the need to go to the library.
- Students who have difficulty conducting conversations, such as those on the autism spectrum, can use these tools for “social scripting.” In that scenario, you might ask her to provide ChatGPT with her three ways to start a conversation with her classmates about a group project.
- Students who have difficulty organizing their thoughts may benefit from asking a generative AI tool to suggest the opening paragraph of the essay they are working on. It’s not plagiarism, but it helps you get over the “blank page fear,” Cullen says. Costa is a teacher development facilitator with a particular focus on teaching, learning and living with ADHD. “AI can help build momentum.”
- ChatGPT is good at productive repetition. This is the practice most teachers use anyway to enhance their learning. But AI can take it to the next level by empowering students with low information processing skills to repeatedly generate examples, definitions, questions, and scenarios for the concepts they’re learning.
“I want all of you, students, to develop critical thinking, but I do not want you to give me AI-generated content,” says the Department of Software Engineering and Data Science at the University of St. Thomas. Professor and Dean Manjie Trege said. But since a student could spend his three hours in a lecture session, he says: “At the end of the lecture, if you want to cover that aspect, drop it into a generative AI model and see the similarities so you can understand it better.” Yes, of course, that’s what I encourage. That’s it. “
Education experts say teachers themselves can use AI tools to support students with disabilities. One way he could do that might be to run his syllabus through ChatGPT to improve accessibility, said Thomas Allen, an associate professor of computer science and data science at his college at the Center for Kentucky. says.
Allen, who has ADHD, is particularly aware that an overly complicated syllabus can hinder students. For example, a 20-page graphic-heavy document can trip students with a variety of disabilities, including those with low vision, dyslexia, autism, and ADHD. “It’s about using AI to solve the problems we created because we didn’t have accessible classrooms in the first place,” he says.
Disability advocates have long encouraged teachers to use an approach called Universal Design for Learning (UDL). In a nutshell, this method allows students to approach the material in different ways. A common example is captioning a video. Another is to provide a textual description of the graphic. Proponents point out that these strategies can benefit all learners and create more inclusive classrooms.
“Professors who design their courses with UDL at the center of their pedagogy will be better prepared and more adaptable not only to AI, but to other strange and challenging things,” says Costa. says Mr.
Education experts warn that these tools should be used with caution. In simplifying the syllabus, or lecture notes, ChatGPT may change the meaning of words or add content that wasn’t said, Allen notes. And it will reflect the biases of human-generated ideas and the language from which they were trained. “You can’t trust the output as-is,” says Allen.
Risks and challenges
A more nuanced challenge, according to education experts, is that students with disabilities may be less confident learners, so rather than using the output of AI as an assistant to other students, It is said that there is a high possibility that the words and ideas of AI will be replaced with the output of AI.
You are not alone in understanding this and getting all the answers. Work with your students and explore this together.
For example, students submit first drafts of their papers through ChatGPT and receive feedback on language clarity, argument coherence, and other measures of good writing. If an AI tool changes language significantly, it is not necessarily in a way that a teacher would consider an improvement, but students who lack confidence in their work and who do not perceive the tool as an expert may find AI tools may not comply. “The deliverables I’ve seen have been too rational, too linear, and too precise in a very unproductive way,” says Boyle.
One way to mitigate that risk is to teach all students about the strengths and limitations of AI. This includes teaching students how to craft thoughtful and specific prompts to get the most helpful feedback. We discuss how generative AI tools can generate confident yet deceptive or drab sentences. Also, remind students that ChatGPT is a word prediction tool without real intelligence and should not be treated as a replacement for a teacher, counselor, or tutor.
“If we continue to rely on technology, we will not grow or develop because we are dependent on this technology,” says S. Mason Garrison, assistant professor of quantitative psychology at Wake Forest University. “While this is a problem for everyone, it can disproportionately affect those who are genuinely concerned that their work is not good enough.”
Disability advocates point to two other challenges that may affect students with disabilities more than others.
One is that using AI to generate ideas or facilitate writing can make your work more likely to be flagged by AI detectors. This is a problem for a variety of students, including those whose native language is not English. But neurodivergent students can face certain challenges in response, says Allen.
“Sometimes we find it difficult to look people in the eye and we get fidgety. It’s part of our social challenge,” he says. “If you get called and your lecturer or dean says you got a warning on your essay, tell me why you cheated. and perhaps the students used it to take on and converse with character personas, but to communicate their own thoughts. It’s a different use case than using what you type and what the prompt spits out.”
Another challenge is that many students do not ask for accommodation until they need it. And how many students have ever had to take an oral exam or write an essay by hand?
“Maybe if it’s the first time for a student, they didn’t think they needed it, so they couldn’t get it in time,” says Garrison. “There will probably be a lot of surprises like this, and for professors, it may not even occur to them that it’s on the syllabus.”
One of the core pieces of advice from education professionals is to include students, especially those with disabilities, in developing policies for AI use. As generative AI evolves and becomes embedded in other technologies, it will become even more important.
“It’s not your responsibility to figure this out and get all the answers,” says Costa. “Work with your students and explore this together.”