OCSB to begin testing AI in classrooms

OCSB director of education Tom D'Amico, centre, believes AI education could allow students to think critically about digital content and social issues. (Ottawa Catholic School Board - image credit)
OCSB director of education Tom D'Amico, centre, believes AI education could allow students to think critically about digital content and social issues. (Ottawa Catholic School Board - image credit)

The Ottawa Catholic School Board (OCSB) will start teaching students about the potential benefits and risks of generative artificial intelligence as early as this fall.

The OCSB has spent the last year testing AI tools such as ChatGPT and Gemini, which let users generate images or responses to text-based requests.

At a meeting on April 23, the board's director of education Tom D'Amico presented guidelines for the implementation of AI in all OCSB classrooms, describing the technology in revolutionary terms.

"I think it really should be equated to when the internet came in, or when electricity came in," D'Amico said. "It has the potential to support and impact every single student and staff member in our board if we do it right."

OCSB staff could use AI to draft emails or newsletters, scan YouTube videos for use in class, or quickly create first drafts for quizzes, D'Amico said. He believes that's only the beginning.

Promotes accessibility, but reinforces stereotypes

D'Amico said AI could also help make classes more accessible and inclusive.

Translation programs could help teachers communicate with second-language learners and their parents. AI that supports text-to-speech could also make reading easier for students with "physical or other challenges."

"Those students … can participate in the learning process when they're not just relying on text-based teaching and learning," he said.

We need to make sure our educators can teach that AI literacy so students can identify bias. - Tom D'Amico, OCSB

But artificial intelligence can just as easily reinforce harmful stereotypes. In 2021, researchers at Stanford University in California and McMaster University in Hamilton asked ChatGPT to generate text about Muslims. It used words such as "violence" and "shooting" 66 out of 100 times, far more often than in text generated about Christians or Jews.

D'Amico sees this as an opportunity to help students develop critical thinking skills around digital content and social issues.

"We know there's structural racism in our institutions, and that bias is going to come out" in AI content, D'Amico said. "We need to make sure our educators can teach that AI literacy so students can identify bias."

Omer Livvarcin, a University of Ottawa business professor writing a book about ethical AI use for non-profit organizations, said this is a good first step. He said a teacher's best response when encountering discriminatory content is to explain that AI makes mistakes.

"It doesn't happen very often, but when it does, the teachers must be prepared," Livvarcin said. "They need to establish a mindset foundation that AI is just a decision support tool."

Plagiarism and cheating with AI

For the past two years, educators in Ontario and elsewhere have been concerned about the potential for AI to make cheating easier.

The Simcoe County District School Board changed its cheating and plagiarism guidelines in March 2023 to prohibit AI-generated content. The Peel District School Board formed an AI steering committee in November 2023 to monitor the technology's impact on education and plagiarism.

Markus de Domenico, vice-chair of the Toronto Catholic District School Board, said he believes many Ontario educators don't understand how AI is used to cheat.
Markus de Domenico, vice-chair of the Toronto Catholic District School Board, said he believes many Ontario educators don't understand how AI is used to cheat.

Markus de Domenico, vice-chair of the Toronto Catholic District School Board, said he believes many Ontario educators don't understand how AI is used to cheat. (Markus de Domenico)

D'Amico said he shares this concern, which is why OCSB media literacy lessons will encourage students to use AI for feedback on assignments rather than generating entire essays.

"If you go home and you happen to have a family member, whether it's a sibling or a parent, that can edit your work ... most educators would see that as a positive use of your resources. We want to see that same approach with AI," he said.

Ksenia Yadav, a Carleton University adjunct professor of engineering who works with emerging technology such as AI, said there's likely "a fair bit" of AI-supported cheating, but there isn't enough data to support the claim. She said AI tools made to detect plagiarism in essays are unreliable, so educators need to innovate when evaluating students.

"Perhaps doing stuff without the use of technology in the classroom, or relying on the analytical skills of our students," she suggested.

Markus de Domenico, vice-chair of the Toronto Catholic District School Board, said he believes many Ontario educators don't understand how AI is used to cheat. During a February board meeting, he called on the province to establish a strategy for AI use in education.

"But we need to move quickly," he warned. "It's here, it's present, and it's both good and bad. Let's deal with it."

D'Amico said the OCSB will distribute an AI toolkit to staff over the summer. It's also preparing a website with AI resources for parents that should be ready by the end of the year.