Artificial intelligence allows machines to attack humans, as in “The Terminator,” but can also cripple humanity as in another movie, “Idiocracy,” said the administrator who created a policy for the use of AI in Hazleton Area Schools.
“This is why schools must move very slowly in adopting AI for students,” Kenneth Briggs, Hazleton Area Chief Information Officer, said in an email.
The policy Briggs created, which the school board approved Oct. 23, puts AI use within bounds to protect students, but also requires them to think critically.
Teachers, students and even computer technicians in the technology department Briggs leads cannot use AI until they have training to protect themselves from the release of personal information and to consider the ethics, capabilities and limits of artificial intelligence.
The policy also says students and teachers can only use AI tools that Hazleton Area has reviewed for bias, age levels, data security and privacy protections.
“The most important thing we are looking for is the safety of our students and employees. Not only that their personal information is leaked, but also that they do not allow an AI to provide advice or reassurance to students or employees regarding their emotions or mental health,” said Briggs, who wrote his dissertation on online learning and has led Hazleton Area’s technology department for 16 years. “We also ensure that the product developer has integrated safeguards into the AI to prevent learned biases.”
Tools that Hazleton Area selects will focus AI’s conversational skills on specific tasks and not allow the AI to generate personalized advice, he said.
For example, the first AI tool Hazleton Area adopted is called Goblins and teaches math to students.
“We’ve seen it in action. It’s phenomenal,” Superintendent Brian Uplinger said at the Oct. 23 meeting before the board voted to use Goblins. “It doesn’t provide answers. It helps point toward an answer.”
As teachers broaden their use of AI, they will set rules for each assignment.
A chart in the policy lists five categories ranging from no use of AI to full use with human supervision and explains when students should disclose how they used AI.
For example, if students are allowed to use AI to edit a paper, they can refine their work with AI, but they cannot generate content with it, and they must disclose AI’s role, according to the diagram.
Uplinger said that by giving teachers flexibility over the use of AI, they can tailor assignments to their subject, grade level and learning goals.
“A creative writing teacher might allow generative brainstorming, while a math teacher might limit AI to problem statements only,” Uplinger said in an email.
Briggs said students can use AI to visualize experiments in science, provide tips on readability in creative writing and practice speaking or translation tasks for those learning a second language.
“Teachers will need to ensure that students use AI as a research tool or in a limited capacity as a teacher,” Briggs said, “and not to complete an assignment.”
His department doesn’t use AI yet.
However, after technicians undergo training, they can use AI to review logs for potential problems in computer hardware and software, detect exploits, research new products, and get suggestions for configuring the computer system.
Before using AI tools to prepare budgets, payrolls, enrollment forecasts and other central office tasks, Hazleton Area administrators will verify the tools’ accuracy, the policy said.
Although Briggs has heard educators compare the rollout of AI now to that of calculators decades ago, he notes a difference. With calculators, students still needed to know the steps for solving problems. With AI, students can hand over the entire process to the machine.
“While it is true that in the future we will all have AI at our fingertips,” he said, “we still need critical thinking skills to know what questions to ask to get the results we want.”
AI can be wrong, biased and, at least for now, too immature to offer advice, according to Briggs.
ChatGPT, which uses AI to answer questions, told a 16-year-old Colorado boy not to reveal his suicidal thoughts, his parents told a Senate panel in September after their son died by suicide.
Hazleton Area policy says district workers will report problems they identify with IT tools and devise ways to verify their accuracy and reliability.
Briggs said Hazleton Area will look for tools that detect plagiarism, but realizes these tools can falsely flag content, especially if it was created by someone using a second language.
AI could make plagiarism so easy that schools will reverse the role of classroom teaching and homework, suggested Ethan Mollick, associate professor of management at the University of Pennsylvania’s Wharton School. In ‘Co-Intelligence: Living and Working with AI’, Mollick writes that students can do group work and hands-on learning in the classroom, where teachers can monitor the use of AI; but for homework, teachers may tell students to learn new material by watching video lectures or reading.
Patrick Patte, Hazleton Area curriculum director, doesn’t foresee such a reversal in the near future.
He was interested in Goblins because he wants to “take every opportunity to help our math education.” Even with Goblins, the district will roll out the tool as a pilot program.
“It’s a great asset,” Patte said of AI, “but people are our best tools, our teachers.”
The policy encourages teachers to use AI to “discover lesson plan ideas, create assignments, and generate ideas for personalizing student learning.”
But AI tools will not make final decisions about students’ grades, academic integrity or discipline, the policy says.
Likewise, administrators can use AI to assist in the human relations process, as long as people make final decisions about hiring, promoting, evaluating, and firing employees.
“AI cannot be trusted to solely make a decision that affects a human life,” Briggs said. “We should never take the ‘human’ out of HR.”
Highlights of the AI policy
AI-assisted idea generation: AI is only used for brainstorming and idea generation. No disclosure
AI-assisted editing: AI is used to edit or refine student work, but not to generate content. The student must disclose how AI was used
AI for completing specified tasks: AI is used to complete certain elements of a task or part of a project under human supervision and evaluation of AI-generated content. The student must disclose how AI was used
Full AI use with human supervision: AI may be used throughout the assignment. The student is responsible for providing human supervision and evaluating the AI-generated content. The student must disclose how AI was used.
Source: Hazleton Area School District
