looking in the mirror

In a local op-ed I recently concluded that curriculum change in education is like fixing a plane in mid-flight especially when the first principles of public education are not clear while the curriculum is the same for everyone. Basically, standardized curriculum is the confinement of the human experience. It is a blunt tool that winds up bullying someone. But nobody can take the government to task on first principles when they do not exist. Much of the blame lies with the professionals managing the system. They have handed over this cobbled-together system to each successive government to do their political whims. And so it will continue.

The question about the role of education will become even more important as artificial intelligence tools become pervasive. Dave Cormier notes that, —“The vast majority of the human experience of learning about something is done at the novice level. That experience is about to be autotuned” — by tools such as ChatGPT. These generative pre-trained transformers are designed to automate human work. Educators should be careful that they don’t automate human learning. The only way to avoid this is to understand these powerful new tools.

Christian Talbot recently presented to ACS Athens school on the role of AI in education. He strongly advocated for ‘Humans + AI’, as opposed to ‘Humans vs. AI’. In the workplace, I refer to this as ‘stepping in‘, or using machines to augment human work, requiring deep thinking. Christian concludes that there are some key questions about education that should inform all students and educators alike.

  • Who am I?
  • Who are we?
  • What matters to us?
  • What are we going to do about it?

These questions form a compass for education, as opposed to a curriculum map. Even more succinct, was Nel Nodding’s advice that, “the main aim of education should be to produce competent, caring, loving, and lovable people.” AI, in the form of a tool like GPT, is a mirror of our society says Christian, because our words have informed this AI. Any biases of GPT and large language models are our biases. If we, and our children, don’t learn how to use these tools and understand how they work, then once-again we will be manipulated by those who already own the post-truth machines.

“You don’t need to change everyone’s mind, argues Brittany Kaiser, former director of business development at Cambridge Analytica. You just need to change the minds of the ‘persuadables.’ And the way you identify them is through understanding not just what they buy or say about themselves, but how they think. Through harvesting personal data, Cambridge Analytica could, and did, identify and persuade them.”

Standardized courses are artifacts of a time when connections were few and information was scarce. That time has passed.

 

Leave a comment

 

This site uses Akismet to reduce spam. Learn how your comment data is processed.