Will AI Ruin How We Think & Know?
- Dr. Steve Underwood

- Jun 3, 2025
- 2 min read
As an educator, I’ve lately found myself a little uneasy about the rise of AI. Are you?
Not because of cheating — though yes, students can crank out essays, presentations, and more with a click. What really worries me is something deeper.
No, what I’m concerned about is the underlying social-cultural change that we’re facing about how AI could slowly shift the way we 𝙩𝙝𝙞𝙣𝙠.
Let me give a historical parallel.
In the 1800s and 1900s, U.S. policies pushed to remove Native American and Alaskan cultures — through boarding schools, language bans, and forced assimilation — to get them to fit into Western culture. Part of that effort suppressed what’s known as Traditional Ecological Knowledge (TEK) — a way of understanding the world that had been passed down for thousands of years.
I’m not Native American, and I’m no expert, but my colleagues who are have helped me see just how much was lost — and how much is being fought for today as Native communities work to reclaim and revitalize those traditional ways of knowing that were damaged by larger societal shifts.
This has me wondering.
Who is going to safeguard the classical foundations of Western thought – our traditional (western) ways of knowing – ideas that have shaped much of modern education so that we don’t find ourselves having to reclaim and revitalize them later? Things like:
💡𝗣𝗵𝗶𝗹𝗼𝘀𝗼𝗽𝗵𝘆 – Knowing your own worldview (e.g., materialism, idealism, dualism, etc.) and how to evaluate it and others.
💡𝗘𝗽𝗶𝘀𝘁𝗲𝗺𝗼𝗹𝗼𝗴𝘆 – Understanding how we use reasoning and critical thinking to know what’s true, and what’s not.
💡𝗩𝗶𝗿𝘁𝘂𝗲 – Asking what’s good and why it matters, and knowing how to decide what is right.
These aren’t just academic ideas — they’re the foundation for how we think and communicate. The challenge that I see us facing in the years ahead is that people are using AI to do some, if not all, of their pre-thinking, brainstorming, and evaluating.
Granted, AI can be a useful tool. I’m not saying don’t use it. However, AI pulls from existing language and data, so its “thinking” is always 𝘣𝘰𝘳𝘳𝘰𝘸𝘦𝘥. Its output will be based on whatever predispositions its creators or its sources have about philosophy, epistemology, and virtue.
If we’re not careful, we risk outsourcing our own thinking so much that we lose the core knowledge and ways of thinking that our society was founded on and letting machine learning shape our minds and characters instead of the other way around.
So here’s my question to fellow educators and school leaders:
Who is making sure our students not only know how but 𝙙𝙚𝙨𝙞𝙧𝙚 to think for themselves — deeply, critically, independently — in a world where machines can let humans stay at a surface level for the sake of convenience?
If we don’t intentionally teach and protect these classical ways of knowing, we may wake up one day having lost something essential — not to policy, but to convenience.
Let’s not let that happen.


