
February 25, 2026
In “Penn State at the Crossroads: The Verification Gap That Changes Everything,” Dr. Frank Archibald argues that AI has commoditized the visible markers of academic competence, making polished outputs insufficient evidence of real understanding. He contends that Penn State must shift from grading what students produce to assessing what they can independently verify and defend, or risk eroding employer trust and competitive standing. The article calls for immediate structural changes—especially requiring verification, stress-testing, and oral defense of AI-assisted work—to ensure graduates can justify their analyses under professional scrutiny.
The second in a series of seven articles examining artificial intelligence and higher education.
Two weeks ago, I attended the second AI informational session at Pine Grove Hall. The discussion was thoughtful—focused on policy, academic integrity, detection tools. But no one addressed the question that keeps me up at night:
What happens when Penn State graduates submit their first AI-assisted analysis to a decision-maker who asks, “How do you know this is correct?”
If the answer is “It looked good,” that graduate—and Penn State’s reputation—just failed a test we never prepared them for.
What Changed This Semester
For twenty years, I taught capstone design in Penn State’s Master of Manufacturing Management program. Students produced financial models, operational recommendations, strategic proposals. If the work was coherent, well-cited, and professionally formatted, I could reasonably assume competence had been demonstrated.
That assumption died this academic year.
AI can now generate all of those signals in under three minutes. Polished prose. Proper structure. Plausible analysis. What looked like mastery in 2022 may be sophisticated copying in 2025.
The visible markers of learning have been commoditized.
Penn State is still grading what students produce. We should be grading what students can verify.
The Test Penn State Can No Longer Pass
Here’s a scenario happening right now in internships across Pennsylvania:
A Penn State junior uses AI to create a market expansion analysis. Charts look professional. Recommendations sound confident. The intern submits it to the VP of Strategy.
The VP asks three questions:
- “What assumption drives your 18% growth projection?”
- “How did you validate these competitor data points?”
- “What happens to your recommendation if supply costs increase 12%?”
If the student cannot answer without reopening the AI chat, they just demonstrated they don’t understand their own work.
That VP will not blame the student. They will question Penn State’s standards.
This is not hypothetical. I’m hearing versions of this story from alumni supervisors in manufacturing, consulting, and finance. The phrase I keep hearing: “They can generate impressive work but can’t defend it under questioning.”
The Four-Stage Discipline We’re Not Teaching
Professional work in the AI era requires a new sequence Penn State barely acknowledges:
1. Define approval criteria before generating solutions
Identify the decision-maker, constraints, risks, and success metrics. Get agreement. Then design.
2. Direct AI with precision, not hope
“Design a solution that increases retention 15% within $200K budget while minimizing regulatory risk” beats “create a good strategy” every time.
3. Verify outputs independently
Audit assumptions. Recalculate key numbers. Confirm sources. Stress-test conclusions. Document weaknesses.
4. Defend recommendations under live questioning
Explain trade-offs. Acknowledge risks. Link every claim to validated evidence.
Right now, most Penn State courses assess Stage 1 and skip Stages 2-4 entirely.
That’s the gap. And employers are noticing.
What This Means for Penn State’s Competitive Position
Penn State competes in two markets simultaneously: student enrollment and employer trust.
If companies conclude that Penn State graduates produce polished artifacts but lack verification discipline, three things happen:
First, hiring managers discount the degree signal and add extra screening rounds.
Second, alumni hiring influence weakens. (“We used to prefer Penn State grads, but now…”)
Third, competitors who adapt faster—Georgia Tech’s AI verification modules, Michigan’s accountability simulations—gain reputational advantage.
This is not about whether Penn State “allows” AI. Every university does. This is about whether Penn State graduates can be trusted with AI-assisted decisions.
That’s the inflection point. And we’re behind.
The Immediate Action Required
Penn State doesn’t need a three-year strategic initiative. We need structural changes this semester:
For faculty:
Require verification documentation on every AI-assisted assignment. Assumptions audited. Key calculations independently confirmed. Sources validated with screenshots. Weaknesses identified.
For assessment:
Replace some written submissions with 10-minute oral defenses. Unannounced questions. Scenario modifications. “Walk me through this number.” No AI access during defense.
For accreditation:
Shift program learning outcomes from “produce analysis” to “produce and verify analysis.” Make verification a graduation requirement, not an add-on.
For administration:
Convene a Spring 2025 task force with one mandate: define what “AI-era professional competence” means at Penn State and how we assess it by Fall 2025.
Not study it. Define it. Implement it.
Penn State’s Choice
Two paths forward:
Path One: Continue grading polished outputs. Hope employers don’t notice the verification gap. Watch alumni hiring preference erode. Spend five years in committee discussions while competitors adapt.
Path Two: Acknowledge that AI changed the game. Raise standards immediately. Become the university that produces graduates employers trust with AI-assisted decisions. Lead the transformation instead of explaining why we didn’t.
Penn State has brand strength, research credibility, and alumni loyalty that most universities would kill for.
But brand strength doesn’t survive a reputation for graduates who can’t defend their own work.
The Standard That Matters Now
In 1995, Penn State graduates needed to produce good analysis.
In 2015, they needed to produce good analysis efficiently.
In 2025, they need to produce good analysis, verify it rigorously, and defend it under scrutiny.
Creation is easy. Verification is scarce. Accountability is everything.
Universities that teach all three will define professional standards for the next generation.
Universities that teach only the first will become expensive credentialing services for skills students could learn on YouTube.
Penn State is at the crossroads.
Every semester we delay is a cohort of graduates we send into the workforce unprepared for the question they’ll face in their first month:
“How do you know your AI analysis is correct?”
The answer cannot be “I trust it.”
The answer must be “Here’s how I verified it.”
That’s the standard. That’s the gap. That’s the choice.
About the Author
Dr. Frank Archibald worked in the aerospace industry and later spent more than 20 years as an employee of Penn State’s Applied Research Laboratory Water Tunnel facility. During that time, he taught multiple mechanical engineering courses and, for fifteen years, led a two-semester capstone design sequence in the Master of Manufacturing Management one-year MSc program. It was through this experience that he recognized the growing misalignment between traditional undergraduate education and professional decision-making environments. He now volunteers with Discovery Space and The Rivet and advises students in the Penn State Wind Energy Club as they compete in the U.S. Department of Energy Collegiate Wind Competition.
This is the second in a multi-part series examining artificial intelligence and academic credibility. Next week: ” Embracing Top-Down Instruction in the Era of Agentic AI “

One Response
Thank you for your thoughtful article, Dr. Archibald. You’ve given me a good framework to understand how to approach work I’m asked to asses and use. The work of getting to the output is no longer what it was and so rather than build and synthesize this is produce and prove. In the 1980s when I studied at Wells College (small, woman’s liberal arts college, closed 2024) I had to do written and oral comprehensive exams in my major field, produce a thesis and do an oral defense of that thesis to my thesis committee. For a BA. The rigor prepared me for the future and, over the years, I appreciate that rigor even more. I think that’s what you’re talking about here. Will look forward to future articles.