A student walking in the quad in the spring, A brain above him, left digital, right made of books

“Avenues:” AI & Academia

Instead of the usual year-end wrap-up of trends or speculation about where AI may go next, I wanted to close 2025 differently. It’s not hard to find the ongoing and heated debates around AI; ranging from the highly political to the environmental to questions of practicality in the workforce. But putting aside these debates, hot takes, and opinions, the questions I most commonly get asked are still:

How do I stop students from using AI?

How do I help students learn AI?

These are two contradictory paths a faculty member can take. Not just for their entire class, but for individual assignments or activities within that class. At WWU, AI is strictly prohibited unless a faculty explicitly allows it, and it can be permitted or restricted at different times within the same course. This policy flexibility got me thinking about the various levels or approaches faculty might take when deciding whether to employ or prohibit AI.

Literacy or Addiction

I’ve had several conversations this term with faculty who are concerned for their students. One common worry is the feeling by some faculty that some students have become addicted to AI use. A perspective where a faculty feels that whenever a student may encounter something difficult or feel the friction of learning, instead of pushing through it, they immediately default to AI to produce a “better” version of their work. Meanwhile, sometimes in the same day, another faculty expresses the opposite concern: they worry they need to equip students with AI knowledge and skills for the professions they’ll enter after graduation. Both perspectives are valid. And both recognize the same fundamental challenge from different angles: in an era of AI, what is it to not only teach students, but teach students how to learn.

Any technology is a tool, but that doesn’t mean everyone can, or should, use it the same way. As a kid, my father was a master-builder, and a contractor. Just because he was, doesn’t mean I am. Sure, I learned under him and built many things over two decades, but I can also say that just because you own a hammer doesn’t make you a builder. I would even go one step further: just because you own a hammer doesn’t mean you should even be building. AI works the same way. In the hands of someone who already has knowledge, skills, and experience, it can unlock new options, opportunities, and optimizations (yes, I’m proud of that alliteration of O’s) . But without that foundational literacy, knowledge, practice, and even confidence, it can lead to either more work or no meaningful work at all. If you don’t believe me, ask any homeowner who decided to DIY a project… only to pay double for a professional to fix what they’d done.

Exploring an Avenue Forward

As it stands, academia needs to address two fundamental questions with AI:

  1. Why do we learn what we learn? We need to help students understand why the learning itself matters, not just the credential or grade. And we need to incentivize that understanding in the learning process.
  2. Where can AI or any technology actually help? This depends entirely on what knowledge and abilities the student gains or possesses (see said hammer story).

This is a part of what literacy is; and notice I didn’t say “AI literacy.” Even before the pandemic, educators were already observing troubling trends in student preparedness for higher education. Then the pandemic highlighted two issues for me: education systems were unprepared to teach outside the physical classroom, and students were unprepared for the autonomy and ownership of their learning. AI’s ease of use, in my opinion, has only further highlighted these existing issues. The tool isn’t the problem; the lack of foundational learning skills is.

I know this sounds lofty and idealistic, but it starts from a premise that isn’t up for debate: the purpose of education, at any level, is learning and the transfer of that learning. For learners of any age to become truly autonomous, they need to believe that the academic challenges and rigors (the assignments, the quizzes, the exams, the class sessions) will yield what is promised in the learning outcomes of the course. When students can’t observe, believe, or feel that connection, their objective drifts from learning toward mere achievement. That’s when we lose them, and where AI might be gaining them.

The “Power’s-Out” Test

I like to pose this question to faculty and students alike: What do you do when the power goes out? It’s a simple exercise that highlights everyone’s dependence on modern conveniences and the habits we have from them. With the latest atmospheric river event in the Pacific Northwest, the power was out for hours. Now I know that sounds strange, but here in Bellingham, I rarely ever experience an outage for more than a few minutes. Compare that to my childhood out in rural Washington, and winter storms in my halcyon days could go for a week. But the other weekend, with the lights out, my 9-year-old, who was home from school, had no problem being without power or internet. Some of my fellow parents reading this are probably chuckling, as I don’t have a teen with a phone (I know some had a very different experience with their kids, perhaps). But every time he entered a room, he’d reach for the light switch. He even laughed once and asked, “Why do I keep doing that? I know the lights are out.”

This automatic reach for the light switch, that is what we need students to recognize in their own AI use. The goal of education isn’t to eliminate tools; it’s to build the foundational knowledge that lets you choose when to use them versus when you’re simply reaching out of habit. My son knew the power was out, but his hand still moved toward the switch. Our students need to develop that same awareness: Can I actually do this, or am I just reaching for AI because it’s there?

True learning means you can work without the technology. Not that you always should, but that you genuinely can. In any profession, there will be moments when the tools fail, when the AI gives you fluff, when you need to rely on what you actually know. The transfer of learning, the real measure of education, is whether students walk away with knowledge and skills they’re confident in, not just familiarity with tools that did the work for them.

Working with AI to Find Avenues for AI in Education

This next section came out of a presentation I gave during this last Fall term and is very much a work in progress. After those conversations with faculty I mentioned before, while brainstorming with several AI tools on this topic, I realized we had a significant gap in instructional design guidance at WWU. It’s easy to state that “AI is prohibited unless expressly permitted by faculty,” but what does that permission actually look like in practice?

I feel like there are no clear avenues or examples for faculty. What are the meaningful distinctions between different types of AI use? Why might a faculty member allow AI in one context but not another? Without this clarity, “permitted” becomes an all-or-nothing proposition for faculty who may already be tired on the topic of AI.

So I started mapping out what I’m calling “avenues.” Specific ways faculty might allow AI that align with different learning objectives:

Again, a work in progress but the main avenues I’ve identified:

1. Ideation Only

Students can use AI for brainstorming and exploring initial ideas, but not for creating actual content that appears in their work.

2. Research Exploration Only

Just as search engines optimized research discovery, AI can take that a step further. During the exploration phase, AI can help identify meaningful nuances in a research topic. This doesn’t mean students should rely on auto-generated summaries. I’ve lost count of how many times I’ve searched for research only to hit dead ends. But AI’s contextual understanding could reveal new angles or connections worth exploring that get shrouded in the frustration of ‘not being able to find’ applicable research.

3. Revision/Editing Only

Setting aside the ongoing debates about AI and AI writing detection, AI tools can provide valuable revision and editing guidance. I can imagine a near future where writers train lightweight AI models on their own style to serve as personalized editing assistants. But again, that would only work if the student had the skills in their own writing.

4. Full Use With Documentation

Yes, full use. Why? Because in some professional fields, this is already expected. But there’s an important caveat: the person using AI must be proficient enough to vet the information, assess the writing quality, and maintain professional standards. This approach also teaches proper AI citation practices, which are becoming essential skills. And at times, knowing when AI isn’t the right tool to use right away or even at all. But if education doesn’t set those learning opportunities up, where does that leave students?

5. AI Encouraged for Learning, Not Production

This approach identifies specific parts of the learning process where AI could genuinely help students learn, while clearly explaining the reasoning behind each use. For example, a faculty member might encourage students to use AI during ideation to help organize what they’ve already discovered. Tools like NotebookLM are building environments for this where the AI doesn’t directly make decisions for students but gives opportunities to the learner to find alternative ways to interact with material they’ve gathered. These become customized study aids that enable different levels of engagement with the content.

Interacting with Learning

AI Use CategoryPermitted?Description of Allowed UseDisclosure Required?Notes for Students
AI Encouraged for Learning, Not Production✔️ Yes, With LimitsConcept clarification, study aid, writing feedback.✔️ Required if used.AI cannot replace reading, reasoning, or original argumentation.

Ideation and Exploration

AI Use CategoryPermitted?Description of Allowed UseDisclosure Required?Notes for Students
Ideation Only✔️ Yes, LimitedBrainstorming, topic exploration, question development.✔️ Brief note describing use.AI-generated text cannot appear in final work.
Research Exploration Only✔️ Yes, LimitedUnderstanding concepts, identifying keywords, exploring background knowledge.✔️ Must state how it informed research.AI cannot be cited or included in the final product.

Production

Revision/Editing Only✔️ Yes, With LimitsAI may help with clarity, structure, grammar suggestions.✔️ Required.AI cannot generate new sentences, ideas, or analysis.
Full Use With Documentation✔️ YesAI may assist throughout the process (research → drafting → editing).✔️ Required for all uses.Student must verify accuracy and cite AI contributions (APA/MLA/etc).

Sample Language for Syllabi and Assignments

I wanted to create practical sample language for each avenue that faculty could adapt for their syllabi or assignments. Time and opportunity have been constant constraints this term, but here are some initial examples:

Brainstorming Allowed, Final Work Must Be Human-Generated

You may use generative AI tools only during your brainstorming or ideation process (e.g., exploring concepts, outlining possible directions).

For this assignment, the expectations are:

  • AI-generated text, arguments, or sentences may not appear in your submitted work.
  • Any ideas you explore with AI must be completely rewritten in your own words and supported with your own research and analysis.
  • You must disclose your AI use at the end of your assignment (e.g., “I used ChatGPT to brainstorm paper topics”) or cite it according to the required citation style.

Any use of AI beyond ideation will be considered academic dishonesty.


Idea Expansion Allowed, Not Drafting

You may use AI to help formulate initial questions or expand your understanding of unfamiliar topics, but AI may not be used to draft, outline, or phrase any portion of your submitted work.

Evidence of AI-generated drafting or editing will result in an academic integrity review.


Moving Forward: Teaching AI Literacy Through Modeling

By naming these avenues of where and when AI can or can’t be used, we move beyond “ban it all” or “it’s inevitable.” Faculty can build intentional approaches that vary by context, discipline, and learning objective, connecting AI use to transparent, assessable learning outcomes. What matters most is that students recognize when they’re reaching for a tool out of habit or convenience versus relying on their own capability and adaptability. AI, like the technology that came before it and led to its creation, needs a scaffolded understanding of not just how to use it, but when and why it might be used, but beyond that, and perhaps more importantly, when to set it aside to use the “little grey cells.”

Poriot pointing to the temple of his head, saying "the little grey cells"

My view with AI, as with anything new, is that it’s literacy that matters. Not AI literacy. Not digital literacy. Just literacy. Being capable with a tool, but even more capable in the ability to think, create, and solve problems when the ‘power goes out.’ Or to know when that knowledge can be leveraged even further with the right tool at the right time. That’s the education that prepares students for whatever comes next…even what comes after AI.

Other links of note: