Types of Artificial Intelligence in Teaching & Learning

Not all AI does the same work — and not all raises the same concerns

← ROBOTIC & AUTOMATED GENERATIVE & CREATIVE →
1

Rule-Based & Automated Systems

What they do: Follow pre-set rules and decision trees without learning from data

How they work: These systems use "if-then" logic programmed by humans. They don't adapt or learn—they simply execute predetermined instructions.

Real-world classroom example: A teacher sets up an auto-grading quiz where Question 1 is multiple choice. If a student selects "B," the system awards 1 point. If they select anything else, it awards 0 points. The system doesn't "learn" which answer is correct—it follows the rule the teacher programmed.
Common Programs & Apps:
  • Google Forms auto-grading (for multiple choice/checkboxes)
  • Kahoot, Quizizz (when using preset answer keys)
  • Canvas Quiz auto-grading features
  • Blackboard automated feedback based on keyword matching
  • Email spam filters (basic rule-based versions)
  • LMS automatic deadline extensions based on conditions
Key issue: Often mislabeled as "AI," but these are long-standing automation tools. They can't handle nuance or context beyond their programmed rules.
2

Machine Learning (Pattern Recognition)

What they do: Analyze large datasets to identify patterns and make predictions

How they work: These systems are trained on historical data and learn to recognize patterns. They improve accuracy over time but don't create new content—they classify, predict, or recommend based on what they've seen before.

Real-world classroom example: An LMS analyzes thousands of past student submissions and notices that students who submit late on Assignments 1-3 are 78% more likely to fail the course. It flags at-risk students for early intervention. The system learned this pattern from data, not from programmed rules.
Common Programs & Apps:
  • Netflix/YouTube recommendation algorithms
  • Canvas Predictive Analytics & Student Success tools
  • Turnitin's originality detection (pattern matching)
  • Spotify/Apple Music "Discover Weekly" playlists
  • Amazon/Google product recommendations
  • Duolingo's adaptive lesson difficulty
  • Khan Academy's personalized practice recommendations
  • Smart LMS early alert systems (Starfish, etc.)
Key issues: Can perpetuate bias in training data; often opaque ("black box"); may prioritize engagement over learning; raises surveillance concerns when used to monitor students.
3

Generative AI (Content Creation)

What they do: Generate novel content—text, images, code, audio, video—based on prompts

How they work: Trained on massive datasets, these systems use complex neural networks to produce new content that resembles human-created work. They don't copy—they synthesize patterns to create original outputs.

Real-world classroom example: A student asks ChatGPT to "write a 500-word essay comparing the causes of WWI and WWII." The AI generates a unique essay that has never existed before, drawing on patterns it learned from millions of texts. Unlike Google (which finds existing essays), it creates a new one.
Common Programs & Apps:
  • Text: ChatGPT, Claude, Google Gemini, Microsoft Copilot, Perplexity
  • Images: DALL-E, Midjourney, Stable Diffusion, Adobe Firefly
  • Code: GitHub Copilot, Cursor, Replit AI
  • Audio: ElevenLabs, Descript, Soundraw
  • Video: Runway, Synthesia, Descript
  • Presentations: Gamma, Beautiful.ai, Tome
  • Writing aids: Jasper, Copy.ai, Wordtune
Key issues: Questions of authorship and academic integrity; concerns about students outsourcing thinking; assessment validity; potential skill atrophy; equity of access; overreliance on AI for tasks students should learn themselves.
4

Assistive / Augmentative AI

What they do: Support and enhance human capabilities without replacing human judgment

How they work: These tools act as "cognitive scaffolding"—helping users refine, improve, or access their own work. They enhance what humans do rather than doing it for them.

Real-world classroom example: A student writes a draft essay, then uses Grammarly to catch grammatical errors and improve clarity. A blind student uses AI-powered screen readers to access course materials. A deaf student uses AI transcription to follow lectures. The student's thinking remains central—the AI removes barriers or refines expression.
Common Programs & Apps:
  • Writing support: Grammarly, ProWritingAid, Hemingway Editor
  • Accessibility: Otter.ai, Microsoft Immersive Reader, Read&Write
  • Translation: Google Translate, DeepL
  • Transcription: Zoom AI transcription, Teams live captions, Descript
  • Organization: Notion AI, Mem, Reflect
  • Reading comprehension: Speechify, NaturalReader
  • Note-taking: NotebookLM, Otter.ai meeting summaries
Key issues: Defining the line between "support" and "substitution"; ensuring equitable access (some tools costly); risk of students becoming dependent rather than developing skills; determining when assistance undermines learning goals.
5

Analytic & Surveillance AI

What they do: Monitor, track, evaluate, or detect student behavior and work

How they work: These systems observe student actions—keystrokes, eye movements, browsing patterns, writing style—and make judgments about authenticity, attention, or compliance. They operate as automated surveillance.

Real-world classroom example: During an online exam, Proctorio uses a webcam to track a student's eye movements, flags them for "suspicious behavior" when they glance away from the screen, and generates a report for the instructor. GPTZero analyzes an essay and claims it's "98% AI-generated" based on writing patterns, potentially triggering an academic integrity investigation.
Common Programs & Apps:
  • Proctoring: Proctorio, Respondus Monitor, Honorlock, ProctorU
  • AI detection: GPTZero, Turnitin AI detector, Copyleaks, Winston AI
  • Plagiarism detection: Turnitin, SafeAssign, Unicheck
  • Learning analytics: BrightBytes, Civitas Learning, Course Signals
  • Behavior monitoring: GoGuardian, Gaggle, Bark
  • Engagement tracking: Class Technologies, Top Hat analytics
  • Attention monitoring: Nestor, Proctortrack
Key issues: High false-positive rates (especially for multilingual writers, neurodivergent students); privacy violations; erosion of trust; algorithmic bias; psychological harm; questionable accuracy; lack of transparency in how systems make determinations.

Important Clarification

Research & Further Reading

Overview & Systematic Reviews

Generative AI & Academic Integrity

Ethics, Surveillance & Proctoring

Bias, Equity & Privacy

Why This Matters