When it comes to critical thinking frameworks, Ennis (2015) remains one of the most powerful and practical ones I’ve come across. I’ve used it in several graduate courses over the years, not just because it’s comprehensive, but because it translates well into classroom practice.
Ennis breaks critical thinking into two parts:
• Dispositions: the mindset or habits of the thinker
• Abilities: the actual thinking skills
I’ve been thinking lately: how can we apply this framework in an AI-rich classroom? So I put together this short guide. For each of Ennis’s 15 skills, I offer a concrete classroom activity or prompt that brings AI into the picture. The goal is simple: to help students build critical AI literacy.
According to Ennis (2015, pp. 32-33), Ideal critical thinkers have the ability to
- 1. have a focus and pursue it,
- 2. analyze arguments,
- 3. ask and answer clarification questions,
- 4. understand and use graphs and maths,
- 5. judge the credibility of a source,
- 6. observe, and judge observation reports,
- 7. use their background knowledge, knowledge of the situation, and previously
- established conclusions,
- 8. deduce, and judge deductions,
- 9. make, and judge, inductive inferences and arguments (both enumerative
- induction and best-explanation reasoning),
- 10. make, and judge, value judgments,
- 11. define terms, and judge definitions,
- 12. handle equivocation appropriately,
- 13. attribute and judge unstated assumptions,
- 14. think suppositionally, and
- 15. deal with fallacy labels.
Now let’s explore how we can use these skills to enhance students critical AI literacy
1. Have a Focus and Pursue It
Teach students to approach AI with a clear goal. Instead of vague queries, they should frame precise questions and evaluate how well the AI response meets their intended purpose.
2. Analyze Arguments
Ask students to prompt AI to generate arguments on a topic. Then have them break down the structure—identifying claims, reasons, evidence, and logical coherence.
3. Ask and Answer Clarification Questions
Model how to refine AI prompts iteratively. If an AI response is vague, ask: “What’s missing?” or “How could we clarify this further?”
4. Understand and Use Graphs and Maths
Have students use AI to generate data visuals or summaries and then assess their accuracy, logic, and limitations. Does the graph support the claim? Are there misleading patterns?
5. Judge the Credibility of a Source
Train students to verify AI-generated claims. Was a source cited? Is it traceable? Reliable? This promotes healthy skepticism, not blind trust.
6. Observe, and Judge, Observation Reports
Compare AI summaries of texts, experiments, or articles with the originals. What key details were emphasized, distorted, or omitted?
7. Use Background Knowledge
Encourage students to notice when AI contradicts what they’ve already learned. Is it missing important context? Are its conclusions incomplete?
8. Deduce and Judge Deductions
Challenge AI to walk through reasoning step-by-step. Then ask students: Do the conclusions logically follow from the premises? Where does it go wrong?
9. Make and Judge Inductive Inferences
Have AI generalize from examples or patterns. Students then evaluate: Are the inferences valid? What’s the evidence?
10. Make and Judge Value Judgments
Feed AI an ethical dilemma or social issue. Then discuss: What values are implicit in the response? What’s missing? What assumptions does it make?
11. Define Terms and Judge Definitions
Ask AI to define complex terms like “truth,” “intelligence,” or “bias.” Students analyze the definitions for clarity, depth, and hidden bias.
12. Handle Equivocation Appropriately
Spot when AI shifts meanings mid-response (e.g., using “learning” differently in two parts of the answer). Teach students to detect and address this.
13. Attribute and Judge Unstated Assumptions
Have students ask: What is the AI assuming in this answer? Is it leaving out context, background, or cultural framing?
14. Think Suppositionally
Use “what if” prompts to explore alternate outcomes or perspectives. Students learn to engage in hypothetical reasoning and counterfactual thinking.
15. Deal with Fallacy Label
Ask AI to generate flawed arguments—like strawman or slippery slope—and let students identify and correct them. It’s active, diagnostic thinking in action.
Here is a PDF version of these skills:
Reference:
Ennis, R. H. (2015). Critical thinking: A streamlined conception. In M. Davies & R. Barnett (Eds.), The Palgrave handbook of critical thinking in higher education (pp. 31–47). Palgrave Macmillan.