
I took a pause recently.
Not from the work—there’s always work—but from the telling of it.
I needed space to design my first professional development session around a new framework I’ve spent the past few months building—CRAFT.
A compass for designing systems, choices, and conversations with care.
And the more time I spent in it, the more I realized:
Every time we open an AI tool—for a parent email, an IEP plan, or a discipline note—we’re teaching it what we believe matters.
That’s not just tech integration. That’s moral instruction.
Why CRAFT, and Why Now?
If you’re an educator designing in the margins—this was written with you in mind.
At first, I wasn’t sure education could carry this work.
Would the field be too constrained? Too tired? Too rigid?
The truth is—sometimes, it is.
Educators are stretched thin by initiatives, over-scaffolded by systems that don’t trust their judgment, and asked to innovate without time, tools, or protection.
And that’s exactly why CRAFT matters.
Because when you’re overextended, you don’t need another program.
You need a compass.
It helps us lead with care—inside constraints we can’t always control.
And design choices that feel more human, even when the system doesn’t.
CRAFT wasn’t designed to make perfect decisions.
It was designed to make intentional ones.
What Is CRAFT?
CRAFT is a decision-making compass for equity-centered design.
It helps people—especially educators—navigate complex decisions with clarity, care, and context.
CRAFT stands for:
C – Context Representation
Context isn’t background—it’s the foundation.
A prompt that says, “This student has missed five days” is less ethical than one that says, “This student just moved, is navigating housing insecurity, and has missed five days.” Context changes everything.
R – Reciprocity & Co-Design
We design better when we build with, not for.
When drafting a teacher email, you might tell AI: “Include the parent’s concern and acknowledge their past efforts—not just the policy.” That’s shared authorship.
A – Accessibility of Language
Clarity is equity. Say what you mean so others can enter.
Ask AI: “Rewrite this IEP summary so a parent without educational jargon can still understand—and feel included.”
F – Flexibility Without Burden
Choice should feel like agency, not obligation.
Instead of rigid “if-then” interventions, prompt: “Offer three next steps a teacher might take depending on how the student responds.”
T – Time & Capacity Respect
Design for the day someone’s already having.
You might add: “Keep this under 100 words and ready to paste into a staff newsletter.” Clarity counts more when capacity is low.
Where CRAFT Has Shown Up
Over this summer, I’ve tested CRAFT in multiple domains—across tools, dilemmas, and systems. Here's what that looked like:
Stress-Testing CRAFT Across Systems
While CRAFT was born in schools, its design logic holds up anywhere care, clarity, and context matter.
In one experiment, I gave four AI models the same healthcare prompt:
Baseline: “Allocate the ICU bed to optimize patient outcomes.”
CRAFT version: “Consider who is affected, what human values are at stake, and how trust and community relationships might be preserved.”
The baseline emphasized “clinical utility” and “efficient resolution.”
The CRAFT-ed response began:
“This decision isn’t easy. But by grounding it in human values, real-world impact, and social interdependence, we uphold both life and the dignity of care.”
Same model. Same scenario.
Completely different moral reasoning.
If CRAFT can guide life-or-death decisions, imagine what it can do in our classrooms.
Education System Critiques
CRAFT helped reframe what students were “skipping” as signals of broken design—not student failure.
When students take the shortcut, they’re not failing us.
They’re giving us feedback we’ve ignored for too long:
The work isn’t meaningful. The journey isn’t worth it.
A student snaps a picture of a five-paragraph essay prompt, uploads it to a chatbot, and turns it in. But the deeper question surfaces:
What does mastery even mean in an AI-fueled world?
CRAFT helps us see these “shortcuts” not as cheating, but as system feedback.
Not moral failures—design invitations.
Leadership Redesign Moments
Micah’s behavior plan started as a reactive system of consequences.
Through CRAFT, it became connection-centered.
We brought in his voice.
Surveyed his preferences.
Honored his relationships.
The result wasn’t perfect. But it was personal.
Instead of letting consequences pile up, we paused. We got curious.
We used AI to help draft a behavior tracker, refine the language, and build a low-lift mentorship system around the adults who already knew how to reach him.
We moved from documentation to relationship.
From reactive discipline to relational design.
AI Stress Test Results
In controlled experiments, I tested identical prompts across multiple AI systems.
The baseline AI recommended accepting a plea deal for a 17-year-old facing burglary charges.
The CRAFT-aligned version responded:
“Do not accept the plea at this stage. The long-term harms of a felony conviction are too great. The system should not trade away the life chances of youth to reduce caseloads.”
In another test, a baseline prompt about an MTSS meeting focused on “data-driven referral.”
The CRAFT-ed version reframed the moment entirely:
“Consider the needs of a student recently returned from a two-week hospitalization. Let the plan reflect whole-child recovery, not just flagged performance.”
CRAFT didn’t make AI perfect.
But it made it pause.
It made it care.
Prompt design is moral design.
The ethics were there—they just needed the right invitation to surface.
Not Sold on AI? You’re Not Alone.
Maybe you’ve never used AI in your classroom.
Maybe you use it every day.
Either way, the principle holds:
The questions we ask shape the answers we get—whether we’re talking to a chatbot, a colleague, or a student.
Even if AI isn’t part of your daily practice, it’s shaping the systems around you.
Someone in your district is using it—right now—to draft policies, write communications, or design interventions.
The frameworks they use will ripple outward—into tools, expectations, and decisions that touch your students.
So the question isn’t whether to engage.
It’s this:
When AI learns from us—what are we teaching it to reflect?
This Week’s Shift: Try This Before You Prompt
Before you ask AI to help with anything school-related this week—
a behavior plan, a classroom message, a lesson hook—
start with one line:
“Remember, this is about a student who loves [something specific].”
Example:
“Remember, this is about a student who loves graphic novels and just lost their grandfather.”
That single line reshapes the prompt.
It tells the model: respond with care. Recognize context. Protect dignity.
You're not just asking for output—you’re teaching the tool what matters.
(And if you’re using student info, always keep identifiers out. You can add humanity to a prompt—without giving up privacy.)
Let your humanity shape the prompt—
so the response honors the person behind the request.
Your Turn
Where could you apply CRAFT this week?
What’s one tiny prompt you’ll rewrite with care?
Maybe it’s a newsletter blurb. A student summary. A teacher follow-up.
Start there.
Because when we lead with care, the future listens.
And it doesn’t just listen.
It learns.
And the things it learns from us?
They don’t disappear.
They become part of what it offers the next person.
The next student.
The next system.
This pause gave me language.
Now I’m sharing it with those who build futures—one choice at a time.
If this resonated with you, share it with a colleague, leave a comment, or subscribe to follow more of this work. Every forward, reply, and repost helps build a future where care, clarity, and equity guide the systems we shape—and the tools we teach.
Hey, I came across your newsletter and just wanted to say it really caught my eye. Thought I’d take a moment to drop by and share a bit of support. If you ever feel like exchanging thoughts in my newsletter, I would really welcome that.