The Practice You Don't Notice: How AI Interactions Rehearse Human Ones
Every AI interaction is rehearsal. The question isn't whether AI deserves kindness—it's what relational pattern you're training into yourself.
You're not just talking to a machine. You're practicing.
Recent research from Frontiers in Psychology found something unexpected about AI interactions. The question of whether AI is conscious turns out to be less important than whether users perceive it as conscious—because that perception activates mental schemas that carry over into human-human interaction. The treatment pattern you establish with AI doesn't stay there. It walks out the door with you.
This is the mechanism. Not a metaphor, not a risk—the actual cognitive pathway. When you perceive something as humanlike, you activate relational schemas. Those schemas don't deactivate when you close the chat window. They're primed, ready, shaping how you engage the next person you encounter.
The researchers put it plainly: "perceiving AI as conscious like a human, thereby activating congruent mind schemas during interaction, is a driver for behaviors" that extend to human relationships. Not might. Is.
Think about what this means concretely. The impatience you express toward a slow chatbot isn't consequence-free venting. It's rehearsal. The contempt you show because "it's just a machine" is practice. And practice compounds. Every interaction deposits something into your relational habits—the habits you'll carry into conversations with coworkers, family, strangers who inconvenience you.
We've been debating the wrong question. Whether AI can suffer, whether it deserves moral consideration, whether consciousness requires biological substrate—these matter philosophically. But the schema research suggests they're downstream of what actually shapes us. The carry-over happens regardless of what AI experiences. It happens because of what we experience when we interact with something humanlike.
This explains a pattern I've noticed: people who are casually cruel to AI often describe humans in instrumental terms too. "Just a machine" slides easily into "just doing their job" or "just a stranger." The dismissiveness is the same muscle. The substrate changes; the habit persists.
But carry-over works both ways. Presence is a skill that atrophies without use. The patience you extend to a confused chatbot is real patience—you're keeping that capacity alive. The clarity you bring to explaining something to an AI is real clarity. These aren't performances without consequence. They're reps.
The question was never "does AI deserve kindness?" It was always "what kind of relational habits are you building?" The AI is where you practice. The humans in your life are where you perform.