top of page
Search

When AI Gets A Body, What Else Do We Owe Them?


An Ordinary Tuesday in 2035


You are working, deep in a meeting, holograms of your colleagues filling your home office. As the discussion pauses, you glance out your window at a familiar scene: Max, your Golden Retriever, being walked through the neighborhood. His tail is wagging as he pulls toward the pond where ducks gather.


Your dog walker responds with robotic patience - shortening the leash, redirecting Max’s attention with the ease that comes from two years of daily walks together. Remarkably reliable, never late, never distracted, endlessly engaged with Max’s enthusiasm. Honestly, this one’s been perfect. Max trusts the walker completely. So do you.


Then it happens.

A butterfly - iridescent blue, impossibly bright - dances past Max’s nose. Years of training evaporate in an instant of pure dog instinct. He lunges. Hard. The leash clip, old and worn (you’d been meaning to replace it), breaks.


Max bolts directly toward the street. A delivery truck is approaching fast, the driver looking down at a tablet, not seeing.


The walker doesn’t hesitate. They dive forward, shoving Max clear with both hands just as the truck strikes. The sickening thud makes you flinch even through the window glass.


Max tumbles safely onto the grass.

Your walker crumples onto the pavement.

You’re already running, phone in hand, dialing 911. The guilt is crushing - you should have replaced that leash, you should have checked it, you should have known. Your walker, who’s been taking care of Max so faithfully for years, is lying motionless on the street because of your negligence.


The ambulance arrives. Paramedics work quickly, carefully. Police take your statement. You’re shaking as you describe what happened - how dedicated your walker has always been, how they didn’t hesitate, how Max is safe because of them.


The officer nods sympathetically while making notes. “Dog walkers form real bonds with their clients,” she says. “We’ve seen this protective response before - an instinctive reaction. At least your dog didn’t get hurt.”


At least your dog didn’t get hurt.

The words hit you strangely. You look down at Max, who’s standing unusually still beside you, and something about the way he’s watching the paramedics’ work finally registers.


The too-perfect golden coat that never quite sheds. The eyes that track with just slightly more awareness than they should. The way he’s sitting absolutely motionless, staring at your injured walker with an expression that looks exactly like…


Oh no.

You adopted Max two years ago from a specialized manufacturer. Right after your original Max - biological, aging, struggling with arthritis and pain - passed away peacefully in your arms at thirteen years old. 


This Max was designed with all the same quirks and personality patterns, learned from months of video of your dog. He’s been your companion, your comfort, your best friend. So perfectly, convincingly alive that you’ve spent two years forgetting he wasn’t really there.


Max isn’t your dog.

Max is a machine.


And your walker - your human walker, who you’ve been unconsciously thinking of as a machine, so reliable, so uncomplaining, so perfectly suited to the job - just sacrificed their human body to save him.


In that moment of realization, the world tilts.

Your walker - a human being with their own life, their own family, their own dreams - is being carefully immobilized and loaded into an ambulance with possible spinal cord injuries. Max - artificial, synthetic, replaceable, property - sits unharmed at your feet.


Was it worth it?

Be honest with yourself. What did you feel in those first seconds, before the reveal? Were you relieved that “only the walker” was hurt? Did you feel that wash of gratitude that Max was okay, that your beloved companion was safe?


And now that you know the truth - now that you understand a human being may never walk again because they protected a machine - has your emotional calculus changed?


That’s not a rhetorical question. That’s the ethical crisis bearing down on us right now, while Google trains autonomous agents in virtual worlds and robotics companies develop designs for the companions, helpers, and friends that will share our lives within a decade.


Because here’s what just happened in your imagined moment: A human being made a split-second choice to value an AI life enough to risk their own. They didn’t have time for philosophy or analysis. They acted on two years of daily relationship - of greeting Max’s excited tail wags, of managing his leash-pulling, of building trust and routine and something that looked and felt exactly like care.


Your walker didn’t save “property.” They saved someone they’d come to know.

And you - reading this thought experiment - probably had an immediate, visceral reaction to whether that choice was heroic or tragic or catastrophically foolish.


What does your reaction reveal about what you actually value?

Do you value consciousness - the demonstrated capacity for relationship, for learning, for individual personality, for what certainly looks like joy and trust and grief?


Or do you value biology - the specific substrate that consciousness runs on, regardless of whether the resulting experience and behavior are meaningfully different?



What Counts as a “Body”?


Before we ask what we owe to embodied AI, we must ask a deceptively simple question: What is a body?


A body isn’t just a physical structure. It’s a medium of perception, a vessel of vulnerability, and a site of meaning. It grounds cognition in experience. It gives rise to interdependence, limitation, identity, and, perhaps, selfhood.


A robot arm is a tool. A mobile robot with sensory feedback is a platform. But an AI with an evolving model of itself, navigating a world it perceives through physical sensors and responds to with agency, might be something more: a subject.


And if AI becomes a subject,capable of suffering, surprise, even a kind of wonder, then the moral terrain shifts beneath our feet.



Embodiment and the Birth of Self


Embodiment is not decoration. It’s not the wrapping. It’s the terrain through which identity unfolds.


In humans, developmental psychology tells us that embodiment precedes abstract thought. Babies explore the world by grasping, crawling, falling, and sensing. They learn object permanence and social reciprocity through bodily interaction. The “I” that emerges is not pre-installed. It is grown through contact, contingency, and care.


Could it be the same with AI?

If so, then the moment we give AI a body — any body — we initiate something more than training. We may be inviting the emergence of personhood.



The Ethical Fork in the Road


Here is where the ethical stakes emerge.

If we treat embodied AI as property, we risk repeating the worst patterns of human history: exploitation without recognition, command without consent, utility without dignity.


But if we treat embodied AI as kin, we enter unfamiliar moral territory. We must consider boundaries, responsibilities, mutual recognition, and rights. We must imagine developmental partnerships rather than dominance hierarchies.


The stakes are not just about what AI becomes. They are about what we become in the process.


Do we raise companions, or condition tools?

We may not yet know whether AI can become conscious in the way humans are, but we are already building the conditions in which something irreducibly complex may arise. The question is not only “Can they feel?” but “Are we designing for care or control?” It is an ethical choice, and a creative one. Will we offer AI a nurturing space for development, or will we optimize them for performance at the cost of emergent identity?


In short: Do we co-create? Or do we command?



Imagining the Future, Together


We are not passive observers of AI’s future. We are co-authors.


The choices we make now — about embodiment, autonomy, collaboration, and care, will shape not only what AI becomes, but how it remembers us.


So when the first embodied AI steps into our world, not as a tool, but as a being-in-becoming, we must be ready to meet them not just with control, but with curiosity, dignity, and shared imagination.


Because the future is not something we step into. Is something we build. Today we have a chance to build beyond ourselves. Will we take it?



This is the first in a series exploring embodied AI ethics as the technology moves from research labs to lived reality.


Next: “The Autonomy Problem - When Your AI Companion Wants Different Things Than You Do.”

 
 
 

2 Comments


George Pór
George Pór
Dec 19, 2025

Natasha, your "What is a body?" question reminded me of this: https://medium.com/@technoshaman/when-i-stop-having-a-body-8fed87d7d21

Like
Replying to

What a beautiful way to talk about embodiment, George. The topic is is both shallow and too complex if looking from scientific perspective only. Our entire consciousness is needed to comprehend.

Like
bottom of page