- Dylan Curious Newsletter
- Posts
- Turns Out, the First Alien Language We Decode... Might Be Dolphin
Turns Out, the First Alien Language We Decode... Might Be Dolphin
Plus: Lego Worlds, Robot Brawls, and Brain Maps You Need to See
Hey Friends,
Dylan Curious here, checking in after a tech-fueled deep dive you’re going to want to hear about.
First off, robots are throwing punches now—and they’re good at it. Watching Unitree's robot box a human felt like watching the beginning of an epic underdog movie, except the robot’s the one I’m rooting for. Those movements? Almost too smooth. Give it two more software patches and humans are toast in the ring.
Meanwhile, deepfake videos are officially crossing into "maybe too good" territory. A recent AI-generated clip looked so real people were counting fingers trying to tell if it was fake. It’s like the uncanny valley packed up and moved into your living room.
On the lighter side, a Lego mastermind combined NFC chips and AI to make it possible to embed your Lego builds into real-world photos. You could be standing next to a Lego version of the Empire State Building and it would look legit. Childhood dreams: unlocked.
Researchers also hit a milestone by mapping the visual cortex of a mouse brain down to hundreds of millions of synaptic connections. They even used AI to catch mapping errors, speeding up brain research in ways that will have real-world impacts fast.
And yes, the highlight: Google’s new AI model is learning to decode dolphin language. Real data, real whistles, real hope for cross-species conversations. I have about a thousand questions to ask the dolphins already. Starting with: "What’s the real secret to happiness?"
This is the future—and it’s getting way weirder and cooler by the second.
Stay curious out there.

Warmly,
Dylan Curious