- Dylan Curious Newsletter
- Posts
- How AI Is Blurring Every Line with Wes Roth & Tim from Theoretically Media
How AI Is Blurring Every Line with Wes Roth & Tim from Theoretically Media
From Deepfakes to DIY Blockbusters, a Hollywood Insider Explains Why AI Won't Replace Us
Hey Friends,
Dylan Curious here. Wes Roth and I recently wrapped up an epic two-part podcast with Tim Simmons, from Theoretically Media. If you don’t know Tim yet, he’s a filmmaker, digital content creator, and former Hollywood insider who now spends his time decoding the wild world of AI models, and cultural shifts in digital storytelling. We didn’t hold back at all. Let me tell you what I learned, and why it made me rethink everything about what’s real, what’s art, and who’s really in control.
From Hollywood Backlots to Bedroom Studios
Tim used to work in the belly of the beast, places like Sony Animation, where stadium-sized floors housed rows of animators. But as he said, those days are numbered because with AI, the power is moving to you and I, the individual creators. The kid with a phone. The artist with a laptop. Soon, small studios, or even one-person teams, will be creating the kind of content that used to take 50 people and a multi-million-dollar budget. AI is redefining who gets to tell their stories.
“Johnny the Rotoscoper”: The Great Creative Shift
One of my favorite anecdotes Tim dropped was about Johnny, a guy whose only job in Hollywood was to rotoscope, frame by frame, eight hours a day. It’s the kind of job AI is automating away. And yet, as Tim argues, it’s a reboot. That job might be gone but now Johnny, or someone like him, can become a self-contained studio. Shoot with a phone, edit with AI, and tell stories from the heart.
AI, Taste, & The Rick Rubin Test
We even got into the role of taste in the AI era. Tim compared the future creator to Rick Rubin, not a technician, but a vibe curator. Someone who knows what feels right and can pull that out using these new tools. We’re heading into a world where technical skills might take a backseat to curation, instinct, and the nerve to create something authentic using synthetic tools.
Deepfakes, Identity, & What’s Too Real
In the second half of the podcast (on my channel), we talked about deepfakes, voice cloning, and how easy it is now to see yourself selling a product you’ve never even heard of. Tim had strong words about our faces and voices being up for grabs. Tools like Eleven Labs and AI-generated avatars make it effortless for anyone to mimic you. And maybe that’s the cost of this new creative explosion. Still, platforms like YouTube are stepping up with new tools to notify you if your likeness is being used elsewhere. Whether that is enough, remains to be seen.
The Trust Crisis Is Already Here
Maybe the most haunting idea? Tim’s mom was fooled by a horribly synced AI video of a fake Hiroshima survivor. If we can’t tell when something is fake, then ten years from now, what does that do to our collective sense of truth? The danger is that we might not believe anything at all. And the conversation reminded us that this is happening, whether we make music, write blogs, direct films, or produce memes, we have to stay alert. We can’t outsource taste, authorship, or integrity to machines. Not yet. So the uncanny valley? It’s not a valley anymore. It’s a living room, and we’re all sitting in it. Until next time.

Dylan Curious & Tim From Theoretically Media
Warmly,
Dylan Curious
Links to the Two-Part Podcast: