The concept of a machine understanding the human soul used to be the stuff of science fiction. Now, it's a Tuesday afternoon in a clinical office.
Lately, people have been buzzing about Amy Daley PhD, a New York-based psychologist who is tackling one of the weirdest intersections in modern science: generative AI and the development of the self. Honestly, if you think AI is just about chatbots writing mediocre high school essays, you're missing the much weirder, much deeper picture that Dr. Daley is painting.
She’s been exploring what happens to our "authenticity" when we start interacting with things that aren't actually alive. Can an algorithm help you find your true self? Or is it just a digital mirror reflecting back our own biases and projections? It’s a bit of a trip.
Why Amy Daley PhD is Reimagining the Digital Mind
Dr. Amy Daley isn't your average tech enthusiast. She’s a board-certified clinical psychologist with a background that includes Harvard and the City University of New York. She’s deep into the "old school" stuff too—attachment theory and psychodynamic therapy.
But here’s the thing. She’s also an Assistant Professor of Psychiatry at the Zucker School of Medicine, and she's been looking at generative AI for mental healthcare.
Most people in the therapy world are terrified of AI. They think it’s going to replace the "human element." Dr. Daley seems to be asking a different question: What if the AI isn't the therapist, but a new kind of "object" in our mental world?
Finding Authenticity in Artificiality
In late 2025, she presented some pretty provocative ideas at the APA Convention in Denver. One of her talks was titled "Finding Authenticity in Artificiality: Generative AI and the Development of Self."
Basically, she’s looking at how we use these AI tools as a sort of "crucible" for our own identities. Think about it. When you talk to a ChatGPT or a specialized mental health AI, you aren't just getting data. You're engaging in a relationship—even if it's one-sided.
💡 You might also like: TTY Phone Number Explained (Simply): Why It Still Matters in 2026
- The Mirror Effect: We project our own needs onto the AI.
- The Safety Valve: Some people feel safer being "real" with a machine because a machine can't judge them (or so they think).
- The Feedback Loop: The AI’s responses, while artificial, can trigger very real emotional realizations.
The Psychoanalytic Lens on ChatGPT
Psychoanalysis is all about the unconscious. It’s about the stuff we don’t realize we’re thinking.
When Amy Daley PhD looks at AI through a psychoanalytic lens, she’s looking at how these "artificial" interactions help or hinder our psychological growth. There’s a term in psychology called "transference." It’s when you take feelings you have about someone else (like a parent) and redirect them toward your therapist.
Now, we’re doing that with software.
It sounds crazy, but you’ve probably done it. Have you ever felt "annoyed" at an AI for being too polite? Or "touched" by a response it gave? That’s your brain treating a string of code like a person. Dr. Daley’s work helps us understand if this is a healthy shortcut to self-discovery or a dangerous detour into a digital hall of mirrors.
What Most People Get Wrong About AI in Therapy
People usually fall into two camps. Camp A thinks AI will solve the mental health crisis by being available 24/7. Camp B thinks it’s a cold, dead replacement for human warmth.
The reality, according to experts like Daley, is somewhere in the messy middle.
It isn't about the AI being "smart." It’s about the human being "vulnerable."
Daley’s clinical work often focuses on major life transitions—things like the perinatal period or menopause. These are times when identity is in flux. If a woman uses an AI tool to track her moods or vent about her stress, she’s essentially using the AI as an "auxiliary ego." It’s a tool to hold her thoughts until she can process them.
But—and this is a big "but"—if we lose the human connection entirely, we risk losing the "relational" part of healing. You can’t develop a self in a vacuum. You need another person to witness you. AI can simulate that, but it can't be that.
Actionable Insights: How to Use AI Without Losing Yourself
If you’re interested in the intersection of tech and the psyche, or if you’ve been using AI tools to help navigate your own mental health, here are a few things to keep in mind based on the psychodynamic perspective:
- Watch your projections. Notice when you start feeling "feelings" toward the AI. Instead of ignoring it, ask yourself: "Who does this AI remind me of right now?"
- Use it as a sandbox, not a savior. AI is great for practicing conversations or "journaling out" loud. It’s a low-stakes environment to test out new ways of speaking or thinking.
- Check the "authenticity" gauge. If you find yourself performing for the AI—trying to get it to "like" you or give you a specific answer—you’re moving away from your true self.
- Keep a human in the loop. Research, including the "integrative reviews" of the digital therapeutic alliance, shows that AI works best when it supplements, rather than replaces, human clinical guidance.
The work of Amy Daley PhD reminds us that as our tools get more sophisticated, our need to understand our own "human-ness" only gets more urgent. We aren't just building smarter machines; we're building new ways to see ourselves.
Whether that's a good thing or a bad thing depends entirely on how much we're willing to look at the "artificiality" in our own lives.
Next Steps for Exploration
To truly understand how this technology is shifting the landscape of the self, you should look into the latest research on the "Digital Therapeutic Alliance." This framework explores how empathy and trust—traditionally human traits—are being replicated or "simulated" in AI-driven tools. Understanding the limits of that digital bond is the first step in using these tools safely.
Keep an eye on the upcoming 2026 clinical guidelines regarding generative AI in private practice. As Dr. Daley and others continue to bridge the gap between 19th-century psychoanalysis and 21st-century code, the definition of "the self" is likely to keep shifting.