
Columbia University
bab2239@tc.columbia.edu
When I began my dissertation research on Black mothers and their relationships with their daughters, I expected to uncover patterns of attachment, communication, and generational healing. What became increasingly clear, not only in my own work but also in the literature, was that the foundation for a strong mother-child bond is laid long before birth. Secure attachment does not begin in the delivery room. It begins when a mother or birthing person feels mentally, physically, financially, and emotionally supported during pregnancy and even before conception.
This idea is not new. Dr. Aurélie Athan’s reproductive identity theory posits that decisions about parenthood—whether, when, and how to become a parent—are fundamental aspects of adult identity development, akin to race, gender, and sexuality. Athan (2020) emphasizes that reproductive identity is multifaceted, fluid, and self-authored, reflecting the diverse ways individuals engage with their procreative potential. My own research expands on this by asserting that matrescence, the transition to motherhood, is an ongoing, evolving experience profoundly shaped by the systemic and social conditions surrounding pregnancy. Yet, in a society where Black women disproportionately face maternal mental health disparities, access to perinatal mental health support remains an afterthought rather than a priority.
Despite growing awareness of maternal mental health, many birthing people, especially those who are Black, low-income, or otherwise marginalized, struggle to access adequate care. Barriers include systemic racism in healthcare, financial constraints, and a severe shortage of perinatal mental health providers trained in culturally responsive care (Glazer & Howell, 2021; Huggins et al., 2020). Technology has increasingly been proposed as a way to bridge these gaps. AI-powered mental health chatbots, digital coaches, and mobile interventions claim to offer accessible, on-demand support during this critical period. But can these tools truly help? And more importantly, do they risk becoming another superficial solution that overlooks the deeper, structural inequities that make perinatal mental health support so difficult to access in the first place?
The Promise and Peril of AI in Perinatal Mental Health
AI-driven mental health interventions, such as chatbots like Woebot and Wysa, have been lauded for their accessibility and affordability. These tools use natural language processing to simulate therapeutic conversations, providing psychoeducation, mood tracking, and cognitive-behavioral strategies for managing anxiety and depression. They are available 24/7, require no insurance or appointment, and can be used at one’s convenience, making them particularly appealing to new mothers who may be balancing exhaustion, childcare, and the emotional weight of postpartum recovery (Firth et al., 2019).
For Black mothers, who often experience medical bias and dismissive treatment in healthcare settings, AI mental health tools present a unique potential advantage: a space free of judgment, where their experiences are validated, and their mental health concerns are taken seriously. Research suggests that digital mental health interventions can be effective in reducing perinatal depression and anxiety (Anyanwu & Jenkins, 2024; Singla et al., 2021), particularly when integrated into broader support systems.
But while AI-driven support has potential, it is not without its flaws. These chatbots lack the nuance, empathy, and human intuition required to address the complexities of perinatal mental health. They operate on pre-programmed algorithms that may not fully grasp the cultural, social, and historical factors that shape a Black mother’s experience. Studies have found that AI-driven mental health tools often struggle with racial and gender biases, as their underlying training data is not always representative of diverse populations (Koenecke et al., 2020). Additionally, AI models tend to oversimplify perinatal distress, failing to differentiate between normal pregnancy-related mood fluctuations, prenatal anxiety, and severe postpartum disorders, which require different levels of care. This raises the question: Can AI ever truly provide the culturally responsive support that is needed for perinatal mental health interventions to be effective?
A More Holistic Approach to Perinatal Mental Health Support
If we take seriously the idea that matrescence begins before birth, then perinatal mental health interventions must be proactive rather than reactive. They must not only address postpartum depression after it has set in but actively work to prevent it by ensuring birthing people have the resources, support, and community they need before, during, and after pregnancy.
AI can play a role in this, but it cannot stand alone. Instead of viewing chatbots and digital coaches as a replacement for perinatal mental health care, they should be integrated into a more holistic system of support. Imagine a model where AI-powered mental health tools are embedded within a broader ecosystem that includes doulas, midwives, therapists, and community-based programs. These tools could be used to supplement traditional care, helping mothers identify early warning signs of perinatal mood disorders, providing culturally relevant psychoeducation, and connecting them with human providers when necessary.
Additionally, AI-driven interventions must be developed with inclusivity in mind. This means ensuring that the algorithms underlying these tools are trained on diverse datasets that reflect the experiences of Black mothers and other marginalized communities (Mehrabi et al., 2021). It also means involving perinatal mental health experts, reproductive justice advocates, and Black maternal health researchers in the development process to ensure that these tools do not perpetuate the same biases they claim to solve.
Conclusion
The transition to motherhood is not a singular event but an ongoing, identity-shifting process that requires deep and sustained support. If we are to use AI as a tool to enhance perinatal mental health care, it must be designed to intentionally complement rather than replace human connection and address the racial and socioeconomic inequities that shape maternal health outcomes. AI can be part of the solution, but it is not the solution. Real change requires systemic shifts in how we value and support mothers, particularly those most at risk.
Policymakers, healthcare providers, and technology developers must work together to ensure that AI-driven mental health tools are integrated into comprehensive, human-centered care models. This means prioritizing funding for perinatal mental health research, addressing racial bias in AI development, and ensuring that Black mothers and other vulnerable populations have equitable access to quality care.
Because if we truly believe that strong attachment begins before birth, then we must commit to ensuring that every mother, regardless of race, income, or background, has access to the mental, physical, and emotional support they need to thrive.
References
Anyanwu, I. S., & Jenkins, J. (2024). Effectiveness of digital health interventions for perinatal depression: A systematic review and meta-analysis. Oxford Open Digital Health, 2, oqae026. https://doi.org/10.1093/oodh/oqae026
Athan, A. M. (2020). Reproductive identity: An emerging concept. American Psychologist, 75(4), 445–456. https://doi.org/10.1037/amp0000623
Firth, J., Torous, J., Nicholas, J., Carney, R., Rosenbaum, S., & Sarris, J. (2017). Can smartphone mental health interventions reduce symptoms of anxiety? A meta-analysis of randomized controlled trials. Journal of Affective Disorders, 218, 15–22. https://doi.org/10.1016/j.jad.2017.04.046
Glazer, K. B., & Howell, E. A. (2021). A way forward in the maternal mortality crisis: Addressing maternal health disparities and mental health. Archives of Women’s Mental Health, 24(5), 823–830. https://doi.org/10.1007/s00737-021-01161-0
Huggins, B., Jones, C., Adeyinka, O., Ofomata, A., Drake, C., & Kondas, C. (2020). Racial disparities in perinatal mental health. Psychiatric Annals, 50(11), 489–493. https://doi.org/10.3928/00485713-20201007-02
Koenecke, A., Nam, A., Lake, E., Nudell, J., Quartey, M., Mengesha, Z., Toups, C., Rickford, J. R., Jurafsky, D., & Goel, S. (2020). Racial disparities in automated speech recognition. Proceedings of the National Academy of Sciences, 117(14), 7684–7689. https://doi.org/10.1073/pnas.1915768117
Mehrabi, N., Morstatter, F., Saxena, N., Lerman, K., & Galstyan, A. (2021). A survey on bias and fairness in machine learning. ACM Computing Surveys, 54(6), 1–35. https://doi.org/10.1145/3457607
Singla, D. R., Lawson, A., Kohrt, B. A., Jung, Y., Meng, Z., Ratjen, C., & Fazel, M. (2021). Implementation and effectiveness of nonspecialist-delivered interventions for perinatal mental health in high-income countries: A systematic review and meta-analysis. JAMA Psychiatry, 78(5), 498–509. https://doi.org/10.1001/jamapsychiatry.2020.4556

