AI In Psychology Survey

AI In Psychology Survey

AI In Psychology Survey

 

Artificial Intelligence in Psychological Practice: Where Do We Stand?
Artificial intelligence is no longer peripheral to psychological work. For some practitioners, it is a quiet productivity tool helping structure reports, summarise research, or streamline administrative load. For others, it raises immediate questions about confidentiality, bias, regulatory clarity, and the integrity of the therapeutic relationship.

The reality is that AI is no longer a hypothetical future issue. It is a present professional consideration. What remains unclear now is how we are engaging with it, how we evaluate its risks, and what professional infrastructure is required to govern its integration responsibly.

To answer this responsibly, the PsySSA AI Division has launched a national survey to psychologists and practitioners in training, and we ask that you please complete it here: https://forms.gle/uwdG2qrGYhcdMbpr8

What Is Actually Happening in Practice?
Across clinical, counselling, educational, organisational and research contexts, we are seeing varied patterns of engagement:

  • Some are actively integrating AI tools into selected tasks.
  • Some are experimenting occasionally.
  • Some are intentionally avoiding use.
  • Many are observing developments while waiting for clearer guidance.

What we currently lack is empirical clarity:

  • How widespread is actual use?
  • In which domains is AI being applied?
  • What risks are most salient in practice?What support would meaningfully assist practitioners?

Without this data, discussions remain speculative.

What the Survey Findings Will Inform

The results will directly shape:

  • A formal PsySSA AID Position Statement
  • Practice guidelines aligned with HPCSA ethical principles
  • CPD offerings tailored to identified learning needs
  • Practical toolkits for safe and bounded AI use
  • Policy submissions and regulatory engagement
  • Research priorities within the South African context

In short, the findings will not sit in a report archive. They will inform governance, education, and professional standards. If the profession does not articulate its realities, external narratives will fill the gap.

Ethics and Regulations

AI intersects with core professional commitments:

  • Beneficence and non-maleficence
  • Autonomy
  • Justice
    • Confidentiality
  • Accountability

The question is no longer “AI: yes or no?”, but “Under what conditions, safeguards, and competencies can AI be responsibly integrated into South African psychological practice?

An Invitation

If you are a registered psychologist, intern, academic, or practitioner in training, your perspective is essential.

Whether you are actively using AI, cautiously observing it, or deliberately avoiding it, your position contributes to the evidence base that will shape future guidance.

If you are willing, we invite you to complete the survey and add your perspective to the conversation. Click here to complete the survey. The direction our profession takes will be influenced by who participates in defining it.

 

Time To Talk Day – 06 February 2026

Time To Talk Day – 06 February 2026

PsySSA: It’s Time to Talk

Conversation is connection. Time to Talk Day reminds us that conversation can change lives. Today, PsySSA amplifies voices from our PIPS and AID divisions – because listening, speaking, and connecting are acts of care, courage, and social justice.

Read the PiPS and AID contributions below:

 

Time to Talk Day: Social Media, AI, and the Quality of Mental Health Conversations

By Rekha Kangokar Rama Rao and Athena Clayton (AI Division)

 

Time to Talk Day calls for open, stigma-free conversations about mental health. Yet in a digital era shaped by social media and artificial intelligence (AI), many of these conversations now take place in online spaces that are governed less by care and more by platform, e-design, algorithms and engagement incentives. While this shift has expanded access and visibility, it also introduces significant risks to how mental health distress is expressed, received, and responded to. Questions of depth, psychological safety, and ethical responsibility become particularly urgent when mental health conversations are shaped by systems that reward speed, exposure, and emotional intensity rather than understanding and containment. These concerns are especially pressing in unequal contexts such as South Africa, where overstretched services and structural inequality mean that online conversations may carry more weight, and more risk, than they were ever designed to hold.

Within this landscape, social media can offer connection, validation, and a first step toward acknowledging distress. Platforms enable people to share lived experiences, find peer support, and connect with others who share similar experiences. It also plays a growing role in promoting awareness and acceptance of mental health conditions by sharing accessible information, challenging stereotypes, and correcting common misconceptions. Research suggests that online self-disclosure can reduce feelings of isolation and encourage help-seeking, especially among young people and marginalised groups (Naslund et al., 2016). In South Africa, where public mental health services are overstretched and unevenly distributed, these digital spaces can offer connection where formal care is inaccessible. For many, posting or engaging online becomes the first step toward acknowledging distress an outcome aligned with the aims of ‘Time to Talk’.

For example, a university student might post: “I feel like I’m falling behind in everything and I’m stressing.” A meaningful response is rarely about having the perfect words, but about offering safety and recognition: “I’m really glad you said something. You don’t have to carry this alone. Do you want to talk, or would it help if we looked at support options together?” In moments like these, a comment section can become the first space where someone feels seen, and that can be enough to prompt help-seeking.

AI-driven mental health tools further extend this accessibility. Chatbots and mental health apps offer anonymity, immediacy, and consistency, which can be appealing in contexts where stigma or fear of judgment prevents open discussion. Evidence indicates that some AI-based conversational agents can reduce symptoms of depression and anxiety in the short term by delivering structured psychological strategies such as cognitive-behavioural techniques (Fitzpatrick et al., 2017). From this perspective, AI can help people start talking sooner and access support more easily.

For example, a person who is overwhelmed at 2 a.m. might not be able to call a friend or visit a counselling centre, but they may be willing to open an app. A chatbot might guide them through a grounding exercise (“Take a slow breath in. Name five things you can see.”) or help them challenge spiralling thoughts (“What is the thought you keep returning to? What evidence supports it?”). While this is not the same as human care, it can offer a moment of steadiness and structure when emotions feel unmanageable.

On the other hand, increased conversation does not automatically translate into meaningful or safe engagement. Social media platforms are shaped by algorithms that reward visibility and emotional intensity rather than care or accuracy. Studies link high levels of social media use to increased depressive symptoms, anxiety, and harmful social comparison, particularly among adolescents (Twenge et al., 2018). Public disclosures of distress may attract empathy, but they can also invite dismissive or unkind reactions, moral judgement, unsolicited advice, or misleading mental health content that is not evidence-based. In this sense, social media can blur the line between support and spectacle, where personal distress is shared widely but not always held with care.

For example, a person might share that they are depressed and receive responses like: “You’re just looking for attention,” “Other people have it worse,” or “Stop being dramatic.” Even when replies are not intentionally cruel, they may still be dismissive or simplistic: “Just be positive,” “Just pray,” or “Go for a run.” Instead of feeling supported, the person learns that disclosure comes with risk, and that vulnerability is tolerated only when it is neat, inspiring, or easy to consume.

AI tools introduce further ethical and clinical concerns. While chatbots can simulate empathy, they do not possess true understanding or moral responsibility. Researchers caution that AI systems may fail to respond appropriately to complex mental health crises, including suicidality or trauma, where nuanced human judgment is essential (Bickmore et al., 2018). Issues of data privacy, surveillance, and algorithmic bias are particularly salient in societies marked by inequality. If AI tools are trained on data that do not reflect local languages, cultural expressions of distress, or socio-economic realities, they risk excluding or misinterpreting those most in need.

For example, someone might type: “I can’t do this anymore. I’m tired of everything.” A human listener may recognise the seriousness behind such a message and respond with care, urgency, and appropriate referral. An AI tool, however, may not always interpret context reliably, particularly when language is ambiguous, culturally specific, or emotionally complex. This highlights why AI can be useful for everyday support, but should not be treated as a substitute for professional or relational care in moments of crisis.

The central question, then, is not whether social media and AI are good or bad for mental health conversations, but whether they improve the quality of those conversations. ‘Time to Talk’ reminds us that talking is not simply about expression, but about being heard, understood, and supported responsibly. Digital tools can open doors, normalise discussion, and provide interim support, but they should not become substitutes for human connection or systemic investment in mental health care.

Ultimately, social media and AI should be treated as entry points rather than endpoints. They can open the door to conversation, but they cannot replace deep, responsible support and connection. ‘Time to Talk’ challenges us to think critically: are we simply talking more, or are we creating conditions where talking leads to dignity, connection, and meaningful support?

Time to Talk: Creating the Conditions for Meaningful Mental Health Conversations

By Moshibudi Molepo and Barry Viljoen

 

Talking about mental health matters…but how we talk matters just as much.

In South Africa, many people live with emotional distress in silence. Stigma, limited access to care, cultural expectations, and daily survival pressures can make it hard to speak openly or seek support. For some, silence becomes a way of coping, not because help isn’t needed, but because talking doesn’t always feel safe, welcome, or useful.

Time to Talk invites us to look beyond words and ask a deeper question: What helps a conversation about mental health feel possible, respectful, and meaningful?

Before people can talk, certain conditions need to be in place. Emotional safety matters. Power dynamics need to be acknowledged. Shame, fear, and cultural taboos must be held with care. Timing matters too, conversations work best when they are invited, not forced.

When conversations do happen, listening becomes more important than fixing. This means resisting the urge to reassure too quickly, reflecting feelings rather than correcting facts, and allowing space for silence. Often, being present is more helpful than saying the “right” thing.

Open conversations can reduce isolation and strengthen connection but they should not happen in isolation from support. In South Africa, organisations such as SADAG and LifeLine provide accessible, 24-hour telephonic and WhatsApp support, helping bridge the gap between home, community, and care. Knowing where to turn makes it easier to talk, and easier to listen.

Time to Talk is not about having all the answers. It’s about creating spaces where people feel heard, respected, and supported. Where help is visible and within reach.

Sometimes, the most meaningful thing we can do is listen well — and help someone know they are not alone.

Men’s Health Month 2025

Men’s Health Month 2025

Men’s Health Month 2025

Read our submissions from The DRM, SASCP and the AI Division!

As we mark Men’s Mental Health Month this June, the Division for Research and Methodology (DRM) of PsySSA reflects critically on the evolving discourse around men’s psychological well-being in South Africa. Despite growing awareness, men remain significantly underrepresented in mental health service use, often constrained by dominant ideals of masculinity that equate vulnerability with weakness.

In this short video and companion article, Executive Committee member Omphile Rammopo offers a timely and thought-provoking exploration into how mental health support for men can move beyond awareness toward action. Drawing from clinical insight, personal observation, and grounded local research, Rammopo challenges us to rethink therapeutic approaches that may inadvertently alienate men—and invites us to consider new, culturally relevant, strength-based frameworks.

Produced in collaboration with the DRM, this offering is both a call to reflection and a catalyst for transformation. As psychologists, researchers, and mental health advocates, we are urged not only to ask “Where to from here?”—but to act decisively in shaping support systems that resonate with the lived experiences of men across our diverse society.

#MensMentalHealthMonth #PsySSA #DRM #MentalHealthMatters #MasculinitiesInContext #PsychologyForSocialJustice

Men’s Mental Health Month 2025

By Sibusiso Vilakazi and Barry Viljoen

The month of June is dedicated to the awareness of Men’s Mental Health. The goal of which is to shine a light on and raise awareness regarding the unique mental health challenges faced by men. While there has been positive change in this regard, many men still continue to struggle in silence. One of the reasons for this could be as a result of societal expectations and the subsequent sigma, both internally and externally regarding the expression of vulnerability.

We know that statistically men are less likely to seek mental health support, which can and sadly often does lead to serious consequences. One such example being the higher suicide rates by men. It is hope that by giving a platform to these topics that friends, families, and communities will be encouraged to create safe spaces which are free of judgment and scrutiny. Reminding us that seeking help is a sign of strength and not that of weakness. Once again demonstrating that mental health is as important as that of physical health.

In this month we hope to break barriers by encouraging open conversations, which promote mental wellness and support those seeking the assistance required. In so doing we have collaborated with Sibusiso Vilakazi to share with us some of the work which he and his organisation are doing, to achieve these goals.

Brother’s Keeper SA (BKSA) is a men-only non-profit and registered organisation. BKSA serves as a support network and structure for men. It was established premised on the realisation that men do not have platforms through which they can be vulnerable and express their feelings and challenges. Unemployment, underemployment and a myriad of societal issues continue to beset men, resulting in psycho-emotional conditions such as stress, depression and, in extreme cases, suicide, substance abuse and propensity to criminal behaviour and detainment. In a world that prioritises and advances the rights and developmental needs of other members of society, little focus is devoted to the needs and a plethora of challenges that confronting men daily in South Africa. The number of men who are apprehended continue to rise, although it is understood that multiple factors account for this. BKSA loathes the acts of men who harm women.

BKSA came into existence at the height of the Covid-19 pandemic; at a time when greater support was needed, as the effects of pandemic were felt throughout the world. It was during this time when some men lost their jobs and some lost their spouses due to Covid-19.  It exists as a mechanism for providing a support network and structure for men to freely express their frustrations, challenges and needs in a space that is welcoming and free of prejudice and judgement.

BKSA hosts virtual support sessions monthly on Thursdays. Topics covered include social and emotional support, mental health, family issues, career development, financial health and physical health and fitness. Since its establishment, the organisation has developed an ongoing good relationship with practitioners and professionals across various fields. Ultimately, the organisation seeks to establish a formalised partnership with this network of professionals to enable sustainability of interventions and structured support services such as counselling, mentoring and coaching. Equally important, the organisation will partner with like-minded organisations whose mandates are geared towards development of men. It is through partnership that the organisation will be able to expand and widen its reach throughout the country.

BKSA observes local and international campaigns. The ultimate aim is to create a community of men who will be responsible, caring and able to be receptive to help and support. Men who participate in BKSA learn about how to be responsible in their communities, families and workplaces and responsive to the needs of their communities. There’s a sense of brotherhood and collective responsibility that is engendered through participation in BKSA.

In terms of how we operationalise our services, we identify men-related issues and:

  1. Raise awareness by running and supporting campaigns
  2. Referral to professional services and support
  • Targeted support, such as one-on-one intervention (mentorship)
  1. Community presentations
  2. Recreational activities

While our organisation has experienced a gradual increase in numbers, we intend to continually diversify our approach to topical issues and interventionary strategies. Whether participants are dealing with relationship issues, financial difficulties and mental health concerns or require any other form of support, our team is readily available to support them. While we maintain that we do not offer clinical, therapeutic or medical diagnosis or intervention, our platform exposes participants to qualified and seasoned experts in the various spheres of counsel and guidance to ensure appropriate approaches to healing.

The vision of the organisation ideates the creation of a safe, supportive and holistic community that encourages a culture of expressive, emotionally conscious and self-aware men as they navigate their lived experiences in an ever-changing world. To produce men who are psycho-emotionally healthy and resilient. We seek to build a transformational space for men and encourage authenticity and accountable men who contribute positively to a healthier society.

Men’s Health Month 2025: Listening Beyond the Silence
By Dr Ewald Crause
For the Psychological Society of South Africa

June is Men’s Health Month. But for many men in South Africa, health remains something unspoken. Not due to a lack of problems, but because speaking comes at a cost. In too many homes, clinics, and counselling rooms, silence has become the strategy. And for many men, silence is safer than honesty.

This year’s theme, “Check In, Not Out”, calls for early intervention and preventative care. It is a message that needs to land differently here. Because in the South African context, men are not simply failing to check in with doctors or therapists. They’re also checking out of themselves, their families, and for some, even their futures.

As psychology practitioners and academics, we observe it in statistics and sense it in the absences and silences. The man who doesn’t return for a second session. The father who disappears from the school meeting. The young adult whose first appointment only comes after an attempt. For too many, help arrives too late.

Behind these moments are pressures that psychology professionals know well. High rates of unemployment. Cycles of intergenerational trauma. The burden of being the provider, even when there is nothing left to give. Social scripts that still reward men for being silent, stoic, and self-contained…until they break.

In practice, male clients often arrive not because they chose therapy, but because someone else did. A partner insisted. A boss threatened. A court ordered. When they do arrive, they rarely use clinical language. They don’t say “anxiety” or “depression.” They talk about pressure. Sleeplessness. Losing control. Being “off.” These are not just linguistic differences. They are warnings. If we are not listening closely, we miss the distress altogether.

To work effectively with men, our role is not to convince them to talk. It is to ensure that when they do, they are heard without judgment, interruption, or interpretation. That our language doesn’t assume help-seeking is familiar or safe. That we acknowledge the resistance without reinforcing it.

Intervention needs to be practical, not idealistic. Most men are not looking for long-term therapy. They are looking for something that works. Brief interventions, solution-focused conversations, peer models, role clarity, and support that aligns with their roles as workers, fathers, sons, or leaders. Respect matters. So does structure. So does knowing when to step back and refer.

But this is not only about individual therapy. It is about the systems in which we operate. Access to care is uneven. Services in rural and peri-urban areas remain difficult to reach. Long waiting lists in the public sector often mean that prevention becomes impossible. For many men, especially working-class men, there is no clear path between noticing that something is wrong and receiving the support to address it. That space between is where we lose them.

This month must not be reduced to awareness slogans. The work is not about getting men to speak. It is about creating spaces where they do not have to defend their pain. It is about reducing the threshold for help. About making support a familiar part of life, not a crisis response. About including men in the broader mental health conversation without assuming they already know the terms of engagement.

To the men reading this: this month is not a campaign. It is a reminder that your life matters. That survival is not the only goal. That checking in is a strength, not a liability. That you are not meant to carry everything alone.

To the psychology professionals reading this: we can change the trajectory. To create systems, practices, and messages that speak to men without demanding that they first speak like us. That is the challenge. That is the opportunity.

Let’s meet men where there are, not where society expects them to be. And not just in June, but in the work we do every day.

Adapting to AI: What Psychologists Need to Know for the Future

Adapting to AI: What Psychologists Need to Know for the Future

As artificial intelligence (AI) continues to reshape industries worldwide, psychology finds itself at a turning point. AI is no longer a distant future—it’s already here, and it’s expanding rapidly. But for psychologists, the question is not if AI will play a role in their work, but how they will respond. Will AI become an ally in enhancing our practice, or will we resist it out of fear, skepticism, or uncertainty?

To understand the different ways psychologists might approach this transformation, let’s explore the experiences of three professionals, each facing AI from a unique perspective. Through their stories, we can examine the potential for growth, the risks of inaction, and the ethical considerations that must guide us forward.

1. The Early Adopter: Fully Embracing AI

Dr. Zanele has fully integrated AI into her practice. She uses AI-powered tools to streamline administrative tasks, analyze behavioral data, and assist with diagnostics. AI allows her to focus more on providing personalized care to clients while automating routine tasks.

While embracing AI, Dr. Zanele ensures that ethical guidelines are followed. She prioritizes transparency, informed consent, and client privacy. By using AI responsibly, she enhances her practice, making it more efficient and better equipped to handle the increasing demand for mental health services.

Questions for Dr. Zanele to Consider:

  • How can I ensure that the AI tools I use are transparent and comply with privacy standards?
  • What steps can I take to ensure the AI tools I use do not perpetuate bias or harm in the therapeutic process?
  • How can I balance AI’s capabilities with the human touch that remains essential to therapy?

Dr. Zanele sees AI as a tool to expand her capacity, improve outcomes, and stay at the forefront of mental health innovation.

2. The Skeptic: Sticking to Tradition

Dr. Sipho is skeptical about AI’s role in psychology. He believes it’s just a passing trend, much like previous technological shifts that didn’t stick. His practice continues as it always has—focusing on in-person consultations, paper records, and traditional assessments.

While Dr. Sipho is content with his established methods, the world around him is changing. AI offers new opportunities to improve diagnostic accuracy, provide personalized treatment, and expand access to services. If Dr. Sipho continues to resist AI, he risks missing out on valuable tools that could enhance his practice and client care.

Questions for Dr. Sipho to Consider:

  • What might I miss by resisting AI, especially in terms of improving diagnostic accuracy and client engagement?
  • How might client expectations change as AI tools become more integrated into mental health services?
  • How can I remain relevant in a changing field that increasingly embraces AI?

By sticking with the status quo, Dr. Sipho may find himself at a disadvantage, missing out on innovations that could improve his practice.

3. The Cautious Adopter: Curious but Hesitant

Dr. Thandi is intrigued by AI but unsure how to begin integrating it into her practice. She is curious about its potential to improve her work but is overwhelmed by the complexity and the fear of making mistakes. Dr. Thandi has explored AI through articles and online courses, but she hesitates to fully dive in.

Dr. Thandi is not alone in feeling uncertain. Many psychologists share her concerns about adopting AI. However, by taking a cautious yet proactive approach, Dr. Thandi can gradually integrate AI into her practice, ensuring that she remains at the forefront of innovation while maintaining the human connection essential to therapy.

Questions for Dr. Thandi to Consider:

  • How can I take the first steps toward integrating AI in a manageable way?
  • What risks do I face if I let my uncertainty prevent me from adapting to this shift?
  • How can I use AI tools without compromising the human-centered approach to therapy?

By embracing AI in a measured way, Dr. Thandi can ensure that she remains adaptable and ready to take advantage of the growth opportunities AI presents.

The Ethical Implications: Safe Use of AI Tools

For all three psychologists, the ethical implications of using AI are paramount. In South Africa, where access to mental health care is limited in many areas, AI offers the potential to extend services to underserved populations. However, AI must be used ethically—ensuring client privacy, informed consent, and transparency.

Ethical guidelines for AI in psychology should focus on:

  • Privacy and confidentiality: Ensuring AI tools protect sensitive data and adhere to privacy regulations.
  • Informed consent: Clients must be fully aware of how AI tools will be used in their treatment.
  • Bias and fairness: AI systems must be designed and tested to avoid reinforcing existing biases.
  • Human oversight: AI should assist, not replace, the human connection central to therapy.

The Road Ahead: Adapting to Change

AI is here to stay, and those in the psychology profession must decide how they will respond. Whether you are an early adopter, a skeptic, or a cautious adopter, the key question is how you will engage with this technological shift.

  • If you embrace AI, you open up new opportunities for growth, better outcomes, and improved access to services.
  • If you resist AI, you risk falling behind in a field that is rapidly evolving.
  • If you hesitate, you may find yourself scrambling to catch up with others who have already adapted.

By considering the ethical use of AI and gradually incorporating it into your practice, you can stay ahead of the curve and ensure your practice remains relevant, efficient, and impactful in a changing world.

What are your thoughts on AI in psychology? How are you approaching this shift? Let’s continue the conversation.

#PsySSA #AIandPsychology #EthicalAI #MentalHealth #ArtificialIntelligence #SouthAfrica #PsychologyInTheFuture