By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Human vs digital therapy: AI falls short when IT pros need help | Computer Weekly
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > Human vs digital therapy: AI falls short when IT pros need help | Computer Weekly
News

Human vs digital therapy: AI falls short when IT pros need help | Computer Weekly

News Room
Last updated: 2025/06/11 at 7:06 PM
News Room Published 11 June 2025
Share
SHARE

Over half of cyber security professionals lose sleep due to work-related stress, according to research by the Chartered Institute of Information Security (CIISec: 2022/23 State of the profession survey). They suffer from these and other symptoms similar to those we deal with in combat veterans at PTSD Resolution, the UK ex-Forces mental health charity.

Yet increasingly these stressed IT professionals are turning to AI chatbots for mental health support, largely because they are unable to access proper therapeutic help, or maybe it just seems easier.

To us, this is very concerning. We appear to be facing a mental health crisis in the IT sector, and instead of addressing root causes, we are handing people over to algorithms.

The AI therapy market reality

The numbers are alarming: more than 1.6 million people are on a mental health waiting list in England, and the NHS estimates that up to eight million with diagnosable conditions receive no treatment. Tech entrepreneurs have stepped in to fill this gap at least in part with AI-powered mental health and also companion platforms, which promise a sympathetic ear and even a ‘relationship’ with a chatbot.

We can understand the appeal. These systems are available 24/7, seemingly cost-effective, and for IT professionals working irregular hours under constant pressure, they may offer immediate relief.

But accessibility is not the only consideration when dealing with vulnerable people. In fact, PTSD Resolution successfully pioneered the delivery of therapy over the internet during the Covid-19 pandemic, and we continue to offer this service today, in addition to in-person sessions.

For IT workers, some of whom are ex-military personnel who’ve moved into cyber security, the stress patterns can mirror combat trauma. The constant vigilance, high-stakes decisions, and responsibility for protecting others. These aren’t simple problems that a response automated by an algorithm can solve.

The human advantage

The risks are evident, although specific cases of harm inflicted by therapy chatbots are harder to pin down. Many of these AI services claim to embed suicide-screening algorithms, automatic helpline sign-posting, and, in at least one case, human escalation.

But unlike human therapists bound by ethical codes and professional oversight, most consumer chatbots lack mandated clinical oversight and have only rudimentary crisis-escalation scripts.

From an evolutionary viewpoint, human distress has always required a human response. Our ancestors needed others who could read facial expressions, interpret vocal nuances, and understand contextual factors. This is how our brains are wired to process and heal from trauma.

AI chatbots lack these capabilities. They cannot observe body language during panic attacks, detect subtle voice changes indicating deception about mental state, or understand the complex interplay between work pressures and personal circumstances. Unlike AI, a human may notice that someone in distress, claiming to be ok, might be masking.

General chatbots may not have safety parameters and ways of identifying if the issue needs to be taken over by a therapist. For IT professionals dealing with moral injury such as being forced to implement surveillance systems against their values, or making decisions affecting thousands of users’ data security, this contextual understanding is crucial.

There is also automation bias. IT professionals may be particularly susceptible to trusting algorithmic advice over human judgment, creating a dangerous feedback loop where those most likely to use these systems are most vulnerable to their limitations.

Privacy and security concerns

IT professionals should be particularly alarmed by privacy implications. Human therapists operate under strict confidentiality rules, protected by laws and regulations. But ChatGPT acknowledges that engineers “may occasionally review conversations to improve the model.”

Consider the implications: your most private thoughts, shared during vulnerability, potentially reviewed by programmers optimising for user engagement rather than therapeutic outcomes, or even a state intelligence organisation or criminal gang hacking that data for their own nefarious purposes.

Human Givens Therapy

The human therapy alternative has been tested and proven effective. PTSD Resolution uses a therapy developed by the Human Givens Institute and all 200 therapists in the charity’s network are qualified members. HGI recognises that humans have innate emotional needs: security, autonomy, achievement, meaning, and others. When these needs aren’t met, psychological distress follows.

Tony Gauvain, an HGI therapist and retired army colonel who chairs PTSD Resolution, explains: “Executive burnout and military trauma share similar symptoms – depression, anger, insomnia. It’s about feeling overwhelmed and unable to cope, whether from a military incident or stressful encounters with management.”

HG therapy acknowledges the fundamentals of human psychology: we are pattern-matching creatures. Skilled therapists can identify metaphors in language, recognise processing patterns, and work with imagination to reframe traumatic experiences. Crucially, they adapt in real-time based on the client’s often very subtle responses – something no algorithm can replicate. At least not yet.

There is clear evidence for this approach. PTSD Resolution achieves a 68% reliable improvement rate with 80% treatment completion, typically delivered in around six sessions, according to a King’s College London study, published in Occupational Medicine in March 2025.

At £940 per treatment course – delivered free of charge to UK Forces’ veterans, reservists and their families – it is highly cost-effective compared to the long-term impacts of untreated trauma, and even to other person-to-person therapies. We are very lean in our operation, owning no assets and channeling donations to pay for the therapists’ time for each session.

Real-world success

We’ve seen this approach work with IT professionals experiencing constant fight-or-flight mode due to work pressures, but unable to take the natural action their stress response demands. Unlike our ancestors who could fight or flee threats, modern workers must sit at desks pretending everything’s fine while their nervous systems are in overdrive.

Through our Trauma Awareness Training for Employers (Tate) programme, the charity has worked with companies like Anglo American. Following training, 100% of delegates reported significantly increased confidence in identifying and supporting colleagues experiencing trauma.

The King’s College evaluation found that our therapy clients showed sustained improvement, despite often working with people who had complex post-traumatic stress disorder (PTSD) and had been failed by other services.

Most recently, we formed a strategic partnership with CiiSec, with services now available to their membership of more than 10,000 cyber security professionals. This collaboration provides both mental health support through trauma awareness training and access to professional therapy.

The bottom line

AI may have supplementary roles – perhaps for basic education or support between therapy sessions. But as a replacement for human therapists? No. No AI chatbot has UK or FDA approval in the USA to treat mental health conditions, and documented risks are too significant.

For IT professionals struggling with burnout, depression, or work-related trauma, the solution is not better algorithms – it’s better access to qualified human therapists who understand this industry’s unique pressures.

Ultimately, healing happens in a relationship. It occurs when one human truly understands another’s experience and guides them towards meeting fundamental emotional needs. No algorithm can replicate that.

The choice is not between convenience and inconvenience, not when a full HG therapy session is available over Zoom, often within days of a first exploratory contact call. The choice is in fact between genuine help and digital simulation of care.

Malcolm Hanson is clinical director at PTSD Revolution.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article KV-Cache Fragmentation in LLM Serving & PagedAttention Solution | HackerNoon
Next Article Disney, Universal Sue Midjourney for Creating ‘Bottomless Pit of Plagiarism’
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Apple Says iOS 26 Can Reserve Space for Automatic Software Updates
News
‘The cyber weather is always bad’: Leaders call for stronger collaboration to face evolving digital threats
News
Intel Begins Preparing Linux For Next-Gen DSA 3.0 Accelerators
Computing
Best Gas Grills of 2025: We Tested More than 10
News

You Might also Like

News

Apple Says iOS 26 Can Reserve Space for Automatic Software Updates

5 Min Read

‘The cyber weather is always bad’: Leaders call for stronger collaboration to face evolving digital threats

11 Min Read
News

Best Gas Grills of 2025: We Tested More than 10

5 Min Read

GffyunhswWb3,UshngnNwfnzngn

0 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?