We are currently living in the fourth industrial revolution, a new era which builds and extends the impact of digitization in new and unanticipated ways. At the same time, the field of psychotherapy hasn’t changed that dramatically in the past one hundred years. How could technological advances influence suicide prevention?
This was the opening statement from Dr Jason Bantjes, during a plenary session entitled ‘The Changing Landscape of Psychiatry, Neuroscience and Technology’ held during the 2019 Biological Psychiatry Congress in Cape Town. Bantjes is a registered psychologist and senior lecturer in the Department of Psychology at Stellenbosch University. The below is based on aspects from the presentation.
How can technology be employed, not to replace mental healthcare professionals, but to enable deeper, more meaningful, more connected engagements with patients? Several digital methods have been evaluated, and a few will be mentioned here.
Texting and emailing interventions for suicide prevention have been evaluated, based on the principle of the ‘Caring contacts intervention’, which was the work of psychiatrist Jerome Motto in the 1970s. Motto demonstrated that sending non-demanding letters of care at regular intervals could prevent suicide amongst individuals at high risk. The sense of isolation which these patients often experience was reduced; they experienced a feeling of connectedness and they took comfort in the knowledge that someone was concerned about their wellbeing. It was demonstrated that patients who received these messages had lower odds of experiencing suicidal ideation and a lower number of suicide attempts.
Educational websites aimed at increasing patients’ knowledge and raising awareness have been developed. There was a concern that these websites may be harmful to vulnerable individuals, but it was shown that patients with increased vulnerability experienced a partially sustained reduction of suicidal ideation. These websites served as an electronic bridge to accessing mental health services. Other features which may be integrated include online screening for risk, with personalised feedback, prompts about where to find help and online counselling.
There are many websites available which offer education to those who are suffering, and for their friends and family. Smart phone applications may also be another resource.
Smartphone apps can offer information, education and training, act as resource locators and can have an ‘emergency button feature’, where a friend or family member can be alerted if the patient who is experiencing suicidal thoughts activates it. In addition, these apps can offer safety planning, other coping tools, and the opportunity for clinical assessment and automated intervention.
Some specific apps include:
*Virtual Hope Box – Designed to redirect negative thoughts to positive ones by focusing on reasons for living. It contains simple tools to help patients with coping, relaxation, distraction and positive thinking. The Hope Box can be personalized and include a wide range of multimedia such as photos, videos, messages and positive affirmations. It also provides distraction tools, guided imagery, controlled breathing and relaxation exercises.
*Seven cups – An app which offers emotional support, anxiety relief, depression help, counselling and therapy by connecting individuals to people they don’t know for real time support. These conversations are anonymous, confidential and take place with trained active listeners.
*TalkCampus – Offers an online peer to peer support network which is grounded in sound theory. TalkCampus is clinically governed and has shown to have high levels of acceptability, usability and usefulness. Evidence from the United States and United Kingdom support its use.
As always, there are also ethical issues to consider with the use of technology and digital interventions. For example, who are the beneficiaries of these technological advances? To utilise the above, one must have a smartphone, a reliable internet connection and data. Who is responsible when Artificial Intelligence fails? What about the impact technology has on the environment, whether due to excess power consumption, increased radiation exposure or increasing amounts of toxic waste? Should mental health apps be regulated as medical devices?
In conclusion, there are many ethical and moral issues which will have to be addressed. However, there are plenty of exciting opportunities to use technology in suicide prevention. Currently, the evidence is thin, but looks promising.