Technology as Advocate: How to build technologies that help people to help themselves

Natalie Neilson

6th September 2019

Back to blog

Find out what happened at our first 'Lunch and Learn' in August at Co Space North; when we invited Aislinn Bergin, Research Fellow at MindTech (https://www.mindtech.org.uk/), to join us to discuss the topic of medical apps and how they need to be designed to support the citizen to find and use them effectively.   

Most developers of the ever growing number of medical apps may have a genuine desire to help their users, but often don’t have (enough) medical expertise. As a result, many digital tools are not very useful in the real world and user engagement is a big challenge. Most technology it is not intended or adapted to fit in existing care pathways. As a result, there is little or no clinical oversight, placing the burden on the user to know what (not) to do. How people find apps, use them, and how they deal with them when they don’t work is particularly important for vulnerable users, such as those with mental health issues. In her talk, Aislinn offered nine recommendations to better fit user needs and take full advantage of technology as advocate: to support and enable people to help themselves.

1. Build for a small well defined population. One generic tool is unlikely to be a good fit for everyone’s needs. Some people may not like the tool, or be unable to use or benefit from it. Until technology gets better at adapting to our needs, it is better to build your tool with a very specific population in mind.

2. Recognize the importance of purpose. Claiming a tool’s purpose is to help with ‘anxiety’ is not the same as ‘anxiety disorder’; the latter is a clinical condition and hence tightly regulated. If you build for existing devices (such as a smartphone app), you have also to keep in mind that the device itself may already have a meaning and purpose to the user (for example, a smartphone being seen as a wellbeing rather than medical tool).

3. Build for changing needs. Our activity, moods and needs can change even within a day. A digital tool should adapt to us rather than the other way around, or at least embed as naturally into our lives as possible. Another challenge is to keep up with the rapid pace of technology. Hardware and software can become obsolete in a relatively short time, while research and funding application processes are comparatively slow.

4. Be transparent about what your tool can do. There is a lot of ‘snake oil’ out there; digital tools making claims about feasibility and effectiveness without having any (appropriate) evidence. Also, it is not always clear which user data will be collected and how it will be used. Failing to do so may reduce people’s trust, not just in the one tool but in this new field in general.

5. Start planning early on for sustainability. Funding sources are limited and real world implementation is challenging, causing many promising tools and companies to fail. This undermines trust, as users may, for example, suddenly lose their painstakingly collected data. It is important to start planning early for implementation and long term sustainability

6. Build with uncertainty in mind. This topic relates to planning for major technical glitches, such as a loss of tool access or stored data due to a server breakdown or corrupted database. Choosing a big (famous) company means you will be less likely to be affected by these issues than small companies with fewer resources. However, we need to recognize this puts us at the mercy of where these big players decide to take things rather than responding to true user needs.

7. Build with care. Often, digital tools for health and care are accessible to the public without gatekeeping by a healthcare professional. This means we need to consider adverse events and negative effects in the real world application of digital tools. What happens if a patient follows the wrong advice and gets worse? If you offer app-based CBT therapy, will this work if people are doing it on a bus? Anyone offering a health or care product has a duty of care, even if they never meet their users.

8. Build with respect. You want your users to have a sense of control and ownership to maximise their engagement and benefits. Often, developers make assumptions and decide for their users. By giving users a certain amount of autonomy and involving them in the creation and testing of your tool, you can help make the tool more relevant and your users feel respected.

9. Technology as advocate. Technology can enable and empower people to (better) help themselves. You can give people another tool in their arsenal that means they have more choice, more options, in how they manage their (mental) health. It is important to keep in mind that your digital tool probably won’t be the solution for everyone, though it can be (part of) a solution.

Technology can support people and give them (more) autonomy. Digital health and care needs a change from the ground up, starting from and with users. The digital world needs to adapt itself to the real world, and the people in it, not vice versa.



Natalie Neilson

Natalie Neilson