Abstract
The last decade has seen an exponential increase in the use of computers and the internet integrated into all areas of health, affecting both the providers and participants of Primary Care. The cultural and social change is of personal and global significance.
Everyone benefits from a shared, integrated health record furthering citizen involvement, clear and contemporaneous notes, interoperability between providers and silos of care, telehealth, e prescriptions, assistive technologies to monitor the person in their home – and more. This technological “connection” is a vital tool that has many uses and benefits. However Primary care should also be full of human connection, empathy and compassion. Empathy is strongly correlated to positive health outcomes that we cannot afford to lose.
Studies show that the average clinician will spend between one third and two thirds of the time of each consultation looking at the screen. In the US the average uncurated electronic medical record is half as long as the text of Shakespeare’s Hamlet. Little time is left for empathic face to face connection, let alone examination. Telehealth presents even more challenges. Computer algorithms may not result in higher quality care because the practice of medicine remains a subtle art requiring careful listening and undivided attention.
If care is to be community driven there are challenges of integrating the complex levels of human sentiment - and needs that are expressed in individual and collective use of social media and apps - with the more formal data platforms of the health system. A care culture that is technology dominated may present equity and hierarchy of access issues, especially for the vulnerable.
The risks of technology can be mitigated by using the concepts of “Safety 1”, focused on ensuring that things do not go wrong, combined with “Safety 2”- now recognised to be so important in healthcare - that plans to make sure that things go right.
“Hallucination” is a word that refers to the situation when an AI system confidently presents an incorrect assertion that is not backed up with sufficient or nuanced data that is needed in the care of people. Inherent in AI is a risk that misinformation in self-care and prescribing will become even more prevalent.
Human competitive digital technologies (particularly AI) have profound risks of proceeding without ethical oversight . As the “Pause Giant AI experiments: Open Letter” says, “planning and management is not happening.” Technology has always outpaced regulation.
The balance between technology and humanity is a delicate one. A middle path may be to develop the formal skills of digital empathy in healthcare and Public Health, in ways that the multidisciplinary Oxford Empathy Programme (OxEmCare) is exploring. People working in Primary care need access to similar programmes. Participants in an integrated health environment may need to pause in their adoption of new technologies, to engineer, adapt and benefit from a human cooperative rather than human competitive digital care culture.
