Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Rapid diagnostic tests (RDTs) provide point-of-care medical screening without the need for expensive laboratory equipment. RDTs are theoretically straightforward to use, yet their analog colorimetric output leaves room for diagnostic uncertainty and error. Furthermore, RDT results within a community are kept isolated unless they are aggregated by healthcare workers, limiting the potential that RDTs can have in supporting public health efforts. In light of these issues, we present a system called RDTScan for detecting and interpreting lateral flow RDTs with a smartphone. RDTScan provides real-time guidance for clear RDT image capture and automatic interpretation for accurate diagnostic decisions. RDTScan is structured to be quickly configurable to new RDT designs by requiring only a template image and some metadata about how the RDT is supposed to be read, making it easier to extend than a data-driven approach. Through a controlled lab study, we demonstrate that RDTScan's limit-of-detection can match, and even exceed, the performance of expert readers who are interpreting the physical RDTs themselves. We then present two field evaluations of smartphone apps built on the RDTScan system: (1) at-home influenza testing in Australia and (2) malaria testing by community healthcare workers in Kenya. RDTScan achieved 97.5% and 96.3% accuracy compared to RDT interpretation by experts in the Australia Flu Study and the Kenya Malaria Study, respectively.

Original publication

DOI

10.1145/3448106

Type

Journal

Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

Publication Date

29/03/2021

Volume

5