Timezone: »

 
Invited Talk (Breiman Lecture)
Making Algorithms Trustworthy: What Can Statistical Science Contribute to Transparency, Explanation and Validation?
David Spiegelhalter

Thu Dec 06 05:30 AM -- 06:20 AM (PST) @ Room 220 CD

The demand for transparency, explainability and empirical validation of automated advice systems is not new. Back in the 1980s there were (occasionally acrimonious) discussions between proponents of rule-based systems and those based on statistical models, partly based on which were more transparent. A four-stage process of evaluation of medical advice systems was established, based on that used in drug development. More recently, EU legislation has focused attention on the ability of algorithms to, if required, show their workings. Inspired by Onora O'Neill's emphasis on demonstrating trustworthiness, and her idea of 'intelligent transparency', we should ideally be able to check (a) the empirical basis for the algorithm, (b) its past performance, (c) the reasoning behind its current claim, including tipping points and what-ifs (d) the uncertainty around its current claim, including whether the latest case comes within its remit. Furthermore, these explanations should be open to different levels of expertise.
These ideas will be illustrated by the Predict 2.1 system for women choosing adjuvant therapy following surgery for breast cancer, which is based on a competing-risks survival regression model, and has been developed in collaboration with professional psychologists in close cooperation with clinicians and patients. Predict 2.1 has four levels of explanation of the claimed potential benefits and harms of alternative treatments, and is currently used in around 25,000 clinical decisions a month worldwide.

Author Information

David Spiegelhalter (University of Cambridge)

David Spiegelhalter is a statistician in the Centre for Mathematical Sciences at Cambridge University, and currently President of the Royal Statistical Society. His background is in Bayesian statistics, and after working in computer-aided diagnosis in the early 1980s, he jointly developed the Lauritzen-Spiegelhalter algorithm for exact evidence propagation in Bayesian networks. He then led the team behind the BUGS software for MCMC analysis of Bayesian models. He is now Chair of the Winton Centre for Risk and Evidence Communication, which aims to improve the way that statistical evidence is used by health professionals, patients, lawyers and judges, media and policy-makers. This work includes the development and evaluation of front-ends for algorithms used in patient care, focusing on explanation and transparency, particularly regarding uncertainty. He has over 200 refereed publications and is co-author of 6 textbooks, as well as The Norm Chronicles (with Michael Blastland) and Sex by Numbers. He works extensively with the media, and presented the BBC4 documentaries ‘Tails you Win: the Science of Chance” and the award-winning “Climate Change by Numbers”. He was elected Fellow of the Royal Society in 2005, and knighted in 2014 for services to medical statistics. Perhaps his greatest achievement came in 2011 when he was 7th in an episode of Winter Wipeout on BBC1.