The statistician Sir Ronald Fisher famously said that asking a statistician to examine data only after an experiment is completed is to invite a post-mortem; the statistician can often say only how the experiment died.
Careful planning and design of experimental studies, including engineering experiments, process improvements, and public health and education interventions, helps to make sure that studies deliver useful information on outcomes and impact. This event will provide an overview of the principal requirements for successful experimental planning and design, and provide examples of some advanced experimental design methods for quality improvement.
The event will be of interest to researchers and quality improvement professionals considering quantitative studies, and to statistics professionals involved in planning and design of experimental quality improvement and research studies in a broad variety of fields.
Welcome and Introduction.
Shirley Coleman – Newcastle University
Basic principles of Experimental Design – a refresher.
Steve Ellison – LGC Limited, Middx, UK
This short refresher will discuss the main features of a good experiment. These include clear objectives; choice of the right outcome measure; adequate capability to detect the effects expected from an intervention; plans for coping with nuisance effects that might interfere with interpretation; appropriate handling of practical constraints; gathering information on variability and/or uncertainty; and early consideration of how the data will be assessed. Brief examples from the development of chemical and biological measurement will be used to illustrate some typical planning constraints and corresponding experimental design strategies
Quality improvement using Bayesian adaptive design applied to a digital twin.
Liam Fleming - Newcastle University and Johnson Matthey
Chemical manufacturing processes are governed by complex mechanics like reaction kinetics and fluid dynamics. Such processes are also subject to considerable uncertainty arising from inherent stochasticity and process noise. On-plant experimentation is costly and time-consuming, so approximating “digital twins” are constructed using simulation. Such simulations, based on proprietary solvers, still require experimental data and are cumbersome to deal with so it is valuable to emulate the simulation to represent the real life process. We will discuss the design of a factorial orthogonal experiment and show how Bayesian adaptive design is used to select further design points. We will then show how the non-linear response is further explored using Gaussian processes (GP) to fit a predictive model.
Quasi-experimental design in quality improvement studies - statistical considerations.
Abdel Douiri - King's College London
Evaluating the effectiveness of a process of service delivery and ensuring the sustainability of improvements over time is one of the main objectives in quality improvement. Therefore, it is important to develop robust designs and methodologies that help to evaluate continuously services being delivered and detect if it is getting better, staying the same or getting worse over time. It is important to distinguish between the two main study designs in quality improvement: 1) quality improvement projects; and 2) effectiveness studies for improvement. In this talk, we will discuss quasi-experimental designs for Quality improvement studies and provide strategies and tools to identify performance gaps, inform and evaluate quality improvement initiatives and ensure continuous improvement.
Session Chair: Shirley Coleman – Newcastle University; Chair, RSS Quality Improvement Section
Chair: Shirley Coleman – Newcastle University
Abdel Douiri - King's College London
Dr Douiri is a Reader in Medical Statistics in the School of Population Health & Environmental Sciences at King’s College London. He has an MSc in numerical analysis and a PhD in signal processing (applied mathematics). He has been involved in a variety of research projects with world leading research groups, in both academia and industry. He has been a lecturer at King’s since 2009, teaching medical statistics and epidemiology on the undergraduate MBBS the postgraduate MPH. He is a statistical editor for Thorax (a BMJ journal) and a statistical consultant with the Biomedical Research Centre, the Research Design Service London, and King’s College Hospital.
Steve Ellison – LGC Limited
Dr Ellison is a Science Fellow at LGC, Teddington, the UK National Measurement Laboratory for chemical and biological measurement. An RSS Fellow and member of the RSS Quality Improvement Section, his principal interests are in applications of statistics applied to analytical chemistry and measurement science, including experimental design for analytical method development, interlaboratory studies of method performance, proficiency testing, and reference material certification. He contributes to a range of IUPAC, ISO, CEN, BSI and other committees involving applications of statistics, and has contributed to several International Standards and guidelines, co-authored a textbook of statistical methods for analytical chemists, and provides training in statistical methods for measurement science.
Liam Fleming – Newcastle University
Liam Fleming is a 3rd year PhD student studying statistics with a focus on stochastic computation for engineering problems. His research work is in collaboration with the Alan Turing Institute whilst based at Newcastle University. He is currently undertaking an industrial placement applying statistical and machine learning techniques to data from simulation of complex chemical processes. He has a BEng first class honours degree in chemical engineering. His technical interests focus on using data and statistics to improve the quality and efficiency of numerical simulations.