INTRODUCTION
Laryngeal squamous cell carcinoma (LSCC) is diagnosed in 1800 patients
in England and Wales annually, half of whom have locally advanced
disease ((American Joint Committee on Cancer (AJCC 8) stages III and IV)
at presentation [1]. Prior to 1991, laryngectomy was considered a
treatment of choice offering the highest chance of cure. However, the
treatment paradigm shifted towards non-surgical laryngeal preservation
strategies following the results of the Veterans Affairs (VA) Laryngeal
Cancer Study, which demonstrated equitable survival and favourable
laryngeal preservation rates (64%) in patients undergoing induction
chemotherapy and definitive radiotherapy versus total laryngectomy and
postoperative radiotherapy[2]. The role of concurrent
chemoradiotherapy (CRT) was established by the Radiation Therapy
Oncology Group 91-11 trial which showed improved locoregional control
and even higher laryngeal preservation rates of 81% with concomitant
CRT when compared with 67% in the induction chemotherapy/radiation arm
and 63% in definitive radiotherapy arm [3].
Whilst primary CRT has become the standard of care for T3 and low-volume
T4 disease in most United States and UK centres since the publication of
these two seminal trials, a number of questions regarding the role of
larynx preserving strategies remain [4]. In particular, the vexed
issue of what constitutes meaningful laryngeal preservation when
accounting for function and quality of life. (C)RT is associated with
significant side-effects with a third of patients experiencing grade 3-5
toxicities, most notably airway and swallowing ramifications[3]. It
is, therefore, of significant clinical relevance to attempt to inform
patient selection for these treatment strategies.