Home > Sustainable Gambling > Problem gambling > The flaw in the Public Health evidence base

The flaw in the Public Health evidence base

| By Scott Longley | Reading Time: 5 minutes
The gambling sector has consistently asked for policy which is evidence-based. But what happens if that evidence is itself suspect?
The flaw in the Public Health evidence base

We are now so deep into extra time on the publication of the British government’s white paper on gambling that onlookers could be forgiven for forgetting a lot of the details on how we got here.

But as will likely soon become clear when the details are finally published (or is already clear by the time you read this), it should be pointed out that the evidence relied upon for the cost of problem gambling in Britain is highly questionable.

Scott Longley
Scott Longley

Central to this debate is a report issued by Public Health England (PHE) back in September last year.

That report estimated the costs associated with gambling harms in England was £1.27bn a year. 

These costs consist of the direct expense to the public purse caused by a range of harms, including homelessness, depression, alcohol dependence, illegal drug use, unemployment and imprisonment; as well as the indirect intangible costs of suicide.

But explanations for how these calculations were reached are curiously absent from the PHE report. As was indicated recently by a set of written questions from Scott Benton MP to Sajid Javid, the secretary of state for health and social care, the “full numerical mathematical” calculations used to estimate the £335.5m cost of depression and the £619.2m annual cost of suicides are missing.

Similarly, the basis for stating that there were 409 deaths by suicide “associated with problem gambling only” was also absent. 

Veil of secrecy

Unfortunately, Benton is likely to get short shrift for his efforts, going by the reply to a direct question on the same subject he put to the then-minister responsible for the Gambling Act review Chris Philp.

Philp said “the lack of quantitative causal evidence for some of the harms described did not allow PHE to make a direct assessment of the cost of gambling harm specifically”.

“While the review acknowledged that further research is needed to determine costs attributable directly to gambling-related harm rather than those associated with people who are problem or at-risk gamblers, it is the most comprehensive review of the evidence on gambling-related harm and its associated costs, and has been carefully considered as an important input to our review of the Gambling Act 2005.”

To which the reply must be: comprehensive in what sense? Without the evidence to back up the numbers, it suggests the government will be basing the white paper proposals on the evidence of a report that is both opaque in terms of its calculations and cannot be used to causatively assess the involvement of gambling in those harms.

For this article, iGB approached the Department of Health and got the response that the report “includes the methodology used to develop the estimate of the number of deaths by suicide associated with problem gambling, the estimate of the annual cost of suicides associated with problem gambling and the estimate of the cost of depression associated with at-risk and problem gambling.” 

Which is to say, it includes its methodology but not the actual evidence.

A recent Freedom of Information Act request to elicit the evidence met with a blank refusal from the DHSC. The department wrote that in order to provide the actual numerical calculations upon which the PHE cost estimates are based would be too time-consuming. It claimed that “to adapt and format our analyses for external use… would imply a considerable time above the threshold” and that “the cost of this work would exceed the appropriate limit”.

It is worth reflecting on this. The DHSC claims that it would take more than 24 hours work to “adapt and format” the PHE calculation to render it suitable for publication.

This indicates a number of things. 

First, no one outside the DHSC (and possibly no one within the DHSC) understands how the cost estimates have actually been arrived at. Second, very few – if any – of the public figures (including the gambling minister) who have cited the cost figures have any idea whether they have any basis in reality. 

Finally, we must assume that the cost calculations are far more complex than the description provided in the PHE report suggests.

But as we shall see, concerns about the reliability of the report are not confined to DHSC secrecy.

Questionable methodology

The inability of the government to even explain how the evidence it is using was reached is a worrying trend.

Recent work undertaken by Dan Waugh, a partner at the research and advisory firm Regulus Partners, has pointed out a number of substantial methodological flaws in PHE’s work.

These include the absence of a consistent conceptual framework for defining costs, the use of small sample populations and non-representative studies from overseas jurisdictions such as Canada and Sweden, the suppression of inconvenient research findings, and the attribution of all estimated excess costs to gambling despite the complexity of the harms noted.

Waugh highlights a fundamental problem with PHE’s much-cited estimate of 409 excess deaths from suicide associated with problem gambling only. “In the absence of any hard data, the researchers extrapolated from a 2018 study of hospital patients in Sweden with a diagnosis of gambling disorder,” he says. “They then assumed that the suicide mortality ratio for gambling disorder – a recognised psychiatric disorder – would be the same for PGSI ‘problem gambling’, which is a sub-clinical classification and not a disorder.” 

As Waugh points out, gambling disorder (as defined by the World Health Organisation’s International Classification of Diseases) is not the same as problem gambling – it generally involves greater dysregulation, more severe harms and more complex comorbidities. 

He adds that PHE’s conflation of problem gambling with gambling disorder is “massively distortive” and suggests either that the researchers “did not understand the basics of the subject they were analysing or that they deliberately misled”. 

“There are a number of other problems with PHE’s estimate but this error alone is sufficient to invalidate their findings,” Waugh adds.

Similar problems run through the entire PHE cost estimates report, raising significant concerns about the reliability of evidence from state agencies (the Gambling Commission has also been accused of evidence manipulation during the review). 

Those concerned about a public health stitch-up point to an article written in 2020 by the lead researcher on the PHE gambling harms project in which it was claimed (in relation to gambling) that “more research and evidence are needed to support advocacy and action”.

It is questionable whether it is the role of any research to support advocacy (as distinct from enlightenment) but the use of taxpayer funds by a state agency for this end appears particularly questionable.

The worrying trend here for the gambling sector is that whatever is included in the white paper will only be seen as a staging post for the next run at greater prohibitions. In which case we are likely to see more misinformation flowing from heavily biased research reports.

The sector needs to start countering this onslaught lest it is hit by even more draconian attacks down the line. The data is there to be investigated and the claims of the anti-gambling lobby are there to be shot at.

It might be too late for that to happen with the forthcoming white paper but the industry should be pursuing this argument in relation to future government action. Its fate will depend on that effort.

Scott Longley has been a journalist since the early noughties covering personal finance, sport and gambling. He has worked for a number of publications including Investment Week, Bloomberg Money, Football First, eGaming Review and Gambling Compliance.

Subscribe to the iGaming newsletter