Skip to main content

Loot boxes in games are gambling and should be banned for kids, say UK MPs

UK MPs have called for the government to regulate the games industry’s use of loot boxes under current gambling legislation — urging a blanket ban on the sale of loot boxes to players who are children. Kids should instead be able to earn in-game credits to unlock look boxes, MPs have suggested in a recommendation […]

UK MPs have called for the government to regulate the games industry’s use of loot boxes under current gambling legislation — urging a blanket ban on the sale of loot boxes to players who are children.

Kids should instead be able to earn in-game credits to unlock look boxes, MPs have suggested in a recommendation that won’t be music to the games industry’s ears.

Loot boxes refer to virtual items in games that can be bought with real-world money and do not reveal their contents in advance. The MPs argue the mechanic should be considered games of chance played for money’s worth and regulated by the UK Gambling Act.

The Department for Digital, Culture, Media and Sport’s (DCMS) parliamentary committee makes the recommendations in a report published today following an enquiry into immersive and addictive technologies that saw it take evidence from a number of tech companies including Fortnite maker Epic Games; Facebook-owned Instagram; and Snapchap.

The committee said it found representatives from the games industry to be “wilfully obtuse” in answering questions about typical patterns of play — data the report emphasizes is necessary for proper understanding of how players are engaging with games — as well as calling out some games and social media company representatives for demonstrating “a lack of honesty and transparency”, leading it to question what the companies have to hide.

“The potential harms outlined in this report can be considered the direct result of the way in which the ‘attention economy’ is driven by the objective of maximising user engagement,” the committee writes in a summary of the report which it says explores “how data-rich immersive technologies are driven by business models that combine people’s data with design practices to have powerful psychological effects”.

As well as trying to pry information about of games companies, MPs also took evidence from gamers during the course of the enquiry.

In one instance the committee heard that a gamer spent up to £1,000 per year on loot box mechanics in Electronic Arts’s Fifa series.

A member of the public also reported that their adult son had built up debts of more than £50,000 through spending on microtransactions in online game RuneScape. The maker of that game, Jagex, told the committee that players “can potentially spend up to £1,000 a week or £5,000 a month”.

In addition to calling for gambling law to be applied to the industry’s lucrative loot box mechanic, the report calls on games makers to face up to responsibilities to protect players from potential harms, saying research into possible negative psychosocial harms has been hampered by the industry’s unwillingness to share play data.

“Data on how long people play games for is essential to understand what normal and healthy — and, conversely, abnormal and potentially unhealthy — engagement with gaming looks like. Games companies collect this information for their own marketing and design purposes; however, in evidence to us, representatives from the games industry were wilfully obtuse in answering our questions about typical patterns of play,” it writes.

“Although the vast majority of people who play games find it a positive experience, the minority who struggle to maintain control over how much they are playing experience serious consequences for them and their loved ones. At present, the games industry has not sufficiently accepted responsibility for either understanding or preventing this harm. Moreover, both policy-making and potential industry interventions are being hindered by a lack of robust evidence, which in part stems from companies’ unwillingness to share data about patterns of play.”

The report recommends the government require games makers share aggregated player data with researchers, with the committee calling for a new regulator to oversee a levy on the industry to fund independent academic research — including into ‘Gaming disorder‘, an addictive condition formally designated by the World Health Organization — and to ensure that “the relevant data is made available from the industry to enable it to be effective”.

“Social media platforms and online games makers are locked in a relentless battle to capture ever more of people’s attention, time and money. Their business models are built on this, but it’s time for them to be more responsible in dealing with the harms these technologies can cause for some users,” said DCMS committee chair, Damian Collins, in a statement.

“Loot boxes are particularly lucrative for games companies but come at a high cost, particularly for problem gamblers, while exposing children to potential harm. Buying a loot box is playing a game of chance and it is high time the gambling laws caught up. We challenge the Government to explain why loot boxes should be exempt from the Gambling Act.

“Gaming contributes to a global industry that generates billions in revenue. It is unacceptable that some companies with millions of users and children among them should be so ill-equipped to talk to us about the potential harm of their products. Gaming disorder based on excessive and addictive game play has been recognised by the World Health Organisation. It’s time for games companies to use the huge quantities of data they gather about their players, to do more to proactively identify vulnerable gamers.”

The committee wants independent research to inform the development of a behavioural design code of practice for online services. “This should be developed within an adequate timeframe to inform the future online harms regulator’s work around ‘designed addiction’ and ‘excessive screen time’,” it writes, citing the government’s plan for a new Internet regulator for online harms.

MPs are also concerned about the lack of robust age verification to keep children off age-restricted platforms and games.

The report identifies inconsistencies in the games industry’s ‘age-ratings’ stemming from self-regulation around the distribution of games (such as online games not being subject to a legally enforceable age-rating system, meaning voluntary ratings are used instead).

“Games companies should not assume that the responsibility to enforce age-ratings applies exclusively to the main delivery platforms: All companies and platforms that are making games available online should uphold the highest standards of enforcing age-ratings,” the committee writes on that.

“Both games companies and the social media platforms need to establish effective age verification tools. They currently do not exist on any of the major platforms which rely on self-certification from children and adults,” Collins adds.

During the enquiry it emerged that the UK government is working with tech companies including Snap to try to devise a centralized system for age verification for online platforms.

A section of the report on Effective Age Verification cites testimony from deputy information commissioner Steve Wood raising concerns about any move towards “wide-spread age verification [by] collecting hard identifiers from people, like scans of passports”.

Wood instead pointed the committee towards technological alternatives, such as age estimation, which he said uses “algorithms running behind the scenes using different types of data linked to the self-declaration of the age to work out whether this person is the age they say they are when they are on the platform”.

Snapchat’s Will Scougal also told the committee that its platform is able to monitor user signals to ensure users are the appropriate age — by tracking behavior and activity; location; and connections between users to flag a user as potentially underage. 

The report also makes a recommendation on deepfake content, with the committee saying that malicious creation and distribution of deepfake videos should be regarded as harmful content.

“The release of content like this could try to influence the outcome of elections and undermine people’s public reputation,” it warns. “Social media platforms should have clear policies in place for the removal of deepfakes. In the UK, the Government should include action against deepfakes as part of the duty of care social media companies should exercise in the interests of their users, as set out in the Online Harms White Paper.”

“Social media firms need to take action against known deepfake films, particularly when they have been designed to distort the appearance of people in an attempt to maliciously damage their public reputation, as was seen with the recent film of the Speaker of the US House of Representatives, Nancy Pelosi,” adds Collins.

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.