The game is called Breaking Harmony Square and is set in a fictional small town. Players are recruited as the Chief Disinformation Officer.

The aim is to be as evil as you can and disrupt every day life and local elections. It works like an inoculation by exposing users to a controlled dose of fake news and how it is spread in the hope of helping them spot misinformation in future

A game built by psychologists has been designed to teach players to sniff out ‘fake news’ by encouraging them to sabotage elections in a fictional town.

Cambridge University researchers created ‘Breaking Harmony Square’ in partnership with the US Department of State and the Department of Homeland Security.

The idea is to teach players to sniff out ‘fake news’ by letting them be as evil as possible and play the part of Chief Disinformation Officer in a fictional small town.

Using tactics such as bots, trolling, and spreading conspiracies, the game shows how disinformation tactics work in real life by using a small town as an example.

In a controlled trial involving 681 people – half playing this and half playing Tetris – 16 per cent of those playing Breaking Harmony Square said they perceived misinformation as being less reliable and 11 per cent were less likely to share it in future.

The goal of the free game, played in a web browser, is to sow as much discord as possible in the fictional town by polarizing audiences and spreading fear and anger.

In the four chapters, the neighbourhood descends into chaos as players spread falsehoods including setting up a disreputable news site to attack a TV anchor.

Users learn five manipulation techniques as part of the gameplay, with the aim that they will be able to spot those tricks in the real world.

They include: trolling to provoke outrage; exploiting emotional language to create anger and fear; artificially amplifying reach through bots and fake followers; creating and spreading conspiracy theories; polarizing audiences.

Study author Sander van der Linden said the game worked like a form of inoculation intervention against the worst excesses of misinformation and fake news.

‘But at the same time, elections are often decided on small margins, so inoculating even a million people could potentially make a practical difference.’

‘Games are interactive and require active cognitive involvement on the part of the player,’ co-author Jon Roozenbeek told MailOnline.

‘This kind of ‘active’ inoculation against fake news, in theory at least, is a good way to retain the lessons learned during the game.’

Other methods of ‘retraining people’ include more passive interventions such as reading or watching a video which can be tough to remember longer term.

‘Aside from this, you can be quite creative when creating games, and engage in world-building and use humour to make the game more attractive and entertaining,’ Roozenbeek explained.

During a controlled, randomised trial, researchers discovered that players were more canny to fake news after playing the 10 minute online game.

For the trial 681 people were asked to rate the reliability of a series of news and social media posts – some of which were real and some contained misinformation.

Half of the players were given Tetris, while the other half were given the fake news game and had to rate the sources before and after playing.

‘We wanted to include a gamified control condition, to make the amount of cognitive load comparable between conditions,’ Roozenbeek told MailOnline.

‘Tetris is a good fit for this, because it’s been used in previous studies, it’s in the public domain, and the learning curve is quite flat so that most participants already know how to play it.’

Out of the people who played Breaking Harmony Square, the perceived reliability of misinformation dropped an average of 16 per cent and willingness to share fake news with others dropped by 11 per cent, researchers discovered.

It was also a politically independent finding – as political party or leaning made no difference to the change in behaviour from playing the game.

Out of the Breaking Harmony Square group, 63 per cent said they would go onto be more discerning about fake news, compared to just 37 per cent of the Tetris group.

The gameplay is based on inoculation theory, the idea that exposing people to a weak ‘dose’ of common techniques used to spread fake news allows them to better identify and disregard misinformation when they encounter it in future.

Dr Sander van der Lindon said trying to debunk misinformation after it spread is like shutting the barn door after the horse has bolted.

‘By pre-bunking, we aim to stop the spread of fake news in the first place,’ the researcher from the Cambridge Social Decision-Making lab.

‘The aftermath of this week’s election day is likely to see an explosion of dangerous online falsehoods as tensions reach fever pitch.

‘Fake news and online conspiracies will continue to chip away at the democratic process until we take seriously the need to improve digital media literacy across populations,’ he explained.

‘The effectiveness of interventions such as Breaking Harmony Square are a promising start.’

The same team that built Breaking Harmony Square also developed a game in partnership with the UK Cabinet Office called Go, Viral! to tackle Covid-19 misinformation.

‘For the Go Viral-game, we focused on three techniques: fearmongering, using fake experts, and spreading conspiracies,’ Roozenbeek told MailOnline.

‘We chose these three not necessarily because these are the only three techniques used to spread misinformation about COVID-19, but because they’re very common and take relatively little time to explain.

‘For the Breaking Harmony Square game, we focused on manipulation techniques that are more directly relevant to political disinformation campaigns.

‘These techniques overlap to some degree, of course, and we encourage players to look up more information about other manipulation techniques, if they’re interested.’

Van der Linden said during the recent US elections, a wider array of prebunking and inoculation early in the campaign would have limited people’s susceptibility to disinformation further than was already the case from Twitter measures.

‘Twitter recently implemented a prebunking strategy during the last week of the election and disseminated it to all of its US users I believe to prebunk election misinformation. This was innovative,’ he said.

‘What we show in our research is that inoculation can also have “therapeutic” benefits, i.e., even when people have already been exposed to falsehoods to some degree, it can still boost immune response.’

This follows the development of therapeutic vaccines in medicine and perhaps reduce people’s willingness to spread the misinformation further.

Originally published at Daily mail