Tuesday 7 August 2018

Gambling with Our Data: Black Box Bellagio



I spent last weekend at Nesta’s Future Fest. Every year it continues to blow my mind. Bringing together thousands of people interested in an alternative reality, as Nesta says “looking for a different story” one that is desirable, plausible and able to connect past, present and future”. I strongly believe that people’s beliefs about their ability to shape or control their destiny strongly depicts whether or not they thrive or just survive. This year we looked at the fragility of nature, the brokenness of politics, sex, race, gender in robotics and Artificial Intelligence (AI) and what I’d like to touch on in this blog, the challenges of data sovereignty.



I found an empty space at the blackjack table within the darkened ‘Black Box Bellagio’. This transforms what we know as a casino, into something unusual. “Take a seat at one of our playing tables and place your bets at Social Strip Poker (not sure the stripping actually happened, thank the good Lord), Best-Bet Blackjack, or maybe Risky Roulette”. Instead of money, your personal data becomes currency. “Play with the (un)fairness of expected values and chances, predicted risks, and giving up your identity” states the house. This and recent articles I have been reading on big data have led me to reflect on my concerns about data handling.


The data industry is one which has massively exploded in recent times. You will hear people talking about ‘big data’ being used to improve systems, improve customer experience and have better tailored services and products for clients. Overall it is a near-essential for any company in this day and age. The collection and use of all this data is what will be driving AI and machine learning. Indeed, even 'The Economist' agrees in a recent article about AI that “the world’s most valuable resource is no longer oil, but data.”

I was watching a TED talk yesterday. Dina Katabi talked through how vital signs can now be monitored by wireless signals. Data on breathing, heartbeat, sleep and even impending cardiac arrest and other chronic diseases can be collected, analysed and provided to an individual or health care professional without the needs for probes, censors or intrusive data collection methods.

In my line of work, what would data look like for the safety of young people? How does it fit into the notion of a smart city where data from multiple sources and agencies is layered and mapped to give a more detailed reflection of “the whole picture”? Gathering and sharing the latest and most accurate information on cities and where trouble occurs exposes recognisable ‘trends’ in incidents that can be better prepared for and faster responded to. SafeStats’ recent work around ‘hotblocking’ is case in point: a crime mapping method showing where the risk of violent street crime is highest. Making optimal use of shared data from Ambulance services and A&E departments the Met police were provided with clarity about where and when particular police activities should be targeted so as to maximise efforts to cut violent crime. The Children’s Society, working with a number of partners, is exploring how this might work to link up victims, perpetrators and common ‘hotspots’ of exploitation, thereby enabling disruption of common exploitation and trafficking networks.

Data quality will also become increasingly important. There will always be, and increasingly so as we walk into the future, a need for roles which ensure data quality – people who possess the skills and assets to collect, clean, consolidate, store and analyse the data. This is the only way we can truly ensure that we as humans, and our AI counterparts, are making educated decisions.

When it comes to data, privacy is understandably a major concern. How do we as members of the public understand how our data is being used and handled and how will it be regulated. I guess the first test is GDPR (if you don’t know what this is after the thousands of e-mails you have received on the 24th May 2018, then you need to do something drastic like go to specsavers). This will be an early indicator of how companies are able to comply with this legislation and no doubt demonstrate how effective legislation is or can be with tackling non-compliance. If anything, it has given people like me a huge opportunity to clean up existing personal data.

It is a common fact that Facebook uses algorithms to track our behaviour. But what these algorithms do, or when they are around, is commonly unknown. Still, we are being held responsible for our own knowledge. In this imposed state of insecurity, all our options seem to be untrustworthy. What is good and what is evil?


The Economist, in its report, makes the point that the current outlook is “likely to be between utopia and disaster”. Doesn’t fill me with much hope, or doom either. I believe that as the likes of companies like DigiMe and ArchiveSocial which help to unlock the power of your data, grow, we will be able to build new rules which are featured around the individual as the central decision point for sharing and protection. True data sovereignty. Many of these issues I have highlighted can be tackled if we decide to address them immediately and with significant legislation and frameworks to provide boundaries within which to work and operate safely.

As for my stakes at the Black Box Bellagio? Let’s just say I lost my ability to maintain data sovereignty when I lost to the house. The impending result being I had to ‘like’ a facebook page randomly selected by the Croupier. I wasn’t as unlucky as getting Britain First or some kind of neo-nazi group page as some people did, but I did have to like Gary John, Flat earth conspirator. Perhaps someone Nesta could engage with for next year’s Future Fest? *pops note onto suggestion wall