Transparency: A foundation of trust

When I first started as a Data Policy Lead, it was rare for a conversation that started about data ethics to end up there. We would start earnestly talking about the ethics of a data related project, but soon realise that the foundations weren’t yet in place. The difficult question was actually a legal one: are we allowed to process the data in this way? Or it was a data quality one: can we draw out insights for decision-making from incomplete source data? We hadn’t got past the “can we do something?” to get to the “should we do it?” question that data ethics is so good at helping us answer.

In more recent years however, I’ve seen organisations becoming increasingly confident in the fundamentals of good data protection and information management. The sweeping changes brought in via the Data Protection Act 2018 (including GDPR, and the recent update to the UK GDPR following our exit from the EU) are increasingly bedded in. Ethical discussions can start moving away from checking through data protection impact assessments (DPIAs, a type of risk assessment for using personal data) to helping organisations safely use new techniques to analyse data and answer complicated questions to improve public services.

The Essex Centre for Data Analytics, ecda, is a great example of an organisation doing just that. I’ve been involved with their Data Ethics Committee (DEC), since spring 2020, and we’re increasingly helping ecda challenge themselves as they work on more and more complex problems. It’s great to see them being innovative and committed to safely bringing together data across a really complex web of areas and using it to deliver real change to the citizens of Essex.

One of the ways in which ethics can particularly help with this activity is through increasing transparency. When we’re using personal data to answer complicated questions, it is essential that we are open with the public about what we’re doing with their data, and why. I’m a firm believer that no matter how complex an algorithm is the principles and purpose for using it can be explained. When we’re using the public’s data, we have to ask: “Can we explain what we’re doing to the public? And what would they think of it when we did?”.

The Data Ethics Committee has “laypeople”: citizens of Essex with no expertise in data analysis. Projects therefore must be clearly explained to make sure everybody can provide comment and approval; there’s no hiding behind acronyms or needlessly incomprehensible technical language. ecda is currently drafting its Transparency Strategy and is scoping out a Making Data Make Sense programme that seeks to do exactly what it says on the tin! Citizens, data peers and interested parties will come together to develop clear messages, materials and opportunities that will enable everyone to be part of the conversation about how we can make the best use of data.

A subgroup of the Data Ethics Committee is also currently working on a Data Charter for ecda and the DEC. This will clearly set out the principles that everybody in ecda is and will continue to work to, things like commitment to privacy, security, public benefit, and transparency. We’re working with a group of citizens to draft the charter; it will be published, and we’ll invite further comment on it. Along with the minutes from the committee meetings, and more and more blog posts and reports, as ecda strives to build a two-way dialogue with the public. The charter should be available online in early autumn; and I’m really looking forward to seeing what ideas ecda come up with next.

Share this page

Leave a comment

We only ask for your email address so we know you're a real person