This was originally posted on Government Technology
Back in November Emma Pearce blogged about using big data following the first in a series of data seminars we’re running. We’ve now held our second session, hosted by Facebook, which focussed on trust and privacy.
Stephen Deadman, Deputy Global Chief Privacy Officer at Facebook, welcomed us for a talk and Q&A. Facebook is often held up as an example, both positive and negative, due to their profile and size.
Having come across the work Stephen’s team are doing to explore attitudes and opportunities around privacy, I was keen to explore what government can learn from their work and what the areas of overlap are likely to be, both as practitioners and regulators.
How Facebook works
In 2010 Facebook set up a dedicated team to oversee their privacy programme. The team was tasked with creating strategic guidance for how Facebook operates.
The team links together engineering, operations and policy teams to help the company navigate the outside world, ensuring they meet European standards of compliance and those of the rest of the world. They provide tools and training to make sure that teams are able to start from a solid foundation, and can also embed a small number of specialists where products merit it.
Over the years Facebook has expanded and now owns several other popular brands such as Instagram and WhatsApp.
Facebook faces some common misconceptions about how they work. One example was the new WhatsApp policy update: users commented that it was rolled out without much consideration. Stephen told us that it had been in development for over 18 months, with the team carefully considering a whole range of factors including timing, legal aspects, user experience and more.
The assumption that such decisions are made overnight can make it difficult for an organisation to get off on the right foot for a conversation with users about what the changes mean. One of the ways to move away from those misconceptions is to make their work more transparent.
Transparency
Facebook has 1.8 billion users worldwide and transparency of data is a global problem. For 18 months Facebook have been working with CtrlShift to improve this and the [final report](https://www.dropbox.com/s/2mpczioqti3h47m/Report 3 A new paradigm for personal data.pdf?dl=0) on their work was published in May. That work was what had first drawn our attention to Stephen and his team.
It can be very hard to get a clear understanding of user attitudes to data sharing and privacy, with different accounts varying widely. We discussed the fact that we commonly see headlines about “only X% of people trust organisation Y with their data” but there aren’t good benchmarks of what level of trust we should expect. The research and global roundtables backing the CtrlShift report are a really useful effort to look at themes in this area.
One of the main links drawn from the work is the need for deeper design thinking that can move the conversation about transparency and regulation past false binaries. Stephen used the analogy of car design - there are a great deal of safety regulations out there which all car manufacturers must abide by, but the designers are still free to design and iterate around those safety features to make an individual, desirable end product.
This design thinking can be applied to getting businesses around the world to adopt transparency solutions. In the past regulations have been good at stopping bad things, but not so good at setting a baseline for best practice.
Governments need to make sure that laws and regulations are abided by, but encourage and enable a cycle of continuous improvement: research, iterate, feedback.
Gaining trust
Other companies are also striving to improve their transparency. For example, when a new update comes through for an Apple product, users are presented with a lengthy policy document. But how many people actually read it?
Could it be that terms and conditions just aren’t designed for consumers and aren’t communicated in a way that they would understand?
But how do you get this right? Laws change and new features are introduced and companies should be sharing this information but when they do it leads to more questions and distrust from users. It’s an ongoing issue with no simple answer.
Facebook conduct assessments to check the privacy impact of every product and use a tool to track projects at every stage of their creation and implementation (a bit like our service assessments), but it’s an ongoing battle.
Stephen talked about the idea of the passive consumer - it can be hard to get them involved and educated about the use of their data but it is their fundamental right to see it. For the private sector putting data in the hands of the consumer allows them to get more involved.
Lessons from around the world
Drawing from experiences developing this research, Stephen pointed to a willingness to think differently in Asia, and in particular Singapore. A commitment has been made to digitise the state which means a fundamental change of thinking and making data a key part of society. There is always a degree of tension between those who are creating the good tech stuff and those who want to protect rights. But by putting it at the forefront of thinking that debate is going to be far more productive.
We discussed the importance of integrated multi-disciplinary approaches there. As the possibilities of technology and service design accelerate we need to consider new, more flexible approaches to regulation but unless that’s shaped by teams with the right mix of skills we won’t be able to strike the right balance. As government is the custodian of a lot of data and the provider of a lot of services, we have the opportunity to bring together our work as practitioner and regulator to help inform that balance.
Stephen finished by encouraging us to continue to recognise that data is a good thing and should be nurtured. We could learn a lot from the infrastructure being put in place in Singapore and by working closely with industry experts to see how they are using data in interesting ways. We need to ease the nerves around this issue and explain why it is helpful. We can use this to create critical design patterns which can be shared and reused.
The main lesson: great services with greater control leads to happier citizens.