Feb 17

Clarifying our Cloud First commitment

This was my final post on the Government technology blog

The government technology landscape has shifted significantly since we made our commitment to Cloud First nearly 4 years ago. Departments have become more mature in their uptake of cloud services and with this maturity comes a need for further guidance. To support this need, we’ve added further clarification to our cloud guidance and policy and we’ll continue to expand this content in the coming months.

From Cloud First to Cloud Native

While working on the new guidance, internally we’ve begun to move away from the phrase “Cloud First” and instead begin to think in terms of “Cloud Native.” Cloud First is the policy we’ve agreed, but it’s not our aspiration.

Cloud Native is one of those terms that has a lot of different definitions, with the more narrow definition encompassing patterns for application design, deployment and operation. We use the term more broadly to include the flexible adoption of Software as a Service (SaaS) applications, which are often loosely coupled and quite task specific.

Cloud Native is not just about considering cloud before other options, it’s about adapting how we organise our work to really take advantage of what’s on offer and what’s emerging.

The need to take advantage of new cloud developments

As the world of cloud technologies continues to accelerate, we should absorb new developments into how we work. Leading organisations are rapidly embracing new tools like “serverless” computing. Some are also investing more in retraining staff so they can get to grips with these new opportunities. We need to make sure the government keeps pace.

At the infrastructure and application level we should expect our applications to be resilient, flexible and API-driven. We should have the tools and practices in place to manage and secure a distributed range of tools accessed over the internet.

We should empower everyone in an organisation to help us become more effective in technology by letting any staff member trial new SaaS applications. Our management and security practices should support this approach. We should look for an API-centric approach that will let us easily integrate new SaaS applications into the rest of our architectures.

A decade of industry growth in the public cloud, and nearly 4 years of our Cloud First approach have given us examples of teams or organisations doing these things, often within government. We need to make them our default. For example DWP Digital has moved telephony to a cloud based service, with nothing in the offices except a phone and a wide area network connection.

Unless we adapt how we adopt technologies and focus on core outcomes and principles we won’t be able to meet the growing expectations of our users’ (including our staff), and we won’t be preparing for even deeper changes that are likely to come as we deal with ever growing volumes of data, and a proliferation of devices and sensors.

What Cloud Native means for government architecture

To become Cloud Native, we need to focus on the digital outcomes we need and how to achieve them. It’s with this focus, for instance, that the National Cyber Security Centre produced the Cloud Security Principles and some guidance on protecting bulk personal data. But in the remit of security, we still have work to do in getting a better understanding of what different providers offer and where we place our trust.

To truly become cloud native, we need to transform how we monitor and manage distributed systems to include ever more diverse applications. We need to deepen our conversations with vendors about the standards that will help us manage these types of technology shifts. We need to continue to ensure we always choose cloud providers that fit our needs, rather than basing our choices on recommendations.

Over the coming weeks we’ll be blogging more about what this means in practice, and GDS will be hiring a new Chief Technical Architect to take forward this work. If you work for government, we’d like to hear your thoughts on what cloud native means to you.

Jan 17

Trust and privacy: sharing lessons

This was originally posted on Government Technology

Back in November Emma Pearce blogged about using big data following the first in a series of data seminars we’re running. We’ve now held our second session, hosted by Facebook, which focussed on trust and privacy.

Stephen Deadman, Deputy Global Chief Privacy Officer at Facebook, welcomed us for a talk and Q&A. Facebook is often held up as an example, both positive and negative, due to their profile and size.

Having come across the work Stephen’s team are doing to explore attitudes and opportunities around privacy, I was keen to explore what government can learn from their work and what the areas of overlap are likely to be, both as practitioners and regulators.

How Facebook works

In 2010 Facebook set up a dedicated team to oversee their privacy programme. The team was tasked with creating strategic guidance for how Facebook operates.

The team links together engineering, operations and policy teams to help the company navigate the outside world, ensuring they meet European standards of compliance and those of the rest of the world. They provide tools and training to make sure that teams are able to start from a solid foundation, and can also embed a small number of specialists where products merit it.

Over the years Facebook has expanded and now owns several other popular brands such as Instagram and WhatsApp.

Facebook faces some common misconceptions about how they work. One example was the new WhatsApp policy update: users commented that it was rolled out without much consideration. Stephen told us that it had been in development for over 18 months, with the team carefully considering a whole range of factors including timing, legal aspects, user experience and more.

The assumption that such decisions are made overnight can make it difficult for an organisation to get off on the right foot for a conversation with users about what the changes mean. One of the ways to move away from those misconceptions is to make their work more transparent.


Facebook has 1.8 billion users worldwide and transparency of data is a global problem. For 18 months Facebook have been working with CtrlShift to improve this and the final report on their work was published in May. That work was what had first drawn our attention to Stephen and his team.

It can be very hard to get a clear understanding of user attitudes to data sharing and privacy, with different accounts varying widely. We discussed the fact that we commonly see headlines about “only X% of people trust organisation Y with their data” but there aren’t good benchmarks of what level of trust we should expect. The research and global roundtables backing the CtrlShift report are a really useful effort to look at themes in this area.

One of the main links drawn from the work is the need for deeper design thinking that can move the conversation about transparency and regulation past false binaries. Stephen used the analogy of car design – there are a great deal of safety regulations out there which all car manufacturers must abide by, but the designers are still free to design and iterate around those safety features to make an individual, desirable end product.

This design thinking can be applied to getting businesses around the world to adopt transparency solutions. In the past regulations have been good at stopping bad things, but not so good at setting a baseline for best practice.

Governments need to make sure that laws and regulations are abided by, but encourage and enable a cycle of continuous improvement: research, iterate, feedback.

Gaining trust

Other companies are also striving to improve their transparency. For example, when a new update comes through for an Apple product, users are presented with a lengthy policy document. But how many people actually read it?

Could it be that terms and conditions just aren’t designed for consumers and aren’t communicated in a way that they would understand?

But how do you get this right? Laws change and new features are introduced and companies should be sharing this information but when they do it leads to more questions and distrust from users. It’s an ongoing issue with no simple answer.

Facebook conduct assessments to check the privacy impact of every product and use a tool to track projects at every stage of their creation and implementation (a bit like our service assessments), but it’s an ongoing battle.

Stephen talked about the idea of the passive consumer – it can be hard to get them involved and educated about the use of their data but it is their fundamental right to see it. For the private sector putting data in the hands of the consumer allows them to get more involved.

Lessons from around the world

Drawing from experiences developing this research, Stephen pointed to a willingness to think differently in Asia, and in particular Singapore. A commitment has been made to digitise the state which means a fundamental change of thinking and making data a key part of society. There is always a degree of tension between those who are creating the good tech stuff and those who want to protect rights. But by putting it at the forefront of thinking that debate is going to be far more productive.

We discussed the importance of integrated multi-disciplinary approaches there. As the possibilities of technology and service design accelerate we need to consider new, more flexible approaches to regulation but unless that’s shaped by teams with the right mix of skills we won’t be able to strike the right balance. As government is the custodian of a lot of data and the provider of a lot of services, we have the opportunity to bring together our work as practitioner and regulator to help inform that balance.

Stephen finished by encouraging us to continue to recognise that data is a good thing and should be nurtured. We could learn a lot from the infrastructure being put in place in Singapore and by working closely with industry experts to see how they are using data in interesting ways. We need to ease the nerves around this issue and explain why it is helpful. We can use this to create critical design patterns which can be shared and reused.

The main lesson: great services with greater control leads to happier citizens.

Jan 17

The internet is ‘ok’

Originally posted on Government technology

When different parts of the public sector share services and exchange data it’s important that we can rely on the basic security of each other’s technology, and that the data will maintain its integrity as it moves around. It is an important part of ensuring that there’s a clear layer of trust between everyone involved in the interaction.

For the past few years a lot of government (and wider public sector) services have relied on the Public Services Network (PSN) to provide assurance of that IT security. As a high-performance network operated by multiple vendors, the PSN provides assured connections for a wide range of public sector organisations.

As we move more and more of our systems to public cloud services the expectation that we’ll communicate over the PSN can cause confusion and adds complexity for public sector organisations and our suppliers.

We also have new ways of providing assurance, with technical controls such as the use of standards-based approaches to email security, Transport Layer Security (TLS) for encrypting web transactions and, where necessary, Virtual Private Networks (VPNs) if an extra layer of isolation or authentication is necessary.

What is the future of the PSN?

At a recent meeting of the Technology Leaders Network, we reviewed our position and it was clear that everyone agreed we could just use the internet.

For the vast majority of the work that the public sector does, the internet is ok. We’ve got some advice in our network principles.

We’ll often need to deploy the sort of security measures described above, along with a host of other measures to ensure basic application-level security, but as my colleague Shan Rahulan said during the meeting we increasingly need to do that even when services are on the PSN. This then opens up the question of whether the extra layer of complexity is really helpful.

So that means we’re on a journey away from the PSN.

Of course, it’s not going to happen immediately. Organisations that need to access services that are only available on the PSN will still need to connect to it for the time being. They’ll need to continue to meet its assurance requirements, and in fact they should make use of the practices that covers when reviewing all their core IT.

But from today, new services should be made available on the internet and secured appropriately using the best available standards-based approaches. When we’re updating or changing services, we should take the opportunity to move them to the internet.

What happens next?

There’s quite a bit of work to do across the public sector to prepare for these changes and we’re not quite ready to provide a full timeline. We’ll be staying in touch with users of the network and commercial providers to make sure that those who need to make decisions get clear information.

My colleague Mark Smith, Head of PSN, has been working with data scientists in GDS and the National Cyber Security Centre (NCSC) to prototype other ways of providing assurance data that will help organisations establish trust. He’ll introduce that soon in a blog post and is doing some deeper discovery work to ensure we have great options for organisations to verify that their networks meet a set of basic standards.

GDS, NCSC and Crown Commercial Service (CCS) will be working together to ensure that as we update the ways in which we buy network services we have the widest possible range of suppliers and the right options to make sure we get the highest quality connections.

We’ll be working with the Tech Leaders Network and the wider PSN community to ensure that common issues are clearly identified and that wherever possible we work together to provide common solutions.

We’ll also be working with colleagues in the Cyber and Government Security Directorate and others across the public sector to make sure that we are able to collaborate on upgrading older systems that need new protections and share good practices. That’s a clear part of the National Cyber Security Strategy and this move just adds some more focus to plans already underway.

Dec 16

Our commitment to better Open Source practices

This was originally posted on Government technology


At the Open Government Partnership (OGP) summit last week in Paris, the UK government joined a new international collective action that recognises the role that Open Source Software has to play in increasing transparency and harnessing new technologies to improve governance.

Our commitment

We committed to sharing what we’ve learnt over the past few years about bringing open source and related working practices into government, to working collaboratively with other governments to develop common practices and policies, and particularly to making sure that open source plays a big role in our growing international collaboration around Digital Marketplace and procurement reform.

This fits with commitments we’ve already made through our Digital Service Standard and the Technology Code of Practice, and commitments we’ve made through international bodies like the Digital 5 (D5).

The OGP roundtable session discussing the commitment and the draft policy were a good start to involving a lot more governments in that work, and to see the support we have from a number of influential open source foundations and companies.

The summit

I attended the summit alongside Paul Maltby, and Sir Eric Pickles in his position as the Prime Minister’s Anti-Corruption Champion. It was great to experience the many themes around openness and transparency coming together with an impressively diverse agenda and group of attendees. With so many significant global political moments having occurred over the past year the mood was naturally very reflective.

At GDS, our day-to-day focus on open source and open standards in government ensures that the UK government can control its own technology and provide efficient foundations for great services. The summit was a helpful reminder that it’s also important that we keep thinking about the ways that open source culture and code work to make sure that the systems and algorithms we develop are as transparent as possible and enable new forms of accountability, participation and collaboration.

On a more immediate level it was good to learn a bit more about what other countries are doing, such as our French hosts’ OpenFisca which is a set of open source tools to simulate tax and benefits systems and a big step toward a significantly more transparent social security system.

What we’re going to do

The first deliverable of the group is an open source contribution policy template which will continue to be developed over the coming months. We’ll be helping to refine the template so we can all contribute to other work in a similar way.

We’re aiming to model open source collaboration across governments in a new phase of work on Digital Marketplace that will build on our work with Australia earlier in the year. The Digital Marketplace team will share more about those plans soon and we committed to sharing what we learn with the OGP community.

There are many other ways in which we can collaborate with other governments and with the wider open source community. We’re looking forward to continued conversations with the other governments, foundations, companies and NGOs that took part in the sessions.

Nov 16

How can services work for government and citizens?

This was originally posted on the Government technology blog. It’s been slightly edited here.


Recently a small group of technology leaders from across government gathered for an informal event to discuss the role that we play in developing user-centric services and their impact on wider society. It was one of a series of events we’re hosting as we refresh the cross-government Technology Leaders network, and experiment with different types of events to strengthen the community.

‘We’re all customers’

We kicked off the discussion with some short talks. Kevin Gallagher, Digital Change Director at HM Courts and Tribunal Service (HMCTS), talked about the work HMCTS is doing to transform the justice system to improve citizens experience. As Kevin pointed out ‘we’re all customers of the justice system as we all rely on it working’ and he showed a video that they’re using internally to communicate the huge importance of the work they’re doing and keep staff focused on the core principles.

Representatives from the event organisers Gartner then spoke about best practice from around the globe. They emphasised the importance of trust in the use of data to allow government to use it effectively. Creating a wider ecosystem will allow for better opportunities for citizens, creating one central place they can approach for help with jobs, accommodation, health care and more.

I wrapped up the talks with a brief overview of the work I’d seen on a recent visit to the Indian government. The roll out of India’s biometric identity programme is impressive and we can learn a lot about how they delivered effectively at such a huge scale, but the idea of building trust between citizens and government with successful delivery of high quality services had been a recurring theme.

‘We will do most of these things just once in our lives’

With this session we wanted to give colleagues a chance to think outside the day-to-day pressures of delivering specific programmes, and so, following these talks we discussed some of the broader responsibilities that come with influential positions and what we can do.

Citizens don’t choose to use government services, it’s a necessity. Often those services are only needed in times of hardship. And many services are things you will only do once in a lifetime, such as reporting a relative’s death. We have a responsibility to make sure that we make using those services as efficient and pain free as possible.

A common theme across all our work has been the need to be able to provide services across (and sometimes in spite of) existing organisational boundaries. Citizens don’t care whether they are talking to HM Revenue and Customs (HMRC), Department for Work and Pensions (DWP) or Ministry of Justice (MoJ). They just want or need to use a service. The same information shouldn’t need to be given multiple times to different places, you should be able to provide it once and all relevant parties are informed.

We’ve got a lot of work to do to be able to do that smoothly but an interesting theme was the idea of the ‘invisible department’. If we’re successful in this how do we ensure that we still make our internal working sufficiently transparent that we can easily be held to account?

That only gets more important as we think about the opportunities to automate large parts of existing processes. We’ve barely begun to scratch the surface of connecting up the experience of services with democratic roles in influencing them and there’s space for a lot more thinking and experimentation there.

‘The most rewarding work is when I’ve seen a positive impact on citizens’

Another big theme we discussed was the idea of openness being key to trust. Without visibility of data, citizens can’t actively contribute to conversations or have an informed opinion. The more data that we publish and make available, the more interesting and useful debates can occur.

The group did note that data can be interpreted differently which means there is a risk that information can be cherry picked and used to create a false narrative. The agreement in the room was that it’s best to share everything and allow those conversations to take place.

We also touched on the ongoing public debate about how we supply, protect and use personal information. We’ve begun to hear from commercial companies who are looking to government to provide refreshed guidelines on the use of data in order to focus the public conversation.

It’s important that government is careful to reach a balance between being a practitioner, policy maker and regulator. We need to think carefully about this balance and as we change the skills profile within government there are big new opportunities to bring those roles closer together. As the perceived pace of change and innovation increases, that just becomes more important.

Currently the trust issue between the citizen and the state is one of the biggest challanges around the use of certain future technologies, such as machine learning, and we need to make use of the growing expertise and experiments happening within government to ground those conversations.

Nov 16

Talking challenges and change at GitHub Universe

This was originally posted on the Government technology blog


A short while back I was fortunate to speak at GitHub’s annual Universe conference in San Francisco. GitHub is a tool we use extensively at GDS and it was good to be able to talk with other customers, share how we’re using the product, and discuss the future product direction with their core team.

I was asked to tell the story of GOV.UK and that’s what most of my talk covered. Starting with Martha Lane Fox’s report, the alpha version of GOV.UK and the story from there. I was keen to emphasise a number of points.

Starting small

When we start out trying to do something big, we often think that means we need to start with a big team. You may eventually need a lot of people to make a big change but successful delivery starts with teams that know and trust each other.

The reason that we were able to move fast was that we started small and built trust in the core team before we scaled up. We were able to prioritise trust and the team’s sense of responsibility (to each other and for the work) before working out the co-ordination and governance challenges that come with a bigger team.

Trust, users, delivery

The GitHub audience was naturally interested in our decision to code in the open. Preparing for the talk I’d tried to find some record of the moment that we first started developing GOV.UK code in the open but couldn’t find any, which shows what a simple decision it was.

This was partly because we had a mandate to do things differently, but fundamentally it was because we were working in a high trust environment where the team understood our responsibility and those sponsoring the project trusted us to act accordingly.

When decisions of that sort need to go through multiple layers of sign-off radical change rarely happens, but when teams are trusted significant changes become almost unremarkable.

Getting close to users

Our former colleague Tom Loosemore’s tweet defining digital has been cited continually since he posted it, and this talk was no different.

While some of the biggest challenges in making government work for its users lie far deeper into the system, it’s no accident that we started with a website.

To really reform the deeply embedded culture, practices, processes and technologies of a large organisation you need to get close to your users. You need to go to them with high quality user research, but a website is the primary way that the majority of people now come to you and is an incredibly valuable source of data about what they need and value. As a website, GOV.UK laid out a new vision for what interaction with government could be, and for how we could meet that across our internal silos. Change that’s worth making works from the outside in and GOV.UK was the first step on the way.

Professional responsibility

Having touched on trust and responsibility when introducing opening our code I want to dwell on that point. A sense of broader responsibility is something that naturally comes with the civil service values and public service ethos but there are some areas in which it’s worked out that might be of particular interest to the sort of tech audience that was at the conference. Reflection on responsibility is vitally important as technology professions dominate so much of how the modern world is evolving.

I started with the responsibility to put accessibility over exciting technologies. In an industry so dominated by new JavaScript frameworks and fascinating new techniques it can sometimes feel like we’re constraining ourselves with the very simple interaction patterns on GOV.UK, but our work is more successful when we focus on what’s right for our users before satiating our own curiosity. Robin Whittleton has recently written about why we are committed to progressive enhancement (unfortunately we published just after the conference so I couldn’t reference that).

As you get deeper into transformative change you have to work on the mechanics of the system and I used work to change the security classification system as a striking example of that (and hope I was clear enough that that work was led by some of our Cabinet Office colleagues, not GDS). Those changes were vital to strip away a lot of the complexity and legend that had built up around the old system, to let us adopt new ways of working and to put more responsibility in the hands of specific civil servants and their teams.

What’s been challenging is finding the right way to bring people with us, to make sure that a change that enables the confident trailblazers doesn’t paralyse people who need more guidance or have to make a lot more decisions than they used to. This is still a work in progress but it’s vital to be conscious of that challenge if you want to make changes stick.

More broadly I wanted to acknowledge that very few of the big problems the world faces are specifically about technology. They’re about systems and people. Thinking about the technology and grasping new technologies can help us understand those problems in new ways and find other ways to resolve them, but we need to keep remembering that context.

Everything changes, and often quickly. User needs shift, our understanding of them increases, the way technology can meet those needs evolves. Most of the time our job as technologists is to minimise the cost of changing the systems we inherit or develop so that they can rapidly react to those changing expectations.


GitHub did an incredible job of creating a diverse, inclusive and welcoming event, and I’m very grateful to them for inviting me. It was particularly good to see a growing presence for public sector work at the event, with Alvand Salehi representing the US government’s new federal source code policy (which GDS provided some support with), Clarence Wardell talking about the Whitehouse’s police data initiative, and Ian Lee from the Lawrence Livermore National Laboratory talking about “developing open source in service to national security”.

Videos of all the talks, including my own, are available on YouTube.

Nov 16

Lead the programme for the new Technology Leaders Network

This was originally posted on the Government technology blog


We’ve written a few times recently about the work that we’re doing to refresh the government Technology Leaders Network. There’s lots going on around the network and we’re working hard to keep up the momentum and make sure we’re addressing the right challenges in the right way.

We are now looking for a passionate technologist and organiser to help us with that. We want someone who understands modern technology delivery and the opportunities that brings for government, and who will help build and focus the community.

The role

Within GDS the Technology Leadership Manager will work closely with the leadership of the Technology Group, the tech leadership coordinator, and our leads on open standards and open source. They’ll also spend a lot of time working across government making sure technologists at all levels are able to fully participate in the network and building the community.

Skills you may need for the role include:

  • a deep understanding of modern technology practices and tools and the challenges organisations have in moving to them
  • experience building an active, supportive community that helps its members learn
  • excellent negotiation and communication skills, able to find pragmatic solutions to complex problems
  • an appreciation of the role of governance and methods for supporting collaborative work across teams
  • good commercial awareness and an ability to build effective partnerships with suppliers

Why join the team?

The role allows a unique opportunity to work with some of the brightest technologists across government. You’ll find yourself doing a wide variety of tasks, from liaising with Chief Technology Officers to shaping the forward plan for network discussions and attending events.

Why now?

This is an exciting time for GDS and for government. As we work to deliver the GDS mission: to support, enable and assure government departments to deliver the technology they need to enable citizens to engage with government.

This role will be essential to sharing best practice and enabling open discussion between tech leaders from across government. It will also open up opportunities for technologists from all levels to get involved with community activities and networking.

What next?

If this all sounds like your kind of challenge you can find the full job specifications on civil service jobs. Opportunities to apply will close Sunday 6 November.

If you have any questions about the recruitment process contact the GDS recruitment team.

Oct 16

Sharing and Learning at the Technology Leaders Network: Enterprise Data Hub

This was originally posted on the Government technology blog


The Technology Leaders Network was established in 2013 to help departments work collaboratively, share good practice and drive the technology agenda across government. Over the summer we have been working on plans to breathe new life into the Network; looking at how we can make the most out of bringing Technology Leaders together.

Traditionally Tech Leaders have met for monthly board meetings with a restricted attendee list and a formal set of minutes circulated afterwards. We will continue to have those meetings where specific decisions are needed, but want to make the Network more transparent and open it up to the wider pool of technologists we have across government. We’re going out to departments more, learning from external speakers and seeing demonstrations of tools in action.

In September we held the first session since our refresh: a live demo of the HMRC Enterprise Data Hub (EDH). Nigel Green, Director IT Delivery at HMRC, describes the hub below.

The hub of all knowledge

Can you imagine storing and managing over 10 million filing cabinets worth of customer data? Then working out how to keep that data relevant, correct and useable? It’s a bit of a conundrum but since 2014 we’ve been working on a solution called the Enterprise Data Hub (EDH) and we’ve just reached a major milestone in its development.

A bit of background

At the moment our customer data is spread across 11 separate data warehouses with some information only available to parts of HMRC, so cross referencing can be tricky.

EDH will bring all our customer data into one usable place. It’ll let us use technological advances to store and analyse customer data using freely available open source tools and commodity hardware. It will save money and give us new ways of interrogating data. We’ll be able to work smarter, quicker and ultimately increase our tax revenues.

Finding a solution

Our technical solution is Apache (Hadoop), an open-source software framework for storage and large-scale processing of data sets on clusters of commodity hardware. Instead of relying on expensive, proprietary hardware and disparate systems to store and process data, this enables the distributed parallel processing of huge amounts of data across relatively inexpensive, industry-standard servers that both store and process the data, and can also scale to accommodate very large data volumes.  

It can handle all types of data, including structured, unstructured, log files, pictures, audio files, communications records and email.

Maintaining security

To protect our customers’ data we’ve incorporated a world leading approach to a technique called ‘Tokenisation’.

Tokenisation is a reversible method for replacing sensitive data with non-sensitive ‘tokens’. It provides similar security benefits as encryption, but retains the vital usability of data for our business processes. This gives us unprecedented capability to securely manage customer information from one single control, and to tightly control access to de-tokenised data giving greater protection to vulnerable or at risk customer groups.

This tokenisation technique is a world first, and we know that a number of banks are very interested in what HMRC are doing, and want to use a similar solution themselves.

What is the major milestone?

Well, we’ve just reached the point where we can securely upload customer data from anywhere within the department. This may not sound like much, but because of the complexity of how our data was stored before and our determination to develop a secure approach it’s a massive step forward.

This means we can move onto the important tasks of migrating over data and services and use EDH for our key transformation projects. We can start bringing data tools together across HMRC and identify future opportunities to analyse and use our customer data smarter. We’ll be able to combine data and customer analysis to improve services and customer experience.

And on a practical level we’ll start making significant savings as we decommission each of the data warehouses.

Transformation of our analytical capability will be a game changer in terms of HMRC’s digital ambition and achievement of our revenue generation objective. There’s also potential for cross government transparency with UK economic benefits and shared use of cloud storage. The possibilities are endless.

Stay up to date

You can find out more about the work being done by HMRC by following their blog HMRC digital.

We’ll make sure we blog about the work of the Tech Leaders Network in the future and showcase opportunities to get involved.

Nigel Green is Director of IT Delivery at HMRC

Sep 16

We are renewing our Cloud First commitment

This was originally posted on the Government Technology blog


A few weeks ago I spoke at an event organised by Oracle focussed on how the public sector uses the cloud. I talked about why we are committed to the cloud, how this commitment has changed how we think about technology, and what we plan to do next. I spoke alongside my Government Digital Service (GDS) colleague Iain Patterson, who updated attendees on the work of Common Technology Services (CTS).

The Cloud First policy

In 2013, the Cloud First policy set out government plans to focus technology commitments on where it can most add value. This put an end to the government running its own data centres, and sometimes running its own software.

The policy (part of the overall Technology Code of Practice) reflected a change of thinking. For a range of crucial components, we can now clearly think about what we need from technology and consume them as products and commodities, rather than build or buy the entire stack as custom pieces.

We can now work flexibly, separating services so they can be defined in ways even computers understand (APIs). We use resources elastically, summoning them when we need them, and paying for them on a utility basis (based on the number of users we have or the amount of processing we’re doing). Cloud First means less of our time is focussed on procurement negotiation and complex licensing, and more on making our tools robust and responsive.

The cloud reduces the time and cost to get something up and running, making change easier and empowering our people to experiment with new tools and approaches.

Change is constant for us, whether it’s changing user needs, a growing understanding of how to meet them, or the wider landscape of changing challenges and opportunities. It’s our responsibility as technologists to minimise the cost of change so that our teams can continually iterate services.

Clarity around system components

Too often we use the wrong labels and concepts when discussing the technology that powers our services. In thinking about what we want from cloud we need to do better.

Software engineers are trained to talk about very low-level building blocks at the library or microservice level. Meanwhile it’s still too common to find discussions taking place around around large, monolithic concepts like enterprise resource planning (ERP) and case management. This dichotomy in terms reflects the problems with the old fashioned business process design approach.

Even though there’s inherent confusion surrounding software labels in this traditional approach to procurement, the products have to be bought anyway. Complicated compromises have to be made between the functionality of large pieces of software, the cost of customising it, and how to effectively meet user needs. We are locked into a vendor and have to spend a lot of money on configuration.

New practices let us think more flexibly. Being clear about how our systems break down into components let’s us understand how we can use utility and commodity elements and where we should be innovating ourselves. API-enabled software as a service let’s us blend off the shelf components into broader systems using common standards, with a clear understanding of where canonical data lives.

We can make smarter use of what the market offers, and develop the talent, understanding and experience in our own teams. We also have a far richer toolbox than we’ve ever had before to respond to rapidly changing user expectations.

Our future plans

Over the next few months we’ll refresh the Cloud First policy and do more to share existing good practice. For example, we’ll work with our colleagues in security and across government to improve our security advice and do more to make sure that the changes are clearly communicated to the right people.

The Cloud Security Principles were a big step, but it’s clear that worked examples will make them more valuable. We’ve still got lots of myths to bust and best practice to share, to prove that most of the time public cloud is at least as secure as your own hosting.

We’ll also be updating and reviewing the range of tools that we can promote to help departments make the transition. Crown Hosting has helped many departments reduce their datacentre commitments and get a better understanding of what they have and will remain crucial, but not every journey to the cloud is the same and there are a huge range of tools and techniques out there to help.

We’re refreshing our approach to Open Standards, making sure we can identify, select and contribute to standards even more flexibly than before. We need to be clearer with the market about what open standards we expect to see, and share our needs with the groups developing new standards.

We’ll be providing more clarity for civil servants and the market about what we think good looks like for different types of cloud use. There’s huge variation in the maturity of Software as a Service applications and often that’s good. It’s evidence of a fast moving, innovative marketplace. But it’s important we’re clear about what we need from the market so we can have the right interfaces, the right security, and simplicity for users.

We’re bringing colleagues from across government together to look at some of our core business processes and the tools that support them. Over the past few years there’s been a lot of smart thinking, but it’s time for much sharper language and thinking about how we label things, what common components can make things more efficient and so on.

At GDS this involves working together across service design, Digital Marketplace, Common Technology Services and Government as a Platform (GaaP).

The changes in the technology market over the past few years are incredibly exciting. We’re aiming to ensure that the work we do in this area is helpful not just to accelerate our work in government but also to contribute back to the growing field of knowledge about how best to use the tools we now have at our disposal and what they can help us achieve.

Fundamentally, we’re talking about Cloud because we believe that the change it represents is a huge opportunity to put more focus on meeting users’ needs, more rapidly, more flexibly and more effectively.

We would like to hear about any obstacles you have encountered in making the transition to the cloud. Your comments will help us develop our work. Contact the Technology Group or comment below.

Aug 16

Developing our technology network and approach

This was originally posted on the Government technology blog

FullSizeRender (30)

I’ve been with GDS since its very first days, leading its technology communities, shaping our approach to architecture and development, providing assurance and direction across a number of programmes, and building connections across government. I’m really excited to now be taking on a clearer role developing our approach to technology leadership, open standards and architecture and I want to tell you a bit more about this work.

Technology leadership

Over the last few years new connections have formed across government as new people have joined, more people have got involved in digital transformation, long-time civil servants have connected with new ways of working, and our relationships with suppliers have become more open.  

My focus is to help these connections bring even deeper change, making sure the government has the right technology to meet new challenges.

Our technology community will work more closely with our Digital Leaders and Data Leaders groups, recognising that we focus on different dimensions of the same problems and that the solutions we come up with will invariably require expertise from all three groups.

We’ve still got a lot of older technology that’s expensive to change and we’ll work together to move away from that, but we’re also now in a position to start focusing more on where we want to be and what new technologies allow us to do.

Within that context we’ll work with the Technology Leaders Network to refine and update our strategy, policy and guidance on our core building blocks like hosting, networking and effective use of cloud tools. The goal is to accelerate the move to technologies that respond to users’ increased expectations.

Technology Leadership isn’t solely about what Chief Technology Officers (CTOs) decide and do. CTOs will need to make some big decisions about our future direction, where we’ll invest and what risks to take. To be effective, though, we need to empower technology leaders at all levels of government and that will be a growing emphasis in our work.

We’ve already run social events and unconferences to connect up architects and developers working in government and we’ll build on those to look at a variety of ways to connect up teams and practitioners to help respond to challenging topics.

Open Standards

After bringing in the right skills, careful use of Open Standards (standards on how to connect software together or how to structure data to meet our Open Standards Principles) is one of our most powerful tools for taking control of our technology destiny.

Too often we see organisations locked into specific products from specific vendors. The organisations accept the lock-in because the products offer  the only way to integrate with other product’s bespoke or proprietary interfaces

We don’t just get locked into one product or vendor, we get locked into an ecosystem and the cost of change becomes exponentially greater.

Open Standards break those locks. Truly open standards emerge from processes where domain experts build their understanding of how to break a problem into the right pieces, and then decide the right parts for the solution..

They open up markets by allowing new entrants to develop their products without worrying about complex licensing arrangements, and by making it easier for organisations to adopt different pieces at different times. They allow us to really understand the architectural dependencies in our systems, manage the cost of change, and move faster.

Over the past few years we’ve used the Open Standards Principles to identify and adopt a number of standards in government. We’re now engaging a broader community in that work to help teams across government understand the role of standards, identify appropriate standards, engage in the development of standards in a well-coordinated way, and where necessary get agreement for standards across government.

Dan Appelquist has been helping lead on this work over the past few months building communities around technical and data standards within government.

Taking a fresh look at problems

We often take incremental approaches to changing technologies. This means it’s tough to dedicate the time to considering problems in genuinely different ways.

Our work with the Technology Leaders Network will sometimes identify areas where a loose network of government teams won’t be able to make the progress we need and instead a small, focused-team is needed to explore, prototype and find new approaches. This is not so much an innovation team approach, but more some space to take a fresh look at a problem.

Sometimes the focus will be giving a little extra support to unblock projects people are already doing, providing a hub for teams assembled across government, or helping with some commercial arrangements. It’ll depend on the particular problem. We’ve started small, supporting work on improving the security of service.gov.uk but will be looking broader soon.

Open source

Alongside Open Standards, the past few years have seen us transform our relationship with open source software. The Service Standard requires new code to be released under open licenses, and many contributions have been made to open source projects.

We now want to build on that work with a more concerted approach to open source. This approach will involve building collaboration and reuse internally and making higher impact contributions to the wider open source community. There’s an enthusiastic and committed group of developers across government ready to work on this and we’re currently recruiting someone to lead and facilitate that.

Government as a network

To accelerate our progress we need to find ways to work across organisational boundaries, since these boundaries almost always involve trade-offs. Thinking of ourselves more as a network and less as a hierarchy is vital to ensure we connect the right people and expertise, understand where the real problems lie  and move ahead together.

We’ll be writing to technology leaders and other members of our leadership networks over the next few weeks with more information on how to get involved with this approach. We’ll also blog regularly about the work as it develops in order to make sure that it’s as open and accessible as possible.