Skip to main content

Quicker delivery on ministerial commitments using cloud architectures and fast flow principles

An image of data on a laptop

The Data Services and Analytics (DSA) portfolio sits in Digital, Data and Technology (DDaT) in the Home Office. It forms part of the wider Enterprise Services portfolio as it serves all of the Home Office and not just specific directorates.  

DSA is responsible for taking multiple data sets, ingesting, transforming, enriching and analysing those data sets, before providing capabilities for our users to consume the information. This varies from the more traditional data analytics functions, such as Management Information (MI) reporting that goes up to senior leaders within the Home Office, to ministers and to the public.  

DSA combines and matches data sets to utilise intelligence at the border and target people and freight of interest. Without many of these systems we wouldn’t be able to publish the data to enable ministers to make data driven decisions. The public wouldn’t be able to see that data and take part in open conversations on key issues. And we wouldn’t be able to keep our country safe.  

Our ambition is aligned with the principles set out in the Home Office Digital, Data and Technology Strategy 2024. We’re continually challenging ourselves to become the leading centre for data analytics across government.  We work with colleagues across government to compare lessons learned and push ourselves to improve our data analytical capabilities.  

We’ve been cloud-first for a number of years now and are constantly exploring use of the ever-growing managed services and serverless technologies being provided by the major cloud providers. We are a pioneer in building for the cloud within central government and in the last year have made substantial savings through undergoing rigorous FinOps initiatives.   

An image of Dimitris Perdikou, Portfolio Chief Technology Officer, Data Services and Analytics
Dimitris Perdikou, Portfolio Chief Technology Officer, Data Services and Analytics.

Setting out the existing problem  

The services we provide are in higher demand than ever before, with the Illegal Migration Act 2023 (IMA) being a case in point. Our operational teams are working under very tight timescales to turn the IMA into practise. DSA is innovating at speed to enable our caseworkers to achieve that objective.    

We continuously ingest immigration data, currently approximately 70 million events per month. This data is growing continuously and as we create new immigration routes such as IMA, we are also consuming new types of data.  

Our typical release process for surfacing new immigration data was taking us several months. For IMA we had a ministerial ambition to surface and build analytics within 6 weeks so we could provide analytics as soon as the first IMA cases were processed.  

It became abundantly clear that our existing approach to the data ingestion problem would not enable our users to deliver on key ministerial priorities, which prompted us to radically rethink our solution to this problem. 

We knew delivering within this 6 week timescale would prove difficult because of our mix of technology and how we were using our data. We faced numerous problems: 

  • we were still using data centres and a legacy immigration caseworking system that relied on a relational data structure instead of data streaming and non-relational data, meaning that when moving the data for analytical purposes we had to keep the fixed database structure or transform the data ready for it to be used on the analytics side  
  • the data was originally used predominantly for MI reporting and didn’t need to be available as often  
  • we relied on a central cloud platform team, including centrally managed Kubernetes clusters, rather than empowering teams to deploy their own infrastructure which created further dependencies between teams that were slowing us down 
  • we had tight coupling between teams when releasing new data which led to long testing times - tightly coupled teams have to develop their release, wait for the final team to be ready and then all teams test together before release into production. 

Our multi-pronged solution 

We took a multi-pronged approach to solving the problem. 

Streamlining the delivery teams 

To streamline we shifted to a single team empowered to deploy our own infrastructure and used industry expert knowledge on ‘fast flow’ at Team Topologies and Amazon’s two pizza team. 

We combined Developer and Operations teams (DevOps). This empowered a single team to deliver services from development all the way to production, introducing efficiency and a deeper knowledge of how to overcome challenges in production.  

One of the key differentiators we implemented was to create clear communication channels between the people:  

  • building the systems that generate the data   
  • building the systems that analyse the data  
  • consuming the information at the end   

This user-centred design approach led our technical decisions, resulting in rapid iterative development that met user needs.  

Leveraging a cloud native architecture 

We designed a solution using Amazon Web Services’ (AWS) cloud native services, such as AWS-managed Kafka, Glue and S3.  Previously we had spent a lot of time building the services and code ourselves, struggling to achieve the reliability and performance we required. By relying on AWS expertise we could focus on the challenge of meeting our IMA obligations.  

By using AWS cloud native services we were able to set up the infrastructure in a matter of hours and greatly reduce the future cost of maintenance.  We were able to continually stream the data to DSA through Kafka, transform the data into the Person, Object, Location, Event (POLE) model in Glue and then store the output in a flat structure in S3.   

Ingesting the data through Kafka and storing the data in S3 in a flat and backwards compatible structure meant we followed an event driven architecture. This gave us loose coupling between our data pipeline and the analytical products and shifted us to a model where each analytical product can independently build and release in a cadence that is best suited to user needs.    

The solution we implemented for the first time allowed us to perform data quality checks at scale which will allow us to build much more robust analytical systems.   

Developing cross-portfolio collaboration 

To meet the tight timescales we had to work very closely with our Migration and Borders Technology Portfolio (MBTP) who build the operational systems. We did this in 2 ways.   

Firstly, we embedded our Architects within the MBTP teams that were designing the front-end functionality for IMA. This meant any upcoming changes were immediately cascaded down, enabling us to adapt at very short notice. 

Secondly, the Data Analytical Service (DAS), a capability for understanding data quality which was built in MBTP, was open-sourced especially for us to build our IMA solution. We have not only built our solution on top of DAS but have since been contributing to the open-source project.  

This level of technical collaboration has allowed us to meet both customer demand and forge a long-lasting partnership with another portfolio.   

Utilising suppliers for short term work  

We had to very quickly bring in a new team to deliver the capability required. To do this we leveraged our strong existing relationship with our supplier, 6point6.  6point6 provided Solution Architects to design the solution and the Software and Data Engineers to implement it. Our long-term goal is to equip civil servants with these skills.  

Our solution enabled us to deliver key ministerial commitments in weeks rather than months. We have set ourselves up for a future where we can continuously adopt new data, analyse and provide it as key information faster than ever before.  

Read more about what we're doing across Data Services and Analytics at the Home Office in our blogs below. 

Sharing and comments

Share this page

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.