Arch Capital Services LLC Logo

Arch Capital Services LLC

Lead Data Engineer

Posted 10 Days Ago
Be an Early Applicant
In-Office
London, Greater London, England, GBR
Senior level
In-Office
London, Greater London, England, GBR
Senior level
The Lead Data Engineer will design and implement data architecture and management practices, develop data pipelines, collaborate with stakeholders, and drive innovation in data engineering solutions.
The summary above was generated by AI

With a company culture rooted in collaboration, expertise and innovation, we aim to promote progress and inspire our clients, employees, investors and communities to achieve their greatest potential. Our work is the catalyst that helps others achieve their goals. In short, We Enable Possibility℠.

Role Summary and purpose:

Reporting to the Head of Enterprise Data, we are seeking a Lead Data Engineer with deep expertise in Data Vault design and a proven track record of delivering high‑quality data solutions. You will partner closely with our technology and development leads to define, shape, and implement Arch’s data architecture and data management practices.

This is a dynamic, hands‑on role ideal for someone who is passionate about technology and motivated by the opportunity to drive meaningful, long‑term impact across the organisation’s data landscape.

Key responsibilities include:

  • Landscape understanding & platform support: Lead the effort to establish a clear view of our current database and application estate, supporting operations in maintaining, rationalising, and optimising existing data platforms.

  • Strategic platform development: Help design and build the strategic data platforms Arch is moving toward, ensuring they are robust, scalable, and aligned with business needs.

  • Standards, tooling & best practices: Define and implement the tools, patterns, and practices required to deliver efficient, high‑quality, and data‑driven engineering solutions.

  • Technology evangelism: Act as an advocate for modern data engineering approaches, championing innovation and continuous improvement across teams.

  • Innovation & awareness: Stay current with emerging technologies, techniques, and capabilities—applying relevant advancements to improve delivery and engineering effectiveness.

  • Leadership & problem solving: Demonstrate strong leadership behaviours, paired with practical operational experience and an ability to tackle complex technical challenges.

  • Collaboration & roadmap shaping: Work closely with architecture and technology leaders to define the roadmap for evolving Arch’s data engineering practices, grounded in industry trends and current organisational capabilities.

To excel in this role, you’ll combine deep technical expertise with a hands‑on, delivery‑focused mindset, driving data initiatives that improve the efficiency, agility, and value of Arch’s data ecosystem.

Key tasks and responsibilities:

 

Arch Operations

 

·       Collaborate with business stakeholders to translate requirements into actionable technical tasks and ensure their successful delivery.

·       Work with ancillary teams to support the data warehouse, driving improvements in data quality, reporting, and coordination with source system teams.

·       Partner closely with the data architecture team to enhance the data warehouse, contributing to design discussions, reviewing architectural plans, and ensuring alignment with best‑practice standards.


AEIS Data Warehouse

·       Operate effectively within agile sprint teams, contributing to sprint planning, daily stand‑ups, and reviews while supporting continuous improvement of team processes.

·       Design, build, and manage ELT processes to integrate data from multiple sources into the data warehouse, ensuring consistency and quality across systems.

·       Monitor and optimise data pipelines to ensure reliable, efficient operation.

·       Maintain a strong focus on data quality, demonstrating attention to detail and rigorous validation practices.

·       Deliver high‑quality code end‑to‑end — including design, implementation, unit testing, refactoring, and documentation.

·       Automate deployment processes to ensure consistent, repeatable, and reliable releases.

·       Monitor automated systems, proactively identifying and resolving issues.

·       Write, maintain, and improve unit tests to ensure code quality and early issue detection.

·       Collaborate with developers to improve test coverage, reliability, and overall engineering standards.

·       Ensure all new code meets established standards for readability, performance, security, and documentation, including performing and participating in code reviews.

·       Apply DevSecOps principles to integrate security into all stages of the development lifecycle.

·       Integrate and manage tools across the data stack — including ETL platforms, orchestration tools, and data management components — ensuring seamless interoperability and optimal performance.

·       Continuously learn and experiment with modern technologies, applying new knowledge to improve systems, processes, and overall engineering maturity.

·       Stay informed on industry trends, using this insight to drive innovation and optimise data engineering practices.


Experience requirements and skills:

  • Extensive hands‑on experience designing, developing, and maintaining data pipelines and ETL/ELT processes. Data Vault 2.0 certification highly desirable.

  • Expert‑level experience with Snowflake or other cloud‑based data warehouse technologies.

  • Strong hands‑on experience with orchestration tools such as Airflow (or equivalent).

  • Deep knowledge of relational and non‑relational databases, including RDBMS proficiency and modern data warehouse design.

  • Familiarity with DevOps practices, CI/CD pipelines, automation, and containerisation technologies (e.g., Docker, Harness, Kubernetes).

  • Knowledge of cloud‑native architectures and modern application frameworks, including REST APIs, microservices, Spring Boot/.NET Core, GitHub, Jenkins, OpenShift, BPM, SQL, Oracle, NoSQL, AMQP/Kafka.

  • Strong understanding of private cloud, IaaS, PaaS, and SaaS models, with extensive experience across Azure and AWS.

  • Broad understanding of modern software engineering methods, tools, and best practices.

  • Proficiency in SQL (including ANSI SQL) and experience with Python or other programming languages used in data engineering.

  • Experience with application and data testing automation tools and best practices.

  • Strong grounding in agile methodologies, with proven experience applying them to large‑scale technology delivery.

  • Strong strategic thinking and long‑term planning capabilities, with the ability to balance ideal architectural solutions against pragmatic business needs.

  • Excellent communication and interpersonal skills for collaborating with diverse technical and non‑technical stakeholders.

  • Strong analytical, problem‑solving, and decision‑making skills, with a focus on delivering reliable, scalable, and high‑quality solutions.


Experience & Education

Required knowledge and skills are typically obtained through a Bachelor’s degree (or equivalent experience) and 10+ years of relevant experience in software development, systems infrastructure, and architecture design — including project management, business analysis, and hands‑on data engineering.

Data Vault 2.0 certification is highly preferred.

Do you like solving complex business problems, working with talented colleagues and have an innovative mindset? Arch may be a great fit for you. If this job isn’t the right fit but you’re interested in working for Arch, create a job alert! Simply create an account and opt in to receive emails when we have job openings that meet your criteria. Join our talent community to share your preferences directly with Arch’s Talent Acquisition team.

14101 Arch Europe Insurance Services Ltd

Top Skills

Airflow
AWS
Azure
Data Vault 2.0
Docker
Kubernetes
Python
Snowflake
SQL

Similar Jobs

2 Days Ago
Hybrid
Senior level
Senior level
Artificial Intelligence • Semiconductor
The Lead Data Engineer will design and oversee data pipelines and platforms, ensuring reliability, scalability, and governance while collaborating with various stakeholders.
Top Skills: AthenaAurora PostgresqlAWSCi/CdDbtFlaskGlueLambdaPrefectPythonRedshiftS3Streamlit
10 Days Ago
In-Office or Remote
London, Greater London, England, GBR
Expert/Leader
Expert/Leader
Blockchain • Fintech • Payments • Financial Services • Cryptocurrency • Web3
As a Lead Security Engineer, you will architect and manage Circle's security data platform, ensuring robust data ingestion, normalization, and response strategies while collaborating on security operations initiatives.
Top Skills: AthenaAWSGlueKafkaMskPythonS3SQL
2 Days Ago
In-Office
London, Greater London, England, GBR
Senior level
Senior level
AdTech • Marketing Tech
The Lead Data Engineer will define data engineering strategies, lead technical practices, manage teams, develop data solutions, and engage with stakeholders and clients.
Top Skills: AWSBigQueryCi/CdCloud StorageDataflowDbtGoogle Cloud PlatformPower BISQLTerraform

What you need to know about the London Tech Scene

London isn't just a hub for established businesses; it's also a nursery for innovation. Boasting one of the most recognized fintech ecosystems in Europe, attracting billions in investments each year, London's success has made it a go-to destination for startups looking to make their mark. Top U.K. companies like Hoptin, Moneybox and Marshmallow have already made the city their base — yet fintech is just the beginning. From healthtech to renewable energy to cybersecurity and beyond, the city's startups are breaking new ground across a range of industries.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account