Markel Logo

Markel

Senior Data Engineer

Posted Yesterday
Be an Early Applicant
In-Office
London, Greater London, England
Senior level
In-Office
London, Greater London, England
Senior level
The Senior Data Engineer will design and develop cloud solutions, manage data pipelines, and collaborate with teams to improve business outcomes in an agile environment.
The summary above was generated by AI
Are you an experienced Data Engineer looking for your next career move?
Help us assess the needs of customers, external business partners and contribute to the solution design and development to enable Markel to drive our desired business outcomes.
What part will you play? If you’re looking for a place where you can make a meaningful difference, you’ve found it. The work we do at Markel gives people the confidence to move forward and seize opportunities, and you’ll find your fit amongst our global community of optimists and problem-solvers. We’re always pushing each other to go further because we believe that when we realize our potential, we can help others reach theirs.
Join us and play your part in something special!

The opportunity:

Working as part of a small but friendly team, the role will be responsible for the design and development of critical initiatives and the implementation of our business solutions.
As a Senior Data Engineer, you will apply your design and development skills to help solve company challenges, understanding the current state of system functionality and domain standard processes, assessing the needs of collaborators including our external business partners, contributing to solution designs, and developing the solutions needed to achieve desired business outcomes. You will operate in an agile, collaborative environment that values your insight, encourages you to take on new responsibility, promotes continuous learning, and rewards innovation.

What you’ll be doing:

  • Assist with design & development of Azure Data offerings such as Databricks, PySpark, Spark SQL and ADLS
  • If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders
  • Excellent grasp of and expertise with test-driven development and continuous integration processes
  • Analysis and Design – Converts high-level design to low-level design and implements it
  • Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalise work plans
  • Create and run unit and integration tests throughout the development lifecycle
  • Benchmark application code proactively to prevent performance and scalability concerns
  • Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management
  • Support and Troubleshooting – Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, and Production environments
  • Assist other teams in resolving issues that may develop as a result of applications or the integration of multiple components
  • Create and maintain appropriate technical documentation and other project artefacts
  • Develop, test, and iterate on MVP solutions to operationalize new features and products

Our must-haves:

Databricks & Lakehouse Expertise

  • 3+ years of experience delivering cloud solutions using Azure Databricks, Delta Tables, Azure Data Factory, ADLS, Azure Data Lake, Azure VM.
  • Deep understanding of Delta Lake, including ACID transactions, schema enforcement, time travel.
  • Ability to design Medallion (Bronze–Silver–Gold) architectures for scalable analytics.
  • Experience with Unity Catalog for data governance, RBAC, lineage and secure data access
  • Strong knowledge of Databricks Workflows, Jobs, Repos, Asset Bundles, and CI/CD integrations.
  • Advanced Apache Spark (PySpark/SQL)
  • Expert in PySpark and Spark SQL performance tuning (partitioning, caching, AQE, skew mitigation, broadcast joins).
    Skilled in building high‑throughput ETL/ELT pipelines and optimising clusters.

Cloud Platform Skills (Azure-focused)

  • Azure Data Factory
  • Azure Databricks
  • ADLS Gen2
  • Azure Functions
  • Storage Accounts, Key Vault

Data Engineering Foundations

  • Data modelling: dimensional modelling, relational design, semantic models.
  • Strong SQL—complex transformations, windowing, analytical queries.
  • Experience with structured, semi‑structured, and streaming data ingestion (Auto Loader, Structured Streaming).

CI/CD & DevOps

  • Using Azure DevOps, GitHub Actions, or Databricks CLI/Repos for automated deployments.
  • Managing multiple environments (dev/test/prod) with parameterisation and environment config.

Data Quality, Observability & Governance

  • Building automated data quality frameworks (expectations, anomaly detection).
  • Monitoring pipelines, implementing alerting.
  • Experience ensuring regulatory compliance.

Desirable:

  • Experience gained in either an insurance/insurance-related business or a fast-paced financial services environment.
  • Experience working with toolset such as Jira, Azure DevOps, Ataccama or Confluence

Who we are:

Markel Group (NYSE – MKL) a Fortune 500 company with over 60 offices in 20+ countries, is a holding company for insurance, reinsurance, specialist advisory, and investment operations around the world.

We’re all about people | We win together | We strive for better | We enjoy the everyday | We think further

What’s in it for you?

  • A great starting salary plus annual bonus & strong benefits package…
  • 25 days paid holiday plus Bank Holidays, with the opportunity to buy/sell extra leave
  • Fantastic company pension scheme, private medical and dental cover, life assurance, travel insurance cover, income protection, season ticket loan as well as other great benefits on offer
  • There are countless opportunities to learn new skills and develop in your career and we can provide the support needed to do just that!

Are you ready to play your part?

Choose ‘Apply Now’ to fill out our short application, so that we can find out more about you.

Markel celebrates the value of a diverse workforce that brings experience and expertise from a wide variety of backgrounds and life circumstances. Whatever your background, if you feel you meet the requirements of this role then we want to hear from you. We are also happy to consider candidates who are looking for flexible working patterns.

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, colour, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status.

We will ensure that individuals with disabilities are provided with all reasonable accommodations to be able to participate in the job application or interview process and to perform essential job functions if successful. Please contact us via email at [email protected] or call us at 0161 507 5827 to request any accommodations that may be needed. This includes any alternative formats of any documents or information on how to apply offline.

#LI-DJ1

#LI-Hybrid #PlayYourPartUK

Top Skills

Adls
Azure Data Factory
Azure Data Lake
Azure Databricks
Azure Functions
Azure Vm
Data Quality Frameworks
Databricks Cli
Databricks Workflows
Delta Tables
Github Actions
Jobs
Pyspark
Repos
Spark Sql

Similar Jobs

4 Days Ago
Easy Apply
Hybrid
London, Greater London, England, GBR
Easy Apply
Senior level
Senior level
AdTech • Artificial Intelligence • Marketing Tech • Software • Analytics
Design and operate data processing pipelines for Zeta's AdTech platform, focusing on streaming and batch processes while collaborating with cross-functional teams.
Top Skills: AirflowArgoAWSAws KinesisBeamBigQueryCassandraClickhouseDruidDynamoDBFlinkJavaKafkaMySQLPostgresPythonRedisRedshiftS3ScalaSnowflakeSparkSQL
5 Days Ago
Hybrid
London, Greater London, England, GBR
Senior level
Senior level
Fintech • Mobile • Payments • Software • Financial Services
Design, build, and operate a scalable internal database platform, focusing on YugabyteDB components. Lead initiatives and ensure security, reliability, and efficiency in database management.
Top Skills: AnsibleAWSCi/CdCloud SpannerCockroachdbGCPPythonTeleportTerraformTidbYugabytedb
Yesterday
In-Office
Cambridge, Cambridgeshire, England, GBR
Senior level
Senior level
Information Technology • Internet of Things • Software • Virtual Reality
As a Senior Data Engineer, you'll design and maintain software solutions, collaborate with teams, and support PTC's platforms with a focus on data management.
Top Skills: GitGit LabGitKubernetesOracle DbmsPdmPlmPostgresSQL

What you need to know about the London Tech Scene

London isn't just a hub for established businesses; it's also a nursery for innovation. Boasting one of the most recognized fintech ecosystems in Europe, attracting billions in investments each year, London's success has made it a go-to destination for startups looking to make their mark. Top U.K. companies like Hoptin, Moneybox and Marshmallow have already made the city their base — yet fintech is just the beginning. From healthtech to renewable energy to cybersecurity and beyond, the city's startups are breaking new ground across a range of industries.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account