Job Type:
Permanent
Build a brilliant future with Hiscox
About us:
Hiscox is a diversified international insurance group with a powerful brand, strong balance sheet, and plenty of room to grow. Listed on the London Stock Exchange and headquartered in Bermuda, Hiscox has over 3,000 staff across 14 countries and 34 offices. Structured by geography and product, Hiscox’s long-held business strategy has helped them grow from a niche Lloyd’s underwriter to an international insurance group with a powerful consumer brand.
In our Lisbon office, we have the privilege of employing approximately 500 exceptional professionals representing 29 diverse nationalities. Despite our central location within the city, we recognize the importance of maintaining a healthy work-life balance. As part of our commitment to our employees’ well-being, we provide a 35-hour workweek along with the option for a hybrid working schedule.
Our team
London Market’s Data Engineering Chapter needs a new, experienced data engineer. We are looking for strong and dynamic team players to help us build modern solutions and deliver value to our business. You will be a leading member of the chapter mentoring more junior engineers and assisting the chapter lead.
Our team
The London Market tech team is a modern, agile technology department seeking to work as closely as possible with our colleagues across the company to deliver true business value in an agile fashion. We are a group of engineers (data, software, devOps, test), data scientists and agile practitioners. We organise ourselves into chapters according to these professional disciplines. You would be joining our data engineering chapter.
From these chapters we build cross-functional squads aimed at delivering business value in particular areas. These are tight-knit teams working closely with product owners and business stakeholders to determine what insights are required and working to deliver them. This means our engineers must be passionate in applying technology to drive results and advocate for appropriate solutions.
We build teams that can get stuff done and deliver value incrementally to the business; teams that understand what agility really is and always looks to keep things simple and avoids excessive amounts of code. We adopt best practices in the cloud and understand the importance of DevOps to ensure we can easily build, deploy and monitor solutions when they go into production.
Required skills:
- Hands on experience of functional and object-oriented Python.
- Experience working with a cloud data warehouse e.g. Snowflake, Databricks, Big Query. Snowflake experience is beneficial but not essential.
- Ability to create robust data pipelines that include data quality checks, monitoring and alerting, using tools such as Azure Data Factory.
- Understanding and experience in developing or maintaining CI/CD processes e.g. Azure DevOps pipelines.
- Experience of designing data models and an appreciation of which model to select for different use cases.
- Understanding and experience implementing measures to ensure data security and privacy.
- A robust understanding of core data engineering topics – ETL vs ELT, structured and unstructured data, data quality and data governance.
- Ability to contribute to all aspects of a solution – design, infrastructure, development, testing and maintenance.
- The ability to design and advocate for technical solutions to business problems.
- Effective collaboration with technical and non-technical team members through agile ceremonies – roadmap planning, feature workshops, backlog elaboration, code review.
- An understanding of cloud technology and a realisation that cloud-native solutions must be built differently from traditional ones.
- Track record of taking initiative and delivering projects end-to-end; clear evidence of being self-driven/ motivated
- Immense curiosity, high energy and desire to go the extra mile to make a difference.
- As an experienced member of the chapter you would not have line management responsibility, but you will exhibit leadership qualities setting technical direction and being a prominent voice in your squad and the chapter as a whole. We like humble leaders who empower others and ensure all voices are heard – we’d expect you to help coach and train more junior chapter members
Desirable Skills
- Experience using Infrastructure as code tools e.g. Terraform
- Experience with cloud workflow orchestration tools e.g. Airflow, Prefect, Dagster
- Experience with additional technologies, data science knowledge or a business background are all valued. We want to know how your unique abilities can contribute to our team.
Key areas of work:
- Working with business stakeholders to understand high value business problems that can be solved through the application of data processing and analytical systems
- Helping to design, build and support a new cloud-based analytics platform for the business
- Being a leading and professional member of the Data Engineering Team
- Supporting other teams including data scientists and data operations analysts.
- Understand business requirements and help refine into development tasks and estimate their complexity
- Research, evaluate and adopt new technologies with a right tool for the job mentality
- Focus on both speed of delivery and quality, with suitable pragmatism – ensuring your solutions are always appropriate and not too complex or over-engineered
- Quick progression of projects from proof-of-concept to post-production stage
- Communication and presentation of ideas to colleagues in all parts of the wider tech team
- Participating in code reviews for the Data Engineering Team
- Forming part of the chapter leadership and assisting the chapter lead in furthering chapter objectives.
- Mentoring and developing other engineers.
Our technology
We operate in a diverse technical landscape and are looking for flexible engineers who can adapt to and use many different tools. We would not expect any engineer to be familiar with the entire tech stack – even the chapter lead has only limited familiarity with some components owned by other team members. Instead, we seek people with a good understanding of data structures and algorithms and the ability to apply this knowledge in learning new tools.
We are working to build a cloud data platform to realise value from our data. That platform is based on Snowflake (running on Azure and GCP), Azue Data Factory for ELT and Prefect for orchestration. Most platform code is written in SQL or Python. We are model and transform everything using dbt.
We prepare data for use by analysts working with a variety of tools – Tableau, PowerBI, Dataiku, Python, R and even Excel.
We take a DevOps approach and strive for continuous integration / continuous deployment. We use Azure Pipelines to deploy our code. We deploy infrastructure the same way using Terraform and Docker. We handle schema migration using Flyway.
Our technology landscape is not fixed. Our engineering teams drive the technology we adopt.
Work with amazing people and be part of a unique culture
Top Skills
What We Do
Hiscox is a leader in specialist insurance. We seek to provide the best protection and peace of mind for our clients through high quality insurance products, backed with excellent service. We are experts in covering a wide range of personal and commercial risks.