We're putting together a talented team to build the #1 training platform for Runners
We help everyday runners become outstanding by building an incredible app providing world-class training, coaching and community for everyone, whether you're improving your 5k time or training for your first marathon.
We’re growing extremely fast! In November 2023 we closed a $6.5M funding round led by JamJar with participation from Eka Ventures, Venrex and Creator Ventures. In 2024, we were selected by Apple as one of three global finalists for the iPhone app of the year, reflecting the innovation and impact of what we’ve built & in 2025 we were acquired by Strava!
Our ambition is huge: to become the go-to global leading training platform for millions of runners everywhere. We’re growing with purpose and looking for people who want to build something meaningful with lasting impact. With the recent acquisition by Strava accelerating our journey, now is a really magical time to join. 🚀
The team you’ll joinWe are looking for a talented, creative, and positive team player to join our highly skilled cross-functional engineering team as part of data engineering.
You’ll be working closely with other data team members as well as a variety of stakeholders around the business to help build out the data platform capabilities and improving how we ingest, process, store, and query all the data we receive each day and use it to drive all of Runna’s data and analytics needs (including machine learning and AI models).
As part of the data engineering team, you’ll help build out data supporting the #1 running app in the world, revolutionising the way that people train and use fitness apps!
What you’ll be doingBuild, test, and deliver state-of-the-art data platform functionality to support the data needs of our rapidly growing company.
Implement scalable and efficient data pipelines using both ETL/ELT processes to ingest and store large volumes of data within AWS and Snowflake.
Implement data transformation logic to cleanse, validate, and enrich raw data for analysis and consumption by downstream applications.
Embrace data platform best practices by designing and developing secure, scalable, and reliable data pipelines.
Maintain and expand our comprehensive data catalogue and dictionary by documenting new data models and data sources.
Collaborate with cross-functional teams, including product, growth, engineering, and business stakeholders, to ensure the data platform aligns with company goals and drives value.
If you don’t quite meet all of the below skills, we’d still love to hear from you as we might be able to tweak the role slightly or offer you a position better suited for you. You can apply directly below or contact us if you’re still unsure.
Relevant Experience: Currently pursuing a degree in a relevant field (e.g. Computer Science, Engineering, Maths) or equivalent experience.
Computing Fundamentals: A solid understanding of computing fundamentals.
Technologies:
Python: You will have experience programming in python and be comfortable solving programming problems using python.
SQL: You will have a solid understanding of SQL and how it is used to query data in a database including joins, filtering, aggregations, and window functions.
Databases: You will have an understanding of the fundamentals of database structure, the differences between tables/views, and functions/procedures.
Adaptable: You will be a fast learner and able to quickly adapt to new technologies outside of your comfort zone.
Detail Oriented: You will have an eye for detail and take pride in the quality of your work.
Ways of Working: Enthusiasm for our ways of working which include:
Iterative development, continuous deployment and test automation
Knowledge sharing, pair programming, collaborative design & development
Shared code ownership & well written documentation
Infrastructure as Code: Experience with Terraform or other IaC tools.
Continuous Integration/Continuous Deployment: Knowledge of CI/CD best practices and experience with methodologies and tooling, such as GitHub actions.
Fitness: An interest or experience with health/fitness technologies.
Cloud: Experience developing solutions in a cloud environment (e.g. AWS, GCP).
AWS Cloud: DynamoDB, S3, SQS, SNS, EventBridge, Lambda
Languages: Python, SQL, shell
Data Platform: Snowflake, Omni Analytics
Automation: GitHub, GitHub Actions, Terraform
Ways of Working: Linear, Notion, Kanban
We'll be growing our package of benefits over time. We currently offer:
£42.5k salary (prorated over your internship)
Flexible working (we typically work 3 days from our office in Vauxhall)
10 days flexible holiday, with the choice to use your bank holiday allowance on days that better suit you
Bi-weekly team run and lunch
Socials throughout the internship (Tech and company wide)
Our goal is to make the interview process as simple and enjoyable as possible. This process consists of the following stages:
Kick off! apply below (application deadline is 3rd of April, so the sooner you apply the better!)
Once you have applied, you will receive an update on your application before the 7th of April
First interview: 25-minute live coding technical interview with our Engineers (This will consist of two exercises from https://leetcode.com/) and then a 10 minute interview with Josh, Talent Partner
Second interview: 25-minute chat with one of our Senior Engineers and CTO (This will consist of general tech and motivational questions)
Once the process is finished, we promise to let you know our decision as soon as possible.
Please let us know if there’s anything we can do to better accommodate you throughout the interview process - this can be from scheduling interviews around childcare commitments to accessibility requirements. We want you to show your best self in the process, so please speak to your Talent Partner.
Top Skills
Runna London, England Office
London, United Kingdom


