Bastion Logo

Bastion

Data Infrastructure Engineer

Sorry, this job was removed at 08:17 p.m. (GMT) on Thursday, Jul 10, 2025
In-Office or Remote
8 Locations
In-Office or Remote
8 Locations

Similar Jobs

5 Days Ago
Easy Apply
In-Office or Remote
Toronto, ON, CAN
Easy Apply
Senior level
Senior level
Software
The Software Engineer will handle data collection for model training, extending the ingestion pipeline, and collaborating with AI scientists to improve data operations at scale.
Top Skills: BashDockerGCPPythonTerraform
5 Days Ago
Easy Apply
In-Office or Remote
Vancouver, BC, CAN
Easy Apply
Senior level
Senior level
Software
The role involves data collection for model training, extending cloud infrastructure, and collaborating on the dataset roadmap to enhance product offerings.
Top Skills: BashDockerGCPPythonTerraform
5 Days Ago
Easy Apply
In-Office or Remote
Waterloo, ON, CAN
Easy Apply
Senior level
Senior level
Software
The Software Engineer will enhance data collection for AI model training, manage cloud infrastructure, and support data ingestion pipelines, collaborating with scientists and leadership.
Top Skills: BashDockerGCPPythonTerraform

About Bastion

Bastion enables financial institutions and enterprises to issue regulated stablecoins, generate revenue on reserves, and expand their ecosystems. Bastion’s platform combines stablecoin issuance, secure custody, and seamless orchestration for cross-border transfers, on/off-ramps, and stablecoin conversions. With Bastion’s platform and APIs, businesses can create and scale their stablecoin network, while optimizing revenue, compliance, and control.

Overview

As a Data Infrastructure Engineer, you will be responsible for building and maintaining critical data infrastructure. You will build ingestion, analysis, and reporting pipelines that are at the core of Bastion’s leading compliance and product offerings. You’ll also work with teams across the entire Bastion organization, including compliance, legal, and finance. 

Given the foundational nature of this role, you will also be responsible for selecting appropriate technologies, managing external vendor relationships, and fostering a data-driven culture across the entire company. You will be expected to architect and build both real-time and batch data pipelines that ingest data from a variety of sources, delivering them to our data warehouse. In addition, you will be responsible for establishing strong security and privacy controls around sensitive data.

Responsibilities

  • Architect, build, and maintain modern and robust real-time and batch data analytics pipelines.

  • Develop and maintain declarative data models and transformations.

  • Implement data ingestion integrations for streaming and traditional sources such as Postgres, Kafka, and DynamoDB.

  • Deploy and configure BI tooling for data analysis.

  • Work closely with product, finance, legal, and compliance teams to build dashboards and reports to support business operations, regulatory obligations, and customer needs.

  • Establish, communicate, and enforce data governance policies.

  • Document and share best practices with regards to schema management, data integrity, availability, and security.

  • Protect and limit access to sensitive data by implementing a secure permissioning model and establishing data masking and tokenization processes.

  • Identify and communicate data platform needs, including additional tooling and staffing.

  • Work with cross-functional teams to define requirements, plan projects, and execute on the plan.

Qualifications

  • 5+ years of professional engineering and data analytics experience, startup experience a plus.

  • Strong proficiency and comfort using SQL and Python to perform complex data analysis.

  • Recent experience building automation tooling and pipelines using a general purpose language such as Python, Golang, and/or Typescript.

  • Experience with modern data pipeline and warehouse technologies (e.g. Snowflake, Databricks, Apache Spark, AWS Glue)

  • Strong proficient writing declarative data models and transformations using modern technologies (e.g. dbt)

  • Experience building and maintaining cloud-based data lakes.

  • Prior experience with integrating real-time data streaming technologies (e.g. Kafka, Spark)

  • Prior experience with configuring and maintaining modern data orchestration platforms (e.g. Airflow)

  • Comfort with infrastructure-as-code tooling (e.g. Terraform) and container orchestration platforms (e.g. Kubernetes)

  • Strong preference to keep things simple, ship fast, and avoid overengineering.

  • Self-driven and ability to work autonomously.

  • Professional Web3 / Crypto experience is a plus.

Bastion provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, and placement. Bastion participates in E-Verify to authorize eligibility of employment in the United States.

What you need to know about the London Tech Scene

London isn't just a hub for established businesses; it's also a nursery for innovation. Boasting one of the most recognized fintech ecosystems in Europe, attracting billions in investments each year, London's success has made it a go-to destination for startups looking to make their mark. Top U.K. companies like Hoptin, Moneybox and Marshmallow have already made the city their base — yet fintech is just the beginning. From healthtech to renewable energy to cybersecurity and beyond, the city's startups are breaking new ground across a range of industries.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account