Yotpo is leading the next era of trust and loyalty in eCommerce. With AI-powered Reviews and Loyalty solutions, we help brands turn browsers into customers and customers into advocates. Through deep integrations across the eCommerce ecosystem and the trust of over 30,000 global brands, Yotpo delivers seamless omnichannel experiences that increase conversion, strengthen customer relationships, and drive profitable, long-term growth.
Your Mission at Yotpo As a Senior Data Engineer, your mission is to build the scalable, reliable data foundation that empowers Yotpo to make data-driven decisions. You will serve as a bridge between complex business needs and technical implementation, translating raw data into high-value assets. You will own the entire data lifecycle—from ingestion to insight—ensuring that our analytics infrastructure scales as fast as our business.
Key Responsibilities
- Strategic Data Modeling: Translate complex business requirements into efficient, scalable data models and schemas. You will design the logic that turns raw events into actionable business intelligence.
- Pipeline Architecture: Design, implement, and maintain resilient data pipelines that serve multiple business domains. You will ensure data flows reliably, securely, and with low latency across our ecosystem.
- End-to-End Ownership: Own the data development lifecycle completely—from architectural design and testing to deployment, maintenance, and observability.
- Cross-Functional Partnership: Partner closely with Data Analysts, Data Scientists, and Software Engineers to deliver end-to-end data solutions.
What You Bring
- Your Mindset:
- Data as a Product: You treat data pipelines and tables with the same rigor as production APIs—reliability, versioning, and uptime matter to you.
- Business Acumen: You don't just move data; you understand the business questions behind the query and design solutions that provide answers.
- Builder's Spirit: You work independently to balance functional needs with non-functional requirements (scale, cost, performance).
- Your Experience & Qualifications:
- Must Haves:
- 4+ years of experience as a Data Engineer, BI Developer, or similar role.
- Modern Data Stack: Strong hands-on experience with DBT, Snowflake, Databricks, and orchestration tools like Airflow.
- SQL & Modeling: Strong proficiency in SQL and deep understanding of data warehousing concepts (Star schema, Snowflake schema).
- Data Modeling: Proven experience in data modeling and business logic design for complex domains—building models that are efficient and maintainable.
- Modern Workflow: Proven experience leveraging AI assistants to accelerate data engineering tasks.
- Bachelor’s degree in Computer Science, Industrial Engineering, Mathematics, or an equivalent analytical discipline.
- Preferred / Bonus:
- Cloud Data Warehouses: Experience with BigQuery or Redshift.
- Coding Skills: Proficiency in Python for data processing and automation.
- Big Data Tech: Familiarity with Spark, Kubernetes, Docker.
- BI Integration: Experience serving data to BI tools such as Looker, Tableau, or Superset.
#LI-Hybrid



