BeiGene Logo

BeiGene

Senior Manager, Data Platform and Solution Engineering

Reposted 3 Days Ago
Be an Early Applicant
In-Office or Remote
Hiring Remotely in Warsaw, Warszawa, Mazowieckie
Senior level
In-Office or Remote
Hiring Remotely in Warsaw, Warszawa, Mazowieckie
Senior level
The Associate Director will design and implement data architectures using Databricks, manage data frameworks, optimize processes, and collaborate with teams to enhance data-driven decision-making.
The summary above was generated by AI

BeOne continues to grow at a rapid pace with challenging and exciting opportunities for experienced professionals. When considering candidates, we look for scientific and business professionals who are highly motivated, collaborative, and most importantly, share our passionate interest in fighting cancer.

General Description:

Join BeOne's Global Data Strategy and Solutions team to build and scale a cutting-edge and fully integrated Enterprise Data and Analytics Platform that accelerates our journey from Data to Insights and deployment of AI applications. The Senior Manager Platform and Solution Engineering must be an expert in Databricks solution technologies to design a scalable, high-performance data solutions that empower our organization to ingest and curate data and build data products at scale. The ideal candidate will possess strong technical knowledge and experience in cloud data architectures, big data processing, and real-time analytics, coupled with the ability to collaborate cross-functionally to drive data-driven decision-making across the organization.

Essential Functions of the job:

The individual in this position should expect significant day-to-day variability in tasks and challenges.

 Primary duties include but is not limited to the following:

  • Design and implement robust data architectures using Databricks, ensuring integration with existing systems and scalability for future growth
  • Establish data management frameworks, optimizing ETL/ELT processes and data models for performance and accuracy.
  • Evaluate and recommend modern architectural patterns, including Lakehouse, Delta Live Tables, Data Mesh, and real-time streaming.
  • Drive rapid Proof-of-Concepts (POCs) to validate new architectural approaches, tools, and design patterns before enterprise rollout.
  • Partner with data engineers, scientists, and business stakeholders to develop seamless data pipelines prioritizing data integrity and usability.
  • Implement and uphold data governance practices that enhance data accessibility while ensuring compliance with regulations.
  • Integrate external systems, APIs, and cloud-native services to support new data products and analytics use cases.
  • Prototype and test new connectors, ingestion frameworks, and integration patterns to accelerate innovation.
  • Monitor data pipelines and infrastructure performance, troubleshooting issues as they arise and ensuring high availability.
  • Optimize and enhance existing data systems for performance, reliability, and cost-efficiency.
  • Collaborate with data analysts and data scientists to understand data requirements and implement solutions that support data-driven insights and models.
  • Monitor and enhance system performance, employing tools and methodologies to optimize data processing and storage solutions.
  • Optimize compute costs, job orchestration, workflow efficiency, and data storage strategies.
  • Troubleshoot and resolve data-related issues to maintain optimal system functionality.
  • Experiment with new Databricks features (Unity Catalog updates, AI/ML runtimes, Photon, DBRX, Delta Sharing, serverless SQL/compute, etc.) through quick hands-on evaluations.
  • Develop and enforce data governance standards, including data quality, security, and compliance through automation.
  • Innovation & Rapid Prototyping
    • Conduct fast-turnaround POCs to explore new technical capabilities, libraries, and features across Databricks, Azure, Informatica, Reltio, and other ecosystem tools.
    • Build lightweight demo pipelines, dashboards, and micro-solutions to demonstrate feasibility, guide architectural choices, and influence roadmap decisions.
    • Stay current with emerging technologies, industry trends, and platform advancements; translate insights into actionable recommendations.
    • Collaborate with vendors and internal teams to evaluate beta features, pilot new capabilities, and provide technical feedback for adoption decisions.

Education Required:  Bachelor’s Degree in Information Technology or related field/experiences

Qualifications:

  • Proven experience (7+ years) in data architecture or in a similar role, with extensive experience in Databricks and cloud-based data solutions.
  • 7+ years of experience in solution engineering, platform architecture, or related  working in a cross-functional environment.
  • Strong proficiency in Apache Spark, Unity Catalog technology, Python, SQL, and data processing frameworks.
  • Experience with APIs and experience in integrating diverse technology systems.
  • Familiarity with modern development frameworks, DevOps methodologies, and CI/CD processes.
  • Experience with data warehousing solutions, delta lakes, and ETL/ELT processes.
  • Familiarity with cloud environments (AWS, Azure) and their respective data services.
  • Solid understanding of data governance, security, and compliance best practices.
  • Excellent communication and interpersonal skills, with an ability to articulate complex technical concepts to diverse audiences.
  • Databricks certifications or hands-on experience with Delta Lake and its cloud architecture is strongly preferred
  • Familiarity with machine learning, AI frameworks, and data visualization tools (e.g., Tableau, Power BI, Spotfire).
  • A proactive approach to learning and implementing new technologies and frameworks.
  • Experience working with Life Sciences data, including exposure to R&D, Clinical Operations, TechOps, or Manufacturing domains. Understanding of key systems (CTMS, EDC, eTMF, LIMS, MES, PV systems), data models (CDISC, SDTM, ADaM), and typical data challenges (quality, lineage, integration, governance) is highly desirable

Supervisory Responsibilities:  No

Global Competencies

When we exhibit our values of Patients First, Driving Excellence, Bold Ingenuity, and Collaborative Spirit, through our twelve global competencies below, we help get more affordable medicines to more patients around the world.

  • Fosters Teamwork
  • Provides and Solicits Honest and Actionable Feedback
  • Self-Awareness
  • Acts Inclusively
  • Demonstrates Initiative
  • Entrepreneurial Mindset
  • Continuous Learning
  • Embraces Change
  • Results-Oriented
  • Analytical Thinking/Data Analysis
  • Financial Excellence
  • Communicates with Clarity

We are proud to be an equal opportunity employer. BeOne does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Top Skills

Spark
AWS
Azure
Databricks
Informatica
Python
Reltio
SQL

Similar Jobs

9 Days Ago
In-Office or Remote
Warsaw, Warszawa, Mazowieckie, POL
Senior level
Senior level
Biotech
Lead design and implementation of Databricks-based enterprise data platform, build scalable ETL/ELT pipelines, drive POCs, enforce data governance, optimize performance and costs, and collaborate with stakeholders to enable analytics and AI use cases.
Top Skills: Databricks,Delta Lake,Delta Live Tables,Unity Catalog,Delta Sharing,Lakehouse,Data Mesh,Apache Spark,Python,Sql,Photon,Dbrx,Serverless Sql,Apis,Aws,Azure,Informatica,Reltio,Etl/Elt,Devops,Ci/Cd,Tableau,Power Bi,Spotfire,Ai/Ml Runtimes
14 Hours Ago
Remote
Poland
Senior level
Senior level
Artificial Intelligence • Cloud • Consumer Web • Productivity • Software • App development • Data Privacy
The Staff Full Stack Software Engineer will lead the development of scalable solutions, enhance team collaboration, and drive product innovation while addressing churn and growth challenges.
Top Skills: GoPythonReact
14 Hours Ago
Remote
Poland
Junior
Junior
Artificial Intelligence • Cloud • Consumer Web • Productivity • Software • App development • Data Privacy
The role involves building and maintaining scalable data systems for analytics, ensuring data quality, and collaborating with teams to optimize data usage.
Top Skills: DatabricksDelta LakePythonSnowflakeSQL

What you need to know about the London Tech Scene

London isn't just a hub for established businesses; it's also a nursery for innovation. Boasting one of the most recognized fintech ecosystems in Europe, attracting billions in investments each year, London's success has made it a go-to destination for startups looking to make their mark. Top U.K. companies like Hoptin, Moneybox and Marshmallow have already made the city their base — yet fintech is just the beginning. From healthtech to renewable energy to cybersecurity and beyond, the city's startups are breaking new ground across a range of industries.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account