Capco Logo

Capco

Mid+/Senior Data Engineer (Python, GCP)

Job Posted 16 Days Ago Reposted 16 Days Ago
Be an Early Applicant
Remote
Hybrid
Hiring Remotely in Poland
Senior level
Remote
Hybrid
Hiring Remotely in Poland
Senior level
Design, implement, and maintain scalable data pipelines for large-scale processing and analytics. Collaborate with clients and mentor junior engineers.
The summary above was generated by AI

CAPCO POLAND 

*We are looking for Poland based candidate. The job is remote but may require some business trips.

Joining Capco means joining an organisation that is committed to an inclusive working environment where you’re encouraged to #BeYourselfAtWork. We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. It’s important to us that we recruit and develop as diverse a range of talent as we can and we believe that everyone brings something different to the table – so we’d love to know what makes you different. Such differences may mean we need to make changes to our process to allow you the best possible platform to succeed, and we are happy to cater to any reasonable adjustments you may require. You will find the section to let us know of these at the bottom of your application form or you can mention it directly to your recruiter at any stage and they will be happy to help.

Capco Poland is a global technology and management consultancy specializing in driving digital transformation across the financial services industry. We are passionate about helping our clients succeed in an ever-changing industry.

We also are experts in      focused on development, automation, innovation, and long-term projects in financial services. In Capco, you can code, write, create, and live at your maximum capabilities without getting dull, tired, or foggy.

We're seeking a skilled Senior Big Data Engineer to join our Team. The ideal candidate will be responsible for designing, implementing and maintaining scalable data pipelines and solutions on on-prem / migration / cloud projects for large scale data processing and analytics.


THINGS YOU WILL DO

  • Work alongside clients to interpret requirements and define industry-leading solutions
  • Design and develop robust, well tested data pipelines
  • Demonstrate and help clients adhere to best practices in engineering and SDLC
  • Excellent knowledge of building event-driven, loosely coupled distributed applications
  • Experience in developing both on-premise and cloud-based solutions
  • Build and improve strong relationships with peers, senior stakeholders and client
  • Leading and mentoring the team of junior and mid-level engineers
  • Contribute to security designs and have advanced knowledge of key security technologies e.g. TLS, OAuth, Encryption
  • Support internal Capco capabilities by sharing insight, experience and credentials


TECH STACK: Python, OOP, Spark, SQL, Hadoop

Nice to have: Scala, GCP, Pub/Sub, Big Query, Kafka, Juniper, Apache NiFi, Hive, Impala, Cloudera, CI/CD


SKILLS & EXPERIENCES YOU NEED TO GET THE JOB DONE

  • You will have experience working with some of the following Methodologies/Technologies;
  • Strong cloud provider’s experience on GCP
  • Hands on experience using Python. Scala and Java are also nice to have but not fully required.
  • Experience in most or all of data and cloud technologies such as Hadoop, HIVE, Spark, Flume, PySpark, DataProc, Cloudera, Airflow, Oozie, S3, Terraform etc.
  • Hands on experience with schema design using semi-structured and structured data structures
  • Experience using messaging technologies – Kafka, Kafka Connect, Spark Streaming, Amazon Kinesis
  • Strong experience in SQL
  • Good understanding of the differences and tradeoff between SQL and NoSQL, ETL and ELT
  • Understanding of containerisation (Docker, Kubernetes) and orchestration techniques
  • Experience with data lake formation and data warehousing principles and technologies – BigQuery, Redshift, Snowflake
  • Experience using version control tool such as Git.
  • Familiar with development good practices and optimisation techniques.
  • Experience in design, build and maintain CI/CD Pipelines on Jenkins, CircleCI
  • Enthusiasm and ability to pick up new technologies as needed to solve your problems


WHY JOIN CAPCO?

  • Employment contract and/or Business to Business - whichever you prefer
  • Possibility to work remotely
  • Speaking English on daily basis, mainly in contact with foreign stakeholders and peers
  • Multiple employee benefits packages (MyBenefit Cafeteria, private medical care, life-insurance)
  • Access to 3.000+ Business Courses Platform (Udemy)
  • Access to required IT equipment
  • Paid Referral Program
  • Participation in charity events e.g. Szlachetna Paczka
  • Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
  • Being part of the core squad focused on the growth of the Polish business unit
  • A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients
  • A work culture focused on innovation and creating lasting value for our clients and employees


ONLINE RECRUITMENT PROCESS STEPS

  • Screening call with the Recruiter
  • Technical interview: first stage
  • Client Interview
  • Feedback/Offer

Top Skills

Apache Nifi
Big Query
Ci/Cd
Cloudera
GCP
Hadoop
Hive
Impala
Kafka
Pub/Sub
Python
Scala
Spark
SQL
HQ

Capco London, England Office

77-79 Great Eastern Street, London, United Kingdom, EC2A 3HU,

Similar Jobs at Capco

4 Days Ago
Remote
Hybrid
Poland
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
As a Data Analyst at Capco, you will support the Global Risk Analytics Data Environment platform, ensuring data quality, governance, and compliance. You'll collaborate with model development and monitoring teams, create ETL pipelines, and design roadmaps for data architecture. Effective stakeholder communication and project management are key to driving process improvements and meeting business needs.
5 Days Ago
Remote
Poland
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Lead Data Modeller responsible for designing and maintaining data models, ensuring data integrity, and guiding a team. Collaborate with teams for data architecture and analytics initiatives.
Top Skills: AWSAzureData ArchitectureData ModellingGCPHadoopSpark
5 Days Ago
Remote
Poland
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Data Architect provides strategic guidance on data architecture, managing the design and implementation of data solutions while ensuring integrity, security, and optimal performance.
Top Skills: Data ArchitectureData GovernanceData IntegrationData ManagementData Modeling

What you need to know about the London Tech Scene

London isn't just a hub for established businesses; it's also a nursery for innovation. Boasting one of the most recognized fintech ecosystems in Europe, attracting billions in investments each year, London's success has made it a go-to destination for startups looking to make their mark. Top U.K. companies like Hoptin, Moneybox and Marshmallow have already made the city their base — yet fintech is just the beginning. From healthtech to renewable energy to cybersecurity and beyond, the city's startups are breaking new ground across a range of industries.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account