Mid Big Data Engineer

Global technology and management consultancy specializing in driving digital transformation across the financial services industry.
Data
Mid-Level Software Engineer
Hybrid
3+ years of experience
Finance

Description For Mid Big Data Engineer

Capco Poland, a leading global technology and management consultancy, is seeking a Mid Big Data Engineer to join their dynamic team in Krakow. The company specializes in driving digital transformation across the financial services industry and is committed to helping clients succeed in an ever-changing landscape.

The role offers a unique opportunity to work on large-scale data processing and analytics projects, both on-premises and in the cloud. As a Mid Big Data Engineer, you'll be responsible for designing and implementing scalable data pipelines using cutting-edge technologies like Scala, Spark, and Hadoop. The position requires a strong foundation in data engineering with 3-4 years of experience and excellent SQL skills.

What makes this role particularly attractive is Capco's commitment to an inclusive working environment where individuality is celebrated. The company offers a hybrid work model with 2 days in the Krakow office, providing flexibility while maintaining team collaboration. The technical stack includes modern technologies, and there's opportunity to work with additional tools like GCP, Kafka, and various Apache products.

The benefits package is comprehensive, including flexible employment options, private medical care, life insurance, and access to extensive learning resources through Udemy. The company's flat, non-hierarchical structure ensures direct interaction with senior partners and clients, promoting rapid professional growth.

This role is ideal for someone who wants to combine technical expertise in data engineering with exposure to the financial services sector. The position offers the perfect balance of technical challenges, professional development, and work-life flexibility, all while working for a company that values diversity and individual growth.

Last updated 16 minutes ago

Responsibilities For Mid Big Data Engineer

  • Design, develop and maintain robust data pipelines using Scala, Spark, Hadoop, SQL for batch and streaming data processing
  • Collaborate with cross-functional teams to understand data requirements and design efficient solutions
  • Optimize Spark jobs and data processing workflows for performance, scalability and reliability
  • Ensure data quality, integrity and security throughout the data lifecycle
  • Troubleshoot and resolve data pipeline issues
  • Document design specifications, deployment procedures and operational guidelines
  • Provide technical guidance and mentorship for new joiners

Requirements For Mid Big Data Engineer

Scala
Kafka
  • 3-4 years of experience as a Data Engineer/Big Data Engineer
  • University degree in computer science, mathematics, natural sciences, or similar field
  • Excellent SQL skills, including advanced concepts
  • Very good programming skills in Scala
  • Experience in Spark and Hadoop
  • Experience in OOP
  • Experience using agile frameworks like Scrum
  • Interest in financial services and markets
  • Fluent English communication and presentation skills

Benefits For Mid Big Data Engineer

Medical Insurance
Education Budget
  • Employment contract and/or Business to Business options
  • Remote work possibility
  • MyBenefit Cafeteria
  • Private medical care
  • Life insurance
  • Access to 3000+ Business Courses Platform (Udemy)
  • Access to required IT equipment
  • Paid Referral Program
  • Ongoing learning opportunities
  • Flat, non-hierarchical structure

Interested in this job?

Jobs Related To Capco Poland Mid Big Data Engineer

Mid Big Data Engineer

Mid Big Data Engineer position at Capco Poland, focusing on designing and implementing scalable data solutions for financial services using Scala, Python, and Spark.

Mid Big Data Engineer

Mid Big Data Engineer position at Capco Poland, focusing on designing and implementing scalable data solutions for financial services using Scala, Python, and Spark.

Tableau Developer

Tableau Developer position at unybrands, developing data visualizations and dashboards to drive e-commerce business insights

Data Engineer - Credit Modeling

Data Engineer position at PayPay focusing on Credit Modeling, ETL processes, and data pipeline development

Data Engineer

Data Engineer position at Careem, building scalable data infrastructure and ETL pipelines for the Middle East's leading Everything App.