Excella is seeking a Data Engineer to participate in the design and build of modern data products. The role involves working with raw data stores (data lakes) and cleansed data repositories, populated by batch or streaming data pipelines. The Data Engineer will collaborate with a team to create robust, sustainable, and flexible designs, leading the technical delivery using Agile frameworks like Scrum or Kanban.
Key responsibilities include:
- Collaborating with stakeholders to architect, build, and deploy data acquisition initiatives
- Participating in all stages of data pipeline development
- Designing, developing, and maintaining data services and pipelines in AWS
- Developing best practices for continuous process automation in data ingestion and pipeline workflows
- Managing multiple tasks under changing requirements and deadlines
- Presenting proofs of concept and recommendations to stakeholders
The ideal candidate will have:
- 3+ years of relevant professional experience
- Proficiency in SQL and Python for data pipeline development
- Experience with big data pipelines handling structured and unstructured data
- Familiarity with modern data orchestration tools (DBT, AWS Glue, Apache NiFi, Airflow)
- Experience with test-driven development and GitLab practices
- Ability to develop infrastructure as code (CloudFormation, Terraform)
Excella offers a flexible work environment, comprehensive benefits, and opportunities for professional growth. They value diversity and inclusion, providing various employee-led initiatives to foster innovation and inclusivity.