Data Engineer

Modern data consultancy empowering organizations to realize the full value of the Snowflake Data Cloud through consulting and managed services.
Data
Mid-Level Software Engineer
Remote
2+ years of experience
Enterprise SaaS

Description For Data Engineer

Hakkoda, a modern data consultancy specializing in Snowflake Data Cloud solutions, is seeking a Data Engineer to join their distributed team across North America, Latin America, and Europe. The role focuses on building and optimizing data pipelines and architectures within cloud environments (AWS, Azure, GCP) for international clients. The ideal candidate will have 2+ years of experience, strong skills in Python and SQL, and experience with data warehousing and ETL/ELT processes.

The position offers a unique opportunity to work with a fast-growing company that values learning, growth, and collaboration. As a Data Engineer, you'll be responsible for designing and maintaining data pipeline architectures, implementing scalable data models, and ensuring robust data governance. The role requires collaboration with cross-functional teams including data experts, analysts, architects, and data scientists.

The company offers a flexible remote-first work environment with access to a co-working space in Lisbon. Benefits include comprehensive health insurance, meal allowance, annual bonuses, generous PTO, and professional development opportunities. Hakkoda emphasizes creating an inclusive environment where creativity and innovation are valued, making it an ideal place for professionals looking to make an impact in the data engineering field.

The role combines technical expertise with consulting skills, offering the opportunity to work on cutting-edge solutions across various industries. Even if candidates don't meet all qualifications, Hakkoda encourages applications from individuals passionate about their mission and eager to grow with the company.

Last updated a month ago

Responsibilities For Data Engineer

  • Build and optimize data pipelines and architectures within the cloud (AWS, Azure, GCP)
  • Collaborate with data experts, analysts, architects, and data scientists
  • Design and maintain data pipeline architectures
  • Assemble complex data sets
  • Build and optimize ETL/ELT infrastructure
  • Implement scalable data models
  • Ensure robust data governance and security practices

Requirements For Data Engineer

Python
  • 2+ years of experience as a Data Engineer or in a related technical role
  • Bachelor's Degree in Computer Science, Information Systems, Mathematics, MIS, or a related field
  • Experience developing data warehouses and building ETL/ELT ingestion pipelines
  • Strong experience in optimizing 'big data' pipelines, architectures, and data sets
  • Proficiency in SQL scripting and working with relational databases
  • Experience with business intelligence and analytics, particularly working with unstructured data
  • Cloud experience with AWS (experience with Azure and GCP is a plus)
  • Proficiency in Python scripting
  • Strong consulting skills
  • Fluency in English, both written and spoken

Benefits For Data Engineer

Medical Insurance
  • Comprehensive health insurance
  • Competitive meal allowance
  • Annual bonus opportunities
  • 22 days of paid time off, plus 2 additional days
  • Initial home office budget
  • Work-from-home allowance
  • Technical training and certification programs
  • Access to co-working space in Lisbon

Interested in this job?

Jobs Related To Hakkoda Data Engineer

Business Intelligence Engineer, ORC (ORC- Operations Risk Compliance) Program Analytics

Business Intelligence Engineer role at Amazon focusing on ORC Analytics, combining statistical analysis, data engineering, and business intelligence expertise in London.

Data Engineer

Data Engineer position at WorldQuant focusing on developing data pipelines and engineering solutions for financial strategies.

Data Engineer

Data Engineer position at G-P, developing solutions for their Global Employment Platform, working with Python, SQL, and modern data technologies in a remote environment.

Data Engineer

Data Engineer position at Capco working on transformative banking projects, requiring SQL and Python expertise, offering comprehensive benefits and growth opportunities.

Mid Level/Senior Data Developer

Mid Level/Senior Data Developer position at CI&T, focusing on Python, PySpark, and AWS for financial sector projects, with remote work options.