Taro Logo

Data Engineer

All-in-one payroll and HR platform for global teams, supporting workers in 100+ countries with HRIS, compliance, benefits, and equipment management.
Data
Mid-Level Software Engineer
Remote
1,000 - 5,000 Employees
3+ years of experience
Enterprise SaaS · Finance
This job posting may no longer be active. You may be interested in these related jobs instead:

Description For Data Engineer

Deel, the fastest-growing SaaS company in history, is seeking a Data Engineer to join their Data Platform team. As part of a company that processed $11.2B in payments across 100 countries in 2024, you'll work on enhancing data quality, optimizing pipeline performance, and building robust platform tools. The role involves collaborating with 30+ Analytics Engineers and 100+ data professionals, designing scalable data pipelines, and implementing data governance policies. You'll work with modern tools like Snowflake, Airflow, and dbt, while having the opportunity to shape the future of global work infrastructure. The position offers competitive compensation, equity opportunities, and full remote flexibility. Deel, valued at $12B with $800M ARR, provides an excellent environment for career growth in the rapidly evolving field of global workforce management. The role requires strong technical skills in Python and SQL, experience with data warehousing, and excellent collaborative abilities. You'll be part of a diverse team spanning 100+ countries, contributing to a platform that's revolutionizing how global talent connects with world-class companies.

Last updated 3 months ago

Responsibilities For Data Engineer

  • Design, implement, and manage scalable data pipelines using Snowflake, Airflow, dbt, and Fivetran
  • Collaborate with cross-functional teams to understand data requirements
  • Implement data governance policies
  • Develop and maintain ELT processes
  • Optimize SQL queries
  • Partner with data analysts and stakeholders
  • Diagnose and resolve data-related issues
  • Build CI/CD system
  • Ingest data from external systems via APIs

Requirements For Data Engineer

Python
Kubernetes
  • Minimum 3 years as a Data Engineer or similar role
  • Strong proficiency in Python and SQL
  • Experience with cloud-based data warehouses
  • Proficiency in designing efficient database schemas
  • Familiarity with Apache Airflow
  • Experience with data streaming and Change Data Capture
  • Proficiency in Terraform and GitHub Actions
  • Experience in setting up PII anonymization and RBAC
  • Strong ability to work with cross-functional teams
  • Excellent analytical skills with attention to detail

Benefits For Data Engineer

Equity
  • Stock grant opportunities
  • Remote work flexibility
  • WeWork access (optional)
  • Additional perks based on employment status and country

Interested in this job?