Data Engineer

Digital banking platform offering free accounts, investments, shopping services, and financial solutions
Data
Contact Company
Finance

Description For Data Engineer

Inter&Co is a comprehensive digital banking platform that's revolutionizing financial services in Brazil. We offer a free digital account, investment platform, shopping services, and various financial solutions. As a Data Engineer at Inter&Co, you'll be at the forefront of building and maintaining our data infrastructure that powers our financial services.

You'll work with cutting-edge technologies including AWS services, big data tools like Apache Spark and Kafka, and both relational and non-relational databases. Your role will be crucial in developing and implementing data pipelines that handle large-scale data processing and storage, essential for our banking operations.

The ideal candidate should have strong technical skills in Python and SQL, experience with AWS services, and a deep understanding of data engineering concepts. You'll be part of a collaborative team environment where you'll interact with data scientists, software engineers, and business analysts.

We offer a comprehensive benefits package and foster an inclusive work environment that values diversity and individual growth. At Inter&Co, we believe in making financial services simpler and more accessible, and we're looking for passionate individuals who share our vision of transforming the banking industry.

Join us if you're excited about working with large-scale data systems, enjoy solving complex problems, and want to be part of a company that's reshaping the future of digital banking.

Last updated 4 days ago

Responsibilities For Data Engineer

  • Develop and implement data collection, storage, processing, and consumption services
  • Create, maintain and monitor data pipelines for large-scale data collection, storage, processing, and consumption
  • Perform data-related infrastructure provisioning (Terraform)
  • Integrate, operate and manage data communication services (batch and real-time)
  • Provide data application and service integrations between systems

Requirements For Data Engineer

Python
Kafka
MongoDB
MySQL
PostgreSQL
  • Proficiency in Python and SQL
  • Knowledge of Spark (PySpark)
  • Proven AWS knowledge
  • Experience with big data tools like Apache Spark, Apache Kafka, and Apache Airflow
  • Knowledge of main AWS services used in data engineering
  • Experience with relational and non-relational databases
  • Understanding of data lakes and data warehouses concepts
  • Knowledge of distributed architecture and data streaming concepts
  • Basic knowledge of AWS security practices
  • Problem-solving skills for infrastructure and data processing
  • Experience with batch and streaming architectures
  • Familiarity with version control systems and clean code practices
  • Experience with WebServices and API integrations
  • Good communication skills for team collaboration

Benefits For Data Engineer

Medical Insurance
Dental Insurance
  • Food Allowance/Meal Allowance
  • Health and dental insurance
  • Life insurance
  • Transportation allowance
  • Duo Gourmet
  • Wellhub

Interested in this job?