Mid-Level GCP Data Developer

Tech transformation specialists with 30 years of experience, supporting large companies in technology and business changes through AI and human expertise.
Campinas, State of São Paulo, Brazil
Data
Mid-Level Software Engineer
Remote
1,000 - 5,000 Employees
3+ years of experience
Finance · Enterprise SaaS

Description For Mid-Level GCP Data Developer

CI&T is seeking a Mid-Level GCP Data Developer to join their Finance Digital project. As a tech transformation specialist with 30 years of experience and 6,000+ employees across 10 countries, CI&T combines AI's disruptive power with human expertise to help major companies navigate technology and business changes.

The role involves working on an exciting partnership with Google Cloud to build a modern data pipeline for efficient ingestion, processing, and storage of large data volumes. You'll be responsible for understanding data sources, implementing automated pipelines in BigQuery, orchestrating data flows, and ensuring data integrity.

The ideal candidate should have strong experience with GCP, proficiency in SQL and Python, and knowledge of tools like BigQuery, Dataflow, Cloud Composer, and Cloud Spanner. Additional experience with Salesforce, machine learning, AI, and data visualization tools like Looker Studio would be advantageous.

CI&T offers a comprehensive benefits package including health and dental insurance, meal allowance, childcare assistance, extended paternity leave, profit sharing, and various learning and development opportunities through CI&T University and language learning platforms.

The position is remote-based in Brazil, with some occasional presence required in the Campinas office. CI&T values diversity and inclusion, strongly encouraging applications from underrepresented communities to create an innovative and transformative work environment.

Last updated 18 days ago

Responsibilities For Mid-Level GCP Data Developer

  • Understand data sources and how they connect to business metrics, ensuring data quality and relevance
  • Implement automated data pipelines in BigQuery for data ingestion and transformation
  • Create and manage automated processes for orchestrating data ingestion and transformation
  • Configure and manage monitoring systems to ensure data integrity, quality, and real-time availability

Requirements For Mid-Level GCP Data Developer

Python
  • Experience in data development on cloud platforms, especially Google Cloud Platform (GCP)
  • Proficiency in SQL and Python
  • Knowledge of GCP tools such as BigQuery, Dataflow, Cloud Composer, and Cloud Spanner
  • Team player with clear and efficient communication skills
  • Passion for technology and continuous learning

Benefits For Mid-Level GCP Data Developer

Medical Insurance
Dental Insurance
Mental Health Assistance
Parental Leave
  • Health and dental insurance
  • Meal allowance
  • Childcare assistance
  • Extended paternity leave
  • Gympass
  • Profit sharing
  • Life insurance
  • Mental health platform partnership
  • CI&T University
  • Discount club
  • Support Program: psychological guidance, nutritionist and more
  • Pregnancy and parenting course
  • Online course platform partnerships
  • Language learning platform

Interested in this job?

Jobs Related To CI&T Mid-Level GCP Data Developer

Data Developer

Remote Data Engineer position at CI&T focusing on PostgreSQL to AWS DynamoDB migration, data infrastructure optimization, and cross-functional collaboration.

Mid-Level Data Developer

Mid-Level Data Developer position at CI&T, working remotely with financial sector clients, focusing on data pipeline management and ETL processes.

Mid-level Data Developer

CI&T is hiring a Mid-level Data Developer for global client projects, focusing on data lifecycle management and analysis.

Business Intelligence Engineer II, Security Assurance Engineering

AWS Security seeks Business Intelligence Engineer II to develop data-driven solutions and analytics for cloud security initiatives.

Data Engineer II, ROW Central Data Engineering Team

Data Engineer II position at Amazon's ARTS team, focusing on analytics and data engineering for international operations, requiring 4+ years of experience.