Come join the SBSEG Data Engineering team at IDC as a "Software Engineer - 2". We are leveraging big data technologies to gain new insights into our QuickBooks customer experience. Some of the technologies leveraged by our team include: Hadoop, Spark, and AWS Data Services. We foster an open team environment where we value direct interactions and working code above working in a cave.
Responsibilities: • Strong understanding of data engineering and dimensional design fundamentals, good at SQL, integration (ETL), front-end analysis / data visualization, learns new technologies quickly. • Good understanding of data warehouse schema design and granularity of the data. • Good understanding of data federation techniques and aggregation of data at scale from multiple source systems. • Designing and developing ETL pipelines across multiple platforms and tools including Spark, Hadoop and AWS Data Services. • Gathering functional requirements, developing technical specifications, and project & test planning. • Work with business users to develop and refine analytical requirements for quantitative data (view-through, clickstream, acquisition, product usage, transactions), qualitative data (survey, market research) and unstructured data (blog, social network). • Designing and developing schema definitions and support data warehouse/mart to enable integration of disparate data sources from within Intuit and outside, aggregate it and make it available for analysis. • Support large data volumes and accommodate flexible provisioning of new sources. • As a key member of the team drive adoption of new technologies, tools, and process improvements to build world class analytical capabilities for web analytics, optimization, experimentation and personalization. • Resolve defects/bugs during QA testing, pre-production, production, and post-release patches. • Work cross-functionally with various Intuit teams: Product Management, Project Management, Data Architects, Data Scientists, Data Analysts, Software Engineers, and other Data Engineers. • Contribute to the design and architecture of project across the data landscape. • Experience with Agile Development, SCRUM, or Extreme Programming methodologies. • Helps to align to overall strategies and reconcile competing priorities across organization.
Qualifications: • BS/MS in computer science or equivalent work experience. • Experience in developing data pipelines using Cloud service providers such as AWS, Azure, GCP etc • 2+ years of experience in developing data pipelines using Spark and Hive • Experience in developing complex data models/dimension models on datalake • Ideal candidate to have 3+ years of experience in end-to-end data warehouse implementations and at least 2 projects with 4TB+ data volume. • Good knowledge of Operating Systems (Unix or Linux). • Hands on experience in any of the programming languages (Shell scripting, Python, Java, etc). • Must have been through several full life cycle Data Warehousing implementations and involved in scalability and performance related design aspects in datalake and ETL. • Solid communication skills: Demonstrated ability to explain complex technical issues related to technical and non-technical audiences. • Demonstrated understanding of the Software design and architecture process. • Experience with unit testing and data quality automation checks • Should be results oriented, self-motivated, accountable and work under minimal supervision. • Excellent written, oral communication and presentation Skills.
Good to have: • Understanding of datamesh and microservices architecture • Knowledge of Big Data ecosystem like Hadoop M/R, Pig and Hive is a strong plus. • Conceptual understanding of Machine Learning is a plus. • Good understanding of any reporting tools such as Tableau, Pentaho or Jasper is a big plus. • Experience in design, development and deployment of one or more tools - ETL (Informatica, OWB, ODI), reporting (Business Objects, QlikView, Tableau) • Understanding of building highly resilient , fault tolerant data platforms to support 1000+ data applications • Programming languages like Java
Intuit offers a flexible work environment, allowing for hybrid work options that blend the best of in-person collaboration and the flexibility of virtual work.