Expert Support Engineer / Product Specialist : Hadoop SME

Acceldata is revolutionizing data observability in how enterprises manage and observe data by offering comprehensive solutions tailored to each organization's unique needs.
Data
Staff Software Engineer
In-Person
10+ years of experience
AI · Enterprise SaaS
This job posting may no longer be active. You may be interested in these related jobs instead:
Data & Software Engineer (L5) — Productivity Metrics and System Insights

Senior Data & Software Engineer position at Netflix focusing on productivity metrics and system insights, requiring expertise in data pipelines and distributed systems.

Data Engineer (L5) - Commerce Product Data Engineering

Staff Data Engineer role at Netflix focusing on commerce product data engineering, experimentation, and analytics with competitive compensation range of $170K-$720K.

Lead Software Engineer - Data Engineer

Lead Software Engineer position at JPMorgan Chase focusing on data engineering, requiring 5+ years experience and expertise in Python, Java, and data technologies.

Lead Software Engineer - Data modeling / Data governance

Lead Software Engineer position at JPMorgan Chase focusing on data modeling and governance, requiring 5+ years of experience in data architecture and strong technical expertise.

Consulting Data Scientist 4

Principal Data Scientist position at Oracle focusing on AI and machine learning solutions for healthcare strategic engagements, requiring federal security clearance.

Description For Expert Support Engineer / Product Specialist : Hadoop SME

Acceldata is revolutionizing data observability for enterprises, offering tailored solutions for managing and observing data. As a Product Specialist, you'll play a crucial role in resolving complex customer challenges across various data integrations, including Data Catalogs, Databases, Data Lakes/Lakehouses, Orchestration, SQL Query and Analytics Engines, Security, Streaming, Data Warehouses, and Business Intelligence platforms. You'll focus on delivering faster resolutions and ensuring seamless data operations, enabling customers to monitor, predict, and optimize their data systems for impactful business outcomes.

The ideal candidate should have 10+ years of experience in customer-facing support, post-sales, technical architecture, or consulting roles, with a focus on Cloud Native and Data Lakes. Proficiency in cloud platforms (AWS, GCP, Azure), Kubernetes, Docker, and various Hadoop components is required. You'll need excellent communication skills and the ability to handle pressure while demonstrating empathy for customer situations.

Working at Acceldata, you'll collaborate with innovative minds in data observability, solve complex technical problems, and gain invaluable experience in managing distributed, scalable environments. You'll be responsible for ensuring seamless support operations for Gold and Enterprise customers, providing proactive solutions that empower organizations to maximize their data infrastructure.

This role offers the opportunity to work with cutting-edge data solutions in a fast-paced, dynamic environment, making a lasting impact on customer success and Acceldata's evolution in data observability.

Last updated 6 months ago

Responsibilities For Expert Support Engineer / Product Specialist : Hadoop SME

  • Resolve complex customer challenges across various data integrations
  • Deliver faster resolutions and ensure seamless data operations
  • Work with customers to understand use cases and identify pain points
  • Provide operations expertise leveraging knowledge of Data Quality, Cost Optimization, and Data Reliability
  • Manage support cases for Gold and Enterprise customers
  • Troubleshoot intricate technical issues
  • Provide proactive solutions to empower organizations

Requirements For Expert Support Engineer / Product Specialist : Hadoop SME

Kubernetes
Linux
  • 10+ years of experience in customer-facing support, post-sales, technical architecture, or consulting roles
  • Bachelor's degree
  • Proficiency in cloud platforms (AWS, GCP, Azure)
  • Expertise in Kubernetes, Docker, and Hadoop components (HDFS, Yarn, Spark, Hive, HBase, Ranger, Kafka)
  • Knowledge of Linux distributions (RHEL, Ubuntu, SUSE)
  • Excellent communication skills (written and verbal)
  • Ability to handle pressure and demonstrate empathy

Interested in this job?