Engineering Manager, Inference Scalability and Capability

Anthropic creates reliable, interpretable, and steerable AI systems, focusing on safe and beneficial AI development for users and society.
$320,000 - $405,000
Distributed Systems
Staff Software Engineer
Hybrid
5+ years of experience
AI

Description For Engineering Manager, Inference Scalability and Capability

Anthropic is seeking an Engineering Manager to lead their Inference Scalability and Capability team, focusing on building and maintaining critical systems that serve LLMs. This role combines technical leadership with team management, requiring expertise in distributed systems and ML infrastructure. The position offers competitive compensation ($320,000-$405,000) and benefits, including visa sponsorship opportunities. Based in San Francisco with a hybrid work model, Anthropic operates as a public benefit corporation dedicated to creating safe, reliable AI systems. The role demands 5+ years of experience leading distributed systems teams and involves partnering with research, infrastructure, and product teams to optimize LLM inference systems and develop new capabilities. The company emphasizes collaborative research and values diverse perspectives in advancing their mission of developing steerable, trustworthy AI. They operate as a cohesive team focused on large-scale research efforts, viewing AI research as an empirical science comparable to physics and biology. The role offers opportunities to work on cutting-edge AI technology while contributing to the responsible advancement of AI capabilities.

Last updated 7 days ago

Responsibilities For Engineering Manager, Inference Scalability and Capability

  • Build and lead a high-performing team of engineers through technical mentorship, strategic hiring, and creating an environment that fosters innovation
  • Drive operational excellence of inference systems across cloud providers
  • Facilitate development of advanced inference features
  • Partner with research teams to productionize new models
  • Create clear technical roadmaps and execution strategies

Requirements For Engineering Manager, Inference Scalability and Capability

Kubernetes
  • 5+ years of experience leading large-scale distributed systems teams
  • Excellence in building high-trust environments
  • Demonstrated ability to recruit, scale, and retain engineering talent
  • Outstanding communication and leadership skills
  • Deep commitment to advancing AI capabilities responsibly
  • Strong technical background
  • Bachelor's degree in a related field or equivalent experience

Benefits For Engineering Manager, Inference Scalability and Capability

Medical Insurance
Visa Sponsorship
Parental Leave
  • Competitive compensation and benefits
  • Optional equity donation matching
  • Generous vacation and parental leave
  • Flexible working hours
  • Office space for collaboration

Interested in this job?

Jobs Related To Anthropic Engineering Manager, Inference Scalability and Capability

Staff Software Engineer, Transactional Storage Services

Staff Software Engineer position at Airbnb focusing on building and maintaining distributed database systems and storage services.

vSphere High Availability: Software Engineer 5

Staff Software Engineer position at Broadcom focusing on vSphere High Availability development, requiring expertise in distributed systems and C++ programming.

Distributed Systems Engineer (L5) - Compute Abstractions

Staff-level Distributed Systems Engineer position at Netflix, focusing on cloud infrastructure and compute abstractions, offering remote work and competitive compensation.

Staff Software Engineer - Systems Infrastructure

Staff Software Engineer position at LinkedIn focusing on building next-generation distributed systems infrastructure with competitive compensation and comprehensive benefits.

Software Engineering Manager, Network Load Balancing

Lead software engineering manager position at Google, focusing on Network Load Balancing systems, requiring 8+ years of development experience and strong leadership skills.