LinkedIn, the world's largest professional network, is seeking a Staff AI Scientist to drive cutting-edge AI research and development. This role focuses on advancing LinkedIn's capabilities in large-scale foundation models and AI innovations, particularly in the domain of Large Language Models (LLMs).
The position offers an exciting opportunity to work at the intersection of AI research and engineering, where you'll be responsible for developing and optimizing foundation models that serve LinkedIn's global user base. You'll be working with state-of-the-art deep learning architectures, including multi-billion parameter transformers, and scaling them effectively.
As a Staff AI Scientist, you'll lead research initiatives, publish findings in top AI venues, and collaborate with world-class researchers and engineers. The role involves complex problem-solving in areas such as distributed training, model parallelism, and system co-design. You'll be instrumental in pushing the boundaries of AI at scale while maintaining a balance between research innovation and practical implementation.
The ideal candidate brings a strong background in machine learning, demonstrated through 4+ years of experience in AI engineering and algorithmic solutions. You'll need expertise in programming languages like Python and Java, along with a deep understanding of large-scale model training and optimization techniques.
LinkedIn offers a hybrid work environment, competitive compensation ($164,000 - $268,000), and comprehensive benefits including health coverage and equity. You'll be part of a company culture that values innovation, collaboration, and professional growth, with opportunities to mentor others and contribute to groundbreaking AI advancements.
This role presents a unique opportunity to impact how millions of professionals connect and advance their careers through AI technology. You'll work on challenging problems at scale while contributing to LinkedIn's mission of creating economic opportunity for every member of the global workforce.