Google's Content Safety Platform (CSP) is seeking a Data Scientist to join their User Protection organization. This role is crucial in protecting Google's users from abuse, account compromise, and online harms. The position involves working with AI-powered tools to evaluate and improve content safety classifiers across Google's products.
The role sits within the Content Safety Platform pillar, which develops scalable tools for user protection. As a Data Scientist, you'll collaborate with product and engineering teams to enhance the quality of protection systems. You'll be part of a larger data science team in Core, offering extensive opportunities for knowledge sharing and professional growth.
The position requires strong analytical skills and expertise in machine learning. You'll work on evaluating content safety classifiers and developing strategies for understanding these systems. Product safety is a critical aspect of Google's services, especially for new AI tools. The role offers competitive compensation including base salary, bonus, equity, and comprehensive benefits.
Key aspects of the role include collaborating with stakeholders to translate business questions into actionable analyses, designing and evaluating mathematical models, and managing data infrastructure. You'll need to be proficient in Python and SQL, with experience in AI/ML and statistical analysis. The ideal candidate will have a Master's degree in a quantitative field and experience with analytics and machine learning classifier evaluation.
This is an excellent opportunity for someone passionate about using data science to improve user safety at scale. You'll be at the forefront of protecting users across Google's products while working with cutting-edge AI technology. The role offers both technical challenges and the satisfaction of contributing to user protection on a global scale.