Google is seeking a Software Engineer for their Egregious Abuse Protection team. This role is part of the Core team, which builds the technical foundation behind Google's flagship products. The ideal candidate will help create a safer world by preventing abuse and harm across Google products.
As a Software Engineer in this role, you'll be working on developing centralized abuse fighting systems, building critical infrastructure components, and creating distributed systems to handle Google-scale data. You'll collaborate with various stakeholders, including Product Management and critical Product Areas like Trust and Safety, Photos, Workspace, and YouTube.
Key responsibilities include developing systems to detect egregious material and bad actors, building repositories of centralized signals for identifying non-consensual content and other harm types, and creating tools to help reviewers assess and deliver enforcement decisions in a proportionate and privacy-sensitive manner.
The ideal candidate should have a strong background in software development, data structures, and algorithms. Experience with Go, Python, C++, Abuse Detection, and Machine Learning is preferred. You should be able to drive navigation of ambiguous and open-ended problems, possess excellent communication skills, and be self-driven.
This role offers the opportunity to make a significant impact on Google's products and contribute to creating a safer online environment for users worldwide. Join Google's Core team and be at the forefront of developing innovative solutions to combat online abuse and protect users.