18

How do you convey impact for non-product, technical efforts like adding tests and fixing crashes?

Profile picture
Anonymous User at Taro Community2 years ago

For purely technical projects/platform work, how do you convey the impact to leadership/performance review committee? During performance review, do managers care about the impact of technical projects?

One thing we try to do is treat platform teams as enablers of delivery teams, i.e any platform work should impact product work directly. Sometimes we can quantify direct impact, but most of the time the impact is indirect which is hard to back by data and quantify.

Another angle is that delivery team engineers are the customers of platform team - If we enable them to deliver valuable feature faster and cheaper, we succeed. Again, this is hard to quantify with concrete data.

So my overall question is: How do you link and quantify technical projects like app optimization and performance improvements to direct product and business impact?

908
3

Discussion

(3 comments)
  • 5
    Profile picture
    Executive Coach, VP Eng at Mixpanel
    2 years ago

    Here are a few ideas for how to think about the impact of technical projects:

    1. Consider how well they align to the overall goals of your team and the business. Are customers complaining or churning because of reliability issues? Is performance a big differentiator for your product? This is likely to influence how much your management team cares about this type of work.
    2. As you mentioned, sometimes the customer of your work is other engineers at the company. It never hurts to get feedback from your customers. If your work moved the needle in a big way for a few people, it should be easy to get feedback to back that up. If your work made a small difference for lots of people, it might be easier to measure.
    3. Lastly, it's just true that the impact of tech debt projects is often hard to quantify. That's why it helps to have an eng leadership team that understands software development and has good judgement when it comes to technical impact. You can't run a team on numbers alone!
  • 4
    Profile picture
    Senior Manager at Zoox; Meta, Snap, Google
    2 years ago
    • I definitely agree that we should focus on business impact when trying to evaluate any type of work. Another important factor is complexity. And ideally both of these factors should be covered in a package for Performance reviews to justify engineer's work.

    • Coming back to your question, impact for BE could be measured in terms of its influence on business. And usually there are internal and external facing BE project impact:

      • Internal: Saving engineering time by simplifying commonly executed tasks (like adding new functionality faster as you mentioned).
        • This impact is usually measured in X amount of saved engineering hours per day/week/month. And this should be not a very precise answer - just a rough estimate in regards to the average time it took to add such functionality before/after the corresponding BE project.
      • External: Increasing product's reliability/decreasing latencies and e t c via more robust integration tests, bug fixes or refactoring.
        • This impact is usually measured in its influence on the end users. E.g. eliminated Y% of crashes or decreased cold start time by Z%.
    • I also want to mention that sometimes companies even put a special emphasis on BE projects. E.g. Facebook has a special axis for their calibrations called Engineering Excellence. So they expect each engineer to regularly spend their efforts dedicated to the projects in this area.

  • 3
    Profile picture
    Robinhood, Meta, Course Hero, PayPal
    2 years ago
    • For technical issues that actually affect the user experience (crashes, ANRs, etc), you can wrap them in an experiment to measure the impact. You can bundle similar fixes together to make deltas clearer and/or use a long-term holdout group to observe user impact long-term.
    • Unfortunately, a lot of the more purely technical efforts (e.g. refactoring code), you need subjective feedback, especially initially. Ask relevant peers whether the effort improved their engineering quality of life and you can inline that feedback in your performance review packet.
    • You can generate "metrics" for qualitative feedback by running surveys, which is what we did back at Meta. Every half, we would run a survey asking engineers to rate various engineering pillars from good to bad (pillar examples: Oncall system, code velocity, debuggability) and we used this to track engineering system improvements over time. If you were running one of these Better Engineering efforts, your goal was to generate an increased survey score for the pillar connected to your project over time.

    Recommended additional resources: