|It is increasingly evident that if researchers and policymakers want to meaningfully develop an understanding of responsible innovation, we must first ask whether some sociotechnical systems should be developed, at all. Here I argue that systems like facial recognition, predictive policing, and biometrics are predicated on myriad human prejudicial biases and assumptions which must be named and interrogated prior to any innovation. Further, the notions of individual responsibility inherent in discussions of technological ethics and fairness overburden marginalized peoples with a demand to prove the reality of their marginalization. Instead, we should focus on equity and justice, valuing the experiential knowledge of marginalized peoples and optimally positioning them to enact deep, lasting change. My position aligns with those in Science, Technology, and Society (STS) which center diverse and situated knowledges, and is articulated together with calls for considering within science and engineering wider sociocultural concerns like justice and equality.