Public-Interest Audits

Testing AI to protect people, the environment and democracy

Public-Interest Audits (PIA) are projects where we bring AI impact data and issues to the spotlight not as a result of a request by an impacted community, as we do with Community-Led Audits (CLAs), but because the Eticas team or our partners feel a specific issue deserves attention, quickly, or as part of our broader research into AI impacts.

While all our audits are public-interest audits, this program allows us to undertake more experimental projects, leveraging different methods and approaches, to bring the Eticas team and Board together around research that is relevant to us and our partners, and to respond to new events or AI applications in record time.

List of PIA
already published:

Lawmaker or Lawbreaker? How
FaceNet Got It Wrong

FaceNet's errors reveal Al's potential for misidentification, highlighting cases where even prominent figures were incorrectly flagged. This article discusses the implications for privacy and security in facial recognition technology

BadData: The High Cost of Poor Data Quality

Uncovering the hidden risks of flawed data- showing how errors, biases, and outdated information can twist decision-making, fueling predictions that may shape lives in
unexpected and sometimes irreversible ways.

FemTech: My body, my data, their rules

Exploring the privacy risks in femtech, this article reveals how personal health data in menstrual and fertility tracking apps is often exploited, raising concerns over data ownership, consent, and regulatory gaps.

Name Your Bias: Al's Fairness
Challenge in Hiring

Exploring Al's role in hiring, this article delves into bias challenges within automated recruitment tools and the impact on fair hiring practices

Community-led Al Audits:
Methodology for Placing
Communities at the Center of Al Accountability

FaceNet's errors reveal Al's potential for misidentification, highlighting cases where even prominent figures were incorrectly flagged. This article discusses the implications for privacy and security in facial recognition technology

Our Public Interest Audits go beyond traditional assessments,

providing actionable insights and benchmarks that drive AI transparency and accountability across industries. By holding AI systems to higher standards, Eticas Foundation empowers organizations to make responsible improvements that serve and protect the public interest, fostering a digital future that benefits society as a whole.

Do you know of a system we should audit?

At Eticas, we are committed to digital transparency and justice. If you have information about applications or systems you believe should be investigated, reach out to us now.