Fundamental Rights Impact Assessment

Complete a FRIA before deploying a high-risk AI system for social housing benefits.

What Is Fundamental Rights Impact Assessment?

Learn how Article 27 of the EU AI Act requires deployers of high-risk AI systems to conduct a Fundamental Rights Impact Assessment before first use. Complete a FRIA for an automated social housing benefit eligibility system, assessing impacts on non-discrimination, privacy, effective remedy, and human dignity.

What You'll Learn in Fundamental Rights Impact Assessment

Fundamental Rights Impact Assessment — Training Steps

  1. Article 27: Fundamental Rights Impact Assessment

    Article 27 of the EU AI Act requires deployers of high-risk AI systems to conduct a Fundamental Rights Impact Assessment before first use. A FRIA examines how the AI system might affect fundamental rights - dignity, non-discrimination, privacy, freedom of expression, and the right to an effective remedy. A FRIA differs from a GDPR Data Protection Impact Assessment (DPIA). While a DPIA focuses solely on data protection risks, a FRIA covers all fundamental rights. Both assessments may be required for the same system, and they can be conducted together.

  2. Deployment Details

    An email arrives from Elena Novak, the project manager responsible for the social housing AI deployment. She provides the details Alice needs to begin the FRIA.

  3. FRIA Guidance Document

    Before starting the assessment, Alice opens the FRIA guidance document linked in Elena's email. It explains each section of the assessment, the difference between a FRIA and a DPIA, and practical tips for completing each part thoroughly.

  4. Open the FRIA Form

    With the guidance reviewed, Alice clicks Continue to open the FRIA checklist - five sections covering the pillars of a rights impact assessment.

  5. Section 1: System Description

    The first section asks Alice to document the AI system's purpose, the people it affects, and the scope of its deployment. For this system, the affected persons include: Benefit applicants - many in vulnerable financial situations, directly affected by eligibility decisions. Case workers - who rely on the AI's recommendations to issue final decisions. Applicants' families - who depend on the housing benefits the applicant is seeking. The system makes decisions that directly affect access to housing - a fundamental need.

  6. Section 2: Fundamental Rights Analysis

    This is the most critical section of the FRIA. Alice must assess which fundamental rights the AI system could affect: Right to non-discrimination - the AI may discriminate based on family composition, nationality, or residential address, using these as proxy variables for protected characteristics. Right to privacy - the system processes sensitive personal data including income, employment, and family information. Right to an effective remedy - applicants must be able to challenge AI-driven rejections through a clear, accessible appeal process. Right to human dignity - automated decisions about housing access affect a fundamental aspect of a person's life and dignity.

  7. Section 3: Risk Assessment

    Alice scores each rights impact for severity and likelihood. Housing-access denial is high severity; likelihood depends on bias controls. The combined rating drives which mitigations get priority.

  8. Section 4: Mitigation Measures

    For each identified risk, Alice defines safeguards: an appeal mechanism for rejected applicants, human review of all negative decisions, regular bias auditing across demographic groups, and transparency about the AI's role.

  9. Section 5: Monitoring Plan

    A FRIA is a living document. Review it at least annually and whenever the system changes significantly. Escalation paths must be defined so rights-affecting incidents trigger immediate review.

  10. Submit the FRIA

    All five sections are complete. The progress bar shows 100%. Alice can now submit the Fundamental Rights Impact Assessment for the social housing AI system.