Automated Decision-Making

What rights do employees have against AI decisions?

Employees have the right to be informed, to obtain human intervention, to express their point of view, and to contest automated decisions affecting them.

Is full automation of hiring legal?

Fully automated rejection based on keywords is risky. The law generally requires a human review option if the decision has legal effects (e.g., denying a job opportunity).

Can I refuse to be evaluated by an algorithm?

Yes, under Article 24 of the Data Protection Law, you generally have the right not to be subject to a decision based solely on automated processing.

What if the algorithm is biased?

The employer is liable for discrimination. They must ensure the algorithm is regularly audited and free from bias against protected groups.

Reading Time

3 min

Published

...

Automated Decision-Making (Workplace) entails the use of algorithms in human resources management, ranging from recruitment to performance evaluation and dismissal. When an AI system makes decisions without significant human intervention (e.g., automatically rejecting a resume or fining a courier for lateness), it triggers significant legal consequences. Georgia's new "Law on Personal Data Protection" strictly regulates such processes and grants employees the right not to be subject to a decision based solely on automated processing if it produces legal effects or significantly affects them.

Our service includes a full legal audit and adaptation of the company's HR processes. Our specialists ensure:

  • Algorithmic Transparency: Informing employees that a program is making decisions about them (e.g., calculating bonuses) and explaining the logic behind such decisions.
  • Human Intervention Mechanisms: Developing procedures where an employee can contest the AI's decision and request a human review.
  • DPIA (Data Protection Impact Assessment): Conducting an impact assessment on data protection before implementing automated systems.
  • Discrimination Prevention: Checking algorithms for hidden bias (e.g., based on gender or age), which is prohibited by the Labor Code.
  • Consent Forms: Creating legal grounds for automated data processing.

Let's consider real scenarios. Scenario 1: A taxi company uses an app that automatically blocks a driver if their rating drops below 4.5. This constitutes "dismissal" without human intervention, which is illegal if the driver has no right to appeal. Scenario 2: A retail chain uses AI to schedule shifts. The system consistently assigns "bad" shifts to older employees. This is indirect discrimination. Scenario 3: During recruitment, AI analyzes a video interview and scores it based on emotions. This is high-risk processing that requires justified necessity and consent.

In Georgia, this field is regulated by the Law on Personal Data Protection (Article 24 — Automated Decision-Making) and the Organic Law Labor Code of Georgia (prohibition of discrimination). The Data Protection Law explicitly states that the data subject (employee) has the right to demand human involvement, express their point of view, and contest the decision. The employer is obliged to explain the logic and significance of the decision-making process.

Specialists will develop an "Automated Decision-Making Policy." This document details where and how AI is used, what data it relies on, and how employees can protect themselves. This not only protects the company from fines (which can be up to 1% of turnover) but also increases trust among employees.

Legal.ge is your partner in the era of digital transformation. Our experts will help you implement innovative HR technologies without violating human rights and the law. Be technological, but fair.

Updated: ...

Specialists for this service

Loading...