An AI-driven serious game for GDPR compliance


Table of content

  1. Introduction
  2. Related works
  3. Methodology
  4. Team Members
  5. Supervisors
  6. Links

Introduction

In today’s digital world, data breaches are more common than ever, putting users’ privacy at risk and causing major financial losses for organizations. In 2023 alone, over 353 million individuals were affected by data compromises in the U.S., with the global cost of data breaches reaching $4.88 million on average. These incidents highlight the urgent need for strong privacy-preserving practices in software development.

To address this, various privacy principles and regulations—such as Privacy by Design (PbD) and the General Data Protection Regulation (GDPR)—have been introduced. However, many software developers still lack the training and awareness needed to apply these privacy techniques effectively. Traditional training methods often fail to engage learners, leading to poor knowledge retention.

Our research project focuses on creating a more effective and engaging way to teach privacy concepts to developers. Inspired by the success of game-based learning, we build upon an existing serious game framework that teaches GDPR principles through interactive gameplay. While the original game helped developers understand privacy better, it lacked features to keep players continuously motivated and engaged.

To improve this, our enhanced game framework introduces two key elements:

By integrating these features, our project aims to boost player engagement, enhance learning outcomes, and help developers confidently build privacy-aware software systems.

As privacy concerns grow in the digital age, regulatory frameworks like the General Data Protection Regulation (GDPR) have been introduced to protect user data. However, implementing these principles effectively depends largely on software developers, who often face challenges in translating abstract regulations into concrete coding practices.

Several studies have highlighted that developers struggle with:

Despite these challenges, there is a notable lack of educational interventions focused on equipping developers with the necessary skills and motivation to embed privacy from the ground up.

Game-Based Learning in Privacy Education

To address this gap, researchers such as Arachchilage et al. have proposed serious game frameworks that teach data minimization and GDPR principles using game-based learning approaches. These games integrate learning models like Bloom’s Taxonomy to promote better understanding. While promising, these early frameworks lacked adaptive feedback and personalization mechanisms, which are crucial for maintaining learner engagement.

Adaptive and Intelligent Learning Enhancements

Recent studies in related domains like cybersecurity training and secure coding have shown the effectiveness of:

These advancements open new possibilities for intelligent, personalized learning systems that respond to user skill level and performance—ensuring a more effective and engaging learning experience.

Methodology

Image

This study adopts a mixed-methods approach, combining both quantitative and qualitative research methodologies to evaluate the effectiveness of the proposed AI-powered serious game framework for GDPR education.

Design and Development

Sample and Participants

Data Collection Procedure

Team Members

Supervisors

🔗 Project Repository 🔗 Project Page 🏛️ Department of Computer Engineering 🎓 University of Peradeniya