Thursday, 27 November 2025
26.5 C
Singapore
17.3 C
Thailand
24.3 C
Indonesia
25.9 C
Philippines

Training on encrypted data: The future of privacy-preserving artificial intelligence

Discover how AI techniques like Federated Learning and Homomorphic Encryption protect data privacy while advancing technology.

In the era of artificial intelligence (AI) and big data, predictive models have become vital across various sectors, including healthcare, finance, and genomics. These models heavily rely on processing sensitive information, making data privacy a crucial concern. The challenge lies in maximising data utility while ensuring the confidentiality and integrity of the information. Striking this balance is essential for the advancement and acceptance of AI technologies.

Collaboration and open-source

Creating a robust dataset for training machine learning models is not simple. For instance, while AI technologies like ChatGPT have thrived by gathering vast amounts of data from the Internet, healthcare data cannot be compiled freely due to privacy concerns. Constructing a healthcare dataset involves integrating data from multiple sources, including doctors, hospitals, and across borders. This complexity underscores the need for robust privacy solutions.

Although the healthcare sector is emphasised for its societal importance, these principles apply broadly. For example, even a smartphone’s autocorrect feature, which personalises predictions based on user data, must navigate similar privacy issues. The finance sector also needs help sharing data due to its competitive nature.

Thus, collaboration is not just beneficial but crucial for safely harnessing AI’s potential within our societies. An often overlooked aspect is the actual execution environment of AI and the underlying hardware that powers it. Today’s advanced AI models require robust hardware, including extensive CPU/GPU resources, substantial amounts of RAM, and more specialised technologies such as TPUs, ASICs, and FPGAs. Conversely, the trend towards user-friendly interfaces with straightforward APIs is gaining popularity. This scenario highlights the importance of developing solutions that enable AI to operate on third-party platforms without sacrificing privacy, and the need for open-source tools that facilitate these privacy-preserving technologies. Your contribution to this collaborative effort is invaluable.

Privacy solutions to train machine learning models

Several sophisticated solutions have been developed to address the privacy challenges in AI, each focusing on specific needs and scenarios.

Federated Learning (FL) allows for training machine learning models across multiple decentralised devices or servers, each holding local data samples, without actually exchanging the data. Similarly, Secure Multi-party Computation (MPC) enables various parties to jointly compute a function over their inputs while keeping those inputs private, ensuring that sensitive data does not leave its original environment.

Another set of solutions focuses on manipulating data to maintain privacy while allowing for helpful analysis. Differential privacy (DP) introduces noise to data in a way that protects individual identities while still providing accurate aggregate information. Data anonymization (DA) removes personally identifiable information from datasets, ensuring anonymity and mitigating the risk of data breaches.

Finally, homomorphic encryption (HE) allows operations to be performed directly on encrypted data, generating an encrypted result that, when decrypted, matches the result of operations performed on the plaintext.

The perfect fit

Each of these privacy solutions has its own set of advantages and trade-offs. FL maintains communication with a third-party server, which can potentially lead to some data leakage. MPC operates on cryptographic principles that are robust in theory but can create significant bandwidth demands in practice.

DP involves a manual setup where noise is strategically added to the data. This setup limits the types of operations that can be performed on the data, as the noise needs to be carefully balanced to protect privacy while retaining data utility. DA, while widely used, often provides the least privacy protection. Since anonymization typically occurs on a third-party server, there is a risk that cross-referencing can expose the hidden entities within the dataset.

HE, specifically Fully Homomorphic Encryption (FHE), stands out by allowing computations on encrypted data that closely mimic those performed on plaintext. This capability makes FHE highly compatible with existing systems and straightforward to implement thanks to open-source and accessible libraries and compilers like Concrete ML, which are designed to give developers easy-to-use tools to develop different applications. The major drawback is the slowdown in computation speed, which can impact performance.

While all the solutions and technologies discussed encourage collaboration and joint efforts, FHE’s potential to revolutionise data privacy is truly inspiring. It can drive innovation and facilitate a scenario where no trade-offs are needed to enjoy services and products without compromising personal data.

Hot this week

Warner Music ends lawsuit against Suno after reaching new licensing agreement

Warner Music ends its lawsuit against Suno after securing a licensing deal that gives artists opt-in control over AI-generated music.

Apple expected to launch low-cost MacBook with iPhone chip in early 2026

Apple is expected to launch a low-cost MacBook with an A18 Pro chip in February 2026, aiming to offer a budget-friendly alternative to its existing models.

Andika Rama returns to claim TGR Asia Esports GT Championship 2025 title

Indonesia’s Andika Rama wins the TGR Asia Esports GT Championship 2025 as his team seals both individual and country titles.

OpenAI introduces a new shopping assistant in ChatGPT

OpenAI launches a new ChatGPT shopping assistant that helps users compare products, find deals, and search for images ahead of Black Friday.

Chrome tests new privacy feature to limit precise location sharing on Android

Chrome for Android tests a new privacy feature that lets websites access only approximate location data instead of precise GPS information.

Singapore consumers show growing interest in AI shopping companions

Research shows rising consumer interest in AI shopping agents in Singapore, with strong demand for cost savings and secure automation.

Qualcomm introduces Snapdragon 8 Gen 5 as streamlined alternative to Elite chipset

Qualcomm launches the Snapdragon 8 Gen 5 chipset, offering strong performance, AI features, and expected availability in devices within weeks.

Warner Music ends lawsuit against Suno after reaching new licensing agreement

Warner Music ends its lawsuit against Suno after securing a licensing deal that gives artists opt-in control over AI-generated music.

Asia’s boards place AI and digital transformation at the top of 2026 priorities

Nearly half of Asia’s governance leaders plan to prioritise AI in 2026 as digital transformation reshapes board agendas.

Related Articles

Popular Categories