Thursday, 11 December 2025
28.3 C
Singapore
26.2 C
Thailand
29.1 C
Indonesia
28.2 C
Philippines

Training on encrypted data: The future of privacy-preserving artificial intelligence

Discover how AI techniques like Federated Learning and Homomorphic Encryption protect data privacy while advancing technology.

In the era of artificial intelligence (AI) and big data, predictive models have become vital across various sectors, including healthcare, finance, and genomics. These models heavily rely on processing sensitive information, making data privacy a crucial concern. The challenge lies in maximising data utility while ensuring the confidentiality and integrity of the information. Striking this balance is essential for the advancement and acceptance of AI technologies.

Collaboration and open-source

Creating a robust dataset for training machine learning models is not simple. For instance, while AI technologies like ChatGPT have thrived by gathering vast amounts of data from the Internet, healthcare data cannot be compiled freely due to privacy concerns. Constructing a healthcare dataset involves integrating data from multiple sources, including doctors, hospitals, and across borders. This complexity underscores the need for robust privacy solutions.

Although the healthcare sector is emphasised for its societal importance, these principles apply broadly. For example, even a smartphone’s autocorrect feature, which personalises predictions based on user data, must navigate similar privacy issues. The finance sector also needs help sharing data due to its competitive nature.

Thus, collaboration is not just beneficial but crucial for safely harnessing AI’s potential within our societies. An often overlooked aspect is the actual execution environment of AI and the underlying hardware that powers it. Today’s advanced AI models require robust hardware, including extensive CPU/GPU resources, substantial amounts of RAM, and more specialised technologies such as TPUs, ASICs, and FPGAs. Conversely, the trend towards user-friendly interfaces with straightforward APIs is gaining popularity. This scenario highlights the importance of developing solutions that enable AI to operate on third-party platforms without sacrificing privacy, and the need for open-source tools that facilitate these privacy-preserving technologies. Your contribution to this collaborative effort is invaluable.

Privacy solutions to train machine learning models

Several sophisticated solutions have been developed to address the privacy challenges in AI, each focusing on specific needs and scenarios.

Federated Learning (FL) allows for training machine learning models across multiple decentralised devices or servers, each holding local data samples, without actually exchanging the data. Similarly, Secure Multi-party Computation (MPC) enables various parties to jointly compute a function over their inputs while keeping those inputs private, ensuring that sensitive data does not leave its original environment.

Another set of solutions focuses on manipulating data to maintain privacy while allowing for helpful analysis. Differential privacy (DP) introduces noise to data in a way that protects individual identities while still providing accurate aggregate information. Data anonymization (DA) removes personally identifiable information from datasets, ensuring anonymity and mitigating the risk of data breaches.

Finally, homomorphic encryption (HE) allows operations to be performed directly on encrypted data, generating an encrypted result that, when decrypted, matches the result of operations performed on the plaintext.

The perfect fit

Each of these privacy solutions has its own set of advantages and trade-offs. FL maintains communication with a third-party server, which can potentially lead to some data leakage. MPC operates on cryptographic principles that are robust in theory but can create significant bandwidth demands in practice.

DP involves a manual setup where noise is strategically added to the data. This setup limits the types of operations that can be performed on the data, as the noise needs to be carefully balanced to protect privacy while retaining data utility. DA, while widely used, often provides the least privacy protection. Since anonymization typically occurs on a third-party server, there is a risk that cross-referencing can expose the hidden entities within the dataset.

HE, specifically Fully Homomorphic Encryption (FHE), stands out by allowing computations on encrypted data that closely mimic those performed on plaintext. This capability makes FHE highly compatible with existing systems and straightforward to implement thanks to open-source and accessible libraries and compilers like Concrete ML, which are designed to give developers easy-to-use tools to develop different applications. The major drawback is the slowdown in computation speed, which can impact performance.

While all the solutions and technologies discussed encourage collaboration and joint efforts, FHE’s potential to revolutionise data privacy is truly inspiring. It can drive innovation and facilitate a scenario where no trade-offs are needed to enjoy services and products without compromising personal data.

Hot this week

Micron’s exit from Crucial signals a turning point for consumer memory

Micron ends its Crucial consumer line as it shifts focus to AI and enterprise memory, marking a major change in the PC hardware market.

Google highlights Singapore’s top trending searches in 2025

Google reveals Singapore’s top trending searches for 2025, highlighting SG60 celebrations, elections, pop culture and financial concerns.

Airwallex acquires majority stake in Indonesian payments firm to deepen Asia-Pacific expansion

Airwallex acquires majority ownership of PT Skye Sab Indonesia to expand its financial infrastructure across Asia-Pacific.

HPE expands hybrid cloud portfolio with new virtualisation, security and AI capabilities

HPE expands its GreenLake cloud portfolio with new virtualisation, security and AI capabilities to support modern hybrid cloud demands.

New research finds growing public demand for modern emergency call systems in Australia and New Zealand

New study shows strong public support for modern, data-driven and AI-enabled emergency call systems in Australia and New Zealand.

Enterprise AI adoption accelerates as organisations deepen workflow integration

A new OpenAI report shows rapid global growth in enterprise AI, rising productivity gains, and a widening gap between leading and lagging adopters.

Grab signs partnership with Charge+ to expand EV charging network in Vietnam

Grab and Charge+ partner to expand Vietnam’s EV charging network and support the country’s shift towards green mobility.

Kaspersky uncovers macOS malware campaign abusing ChatGPT chat-sharing feature

Kaspersky reports a macOS malware campaign using ChatGPT’s chat-sharing feature to spread the AMOS infostealer.

Singapore leads global third-party cyber risk maturity as supply-chain threats intensify

Singapore leads global third-party cyber risk maturity but faces rising supply-chain cyber threats, according to new BlueVoyant research.

Related Articles

Popular Categories