Tuesday, 29 April 2025
26.7 C
Singapore
27 C
Thailand
18.9 C
Indonesia
27.9 C
Philippines

Training on encrypted data: The future of privacy-preserving artificial intelligence

Discover how AI techniques like Federated Learning and Homomorphic Encryption protect data privacy while advancing technology.

In the era of artificial intelligence (AI) and big data, predictive models have become vital across various sectors, including healthcare, finance, and genomics. These models heavily rely on processing sensitive information, making data privacy a crucial concern. The challenge lies in maximising data utility while ensuring the confidentiality and integrity of the information. Striking this balance is essential for the advancement and acceptance of AI technologies.

Collaboration and open-source

Creating a robust dataset for training machine learning models is not simple. For instance, while AI technologies like ChatGPT have thrived by gathering vast amounts of data from the Internet, healthcare data cannot be compiled freely due to privacy concerns. Constructing a healthcare dataset involves integrating data from multiple sources, including doctors, hospitals, and across borders. This complexity underscores the need for robust privacy solutions.

Although the healthcare sector is emphasised for its societal importance, these principles apply broadly. For example, even a smartphone’s autocorrect feature, which personalises predictions based on user data, must navigate similar privacy issues. The finance sector also needs help sharing data due to its competitive nature.

Thus, collaboration is not just beneficial but crucial for safely harnessing AI’s potential within our societies. An often overlooked aspect is the actual execution environment of AI and the underlying hardware that powers it. Today’s advanced AI models require robust hardware, including extensive CPU/GPU resources, substantial amounts of RAM, and more specialised technologies such as TPUs, ASICs, and FPGAs. Conversely, the trend towards user-friendly interfaces with straightforward APIs is gaining popularity. This scenario highlights the importance of developing solutions that enable AI to operate on third-party platforms without sacrificing privacy, and the need for open-source tools that facilitate these privacy-preserving technologies. Your contribution to this collaborative effort is invaluable.

Privacy solutions to train machine learning models

Several sophisticated solutions have been developed to address the privacy challenges in AI, each focusing on specific needs and scenarios.

Federated Learning (FL) allows for training machine learning models across multiple decentralised devices or servers, each holding local data samples, without actually exchanging the data. Similarly, Secure Multi-party Computation (MPC) enables various parties to jointly compute a function over their inputs while keeping those inputs private, ensuring that sensitive data does not leave its original environment.

Another set of solutions focuses on manipulating data to maintain privacy while allowing for helpful analysis. Differential privacy (DP) introduces noise to data in a way that protects individual identities while still providing accurate aggregate information. Data anonymization (DA) removes personally identifiable information from datasets, ensuring anonymity and mitigating the risk of data breaches.

Finally, homomorphic encryption (HE) allows operations to be performed directly on encrypted data, generating an encrypted result that, when decrypted, matches the result of operations performed on the plaintext.

The perfect fit

Each of these privacy solutions has its own set of advantages and trade-offs. FL maintains communication with a third-party server, which can potentially lead to some data leakage. MPC operates on cryptographic principles that are robust in theory but can create significant bandwidth demands in practice.

DP involves a manual setup where noise is strategically added to the data. This setup limits the types of operations that can be performed on the data, as the noise needs to be carefully balanced to protect privacy while retaining data utility. DA, while widely used, often provides the least privacy protection. Since anonymization typically occurs on a third-party server, there is a risk that cross-referencing can expose the hidden entities within the dataset.

HE, specifically Fully Homomorphic Encryption (FHE), stands out by allowing computations on encrypted data that closely mimic those performed on plaintext. This capability makes FHE highly compatible with existing systems and straightforward to implement thanks to open-source and accessible libraries and compilers like Concrete ML, which are designed to give developers easy-to-use tools to develop different applications. The major drawback is the slowdown in computation speed, which can impact performance.

While all the solutions and technologies discussed encourage collaboration and joint efforts, FHE’s potential to revolutionise data privacy is truly inspiring. It can drive innovation and facilitate a scenario where no trade-offs are needed to enjoy services and products without compromising personal data.

Hot this week

OpenAI says it would consider buying Google Chrome if offered

OpenAI told a judge it would be open to buying Google Chrome if it were sold as part of the US antitrust case against Google.

Ziff Davis takes OpenAI to court over alleged copyright infringement

Ziff Davis sues OpenAI over copyright claims, accusing the AI firm of copying and using its content without permission.

Zoho partners with Cradle to boost Malaysia’s startup ecosystem

Zoho and Cradle are partnering to provide RM44 million in software credits to 4,400 startups, supporting Malaysia's tech innovation drive.

WhatsApp adds new Advanced Chat Privacy feature to boost group chat security

WhatsApp's new Advanced Chat Privacy feature helps stop group chat content from being shared or saved outside the app.

Bitdefender launches GravityZone PHASR to combat stealthy endpoint threats

Bitdefender unveils GravityZone PHASR, a dynamic endpoint security tool that reduces attack surfaces using behaviour-based automation.

Razer Launches Pro Click V2 and V2 Vertical Mice: Blending Gaming and Productivity

Razer's new Pro Click V2 and V2 Vertical mice offer gaming precision and ergonomic comfort, with AI prompt access and long battery life, available now!

Nintendo Pop-Up Store and Mario Kart Fun Return to Jewel Changi Airport

Experience the magic of Nintendo at Jewel Changi Airport with the return of the Pop-Up Store and the exciting Mario Kart Jewel Circuit Challenge!

Lian Li’s new Lancool 207 Digital case brings a 6-inch LCD screen to your PC

Lian Li's Lancool 207 Digital PC case brings a bright 6-inch LCD screen to your setup, offering style, function, and full customisation.

Google to end support for early Nest thermostats on October 25

Google will stop supporting first—and second-generation Nest thermostats on October 25 and end new Nest launches in Europe.

Related Articles

Popular Categories