Thursday, 3 July 2025
28.1 C
Singapore
25.8 C
Thailand
20.1 C
Indonesia
29 C
Philippines

New AI method could cut energy use by 95% but requires fresh hardware

A new AI method could cut energy use by 95% but needs special hardware, highlighting the potential for greener technology in the future.

The demand for artificial intelligence (AI) continues to grow, driving the need for computing power and electricity. As AI becomes more integrated into daily life, concerns about its environmental impact have intensified. However, a breakthrough offers hope for significantly reducing the energy AI systems require.

Researchers at BitEnergy AI have developed a computational method called linear Complexity Multiplication, which could lower AI energy consumption by as much as 95%. While this innovation can potentially transform AI’s energy usage, it may also require overhauling existing hardware systems.

Moving beyond floating-point multiplication

Most current AI systems rely on floating-point multiplication (FPM), an essential technique for handling complex calculations involving large or small numbers. This precision is especially critical for deep learning models, which perform intricate computations to generate accurate results. However, FPM is highly energy-intensive, contributing significantly to the overall power usage of AI applications.

Linear complexity multiplication shifts away from FPM; integer addition is used to perform calculations. According to the researchers, this change drastically reduces the energy demands of AI systems while maintaining performance levels. Despite the lower energy requirements, early tests indicate no computational accuracy or efficiency drop.

The hardware hurdle

Despite its promising benefits, adopting Linear-Complexity Multiplication is challenging. Most AI systems today are built on hardware optimised for FPM, such as GPUs from leading manufacturers like Nvidia. To implement the new method effectively, entirely new hardware will be needed.

The research team claims they have already designed, built, and tested the required hardware. However, making this technology widely available involves licensing and manufacturing processes that could take time to finalise. Until then, the AI industry may face difficulties integrating the energy-saving technique into its systems.

AI’s growing energy problem

The potential of linear Complexity Multiplication comes at a critical moment for the AI industry. AI systems’ energy demands are soaring. OpenAI’s ChatGPT, for instance, reportedly consumes approximately 564 megawatt-hours (MWh) of electricity daily—enough to power 18,000 American homes. Experts predict AI’s annual energy consumption could soon reach 100 terawatt-hours (TWh), rivalling Bitcoin mining, one of the most energy-intensive digital processes.

Rising energy consumption has sparked calls for more sustainable AI technologies. The introduction of linear Complexity Multiplication represents a significant step forward, but its success hinges on widespread adoption and the availability of compatible hardware.

The researchers at BitEnergy AI are optimistic about their method’s potential to reshape AI’s energy footprint. Still, the road ahead involves navigating technological, economic, and manufacturing challenges.

Hot this week

Android 16 to alert you if your phone connects to a fake cell tower

Android 16 will warn you if your phone connects to a fake tower, helping protect your calls, texts, and location from silent spying.

Apple reveals iOS 26 beta 2: What you can expect on your iPhone this autumn

Apple’s iOS 26 brings a new design, smarter features, and better tools—see what’s coming and when your iPhone will get the update.

OpenAI turns to Google’s AI chips in the shift from Microsoft and Nvidia

OpenAI begins renting Google's AI chips to run ChatGPT, shifting away from Microsoft and Nvidia to lower computing costs.

Google Cloud and DISG launch AI Cloud Takeoff to drive enterprise AI adoption in Singapore

Google Cloud and DISG launch AI Cloud Takeoff to help 300 companies build in-house AI centres and accelerate enterprise AI adoption in Singapore.

Self-driving shuttles to begin rolling through Punggol by late 2025

Self-driving shuttles will launch in Punggol by late 2025, bringing autonomous public transport to Singapore neighbourhoods for the first time.

Meta’s investment doesn’t change Scale AI’s priorities, says new CEO

Scale AI CEO Jason Droege confirms the start-up stays independent despite Meta’s 49% stake and outlines plans for broader AI growth.

Mainland investment boom lifts Hong Kong’s market

Chinese firms turn to Hong Kong listings after mainland investors spend US$93B on stocks, eyeing global growth and fresh funding sources.

Alibaba Cloud marks 10 years in Singapore with major AI and cloud expansion

Alibaba Cloud celebrates 10 years in Singapore with global AI tools, new data centres, and expanded services for your digital transformation.

Google lets you share smart home access more easily with family and kids

Google Home lets you easily assign Admin or Member roles, even for kids under 13, to manage your smart home access better.

Related Articles

Popular Categories