
A research team in Beijing has unveiled what they claim to be the world's first 'brain-like' large language model, engineered to function with reduced energy usage and without dependence on Nvidia hardware.
As reported by the South China Morning Post, this system is called SpikingBrain 1.0 and was created by the Institute of Automation at the Chinese Academy of Sciences. Unlike popular AI tools like ChatGPT, which activate entire networks during processing, SpikingBrain is said to emulate the human brain by activating only the necessary neurons.
This selective method enables the model to save energy while responding more swiftly to inputs. The researchers noted that SpikingBrain uses less than 2% of the training data typically required by standard AI systems. Despite the smaller dataset, the model maintained its performance with lengthy inputs and, in some tests, operated up to 100 times faster than traditional systems.
These findings were detailed in a technical paper available on the open-access research repository arXiv, although the study has not yet been peer-reviewed, according to SCMP. The model is powered by the MetaX chip platform, developed in Shanghai, making it independent of Nvidia's commonly used graphics processors.
This is considered significant as the U.S. enforces stricter export controls on advanced AI chips. On its demo site, the model introduces itself as, 'Hello! I’m SpikingBrain 1.0, or ‘Shunxi’, a brain-inspired AI model. I integrate the way the human brain processes information with a spiking computation method, aiming to provide powerful, reliable, and energy-efficient AI services entirely built on Chinese technology.'
The system's core relies on spiking computation, an event-driven technique that mimics the brain's method of sending signals only when necessary. By avoiding continuous full-scale processing, the model reduces memory and energy strain while still handling complex tasks.