chrisdh79 OP t1_jac9z4h wrote
From the article: Scientists across multiple disciplines are working to create revolutionary biocomputers where three-dimensional cultures of brain cells, called brain organoids, serve as biological hardware. They describe their roadmap for realizing this vision in the journal Frontiers in Science.
“We call this new interdisciplinary field ‘organoid intelligence’ (OI),” said Prof Thomas Hartung of Johns Hopkins University. “A community of top scientists has gathered to develop this technology, which we believe will launch a new era of fast, powerful, and efficient biocomputing.”
Brain organoids are a type of lab-grown cell-culture. Even though brain organoids aren’t ‘mini brains’, they share key aspects of brain function and structure such as neurons and other brain cells that are essential for cognitive functions like learning and memory. Also, whereas most cell cultures are flat, organoids have a three-dimensional structure. This increases the culture's cell density 1,000-fold, meaning that neurons can form many more connections.
But even if brain organoids are a good imitation of brains, why would they make good computers? After all, aren't computers smarter and faster than brains?
"While silicon-based computers are certainly better with numbers, brains are better at learning,” Hartung explained. “For example, AlphaGo [the AI that beat the world’s number one Go player in 2017] was trained on data from 160,000 games. A person would have to play five hours a day for more than 175 years to experience these many games.”
Brains are not only superior learners, they are also more energy efficient. For instance, the amount of energy spent training AlphaGo is more than is needed to sustain an active adult for a decade.
“Brains also have an amazing capacity to store information, estimated at 2,500TB,” Hartung added. “We’re reaching the physical limits of silicon computers because we cannot pack more transistors into a tiny chip. But the brain is wired completely differently. It has about 100bn neurons linked through over 1015 connection points. It’s an enormous power difference compared to our current technology.”
Dr_seven t1_jadny3h wrote
>“Brains also have an amazing capacity to store information, estimated at 2,500TB,” Hartung added. “We’re reaching the physical limits of silicon computers because we cannot pack more transistors into a tiny chip. But the brain is wired completely differently. It has about 100bn neurons linked through over 10^15 connection points. It’s an enormous power difference compared to our current technology.”
This part in particular made me squint a little bit.
For starters, we don't fully grasp how memory works in the brain, but we know it isn't like mechanical/electrical memory, with physical bits that flip. It seems to be tied to the combinations of neurons that fire, of which there are essentially infinite permutations, leading to the sky-high calculations of how much "data" the brain can hold....but it doesn't hold data like that, at least not for most humans.
The complexity of this renders it impractical to easily model on anything less than the largest supercomputers, and even then, we aren't actually modeling brain activity in the sense that we know why Pattern X leads to "recalling what that stroganoff tasted like on April 7, 2004".
The reason this is important is because it means that, while we may be able to stimulate neurons in a lab in a way that makes them useful for data storage, it isn't necessarily the same way that human brains store information- indeed, human memory would be a horrible baseline for a computer, considering the brain's preference towards confabulation of details at the time of recall that are not consistent with the reality. Most people's memories of most things are inaccurate, but close enough to work out alright. That's the exact sort of thing you don't want from a computer's memory.
This is compelling stuff, but we have a long way to go before we even understand what we are dealing with in practical terms.
Viewing a single comment thread. View all comments