“We did the back-of-napkin math on what ramping up this experiment to the entire brain would cost, and the scale is impossibly large — 1.6 zettabytes of storage costing $50 billion and spanning 140 acres, making it the largest data center on the planet.”
Look at what they need to mimic just a fraction of our power.
In fairness, the scan required such astronomical resources because of how they were scanning it.
They took the cubic millimeter chunk and cut it into 5,000 super thin flat slices and then did extremely high detail scans of each slice.
That’s why they needed AI, to try and piece those flat layers back together into some sort of 3D structure.
Once they have the 3D structure, the scans are useless and can be deleted.
In time it should be possible to scan the tissue and get the 3D structure without such extreme data use.
“We did the back-of-napkin math on what ramping up this experiment to the entire brain would cost, and the scale is impossibly large — 1.6 zettabytes of storage costing $50 billion and spanning 140 acres, making it the largest data center on the planet.”
Look at what they need to mimic just a fraction of our power.
In fairness, the scan required such astronomical resources because of how they were scanning it. They took the cubic millimeter chunk and cut it into 5,000 super thin flat slices and then did extremely high detail scans of each slice. That’s why they needed AI, to try and piece those flat layers back together into some sort of 3D structure.
Once they have the 3D structure, the scans are useless and can be deleted.
In time it should be possible to scan the tissue and get the 3D structure without such extreme data use.
And the whole human body, brain and all, can run on ~100 watts. Truly astounding.
deleted by creator
Meh. No different than how you can make a programming language and then have it compile itself. It’s weird.
deleted by creator