His AI Can See Inside Living Cells

Our high school textbooks about biology were wrong about cells. It isn’t a neat translucent sphere. Its internal parts are also not sitting and conveniently far apart like pineapple chunks suspended in gelatin. So what does it really look like?

In reality, a cell looks more like a pound of half-melted jelly beans stuffed into a too-small sandwich bag. And its contents are all constantly moving, following a choreography more precise and complex than that of a computer chip.
In short, understanding what cells look like on the inside — much less the myriad interactions among their parts — is hard even in the 21st century. “Think of a cell as a sophisticated machine like a car — except every 24 hours, you’ll have two cars in your driveway, and then four cars in your driveway,” said Greg Johnson, a computer vision and machine learning researcher at the Allen Institute for Cell Science. “If you found the smartest engineers in the world and said, ‘Make me a machine that does that,’ they would be totally stumped. That’s what I think of when I think of how little we know about how cells work.”
To view the inner workings of living cells, biologists currently use a combination of genetic engineering and advanced optical microscopy. (Electron microscopes can image cell interiors in great detail, but not with live samples.) Typically, a cell is genetically modified to produce a fluorescent protein that attaches itself to specific subcellular structures, like mitochondria or microtubules. The fluorescent protein glows when the cell is illuminated by a certain wavelength of light, which visually labels the associated structure. However, this technique is expensive and time-consuming, and it allows only a few structural features of the cell to be observed at a time.
But with his background in software engineering, Johnson wondered: What if researchers could teach artificial intelligence to recognize the interior features of cells and label them automatically? In 2018, he and his collaborators at the Allen Institute did just that. Using fluorescence imaging samples, they trained a deep learning system to recognize over a dozen kinds of subcellular structures, until it could spot them in cells that the software hadn’t seen before. Even better, once trained, Johnson’s system also worked with “brightfield images” of cells — images easily obtained with ordinary light microscopes through a process “like shining a flashlight through the cells,” he said.

Know more about Greg Johnson’s work over at Quanta Magazine.

(Image Credit: Chona Kasinger/ Quanta Magazine)


Login to comment.




Email This Post to a Friend
"His AI Can See Inside Living Cells"

Separate multiple emails with a comma. Limit 5.

 

Success! Your email has been sent!

close window
X

This website uses cookies.

This website uses cookies to improve user experience. By using this website you consent to all cookies in accordance with our Privacy Policy.

I agree
 
Learn More