
Sign up to save your podcasts
Or


Do AI models in biology have to get bigger to get better? And what can you do with more training, more computing power, and more outputs?
Joining me this week to talk about MaxToki, a new AI-powered model that can predict how cells age, is Christina Theodoris, a physician scientist at the Gladstone Institutes who is using her models to study cardiovascular disease. She's a pioneer in this field, having developed GeneFormer, a model trained on millions of single-cell transcriptomes that can predict what will happen when gene networks are disturbed.
But size isn’t everything. “The biggest impact we see though is with diversity of the data,” she said. “As we increase the diversity, that actually has even more impact than just the pure numbers, let's say if you were to use cells that are similar to the ones seen before.” This has implications for training new, even more powerful models.
Theodoris is also an accomplished visual artist whose paintings draw on surrealism, cultural heritage, and memory.
In addition to her models, we talked about the trend of AI slop, how she would paint her models, and whether or not she likes the Ion Genomics logos I came up with.
To learn more about her art, please check out Christinatheodoris.com
Links to Theodoris’ work discussed in this episode:
MaxToki Preprint https://www.biorxiv.org/content/10.64898/2026.03.30.715396v1
Scaling GeneFormer
“Scaling and quantization of large-scale foundation model enables resource-efficient predictions in network biology,” Nature Computational Science
https://www.nature.com/articles/s43588-026-00972-4
"Discovery of candidate therapeutic targets with Geneformer," Nature Protocols
https://www.nature.com/articles/s41596-026-01364-8
By Andrew P. HanDo AI models in biology have to get bigger to get better? And what can you do with more training, more computing power, and more outputs?
Joining me this week to talk about MaxToki, a new AI-powered model that can predict how cells age, is Christina Theodoris, a physician scientist at the Gladstone Institutes who is using her models to study cardiovascular disease. She's a pioneer in this field, having developed GeneFormer, a model trained on millions of single-cell transcriptomes that can predict what will happen when gene networks are disturbed.
But size isn’t everything. “The biggest impact we see though is with diversity of the data,” she said. “As we increase the diversity, that actually has even more impact than just the pure numbers, let's say if you were to use cells that are similar to the ones seen before.” This has implications for training new, even more powerful models.
Theodoris is also an accomplished visual artist whose paintings draw on surrealism, cultural heritage, and memory.
In addition to her models, we talked about the trend of AI slop, how she would paint her models, and whether or not she likes the Ion Genomics logos I came up with.
To learn more about her art, please check out Christinatheodoris.com
Links to Theodoris’ work discussed in this episode:
MaxToki Preprint https://www.biorxiv.org/content/10.64898/2026.03.30.715396v1
Scaling GeneFormer
“Scaling and quantization of large-scale foundation model enables resource-efficient predictions in network biology,” Nature Computational Science
https://www.nature.com/articles/s43588-026-00972-4
"Discovery of candidate therapeutic targets with Geneformer," Nature Protocols
https://www.nature.com/articles/s41596-026-01364-8