loader from loading.io

Season 2, Episode 5 -- Improving Computing Performance and Workforce Diversity

Science in Parallel

Release Date: 10/05/2022

S7E2: Megan Ivory: Supporting the Quantum Workforce show art S7E2: Megan Ivory: Supporting the Quantum Workforce

Science in Parallel

info_outline
Jarrod McClean (Bonus): Parsing Logical Qubits show art Jarrod McClean (Bonus): Parsing Logical Qubits

Science in Parallel

Quantum computing comes with a new layer of concepts. Quantum bits are called qubits, but there's more. Physical qubits are often grouped to form logical qubits. In our recent conversation with Jarrod McClean, we discussed logical qubits. And we're sharing that discussion as a Science in Parallel short.

info_outline
S7E1: Jarrod McClean: Designing Quantum Algorithms show art S7E1: Jarrod McClean: Designing Quantum Algorithms

Science in Parallel

In our seventh season, we’re putting a spotlight on quantum computing, technology that could help speed up high-performance computing and artificial intelligence, shore up cybersecurity, study complex natural systems and much more. Jarrod McClean works on quantum algorithms and applications at the Google Quantum Artificial Intelligence laboratory, and this conversation links some of the ideas about AI for science from our last season to emerging quantum technology. Join us for a conversation about Jarrod's work at Google, where he thinks quantum computing could soon enter the computational...

info_outline
S6E10: Sunita Chandrasekaran: Computation in Translation show art S6E10: Sunita Chandrasekaran: Computation in Translation

Science in Parallel

Computational science requires translation, breaking ideas and principles into pieces that algorithms can parse. The work requires experts capable of zooming in on core computer science while also being able to step back and make sure that the big scientific questions are addressed. This guest, Sunita Chandrasekaran of the University of Delaware, moves seamlessly across these layers— from working with students and postdocs on fundamental software, collaborating with researchers on questions ranging from physics to art conservation and helping to shape AI policy in her state. In our...

info_outline
S6E9: Silvia Crivelli: Understanding Suicide Risk and Building a Foundation Model for Medicine show art S6E9: Silvia Crivelli: Understanding Suicide Risk and Building a Foundation Model for Medicine

Science in Parallel

Nearly a decade ago, the U.S Department of Veterans Affairs and the Department of Energy launched the MVP-CHAMPION initiative, not for sports, but as a data-driven strategy for improving healthcare outcomes for veterans and others. Silvia Crivelli of Lawrence Berkeley National Laboratory turned her skills in computational biology toward this new field, especially the problem of identifying veterans at high risk for suicide. As she and her colleagues worked on this challenge, large language models and the notion of foundation models emerged. Now her team is focused on a more comprehensive...

info_outline
S6E8:Youngsoo Choi: Building Reliable Foundation Models show art S6E8:Youngsoo Choi: Building Reliable Foundation Models

Science in Parallel

Foundation models-- LLMs or LLM-like tools-- are a compelling idea for advancing scientific discovery and democratizing computational science. But there's a big gap between these lofty ideas and the trustworthiness of current models. Youngsoo Choi of Lawrence Livermore National Laboratory and his colleagues are thinking about to how to close this chasm. They're engaging with questions such as: What are the essential characteristics that define a foundation model? And how do we make sure that scientists can rely on their results? In this conversation we discuss a position paper that Youngsoo...

info_outline
S6E7: Steven Wilson: Craving Chemical Efficiency show art S6E7: Steven Wilson: Craving Chemical Efficiency

Science in Parallel

Computational scientists can take on the role of utility players in research, and Steven Wilson is one example. At Arizona State University he's built instruments, carried out experiments and dove deep into computational work. As a postdoc, he's working on a new challenge: building a quantum chemistry startup company. In this episode, he discusses his career that started with 10 years in the United States Navy Nuclear Program, how that military experience shaped his academic studies and the role of the  (DOE CSGF) in shaping his research to make chemical reactions more efficient. ...

info_outline
S6E6 [REPOST]: Joe Insley Transforms Big Data into Stunning Images show art S6E6 [REPOST]: Joe Insley Transforms Big Data into Stunning Images

Science in Parallel

While we take a short summer break, we’re posting one of our favorite past episodes and a great follow-up to our last episode with Amanda Randles of Duke University. In 2023, we talked with Joe Insley of Argonne Leadership Computing Facility and Northern Illinois University about data visualization, from the practical process of helping researchers understand their results to showstopping images and animations that make the work accessible to broad audiences. Joe discusses his career path, how he and his team approach visualization projects, his work with students and his advice for...

info_outline
S6E5: Amanda Randles: A Check-Engine Light for the Heart show art S6E5: Amanda Randles: A Check-Engine Light for the Heart

Science in Parallel

Duke University associate professor Amanda Randles' work to simulate and understand human blood flow and its implications demonstrates how high-performance computing paired with scientific principles can help improve human health. In this conversation, she talks about how she brought together early interests in physics, coding, biomedicine and even political science and policy and followed her enthusiasm for the Human Genome Project. She discusses how supercomputers are pushing the boundaries of what researchers can learn about the circulatory system noninvasively and how that knowledge,...

info_outline
S6E4: Joel Ye: Examining Neural Data More Efficiently and Holistically show art S6E4: Joel Ye: Examining Neural Data More Efficiently and Holistically

Science in Parallel

Understanding how the brain works remains a grand scientific challenge, and it's yet another area where researchers are examining whether foundation models could help them find patterns in complex data. Joel Ye of Carnegie Mellon University talks about his work on foundation models, their potential and limitations and how others can get involved in applying these AI tools. is a Ph.D. student in the program in neural computation at Carnegie Mellon University in Pittsburgh, where he studies ways to understand brain data and brain-computer interfaces. He's a third-year  

info_outline
 
More Episodes

Valerie Taylor doesn’t shy away from challenging problems with multiple layers. At Argonne National Laboratory, she manages teams that develop algorithms, data management strategies, software and hardware to support scientific simulations, including those on the Department of Energy’s leadership-class supercomputers. Her research focuses on performance analysis—the factors involved in making computations efficient. On top of that, she maintains a parallel line of work supporting computer scientists from historically marginalized communities toward building a more diverse computing workforce.

You’ll hear Valerie talk about her career path, what excites her about computing, and the sustained commitment needed to boost diversity, equity and inclusion in this field. You’ll meet:

Valerie Taylor is the director of the mathematics and computer science division at Argonne National Laboratory. She moved to Argonne in 2017 after more than 25 years in academia at both Northwestern University and at Texas A&M University. She also is the president and chief executive officer of the Center for Minorities and People with Disabilities in IT (CMD-IT), a non-profit dedicated to supporting historically marginalized communities in computing.