Lab Partnering Service Discovery
Use the LPS faceted search filters, or search by keywords, to narrow your results.
Title: Deputy Director
- Optimal Design of Experiments under Uncertainty
- Machine Learning
- Computational Physics
- Statistical Physics
Prior to joining Brookhaven National Laboratory in 2017, Francis “Frank” Alexander spent nearly 20 years at Los Alamos National Laboratory, finishing his tenure as the acting division leader of the lab’s Computer, Computational, and Statistical Sciences (CCS) Division. At Los Alamos, he grew in several leadership roles, including serving as deputy leader of CCS Division’s Information Sciences Group and leader of the Information Science and Technology Institute. Alexander was introduced to the DOE national laboratory complex during his postdoctoral work with Los Alamos’ Center for Nonlinear Studies and the Institute for Scientific Computing Research at Lawrence Livermore National Laboratory. He also was a research assistant professor at Boston University’s Center for Computational Science. Alexander has led many research projects and has published more than 50 papers in peer-reviewed journals. In addition to leading Brookhaven’s artificial intelligence and machine learning strategy effort, Alexander currently serves as project director of the multi-laboratory ExaLearn Co-design Center for Exascale Machine Learning Technologies, part of the Exascale Computing Project. He also leads various projects involving optimal experimental design, including for biological systems.
Dr. Warren L. Davis IV is a Principal Member of Technical Staff in the Scalable Analysis and Visualization department in the Center for Computing Research at Sandia National Laboratories. He was the principal investigator for the Hybrid Methods for Cybersecurity Analysis LDRD and the Machine Learning in Adversarial Environments LDRD research projects which had significant impact on cyber operations at the lab. In addition, he is the principal investigator of the Machine Learning for Intelligent Data Capture on Exascale Platforms research project for the DOE ASCR program.
Warren joined the technical staff at Sandia in 2009. He received his Ph.D. in computer science from Florida State in 2006, gaining industry experience as a graduate intern at both the National Astronomical Observatory of Japan in Tokyo and the IBM Almaden Research Center, where he was hired as a Research Staff Member after graduation. Warren has published over 20 journal articles, conference publications, and peer-reviewed presentations, and has worked in the fields of cybersecurity, healthcare informatics, climate modeling, material science, text analytics, combustion, and fluid dynamics, to name a few. In addition, he was awarded the 2019 Black Engineer of the Year Award in Research Leadership.
Nhan Tran is a Wilson Fellow at Fermilab working on the Compact Muon Solenoid experiment at the Large Hadron Collider and is also developing new dark sector experimental initiatives. He is generally interested in deploying machine learning as a powerful tool across fundamental physics. His recent research focus is on the intersection of machine learning with real-time systems and embedded electronics as well as heterogeneous computing to improve experimental efficiency and sensitivity. He received his PhD from Johns Hopkins University in 2011 and was a postdoctoral researcher at Fermilab prior to joining in his current position.
Areas of expertise: ML Algorithms for Data Reconstruction and Pattern Recognition; Real-Time Low-Latency ML in Resource-Constrained Environments; Heterogeneous Computing
She leads the Data Science Engagement Group at the National Energy Research Scientific Computing Center (NERSC) at Berkeley National Lab. A native of the U.K., her career spans research in particle physics, cosmology, and computing on both sides of the Atlantic. She obtained her Ph.D. at Edinburgh University, and worked at Imperial College London and SLAC National Accelerator Laboratory before joining NERSC. Her group leads the support of supercomputing for experimental science, and her work focuses on data-intensive computing and research. This includes using high-performance computing (HPC) to scale up machine-learning algorithms that can tackle new, larger scientific problems; and leveraging artificial intelligence to gain insight from cosmological data.
Bert de Jong leads the Computational Chemistry, Materials, and Climate Group, which advances scientific computing by developing and enhancing applications in key disciplines, as well as developing tools and libraries for addressing general problems in computational science.
de Jong is the director of the LBNL Quantum Algorithms Team QAT4Chem, the team director of the Accelerated Research for Quantum Computing (ARQC) Team AIDE-QC, both funded by DOE ASCR, focused on developing software stacks, algorithms, and computer science and applied mathematics solutions for chemical sciences and other fields on near-term quantum computing devices. He is also a co-PI on the ARQC team FAR-QC (led out of Sandia). He is also part of LBNL’s quantum testbed, developing superconducting qubits. He is the LBNL lead for the Basic Energy Sciences Quantum Information Sciences project (led out of PNNL), where he is focusing on new approaches for encoding wave functions and embedding quantum systems. In addition, he is a co-PI on an LBNL led HEP funded quantum information science projects.
de Jong is a co-PI within the DOE ASCR Exascale Computing Project (ECP) as the LBNL lead for the NWChemEx effort, contributing to the development of an exascale computational chemistry code. He is the LBNL lead for the Basic Energy Sciences SPEC Computational Chemistry Center (led out of PNNL), where he is working on reduced scaling MCSCF and beyond GW approaches for molecules.
He leads an LBNL funded effort on machine learning for chemical sciences, focused on developing deep learning networks (GANs and autoencoders) for the prediction of structure-function relationships and its inverse, with a demonstrating in mass spectrometry. As part of this effort, his team developed the ML4Chem Python package.
Areas of expertise: software and algorithms for near-term quantum computing devices, machine learning, supercomputing, computational chemistry.
During his career with NETL, U.S. Army veteran Jimmy Thornton has worked tirelessly to advance new technology development for Fossil Energy (FE), and that remains true today with current efforts to investigate uses for artificial intelligence (AI) and machine learning (ML) for FE technology development.
Born in Kentucky and growing up in Campbells Creek, Thornton joined the U.S. Army at the encouragement of his high school baseball coach who was an Army Reserve drill instructor. Trained as an infantryman and entering service in early 1983, Thornton was stationed in Germany, where he completed French Commando School in Givet, France.
Leaving active service in 1987, Thornton joined the Kentucky National Guard while studying at Eastern Kentucky University, and he later transferred to the West Virginia National Guard after accepting a professional internship with the U.S. Department of Energy (DOE) in Morgantown in 1988. Commissioned as an officer in 1992, he served with the 201st Field Artillery and was deployed to Iraq in 2004 during Operation Iraqi Freedom.
Achieving the rank of major, Thornton retired in 2010 with more than 27 years of service and continues to serve the 201st as an active member of the 201st Association. He said many of the skills and life lessons the army taught him continue to guide him at NETL, where he started working in 1991 when it was known as the Morgantown Energy Technology Center.
Thornton’s work at the Lab as associate director for the Computational Science and Engineering directorate includes advances in applied artificial intelligence, and machine learning, which he said have great potential to benefit the country’s energy industries, especially the existing fleet of coal-fired power plants and the subsurface. He noted that increased data availability and the use of supercomputers can speed up the development cycle of new tools for decision making because machine learning techniques can provide insights beyond our current understanding.
Dani Ushizima PhD, is a Staff Scientist at Lawrence Berkeley National Laboratory, a Data Scientist at UC Berkeley and an Affiliate Faculty at UC San Francisco. More than a decade at LBNL, her research in image analysis and pattern recognition has impacted a broad array of scientific investigation that depends on experimental data, particularly images. In 2015, Ushizima received the U.S. Department of Energy Early Career award to focus on pattern recognition applied to diverse scientific domains, such as structural analysis of materials science samples. She is also recipient of the Science without Borders Researcher award (CNPq/Brazil) for her work on machine learning applied to cytology, as part of an initiative focused on public healthcare. She has also led the Image Processing team for the Center for Advanced Mathematics for Energy Related Applications (CAMERA). Recently, she's been investigating lung scans for COVID-19 screening as part of initiatives related to the National Virtual Biotechnology Laboratory (NVBL).
COVID-19-related research: "Can CT Scans Be Used to Quickly and Accurately Diagnose COVID-19?"
Emily Donahue is a member of the technical staff at Sandia. She applies state-of-the-art machine learning innovations to novel applications for national security. She performs research in unsupervised learning, anomaly detection, and data-driven code acceleration. Emily earned her Master of Engineering at Cornell University with a focus in computer vision. While away from her computer, she enjoys landscape painting and rock climbing.