Skip to main content
eScholarship
Open Access Publications from the University of California

LBL Publications

Lawrence Berkeley National Laboratory (Berkeley Lab) has been a leader in science and engineering research for more than 70 years. Located on a 200 acre site in the hills above the Berkeley campus of the University of California, overlooking the San Francisco Bay, Berkeley Lab is a U.S. Department of Energy (DOE) National Laboratory managed by the University of California. It has an annual budget of nearly $480 million (FY2002) and employs a staff of about 4,300, including more than a thousand students.

Berkeley Lab conducts unclassified research across a wide range of scientific disciplines with key efforts in fundamental studies of the universe; quantitative biology; nanoscience; new energy systems and environmental solutions; and the use of integrated computing as a tool for discovery. It is organized into 17 scientific divisions and hosts four DOE national user facilities. Details on Berkeley Lab's divisions and user facilities can be viewed here.

Deep Generative Models for Fast Photon Shower Simulation in ATLAS

(2024)

Abstract: The need for large-scale production of highly accurate simulated event samples for the extensive physics programme of the ATLAS experiment at the Large Hadron Collider motivates the development of new simulation techniques. Building on the recent success of deep learning algorithms, variational autoencoders and generative adversarial networks are investigated for modelling the response of the central region of the ATLAS electromagnetic calorimeter to photons of various energies. The properties of synthesised showers are compared with showers from a full detector simulation using geant4. Both variational autoencoders and generative adversarial networks are capable of quickly simulating electromagnetic showers with correct total energies and stochasticity, though the modelling of some shower shape distributions requires more refinement. This feasibility study demonstrates the potential of using such algorithms for ATLAS fast calorimeter simulation in the future and shows a possible way to complement current simulation techniques.

Artificial Intelligence for the Electron Ion Collider (AI4EIC)

(2024)

The Electron-Ion Collider (EIC), a state-of-the-art facility for studying the strong force, is expected to begin commissioning its first experiments in 2028. This is an opportune time for artificial intelligence (AI) to be included from the start at this facility and in all phases that lead up to the experiments. The second annual workshop organized by the AI4EIC working group, which recently took place, centered on exploring all current and prospective application areas of AI for the EIC. This workshop is not only beneficial for the EIC, but also provides valuable insights for the newly established ePIC collaboration at EIC. This paper summarizes the different activities and R&D projects covered across the sessions of the workshop and provides an overview of the goals, approaches and strategies regarding AI/ML in the EIC community, as well as cutting-edge techniques currently studied in other experiments.

Software Performance of the ATLAS Track Reconstruction for LHC Run 3

(2024)

Charged particle reconstruction in the presence of many simultaneous proton–proton (pp) collisions in the LHC is a challenging task for the ATLAS experiment’s reconstruction software due to the combinatorial complexity. This paper describes the major changes made to adapt the software to reconstruct high-activity collisions with an average of 50 or more simultaneous pp interactions per bunch crossing (pile-up) promptly using the available computing resources. The performance of the key components of the track reconstruction chain and its dependence on pile-up are evaluated, and the improvement achieved compared to the previous software version is quantified. For events with an average of 60pp collisions per bunch crossing, the updated track reconstruction is twice as fast as the previous version, without significant reduction in reconstruction efficiency and while reducing the rate of combinatorial fake tracks by more than a factor two.

Cover page of Estimating geographic variation of infection fatality ratios during epidemics.

Estimating geographic variation of infection fatality ratios during epidemics.

(2024)

OBJECTIVES: We aim to estimate geographic variability in total numbers of infections and infection fatality ratios (IFR; the number of deaths caused by an infection per 1,000 infected people) when the availability and quality of data on disease burden are limited during an epidemic. METHODS: We develop a noncentral hypergeometric framework that accounts for differential probabilities of positive tests and reflects the fact that symptomatic people are more likely to seek testing. We demonstrate the robustness, accuracy, and precision of this framework, and apply it to the United States (U.S.) COVID-19 pandemic to estimate county-level SARS-CoV-2 IFRs. RESULTS: The estimators for the numbers of infections and IFRs showed high accuracy and precision; for instance, when applied to simulated validation data sets, across counties, Pearson correlation coefficients between estimator means and true values were 0.996 and 0.928, respectively, and they showed strong robustness to model misspecification. Applying the county-level estimators to the real, unsimulated COVID-19 data spanning April 1, 2020 to September 30, 2020 from across the U.S., we found that IFRs varied from 0 to 44.69, with a standard deviation of 3.55 and a median of 2.14. CONCLUSIONS: The proposed estimation framework can be used to identify geographic variation in IFRs across settings.

Cover page of Helping Faculty Teach Software Performance Engineering

Helping Faculty Teach Software Performance Engineering

(2024)

Over the academic year 2022–23, we discussed the teaching of software performance engineering with more than a dozen faculty across North America and beyond. Our outreach was centered on research-focused faculty with an existing interest in this course material. These discussions revealed an enthusiasm for making software performance engineering a more prominent part of a curriculum for computer scientists and engineers. Here, we discuss how MIT’s longstanding efforts in this area may serve as a launching point for community development of a software performance engineering curriculum, challenges in and solutions for providing the necessary infrastructure to universities, and future directions.

AutoCT: Automated CT registration, segmentation, and quantification

(2024)

The processing and analysis of computed tomography (CT) imaging is important for both basic scientific development and clinical applications. In AutoCT, we provide a comprehensive pipeline that integrates an end-to-end automatic preprocessing, registration, segmentation, and quantitative analysis of 3D CT scans. The engineered pipeline enables atlas-based CT segmentation and quantification leveraging diffeomorphic transformations through efficient forward and inverse mappings. The extracted localized features from the deformation field allow for downstream statistical learning that may facilitate medical diagnostics. On a lightweight and portable software platform, AutoCT provides a new toolkit for the CT imaging community to underpin the deployment of artificial intelligence-driven applications.

Cover page of Old-Aged groundwater contributes to mountain hillslope hydrologic dynamics

Old-Aged groundwater contributes to mountain hillslope hydrologic dynamics

(2024)

Understanding connectivity between the soil and deeper bedrock groundwater is needed to accurately predict a watershed's response to perturbation, such as drought. Yet, the bedrock groundwater dynamics in mountainous environments are typically under-constrained and excluded from watershed hydrologic models. Here, we investigate the role of groundwater characterized with decadal and longer water ages on the hydrologic and mass-transport processes within a steep snow-dominated mountain hillslope in the Central Rocky Mountains (USA). We quantify subsurface and surface water mass-balance, groundwater flowpaths, and age distributions using the ParFlow-CLM integrated hydrologic and EcoSLIM particle tracking models, which are compared to hydrometric and environmental tracer observations. An ensemble of models with varied soil and hydrogeologic parameters reproduces observed groundwater levels and century-scale mean ages inferred from environmental tracers. The numerical models suggest soil water near the toe of the hillslope contains considerable (>60 % of the mass-flux) contributions from bedrock flowpaths characterized with water ages >10 years. Flowpath connectivity between the deeper bedrock and soil systems is present throughout the year, highlighting the potentially critical role of groundwater with old ages on processes such as evapotranspiration and streamflow generation. The coupled numerical model and groundwater age observations show the bedrock groundwater system influences the hillslope hydrodynamics and should be considered in mountain watershed conceptual and numerical models.

Cover page of Survey on Efficient and Productive Use of Electricity In Women-Run Small Businesses in Uganda

Survey on Efficient and Productive Use of Electricity In Women-Run Small Businesses in Uganda

(2024)

In Uganda, women make up a large portion of the service sector, notably in informal small and micro businesses such as retail shops, food preparation, tailoring, sewing, beer brewing, basket weaving, healthcare, and hairdressing. Lawrence Berkeley National Laboratory (LBNL), and USAID, in partnership with the Clean Energy Enthusiasts (CEE), a local implementer in Uganda, designed and conducted a participatory survey to collect baseline data and information on the current state of benefits to energy access. The study included data on the uptake of efficient and productive electric (EPUE) equipment, business skills, and financial needs among small and mid-size women-led businesses in Uganda. The objective of the survey was to build evidence and identify economic activities where access to electricity has a positive impact on women’s empowerment by helping them develop new and existing businesses, improve productivity, and increase their income. The survey included questions about the type of electric equipment used and needed, electric consumption patterns, alternative sources to electricity, and business and capital barriers related to increased access. The survey results provided the basis for designing a training program tailored to the need of women entrepreneurs in the region.

Cover page of Reinvented: An Attosecond Chemist

Reinvented: An Attosecond Chemist

(2024)

Attosecond science requires a substantial rethinking of how to make measurements on very short timescales; how to acquire the necessary equipment, technology, and personnel; and how to build a set of laboratories for such experiments. This entails a rejuvenation of the author in many respects, in the laboratory itself, with regard to students and postdocs, and in generating funding for research. It also brings up questions of what it means to do attosecond science, and the discovery of the power of X-ray spectroscopy itself, which complements the short timescales addressed. The lessons learned, expressed in the meanderings of this autobiographical article, may be of benefit to others who try to reinvent themselves. Expected final online publication date for the Annual Review of Physical Chemistry, Volume 75 is April 2024. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

Cover page of How the AI-assisted discovery and synthesis of a ternary oxide highlights capability gaps in materials science.

How the AI-assisted discovery and synthesis of a ternary oxide highlights capability gaps in materials science.

(2024)

Exploratory synthesis has been the main generator of new inorganic materials for decades. However, our Edisonian and bias-prone processes of synthetic exploration alone are no longer sufficient in an age that demands rapid advances in materials development. In this work, we demonstrate an end-to-end attempt towards systematic, computer-aided discovery and laboratory synthesis of inorganic crystalline compounds as a modern alternative to purely exploratory synthesis. Our approach initializes materials discovery campaigns by autonomously mapping the synthetic feasibility of a chemical system using density functional theory with AI feedback. Following expert-driven down-selection of newly generated phases, we use solid-state synthesis and in situ characterization via hot-stage X-ray diffraction in order to realize new ternary oxide phases experimentally. We applied this strategy in six ternary transition-metal oxide chemistries previously considered well-explored, one of which culminated in the discovery of two novel phases of calcium ruthenates. Detailed characterization using room temperature X-ray powder diffraction, 4D-STEM and SQUID measurements identifies the structure and composition and confirms distinct properties, including distinct defect concentrations, of one of the new phases formed in our experimental campaigns. While the discovery of a new material guided by AI and DFT theory represents a milestone, our procedure and results also highlight a number of critical gaps in the process that can inform future efforts towards the improvement of AI-coupled methodologies.