|Commercial Aircraft Propulsion and Energy Systems Research: Reducing Global Carbon Emissions
Released 2016-05-24 Forthcoming/Prepublication
The primary human activities that release carbon dioxide (CO2) into the atmosphere are the combustion of fossil fuels (coal, natural gas, and oil) to generate electricity, the provision of energy for transportation, and as a consequence of some industrial processes. Although aviation CO2 emissions only make up approximately 2.0 to 2.5 percent of total global annual CO2 emissions, research to reduce CO2 emissions is urgent because (1) such reductions may be legislated even as commercial air travel grows, (2) because it takes new technology a long time to propagate into and through the aviation fleet, and (3) because of the ongoing impact of global CO2 emissions.
Commercial Aircraft Propulsion and Energy Systems Research develops a national research agenda for reducing CO2 emissions from commercial aviation. This report focuses on propulsion and energy technologies for reducing carbon emissions from large, commercial aircraft— single-aisle and twin-aisle aircraft that carry 100 or more passengers—because such aircraft account for more than 90 percent of global emissions from commercial aircraft. Moreover, while smaller aircraft also emit CO2, they make only a minor contribution to global emissions, and many technologies that reduce CO2 emissions for large aircraft also apply to smaller aircraft.
As commercial aviation continues to grow in terms of revenue-passenger miles and cargo ton miles, CO2 emissions are expected to increase. To reduce the contribution of aviation to climate change, it is essential to improve the effectiveness of ongoing efforts to reduce emissions and initiate research into new approaches.
|Achieving Science with CubeSats: Thinking Inside the Box
Released 2016-05-23 Forthcoming/Prepublication
Space-based observations have transformed our understanding of Earth, its environment, the solar system and the universe at large. During past decades, driven by increasingly advanced science questions, space observatories have become more sophisticated and more complex, with costs often growing to billions of dollars. Although these kinds of ever-more-sophisticated missions will continue into the future, small satellites, ranging in mass between 500 kg to 0.1 kg, are gaining momentum as an additional means to address targeted science questions in a rapid, and possibly more affordable, manner. Within the category of small satellites, CubeSats have emerged as a space-platform defined in terms of (10 cm x 10 cm x 10 cm)- sized cubic units of approximately 1.3 kg each called “U’s.” Historically, CubeSats were developed as training projects to expose students to the challenges of real-world engineering practices and system design. Yet, their use has rapidly spread within academia, industry, and government agencies both nationally and internationally.
In particular, CubeSats have caught the attention of parts of the U.S. space science community, which sees this platform, despite its inherent constraints, as a way to affordably access space and perform unique measurements of scientific value. The first science results from such CubeSats have only recently become available; however, questions remain regarding the scientific potential and technological promise of CubeSats in the future.
Achieving Science with CubeSats reviews the current state of the scientific potential and technological promise of CubeSats. This report focuses on the platform’s promise to obtain high- priority science data, as defined in recent decadal surveys in astronomy and astrophysics, Earth science and applications from space, planetary science, and solar and space physics (heliophysics); the science priorities identified in the 2014 NASA Science Plan; and the potential for CubeSats to advance biology and microgravity research. It provides a list of sample science goals for CubeSats, many of which address targeted science, often in coordination with other spacecraft, or use “sacrificial,” or high-risk, orbits that lead to the demise of the satellite after critical data have been collected. Other goals relate to the use of CubeSats as constellations or swarms deploying tens to hundreds of CubeSats that function as one distributed array of measurements.
|2015-2016 Assessment of the Army Research Laboratory: Interim Report
The National Academies of Sciences, Engineering, and Medicine's Army Research Laboratory Technical Assessment Board (ARLTAB) provides biennial assessments of the scientific and technical quality of the research, development, and analysis programs at the Army Research Laboratory (ARL), focusing on ballistics sciences, human sciences, information sciences, materials sciences, and mechanical sciences.
This interim report summarizes the findings of the Board for the first year of this biennial assessment; the current report addresses approximately half the portfolio for each campaign; the remainder will be assessed in 2016. During the first year the Board examined the following elements within the ARL's science and technology campaigns: biological and bioinspired materials, energy and power materials, and engineered photonics materials; battlefield injury mechanisms, directed energy, and armor and adaptive protection; sensing and effecting, and system intelligence and intelligent systems; advanced computing architectures, computing sciences, data-intensive sciences, and predictive simulation sciences; human-machine interaction, intelligence and control, and perception; humans in multiagent systems, real-world behavior, and toward human variability; and mission capability of systems. A second, final report will subsume the findings of this interim report and add the findings from the second year of the review.
|Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020
Released 2016-04-28 Forthcoming/Prepublication
Advanced computing capabilities are used to tackle a rapidly growing range of challenging science and engineering problems, many of which are compute- and data-intensive as well. Demand for advanced computing has been growing for all types and capabilities of systems, from large numbers of single commodity nodes to jobs requiring thousands of cores; for systems with fast interconnects; for systems with excellent data handling and management; and for an increasingly diverse set of applications that includes data analytics as well as modeling and simulation. Since the advent of its supercomputing centers, the National Science Foundation (NSF) has provided its researchers with state-of-the-art computing systems. The growth of new models of computing, including cloud computing and publically available by privately held data repositories, opens up new possibilities for NSF.
In order to better understand the expanding and diverse requirements of the science and engineering community and the importance of a new broader range of advanced computing infrastructure, the NSF requested that the National Research Council carry out a study examining anticipated priorities and associated tradeoffs for advanced computing. Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020 provides a framework for future decision-making about NSF’s advanced computing strategy and programs. It offers recommendations aimed at achieving four broad goals: (1) position the U.S. for continued leadership in science and engineering, (2) ensure that resources meet community needs, (3) aid the scientific community in keeping up with the revolution in computing, and (4) sustain the infrastructure for advanced computing.
|Applying Materials State Awareness to Condition-Based Maintenance and System Life Cycle Management: Summary of a Workshop
In August 2014, the committee on Defense Materials Manufacturing and Infrastructure convened a workshop to discuss issues related to applying materials state awareness to condition-based maintenance and system life cycle management. The workshop was structured around three focal topics: (1) advances in metrology and experimental methods, (2) advances in physics-based models for assessment, and (3) advances in databases and diagnostic technologies. This report summarizes the discussions and presentations from this workshop.
|Mainstreaming Unmanned Undersea Vehicles into Future U.S. Naval Operations: Abbreviated Version of a Restricted Report
At the request of the former Chief of Naval Operations, the National Academies of Sciences, Engineering, and Medicine appointed an expert committee to assess the potential of unmanned undersea vehicles (UUVs) in enhancing future U.S. naval operations. The Department of the Navy has determined that the final report prepared by the committee is restricted in its entirety under exemption 3 of the Freedom of Information Act (5 USC § 552 (b) (3)), via 10 USC § 130 and therefore cannot be made available to the public. This abbreviated report provides background information on the full report and the committee that prepared it.
|Analytic Research Foundations for the Next-Generation Electric Grid
Electricity is the lifeblood of modern society, and for the vast majority of people that electricity is obtained from large, interconnected power grids. However, the grid that was developed in the 20th century, and the incremental improvements made since then, including its underlying analytic foundations, is no longer adequate to completely meet the needs of the 21st century. The next-generation electric grid must be more flexible and resilient. While fossil fuels will have their place for decades to come, the grid of the future will need to accommodate a wider mix of more intermittent generating sources such as wind and distributed solar photovoltaics.
Achieving this grid of the future will require effort on several fronts. There is a need for continued shorter-term engineering research and development, building on the existing analytic foundations for the grid. But there is also a need for more fundamental research to expand these analytic foundations. Analytic Research Foundations for the Next-Generation Electric Grid provide guidance on the longer-term critical areas for research in mathematical and computational sciences that is needed for the next-generation grid. It offers recommendations that are designed to help direct future research as the grid evolves and to give the nation’s research and development infrastructure the tools it needs to effectively develop, test, and use this research.
|An Assessment of the National Institute of Standards and Technology Physical Measurement Laboratory: Fiscal Year 2015
The Physical Measurement Laboratory (PML) at the National Institute of Standards and Technology (NIST) is dedicated to three fundamental and complementary tasks: (1) increase the accuracy of our knowledge of the physical parameters that are the foundation of our technology-driven society; (2) disseminate technologies by which these physical parameters can be accessed in a standardized way by the stakeholders; and (3) conduct research at both fundamental and applied levels to provide knowledge that may eventually lead to advances in measurement approaches and standards. This report assesses the scientific and technical work performed by the PML and identifies salient examples of accomplishments, challenges, and opportunities for improvement for each of its nine divisions.
|Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific Results: Summary of a Workshop
Questions about the reproducibility of scientific research have been raised in numerous settings and have gained visibility through several high-profile journal and popular press articles. Quantitative issues contributing to reproducibility challenges have been considered (including improper data measurement and analysis, inadequate statistical expertise, and incomplete data, among others), but there is no clear consensus on how best to approach or to minimize these problems.
A lack of reproducibility of scientific results has created some distrust in scientific findings among the general public, scientists, funding agencies, and industries. While studies fail for a variety of reasons, many factors contribute to the lack of perfect reproducibility, including insufficient training in experimental design, misaligned incentives for publication and the implications for university tenure, intentional manipulation, poor data management and analysis, and inadequate instances of statistical inference.
The workshop summarized in this report was designed not to address the social and experimental challenges but instead to focus on the latter issues of improper data management and analysis, inadequate statistical expertise, incomplete data, and difficulties applying sound statistic inference to the available data. Many efforts have emerged over recent years to draw attention to and improve reproducibility of scientific work. This report uniquely focuses on the statistical perspective of three issues: the extent of reproducibility, the causes of reproducibility failures, and the potential remedies for these failures.
|Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community
Recent disclosures about the bulk collection of domestic phone call records and other signals intelligence programs have stimulated widespread debate about the implications of such practices for the civil liberties and privacy of Americans. In the wake of these disclosures, many have identified a need for the intelligence community to engage more deeply with outside privacy experts and stakeholders.
At the request of the Office of the Director of National Intelligence, the National Academies of Sciences, Engineering, and Medicine convened a workshop to address the privacy implications of emerging technologies, public and individual preferences and attitudes toward privacy, and ethical approaches to data collection and use. This report summarizes discussions between experts from academia and the private sector and from the intelligence community on private sector best practices and privacy research results.
|Strategies to Enhance Air Force Communication with Internal and External Audiences: A Workshop Report
The U.S. Air Force (USAF) helps defend the United States and its interests by organizing, training, and equipping forces for operations in and through three distinct domains -- air, space, and cyberspace. The Air Force concisely expresses its vision as "Global Vigilance, Global Reach, and Global Power for America." Operations within each of these domains are dynamic, take place over large distances, occur over different operational timelines, and cannot be routinely seen or recorded, making it difficult for Airmen, national decision makers, and the American People to visualize and comprehend the full scope of Air Force operations. As a result, the Air Force faces increasing difficulty in succinctly and effectively communicating the complexity, dynamic range, and strategic importance of its mission to Airmen and to the American people.
To address this concern, the Chief of Staff of the USAF requested that the National Academies of Sciences, Engineering, and Medicine convene a workshop to explore options on how the Air Force can effectively communicate the strategic importance of the Service, its mission, and the role it plays in the defense of the United States. Participants worked to address the issues that a diverse workforce encompassing a myriad of backgrounds, education, and increasingly diverse current mission sets drives the requirement for a new communication strategy. The demographics of today's Air Force creates both a unique opportunity and a distinct challenge to Air Force leadership as it struggles to communicate its vision and strategy effectively across several micro-cultures within the organization and to the general public. This report summarizes the presentations and discussions from the workshop.
|Affordability of National Flood Insurance Program Premiums: Report 2
When Congress authorized the National Flood Insurance Program (NFIP) in 1968, it intended for the program to encourage community initiatives in flood risk management, charge insurance premiums consistent with actuarial pricing principles, and encourage the purchase of flood insurance by owners of flood prone properties, in part, by offering affordable premiums. The NFIP has been reauthorized many times since 1968, most recently with the Biggert-Waters Flood Insurance Reform Act of 2012 (BW 2012). In this most recent reauthorization, Congress placed a particular emphasis on setting flood insurance premiums following actuarial pricing principles, which was motivated by a desire to ensure future revenues were adequate to pay claims and administrative expenses. BW 2012 was designed to move the NFIP towards risk-based premiums for all flood insurance policies. The result was to be increased premiums for some policyholders that had been paying less than NFIP risk-based premiums and to possibly increase premiums for all policyholders.
Recognition of this possibility and concern for the affordability of flood insurance is reflected in sections of the Homeowner Flood Insurance Affordability Act of 2014 (HFIAA 2014). These sections called on FEMA to propose a draft affordability framework for the NFIP after completing an analysis of the efforts of possible programs for offering “means-tested assistance” to policyholders for whom higher rates may not be affordable.
BW 2012 and HFIAA 2014 mandated that FEMA conduct a study, in cooperation with the National Academies of Sciences, Engineering, and Medicine, which would compare the costs of a program of risk-based rates and means-tested assistance to the current system of subsidized flood insurance rates and federally funded disaster relief for people without coverage. Production of two reports was agreed upon to fulfill this mandate. This second report proposes alternative approaches for a national evaluation of affordability program policy options and includes lessons for the design of a national study from a proof-of-concept pilot study.