|National Security Space Defense and Protection: Public Report
Released 2016-08-16 Forthcoming/Prepublication
It is not yet 60 years since the first artificial satellite was placed into Earth orbit. In just over a half century, mankind has gone from no presence in outer space to a condition of high dependence on orbiting satellites. These sensors, receivers, transmitters, and other such devices, as well as the satellites that carry them, are components of complex space systems that include terrestrial elements, electronic links between and among components, organizations to provide the management, care and feeding, and launch systems that put satellites into orbit. In many instances, these space systems connect with and otherwise interact with terrestrial systems; for example, a very long list of Earth-based systems cannot function properly without information from the Global Positioning System (GPS).
Space systems are fundamental to the information business, and the modern world is an information-driven one. In addition to navigation (and associated timing), space systems provide communications and imagery and other Earth-sensing functions. Among these systems are many that support military, intelligence, and other national security functions of the United States and many other nations. Some of these are unique government, national security systems; however, functions to support national security are also provided by commercial and civil-government space systems.
The importance of space systems to the United States and its allies and potential adversaries raises major policy issues. National Security Space Defense and Protection reviews the range of options available to address threats to space systems, in terms of deterring hostile actions, defeating hostile actions, and surviving hostile actions, and assesses potential strategies and plans to counter such threats. This report recommends architectures, capabilities, and courses of action to address such threats and actions to address affordability, technology risk, and other potential barriers or limiting factors in implementing such courses of action.
|New Worlds, New Horizons: A Midterm Assessment
Released 2016-08-15 Forthcoming/Prepublication
New Worlds, New Horizons in Astronomy and Astrophysics (NWNH), the report of the 2010 decadal survey of astronomy and astrophysics, put forward a vision for a decade of transformative exploration at the frontiers of astrophysics. This vision included mapping the first stars and galaxies as they emerge from the collapse of dark matter and cold clumps of hydrogen, finding new worlds in a startlingly diverse population of extrasolar planets, and exploiting the vastness and extreme conditions of the universe to reveal new information about the fundamental laws of nature. NWNH outlined a compelling program for understanding the cosmic order and for opening new fields of inquiry through the discovery areas of gravitational waves, time-domain astronomy, and habitable planets. Many of these discoveries are likely to be enabled by cyber-discovery and the power of mathematics, physics, and imagination. To help realize this vision, NWNH recommended a suite of innovative and powerful facilities, along with balanced, strong support for the scientific community engaged in theory, data analysis, technology development, and measurements with existing and new instrumentation. Already in the first half of the decade, scientists and teams of scientists working with these cutting-edge instruments and with new capabilities in data collection and analysis have made spectacular discoveries that advance the NWNH vision.
New Worlds, New Horizons: A Midterm Assessment reviews the responses of NASA’s Astrophysics program, NSF’s Astronomy program, and DOE’s Cosmic Frontiers program to NWNH. This report describes the most significant scientific discoveries, technical advances, and relevant programmatic changes in astronomy and astrophysics over the years since the publication of the decadal survey, and assesses how well the Agencies’ programs address the strategies, goals, and priorities outlined in the 2010 decadal survey.
|NASA Space Technology Roadmaps and Priorities Revisited
Released 2016-08-12 Forthcoming/Prepublication
Historically, the United States has been a world leader in aerospace endeavors in both the government and commercial sectors. A key factor in aerospace leadership is continuous development of advanced technology, which is critical to U.S. ambitions in space, including a human mission to Mars. To continue to achieve progress, NASA is currently executing a series of aeronautics and space technology programs using a roadmapping process to identify technology needs and improve the management of its technology development portfolio.
NASA created a set of 14 draft technology roadmaps in 2010 to guide the development of space technologies. In 2015, NASA issued a revised set of roadmaps. A significant new aspect of the update has been the effort to assess the relevance of the technologies by listing the enabling and enhancing technologies for specific design reference missions (DRMs) from the Human Exploration and Operations Mission Directorate and the Science Mission Directorate. NASA Space Technology Roadmaps and Priorities Revisited prioritizes new technologies in the 2015 roadmaps and recommends a methodology for conducting independent reviews of future updates to NASA’s space technology roadmaps, which are expected to occur every 4 years.
|Optimizing the Air Force Acquisition Strategy of Secure and Reliable Electronic Components: Proceedings of a Workshop
In 2012, the National Defense Authorization Act (NDAA), section 818, outlined new requirements for industry to serve as the lead in averting counterfeits in the defense supply chain. Subsequently, the House Armed Services Committee, in its report on the Fiscal Year 2016 NDAA, noted that the pending sale of IBM’s microprocessor fabrication facilities to Global Foundries created uncertainty about future access of the United States to trusted state-of-the-art microelectronic components and directed the Comptroller General to assess the Department of Defense’s (DoD’s) actions and measures to address this threat.
In this context, the National Academies of Sciences, Engineering, and Medicine convened a workshop to facilitate an open dialogue with leading industry, academic, and government experts to (1) define the current technological and policy challenges with maintaining a reliable and secure source of microelectronic components; (2) review the current state of acquisition processes within the Air Force for acquiring reliable and secure microelectronic components; and (3) explore options for possible business models within the national security complex that would be relevant for the Air Force acquisition community. This publication summarizes the results of the workshop.
|Exploring Encryption and Potential Mechanisms for Authorized Government Access to Plaintext: Proceedings of a Workshop
Released 2016-07-29 Forthcoming/Prepublication
In June 2016 the National Academies of Sciences, Engineering, and Medicine convened the Workshop on Encryption and Mechanisms for Authorized Government Access to Plaintext. Participants at this workshop discussed potential encryption strategies that would enable access to plaintext information by law enforcement or national security agencies with appropriate authority. Although the focus of the workshop was on technical issues, there was some consideration of the broader policy context, and discussion about the topics of encryption and authorized exceptional analysis frequently addressed open policy questions as well as technical issues. This publication summarizes the presentations and discussions from the workshop.
|Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020
Advanced computing capabilities are used to tackle a rapidly growing range of challenging science and engineering problems, many of which are compute- and data-intensive as well. Demand for advanced computing has been growing for all types and capabilities of systems, from large numbers of single commodity nodes to jobs requiring thousands of cores; for systems with fast interconnects; for systems with excellent data handling and management; and for an increasingly diverse set of applications that includes data analytics as well as modeling and simulation. Since the advent of its supercomputing centers, the National Science Foundation (NSF) has provided its researchers with state-of-the-art computing systems. The growth of new models of computing, including cloud computing and publically available by privately held data repositories, opens up new possibilities for NSF.
In order to better understand the expanding and diverse requirements of the science and engineering community and the importance of a new broader range of advanced computing infrastructure, the NSF requested that the National Research Council carry out a study examining anticipated priorities and associated tradeoffs for advanced computing. Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020 provides a framework for future decision-making about NSF’s advanced computing strategy and programs. It offers recommendations aimed at achieving four broad goals: (1) position the U.S. for continued leadership in science and engineering, (2) ensure that resources meet community needs, (3) aid the scientific community in keeping up with the revolution in computing, and (4) sustain the infrastructure for advanced computing.
|Continuing Innovation in Information Technology: Workshop Report
The 2012 National Research Council report Continuing Innovation in Information Technology illustrates how fundamental research in information technology (IT), conducted at industry and universities, has led to the introduction of entirely new product categories that ultimately became billion-dollar industries. The central graphic from that report portrays and connects areas of major investment in basic research, university-based research, and industry research and development; the introduction of important commercial products resulting from this research; billion-dollar-plus industries stemming from it; and present-day IT market segments and representative U.S. firms whose creation was stimulated by the decades-long research.
At a workshop hosted by the Computer Science and Telecommunications Board on March 5, 2015, leading academic and industry researchers and industrial technologists described key research and development results and their contributions and connections to new IT products and industries, and illustrated these developments as overlays to the 2012 "tire tracks" graphic. The principal goal of the workshop was to collect and make available to policy makers and members of the IT community first-person narratives that illustrate the link between government investments in academic and industry research to the ultimate creation of new IT industries. This report provides summaries of the workshop presentations organized into five broad themes - (1) fueling the innovation pipeline, (2) building a connected world, (3) advancing the hardware foundation, (4) developing smart machines, and (5) people and computers - and ends with a summary of remarks from the concluding panel discussion.
|Space Studies Board Annual Report 2015
The original charter of the Space Science Board was established in June 1958, 3 months before the National Aeronautics and Space Administration (NASA) opened its doors. The Space Science Board and its successor, the Space Studies Board (SSB), have provided expert external and independent scientific and programmatic advice to NASA on a continuous basis from NASA's inception until the present. The SSB has also provided such advice to other executive branch agencies, including the National Oceanic and Atmospheric Administration (NOAA), the National Science Foundation (NSF), the U.S. Geological Survey (USGS), the Department of Defense, as well as to Congress.
Space Studies Board Annual Report 2015 covers a message from the chair of the SSB, David N. Spergel. This report also explains the origins of the Space Science Board, how the Space Studies Board functions today, the SSB's collaboration with other National Research Council units, assures the quality of the SSB reports, acknowledges the audience and sponsors, and expresses the necessity to enhance the outreach and improve dissemination of SSB reports.
This report will be relevant to a full range of government audiences in civilian space research - including NASA, NSF, NOAA, USGS, and the Department of Energy, as well members of the SSB, policy makers, and researchers.
|Electricity Use in Rural and Islanded Communities: Summary of a Workshop
On behalf of the Quadrennial Energy Review (QER) Task Force, the National Academies of Sciences, Engineering, and Medicine hosted a workshop on February 8-9, 2016, titled "Electricity Use in Rural and Islanded Communities." The objective of the workshop was to help the QER Task Force public outreach efforts by focusing on communities with unique electricity challenges. The workshop explored challenges and opportunities for reducing electricity use and associated greenhouse gas emissions while improving electricity system reliability and resilience in rural and islanded communities. This report summarizes the presentation and discussion of the workshop.
|Effects of the Deletion of Chemical Agent Washout on Operations at the Blue Grass Chemical Agent Destruction Pilot Plant
The United States manufactured significant quantities of chemical weapons during the Cold War and the years prior. Because the chemical weapons are aging, storage constitutes an ongoing risk to the facility workforces and to the communities nearby. In addition, the Chemical Weapons Convention treaty stipulates that the chemical weapons be destroyed. The United States has destroyed approximately 90 percent of the chemical weapons stockpile located at seven sites.
As part of the effort to destroy its remaining stockpile, the Department of Defense is building the Blue Grass Chemical Agent Destruction Pilot Plant (BGCAPP) on the Blue Grass Army Depot (BGAD), near Richmond, Kentucky. The stockpile stored at BGAD consists of rockets and projectiles containing the nerve agents GB and VX and the blister agent mustard. Continued storage poses a risk to the BGAD workforce and the surrounding community because these munitions are several decades old and are developing leaks.
Due to public opposition to the use of incineration to destroy the BGAD stockpile, Congress mandated that non- incineration technologies be identified for use at BGCAPP. As a result, the original BGCAPP design called for munitions to be drained of agent and then for the munition bodies to be washed out using high-pressure hot water. However as part of a larger package of modifications called Engineering Change Proposal 87 (ECP-87), the munition washout step was eliminated. Effects of the Deletion of Chemical Agent Washout on Operations at the Blue Grass Chemical Agent Destruction Pilot Plant examines the impacts of this design change on operations at BGCAPP and makes recommendations to guide future decision making.
|Commercial Aircraft Propulsion and Energy Systems Research: Reducing Global Carbon Emissions
The primary human activities that release carbon dioxide (CO2) into the atmosphere are the combustion of fossil fuels (coal, natural gas, and oil) to generate electricity, the provision of energy for transportation, and as a consequence of some industrial processes. Although aviation CO2 emissions only make up approximately 2.0 to 2.5 percent of total global annual CO2 emissions, research to reduce CO2 emissions is urgent because (1) such reductions may be legislated even as commercial air travel grows, (2) because it takes new technology a long time to propagate into and through the aviation fleet, and (3) because of the ongoing impact of global CO2 emissions.
Commercial Aircraft Propulsion and Energy Systems Research develops a national research agenda for reducing CO2 emissions from commercial aviation. This report focuses on propulsion and energy technologies for reducing carbon emissions from large, commercial aircraft— single-aisle and twin-aisle aircraft that carry 100 or more passengers—because such aircraft account for more than 90 percent of global emissions from commercial aircraft. Moreover, while smaller aircraft also emit CO2, they make only a minor contribution to global emissions, and many technologies that reduce CO2 emissions for large aircraft also apply to smaller aircraft.
As commercial aviation continues to grow in terms of revenue-passenger miles and cargo ton miles, CO2 emissions are expected to increase. To reduce the contribution of aviation to climate change, it is essential to improve the effectiveness of ongoing efforts to reduce emissions and initiate research into new approaches.
|Achieving Science with CubeSats: Thinking Inside the Box
Released 2016-05-23 Forthcoming/Prepublication
Space-based observations have transformed our understanding of Earth, its environment, the solar system and the universe at large. During past decades, driven by increasingly advanced science questions, space observatories have become more sophisticated and more complex, with costs often growing to billions of dollars. Although these kinds of ever-more-sophisticated missions will continue into the future, small satellites, ranging in mass between 500 kg to 0.1 kg, are gaining momentum as an additional means to address targeted science questions in a rapid, and possibly more affordable, manner. Within the category of small satellites, CubeSats have emerged as a space-platform defined in terms of (10 cm x 10 cm x 10 cm)- sized cubic units of approximately 1.3 kg each called “U’s.” Historically, CubeSats were developed as training projects to expose students to the challenges of real-world engineering practices and system design. Yet, their use has rapidly spread within academia, industry, and government agencies both nationally and internationally.
In particular, CubeSats have caught the attention of parts of the U.S. space science community, which sees this platform, despite its inherent constraints, as a way to affordably access space and perform unique measurements of scientific value. The first science results from such CubeSats have only recently become available; however, questions remain regarding the scientific potential and technological promise of CubeSats in the future.
Achieving Science with CubeSats reviews the current state of the scientific potential and technological promise of CubeSats. This report focuses on the platform’s promise to obtain high- priority science data, as defined in recent decadal surveys in astronomy and astrophysics, Earth science and applications from space, planetary science, and solar and space physics (heliophysics); the science priorities identified in the 2014 NASA Science Plan; and the potential for CubeSats to advance biology and microgravity research. It provides a list of sample science goals for CubeSats, many of which address targeted science, often in coordination with other spacecraft, or use “sacrificial,” or high-risk, orbits that lead to the demise of the satellite after critical data have been collected. Other goals relate to the use of CubeSats as constellations or swarms deploying tens to hundreds of CubeSats that function as one distributed array of measurements.
|2015-2016 Assessment of the Army Research Laboratory: Interim Report
The National Academies of Sciences, Engineering, and Medicine's Army Research Laboratory Technical Assessment Board (ARLTAB) provides biennial assessments of the scientific and technical quality of the research, development, and analysis programs at the Army Research Laboratory (ARL), focusing on ballistics sciences, human sciences, information sciences, materials sciences, and mechanical sciences.
This interim report summarizes the findings of the Board for the first year of this biennial assessment; the current report addresses approximately half the portfolio for each campaign; the remainder will be assessed in 2016. During the first year the Board examined the following elements within the ARL's science and technology campaigns: biological and bioinspired materials, energy and power materials, and engineered photonics materials; battlefield injury mechanisms, directed energy, and armor and adaptive protection; sensing and effecting, and system intelligence and intelligent systems; advanced computing architectures, computing sciences, data-intensive sciences, and predictive simulation sciences; human-machine interaction, intelligence and control, and perception; humans in multiagent systems, real-world behavior, and toward human variability; and mission capability of systems. A second, final report will subsume the findings of this interim report and add the findings from the second year of the review.
|Applying Materials State Awareness to Condition-Based Maintenance and System Life Cycle Management: Summary of a Workshop
In August 2014, the committee on Defense Materials Manufacturing and Infrastructure convened a workshop to discuss issues related to applying materials state awareness to condition-based maintenance and system life cycle management. The workshop was structured around three focal topics: (1) advances in metrology and experimental methods, (2) advances in physics-based models for assessment, and (3) advances in databases and diagnostic technologies. This report summarizes the discussions and presentations from this workshop.
|New Frontiers in Solar System Exploration
Over the last four decades, robotic spacecraft have visited nearly every planet, from torrid Mercury to frigid Neptune. The data returned by these Pioneers, Mariners, Vikings, and Voyagers have revolutionized our understanding of the solar system. These achievements rank among the greatest accomplishments of the 20th century. Now, at the opening of the 21st, it is appropriate to ask, where do we go from here?
In 2001, NASA asked the National Academies to study the current state of solar system exploration in the United States and devise a set of scientific priorities for missions in the upcoming decade (2003-2013). After soliciting input from hundreds of scientists around the nation and abroad, the Solar System Exploration Survey produced the discipline's first long-range, community-generated strategy and set of mission priorities: New Frontiers in the Solar System: An Integrated Exploration Strategy. The key mission recommendations made in the report, and the scientific goals from which the recommendations flow, are summarized in this booklet.
|Mainstreaming Unmanned Undersea Vehicles into Future U.S. Naval Operations: Abbreviated Version of a Restricted Report
At the request of the former Chief of Naval Operations, the National Academies of Sciences, Engineering, and Medicine appointed an expert committee to assess the potential of unmanned undersea vehicles (UUVs) in enhancing future U.S. naval operations. The Department of the Navy has determined that the final report prepared by the committee is restricted in its entirety under exemption 3 of the Freedom of Information Act (5 USC § 552 (b) (3)), via 10 USC § 130 and therefore cannot be made available to the public. This abbreviated report provides background information on the full report and the committee that prepared it.
|Analytic Research Foundations for the Next-Generation Electric Grid
Electricity is the lifeblood of modern society, and for the vast majority of people that electricity is obtained from large, interconnected power grids. However, the grid that was developed in the 20th century, and the incremental improvements made since then, including its underlying analytic foundations, is no longer adequate to completely meet the needs of the 21st century. The next-generation electric grid must be more flexible and resilient. While fossil fuels will have their place for decades to come, the grid of the future will need to accommodate a wider mix of more intermittent generating sources such as wind and distributed solar photovoltaics.
Achieving this grid of the future will require effort on several fronts. There is a need for continued shorter-term engineering research and development, building on the existing analytic foundations for the grid. But there is also a need for more fundamental research to expand these analytic foundations. Analytic Research Foundations for the Next-Generation Electric Grid provide guidance on the longer-term critical areas for research in mathematical and computational sciences that is needed for the next-generation grid. It offers recommendations that are designed to help direct future research as the grid evolves and to give the nation’s research and development infrastructure the tools it needs to effectively develop, test, and use this research.
|An Assessment of the National Institute of Standards and Technology Physical Measurement Laboratory: Fiscal Year 2015
The Physical Measurement Laboratory (PML) at the National Institute of Standards and Technology (NIST) is dedicated to three fundamental and complementary tasks: (1) increase the accuracy of our knowledge of the physical parameters that are the foundation of our technology-driven society; (2) disseminate technologies by which these physical parameters can be accessed in a standardized way by the stakeholders; and (3) conduct research at both fundamental and applied levels to provide knowledge that may eventually lead to advances in measurement approaches and standards. This report assesses the scientific and technical work performed by the PML and identifies salient examples of accomplishments, challenges, and opportunities for improvement for each of its nine divisions.
|Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific Results: Summary of a Workshop
Questions about the reproducibility of scientific research have been raised in numerous settings and have gained visibility through several high-profile journal and popular press articles. Quantitative issues contributing to reproducibility challenges have been considered (including improper data measurement and analysis, inadequate statistical expertise, and incomplete data, among others), but there is no clear consensus on how best to approach or to minimize these problems.
A lack of reproducibility of scientific results has created some distrust in scientific findings among the general public, scientists, funding agencies, and industries. While studies fail for a variety of reasons, many factors contribute to the lack of perfect reproducibility, including insufficient training in experimental design, misaligned incentives for publication and the implications for university tenure, intentional manipulation, poor data management and analysis, and inadequate instances of statistical inference.
The workshop summarized in this report was designed not to address the social and experimental challenges but instead to focus on the latter issues of improper data management and analysis, inadequate statistical expertise, incomplete data, and difficulties applying sound statistic inference to the available data. Many efforts have emerged over recent years to draw attention to and improve reproducibility of scientific work. This report uniquely focuses on the statistical perspective of three issues: the extent of reproducibility, the causes of reproducibility failures, and the potential remedies for these failures.
|Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community
Recent disclosures about the bulk collection of domestic phone call records and other signals intelligence programs have stimulated widespread debate about the implications of such practices for the civil liberties and privacy of Americans. In the wake of these disclosures, many have identified a need for the intelligence community to engage more deeply with outside privacy experts and stakeholders.
At the request of the Office of the Director of National Intelligence, the National Academies of Sciences, Engineering, and Medicine convened a workshop to address the privacy implications of emerging technologies, public and individual preferences and attitudes toward privacy, and ethical approaches to data collection and use. This report summarizes discussions between experts from academia and the private sector and from the intelligence community on private sector best practices and privacy research results.
|Strategies to Enhance Air Force Communication with Internal and External Audiences: A Workshop Report
The U.S. Air Force (USAF) helps defend the United States and its interests by organizing, training, and equipping forces for operations in and through three distinct domains -- air, space, and cyberspace. The Air Force concisely expresses its vision as "Global Vigilance, Global Reach, and Global Power for America." Operations within each of these domains are dynamic, take place over large distances, occur over different operational timelines, and cannot be routinely seen or recorded, making it difficult for Airmen, national decision makers, and the American People to visualize and comprehend the full scope of Air Force operations. As a result, the Air Force faces increasing difficulty in succinctly and effectively communicating the complexity, dynamic range, and strategic importance of its mission to Airmen and to the American people.
To address this concern, the Chief of Staff of the USAF requested that the National Academies of Sciences, Engineering, and Medicine convene a workshop to explore options on how the Air Force can effectively communicate the strategic importance of the Service, its mission, and the role it plays in the defense of the United States. Participants worked to address the issues that a diverse workforce encompassing a myriad of backgrounds, education, and increasingly diverse current mission sets drives the requirement for a new communication strategy. The demographics of today's Air Force creates both a unique opportunity and a distinct challenge to Air Force leadership as it struggles to communicate its vision and strategy effectively across several micro-cultures within the organization and to the general public. This report summarizes the presentations and discussions from the workshop.
|Affordability of National Flood Insurance Program Premiums: Report 2
When Congress authorized the National Flood Insurance Program (NFIP) in 1968, it intended for the program to encourage community initiatives in flood risk management, charge insurance premiums consistent with actuarial pricing principles, and encourage the purchase of flood insurance by owners of flood prone properties, in part, by offering affordable premiums. The NFIP has been reauthorized many times since 1968, most recently with the Biggert-Waters Flood Insurance Reform Act of 2012 (BW 2012). In this most recent reauthorization, Congress placed a particular emphasis on setting flood insurance premiums following actuarial pricing principles, which was motivated by a desire to ensure future revenues were adequate to pay claims and administrative expenses. BW 2012 was designed to move the NFIP towards risk-based premiums for all flood insurance policies. The result was to be increased premiums for some policyholders that had been paying less than NFIP risk-based premiums and to possibly increase premiums for all policyholders.
Recognition of this possibility and concern for the affordability of flood insurance is reflected in sections of the Homeowner Flood Insurance Affordability Act of 2014 (HFIAA 2014). These sections called on FEMA to propose a draft affordability framework for the NFIP after completing an analysis of the efforts of possible programs for offering “means-tested assistance” to policyholders for whom higher rates may not be affordable.
BW 2012 and HFIAA 2014 mandated that FEMA conduct a study, in cooperation with the National Academies of Sciences, Engineering, and Medicine, which would compare the costs of a program of risk-based rates and means-tested assistance to the current system of subsidized flood insurance rates and federally funded disaster relief for people without coverage. Production of two reports was agreed upon to fulfill this mandate. This second report proposes alternative approaches for a national evaluation of affordability program policy options and includes lessons for the design of a national study from a proof-of-concept pilot study.