|An Assessment of the National Institute of Standards and Technology Center for Nanoscale Science and TechnologyFiscal Year 2016
At the request of the National Institute of Standards and Technology (NIST), the National Academies of Sciences, Engineering, and Medicine has, since 1959, annually assembled panels of experts from academia, industry, medicine, and other scientific and engineering communities to assess the quality and effectiveness of the NIST measurements and standards laboratories, of which there are now seven, as well as the adequacy of the laboratories’ resources. An Assessment of the National Institute of Standards and Technology Center for Nanoscale Science and Technology: Fiscal Year 2016 assesses the scientific and technical work performed by the NIST Center for Nanoscale Science and Technology and the accomplisments, challenges, and opportunities for improvement.
|Extending ScienceNASA's Space Science Mission Extensions and the Senior Review Process
NASA operates a large number of space science missions, approximately three-quarters of which are currently in their extended operations phase. They represent not only a majority of operational space science missions, but a substantial national investment and vital national assets. They are tremendously scientifically productive, making many of the major discoveries that are reported in the media and that rewrite textbooks.
Extending Science – NASA’s Space Science Mission Extensions and the Senior Review Process evaluates the scientific benefits of missions extensions, the current process for extending missions, the current biennial requirement for senior reviews of mission extensions, the balance between starting new missions and extending operating missions, and potential innovative cost-reduction proposals for extended missions, and makes recommendations based on this review.
|A 21st Century Cyber-Physical Systems Education
Cyber-physical systems (CPS) are “engineered systems that are built from, and depend upon, the seamless integration of computational algorithms and physical components.” CPS can be small and closed, such as an artificial pancreas, or very large, complex, and interconnected, such as a regional energy grid. CPS engineering focuses on managing inter- dependencies and impact of physical aspects on cyber aspects, and vice versa. With the development of low-cost sensing, powerful embedded system hardware, and widely deployed communication networks, the reliance on CPS for system functionality has dramatically increased. These technical developments in combination with the creation of a workforce skilled in engineering CPS will allow the deployment of increasingly capable, adaptable, and trustworthy systems.
Engineers responsible for developing CPS but lacking the appropriate education or training may not fully understand at an appropriate depth, on the one hand, the technical issues associated with the CPS software and hardware or, on the other hand, techniques for physical system modeling, energy and power, actuation, signal processing, and control. In addition, these engineers may be designing and implementing life-critical systems without appropriate formal training in CPS methods needed for verification and to assure safety, reliability, and security.
A workforce with the appropriate education, training, and skills will be better positioned to create and manage the next generation of CPS solutions. A 21st Century Cyber-Physical Systems Education examines the intellectual content of the emerging field of CPS and its implications for engineering and computer science education. This report is intended to inform those who might support efforts to develop curricula and materials; faculty and university administrators; industries with needs for CPS workers; and current and potential students about intellectual foundations, workforce requirements, employment opportunities, and curricular needs.
|Triennial Review of the National Nanotechnology Initiative
Nanoscale science, engineering, and technology, often referred to simply as “nanotechnology,” is the understanding, characterization, and control of matter at the scale of nanometers, the dimension of atoms and molecules. Advances in nanotechnology promise new materials and structures that are the basis of solutions, for example, for improving human health, optimizing available energy and water resources, supporting a vibrant economy, raising the standard of living, and increasing national security.
Established in 2001, the National Nanotechnology Initiative (NNI) is a coordinated, multiagency effort with the mission to expedite the discovery, development, and deployment of nanoscale science and technology to serve the public good. This report is the latest triennial review of the NNI called for by the 21st Century Nanotechnology Research and Development Act of 2003. It examines and comments on the mechanisms in use by the NNI to advance focused areas of nanotechnology towards advanced development and commercialization and on the physical and human infrastructure needs for successful realization in the United States of the benefits of nanotechnology development.
|A Threat to America's Global Vigilance, Reach, and Power–High-Speed, Maneuvering WeaponsUnclassified Summary
The National Academies of Sciences, Engineering, and Medicine was asked by the Assistant Secretary of the Air Force for Science, Technology and Engineering to assess the threat of high-speed weapons and recommendations to counter the threat. This report reviews the current and evolving threats, and the current and planned U.S. efforts and capabilities to counter these threats, identifies current gaps and future opportunities where the United States Air Force (USAF) could provide significant contribution to the U.S. effort to counter high-speed threats, and recommends actions the USAF could take in terms of materiel, non-materiel, and technology development to address the identified opportunities and gaps in U.S. efforts to address these threats.
|The Role of Experimentation Campaigns in the Air Force Innovation Life Cycle
The Air Force (USAF) has continuously sought to improve the speed with which it develops new capabilities to accomplish its various missions in air, space, and cyberspace. Historically, innovation has been a key part of USAF strategy, and operating within an adversary’s OODA loop (observe, orient, decide, act) is part of Air Force DNA. This includes the ability to deploy technological innovations faster than do our adversaries. The Air Force faces adversaries with the potential to operate within the USAF’s OODA loop, and some of these adversaries are already deploying innovations faster than the USAF.
The Role of Experimentation Campaigns in the Air Force Innovation Life Cycle examines the current state of innovation and experimentation in the Air Force and best practices in innovation and experimentation in industry and other government agencies. This report also explores organizational changes needed to eliminate the barriers that deter innovation and experimentation and makes recommendations for the successful implementation of robust innovation and experimentation by the Air Force.
|The Role of Experimentation Campaigns in the Air Force Innovation Life CycleProceedings of a Workshop
The Workshop on the Role of Experimentation Campaigns in the Innovation Cycle was held in January 2016 to define and assess the current use of experimentation campaigns within the Air Force, evaluate barriers to their use, and make recommendations to increase their use. Participants at the workshop presented a broad range of issues, experiences, and insights related to experimentation, experimentation campaigns, and innovation. This publication summarizes the presentations and discussions from the workshop.
|From Maps to ModelsAugmenting the Nation's Geospatial Intelligence Capabilities
The United States faces numerous, varied, and evolving threats to national security, including terrorism, scarcity and disruption of food and water supplies, extreme weather events, and regional conflicts around the world. Effectively managing these threats requires intelligence that not only assesses what is happening now, but that also anticipates potential future threats. The National Geospatial-Intelligence Agency (NGA) is responsible for providing geospatial intelligence on other countries—assessing where exactly something is, what it is, and why it is important—in support of national security, disaster response, and humanitarian assistance. NGA’s approach today relies heavily on imagery analysis and mapping, which provide an assessment of current and past conditions. However, augmenting that approach with a strong modeling capability would enable NGA to also anticipate and explore future outcomes.
A model is a simplified representation of a real-world system that is used to extract explainable insights about the system, predict future outcomes, or explore what might happen under plausible what-if scenarios. Such models use data and/or theory to specify inputs (e.g., initial conditions, boundary conditions, and model parameters) to produce an output.
From Maps to Models: Augmenting the Nation's Geospatial Intelligence Capabilities describes the types of models and analytical methods used to understand real-world systems, discusses what would be required to make these models and methods useful for geospatial intelligence, and identifies supporting research and development for NGA. This report provides examples of models that have been used to help answer the sorts of questions NGA might ask, describes how to go about a model-based investigation, and discusses models and methods that are relevant to NGA’s mission.
|Owning the Technical Baseline for Acquisition Programs in the U.S. Air Force
While there are examples of successful weapon systems acquisition programs within the U.S. Air Force (USAF), many of the programs are still incurring cost growth, schedule delays, and performance problems. The USAF now faces serious challenges in acquiring and maintaining its weapons systems as it strives to maintain its current programs; add new capabilities to counter evolving threats; and reduce its overall program expenditures. Owning the technical baseline is a critical component of the Air Force’s ability to regain and maintain acquisition excellence.
Owning the technical baseline allows the government acquisition team to manage and respond knowledgeably and effectively to systems development, operations, and execution, thereby avoiding technical and other programmatic barriers to mission success. Additionally, owning the technical baseline ensures that government personnel understand the user requirements, why a particular design and its various features have been selected over competing designs, and what the options are to pursue alternative paths to the final product given unanticipated cost, schedule, and performance challenges.
Owning the Technical Baseline for Acquisition Programs in the U.S. Air Force discusses the strategic value to the Air Force of owning the technical baseline and the risk of not owning it and highlights key aspects of how agencies other than the Air Force own the technical baseline for their acquisition programs. This report identifies specific barriers to owning the technical baseline for the Air Force and makes recommendations to help guide the Air Force in overcoming those barriers.
|New Worlds, New HorizonsA Midterm Assessment
New Worlds, New Horizons in Astronomy and Astrophysics (NWNH), the report of the 2010 decadal survey of astronomy and astrophysics, put forward a vision for a decade of transformative exploration at the frontiers of astrophysics. This vision included mapping the first stars and galaxies as they emerge from the collapse of dark matter and cold clumps of hydrogen, finding new worlds in a startlingly diverse population of extrasolar planets, and exploiting the vastness and extreme conditions of the universe to reveal new information about the fundamental laws of nature. NWNH outlined a compelling program for understanding the cosmic order and for opening new fields of inquiry through the discovery areas of gravitational waves, time-domain astronomy, and habitable planets. Many of these discoveries are likely to be enabled by cyber-discovery and the power of mathematics, physics, and imagination. To help realize this vision, NWNH recommended a suite of innovative and powerful facilities, along with balanced, strong support for the scientific community engaged in theory, data analysis, technology development, and measurements with existing and new instrumentation. Already in the first half of the decade, scientists and teams of scientists working with these cutting-edge instruments and with new capabilities in data collection and analysis have made spectacular discoveries that advance the NWNH vision.
New Worlds, New Horizons: A Midterm Assessment reviews the responses of NASA’s Astrophysics program, NSF’s Astronomy program, and DOE’s Cosmic Frontiers program to NWNH. This report describes the most significant scientific discoveries, technical advances, and relevant programmatic changes in astronomy and astrophysics over the years since the publication of the decadal survey, and assesses how well the Agencies’ programs address the strategies, goals, and priorities outlined in the 2010 decadal survey.
|NASA Space Technology Roadmaps and Priorities Revisited
Historically, the United States has been a world leader in aerospace endeavors in both the government and commercial sectors. A key factor in aerospace leadership is continuous development of advanced technology, which is critical to U.S. ambitions in space, including a human mission to Mars. To continue to achieve progress, NASA is currently executing a series of aeronautics and space technology programs using a roadmapping process to identify technology needs and improve the management of its technology development portfolio.
NASA created a set of 14 draft technology roadmaps in 2010 to guide the development of space technologies. In 2015, NASA issued a revised set of roadmaps. A significant new aspect of the update has been the effort to assess the relevance of the technologies by listing the enabling and enhancing technologies for specific design reference missions (DRMs) from the Human Exploration and Operations Mission Directorate and the Science Mission Directorate. NASA Space Technology Roadmaps and Priorities Revisited prioritizes new technologies in the 2015 roadmaps and recommends a methodology for conducting independent reviews of future updates to NASA’s space technology roadmaps, which are expected to occur every 4 years.
|Achieving Science with CubeSatsThinking Inside the Box
Space-based observations have transformed our understanding of Earth, its environment, the solar system and the universe at large. During past decades, driven by increasingly advanced science questions, space observatories have become more sophisticated and more complex, with costs often growing to billions of dollars. Although these kinds of ever-more-sophisticated missions will continue into the future, small satellites, ranging in mass between 500 kg to 0.1 kg, are gaining momentum as an additional means to address targeted science questions in a rapid, and possibly more affordable, manner. Within the category of small satellites, CubeSats have emerged as a space-platform defined in terms of (10 cm x 10 cm x 10 cm)- sized cubic units of approximately 1.3 kg each called “U’s.” Historically, CubeSats were developed as training projects to expose students to the challenges of real-world engineering practices and system design. Yet, their use has rapidly spread within academia, industry, and government agencies both nationally and internationally.
In particular, CubeSats have caught the attention of parts of the U.S. space science community, which sees this platform, despite its inherent constraints, as a way to affordably access space and perform unique measurements of scientific value. The first science results from such CubeSats have only recently become available; however, questions remain regarding the scientific potential and technological promise of CubeSats in the future.
Achieving Science with CubeSats reviews the current state of the scientific potential and technological promise of CubeSats. This report focuses on the platform’s promise to obtain high- priority science data, as defined in recent decadal surveys in astronomy and astrophysics, Earth science and applications from space, planetary science, and solar and space physics (heliophysics); the science priorities identified in the 2014 NASA Science Plan; and the potential for CubeSats to advance biology and microgravity research. It provides a list of sample science goals for CubeSats, many of which address targeted science, often in coordination with other spacecraft, or use “sacrificial,” or high-risk, orbits that lead to the demise of the satellite after critical data have been collected. Other goals relate to the use of CubeSats as constellations or swarms deploying tens to hundreds of CubeSats that function as one distributed array of measurements.
|Exploring Encryption and Potential Mechanisms for Authorized Government Access to PlaintextProceedings of a Workshop
In June 2016 the National Academies of Sciences, Engineering, and Medicine convened the Workshop on Encryption and Mechanisms for Authorized Government Access to Plaintext. Participants at this workshop discussed potential encryption strategies that would enable access to plaintext information by law enforcement or national security agencies with appropriate authority. Although the focus of the workshop was on technical issues, there was some consideration of the broader policy context, and discussion about the topics of encryption and authorized exceptional analysis frequently addressed open policy questions as well as technical issues. This publication summarizes the presentations and discussions from the workshop.
|Review of Proposals for Research on Statistical Methodologies for Assessing Variables in Eyewitness Performance
Recognizing the importance of eyewitness identifications in courts of law and motivated by data showing that at least one erroneous eyewitness identification was associated with almost 75% of cases where defendants were later exonerated by DNA evidence, in 2013 the Laura and John Arnold Foundation asked the National Academy of Sciences to undertake an assessment of the scientific research on eyewitness identification and offer recommendations to improve eyewitness performance. The appointed committee issued its report, Identifying the Culprit: Assessing Eyewitness Identification, in 2014.
In order to stimulate new and innovative research on statistical tools and the interrelationships between system and estimator variables, the Arnold Foundation in 2015 again called upon the National Academies. This report describes the development of the request for proposals, the processes followed by the committee as it evaluated the proposals, and the committee’s assessment of the scientific merit and research design of the proposals.
|A Vision for the Future of Center-Based Multidisciplinary Engineering ResearchProceedings of a Symposium
Out of concern for the state of engineering in the United States, the National Science Foundation (NSF) created the Engineering Research Centers (ERCs) with the goal of improving engineering research and education and helping to keep the United States competitive in global markets. Since the ERC program’s inception in 1985, NSF has funded 67 ERCs across the United States. NSF funds each ERC for up to 10 years, during which time the centers build robust partnerships with industry, universities, and other government entities that can ideally sustain them upon graduation from NSF support.
To ensure that the ERCs continue to be a source of innovation, economic development, and educational excellence, NSF commissioned the National Academies of Sciences, Engineering, and Medicine to convene a 1-day symposium in April 2016. This event featured four plenary panel presentations on: the evolving global context for center-based engineering research, trends in undergraduate and graduate engineering education, new directions in university-industry interaction, and emerging best practices in translating university research into innovation. This publication summarizes the presentations and discussions from the symposium.
|The Power of ChangeInnovation for Development and Deployment of Increasingly Clean Electric Power Technologies
Electricity, supplied reliably and affordably, is foundational to the U.S. economy and is utterly indispensable to modern society. However, emissions resulting from many forms of electricity generation create environmental risks that could have significant negative economic, security, and human health consequences. Large-scale installation of cleaner power generation has been generally hampered because greener technologies are more expensive than the technologies that currently produce most of our power. Rather than trade affordability and reliability for low emissions, is there a way to balance all three?
The Power of Change: Innovation for Development and Deployment of Increasingly Clean Energy Technologies considers how to speed up innovations that would dramatically improve the performance and lower the cost of currently available technologies while also developing new advanced cleaner energy technologies. According to this report, there is an opportunity for the United States to continue to lead in the pursuit of increasingly clean, more efficient electricity through innovation in advanced technologies. The Power of Change: Innovation for Development and Deployment of Increasingly Clean Energy Technologies makes the case that America’s advantages—world-class universities and national laboratories, a vibrant private sector, and innovative states, cities, and regions that are free to experiment with a variety of public policy approaches—position the United States to create and lead a new clean energy revolution. This study focuses on five paths to accelerate the market adoption of increasing clean energy and efficiency technologies: (1) expanding the portfolio of cleaner energy technology options; (2) leveraging the advantages of energy efficiency; (3) facilitating the development of increasing clean technologies, including renewables, nuclear, and cleaner fossil; (4) improving the existing technologies, systems, and infrastructure; and (5) leveling the playing field for cleaner energy technologies.
The Power of Change: Innovation for Development and Deployment of Increasingly Clean Energy Technologies is a call for leadership to transform the United States energy sector in order to both mitigate the risks of greenhouse gas and other pollutants and to spur future economic growth. This study’s focus on science, technology, and economic policy makes it a valuable resource to guide support that produces innovation to meet energy challenges now and for the future.
|National Security Space Defense and ProtectionPublic Report
It is not yet 60 years since the first artificial satellite was placed into Earth orbit. In just over a half century, mankind has gone from no presence in outer space to a condition of high dependence on orbiting satellites. These sensors, receivers, transmitters, and other such devices, as well as the satellites that carry them, are components of complex space systems that include terrestrial elements, electronic links between and among components, organizations to provide the management, care and feeding, and launch systems that put satellites into orbit. In many instances, these space systems connect with and otherwise interact with terrestrial systems; for example, a very long list of Earth-based systems cannot function properly without information from the Global Positioning System (GPS).
Space systems are fundamental to the information business, and the modern world is an information-driven one. In addition to navigation (and associated timing), space systems provide communications and imagery and other Earth-sensing functions. Among these systems are many that support military, intelligence, and other national security functions of the United States and many other nations. Some of these are unique government, national security systems; however, functions to support national security are also provided by commercial and civil-government space systems.
The importance of space systems to the United States and its allies and potential adversaries raises major policy issues. National Security Space Defense and Protection reviews the range of options available to address threats to space systems, in terms of deterring hostile actions, defeating hostile actions, and surviving hostile actions, and assesses potential strategies and plans to counter such threats. This report recommends architectures, capabilities, and courses of action to address such threats and actions to address affordability, technology risk, and other potential barriers or limiting factors in implementing such courses of action.
|Refining the Concept of Scientific Inference When Working with Big DataProceedings of a Workshop—in Brief
Big Data – broadly considered as datasets whose size, complexity, and heterogeneity preclude conventional approaches to storage and analysis – continues to generate interest across many scientific domains in both the public and private sectors. However, analyses of large heterogeneous datasets can suffer from unidentified bias, misleading correlations, and increased risk of false positives. In order for the proliferation of data to produce new scientific discoveries, it is essential that the statistical models used for analysis support reliable, reproducible inference. The National Academies of Sciences, Engineering, and Medicine convened a workshop to discuss how scientific inference should be applied when working with large, complex datasets.
|Optimizing the Air Force Acquisition Strategy of Secure and Reliable Electronic ComponentsProceedings of a Workshop
In 2012, the National Defense Authorization Act (NDAA), section 818, outlined new requirements for industry to serve as the lead in averting counterfeits in the defense supply chain. Subsequently, the House Armed Services Committee, in its report on the Fiscal Year 2016 NDAA, noted that the pending sale of IBM’s microprocessor fabrication facilities to Global Foundries created uncertainty about future access of the United States to trusted state-of-the-art microelectronic components and directed the Comptroller General to assess the Department of Defense’s (DoD’s) actions and measures to address this threat.
In this context, the National Academies of Sciences, Engineering, and Medicine convened a workshop to facilitate an open dialogue with leading industry, academic, and government experts to (1) define the current technological and policy challenges with maintaining a reliable and secure source of microelectronic components; (2) review the current state of acquisition processes within the Air Force for acquiring reliable and secure microelectronic components; and (3) explore options for possible business models within the national security complex that would be relevant for the Air Force acquisition community. This publication summarizes the results of the workshop.
|Commercial Aircraft Propulsion and Energy Systems ResearchReducing Global Carbon Emissions
The primary human activities that release carbon dioxide (CO2) into the atmosphere are the combustion of fossil fuels (coal, natural gas, and oil) to generate electricity, the provision of energy for transportation, and as a consequence of some industrial processes. Although aviation CO2 emissions only make up approximately 2.0 to 2.5 percent of total global annual CO2 emissions, research to reduce CO2 emissions is urgent because (1) such reductions may be legislated even as commercial air travel grows, (2) because it takes new technology a long time to propagate into and through the aviation fleet, and (3) because of the ongoing impact of global CO2 emissions.
Commercial Aircraft Propulsion and Energy Systems Research develops a national research agenda for reducing CO2 emissions from commercial aviation. This report focuses on propulsion and energy technologies for reducing carbon emissions from large, commercial aircraft— single-aisle and twin-aisle aircraft that carry 100 or more passengers—because such aircraft account for more than 90 percent of global emissions from commercial aircraft. Moreover, while smaller aircraft also emit CO2, they make only a minor contribution to global emissions, and many technologies that reduce CO2 emissions for large aircraft also apply to smaller aircraft.
As commercial aviation continues to grow in terms of revenue-passenger miles and cargo ton miles, CO2 emissions are expected to increase. To reduce the contribution of aviation to climate change, it is essential to improve the effectiveness of ongoing efforts to reduce emissions and initiate research into new approaches.
|Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020
Advanced computing capabilities are used to tackle a rapidly growing range of challenging science and engineering problems, many of which are compute- and data-intensive as well. Demand for advanced computing has been growing for all types and capabilities of systems, from large numbers of single commodity nodes to jobs requiring thousands of cores; for systems with fast interconnects; for systems with excellent data handling and management; and for an increasingly diverse set of applications that includes data analytics as well as modeling and simulation. Since the advent of its supercomputing centers, the National Science Foundation (NSF) has provided its researchers with state-of-the-art computing systems. The growth of new models of computing, including cloud computing and publically available by privately held data repositories, opens up new possibilities for NSF.
In order to better understand the expanding and diverse requirements of the science and engineering community and the importance of a new broader range of advanced computing infrastructure, the NSF requested that the National Research Council carry out a study examining anticipated priorities and associated tradeoffs for advanced computing. Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020 provides a framework for future decision-making about NSF’s advanced computing strategy and programs. It offers recommendations aimed at achieving four broad goals: (1) position the U.S. for continued leadership in science and engineering, (2) ensure that resources meet community needs, (3) aid the scientific community in keeping up with the revolution in computing, and (4) sustain the infrastructure for advanced computing.
|Continuing Innovation in Information TechnologyWorkshop Report
The 2012 National Research Council report Continuing Innovation in Information Technology illustrates how fundamental research in information technology (IT), conducted at industry and universities, has led to the introduction of entirely new product categories that ultimately became billion-dollar industries. The central graphic from that report portrays and connects areas of major investment in basic research, university-based research, and industry research and development; the introduction of important commercial products resulting from this research; billion-dollar-plus industries stemming from it; and present-day IT market segments and representative U.S. firms whose creation was stimulated by the decades-long research.
At a workshop hosted by the Computer Science and Telecommunications Board on March 5, 2015, leading academic and industry researchers and industrial technologists described key research and development results and their contributions and connections to new IT products and industries, and illustrated these developments as overlays to the 2012 "tire tracks" graphic. The principal goal of the workshop was to collect and make available to policy makers and members of the IT community first-person narratives that illustrate the link between government investments in academic and industry research to the ultimate creation of new IT industries. This report provides summaries of the workshop presentations organized into five broad themes - (1) fueling the innovation pipeline, (2) building a connected world, (3) advancing the hardware foundation, (4) developing smart machines, and (5) people and computers - and ends with a summary of remarks from the concluding panel discussion.
|Space Studies Board Annual Report 2015
The original charter of the Space Science Board was established in June 1958, 3 months before the National Aeronautics and Space Administration (NASA) opened its doors. The Space Science Board and its successor, the Space Studies Board (SSB), have provided expert external and independent scientific and programmatic advice to NASA on a continuous basis from NASA's inception until the present. The SSB has also provided such advice to other executive branch agencies, including the National Oceanic and Atmospheric Administration (NOAA), the National Science Foundation (NSF), the U.S. Geological Survey (USGS), the Department of Defense, as well as to Congress.
Space Studies Board Annual Report 2015 covers a message from the chair of the SSB, David N. Spergel. This report also explains the origins of the Space Science Board, how the Space Studies Board functions today, the SSB's collaboration with other National Research Council units, assures the quality of the SSB reports, acknowledges the audience and sponsors, and expresses the necessity to enhance the outreach and improve dissemination of SSB reports.
This report will be relevant to a full range of government audiences in civilian space research - including NASA, NSF, NOAA, USGS, and the Department of Energy, as well members of the SSB, policy makers, and researchers.
|Electricity Use in Rural and Islanded CommunitiesSummary of a Workshop
On behalf of the Quadrennial Energy Review (QER) Task Force, the National Academies of Sciences, Engineering, and Medicine hosted a workshop on February 8-9, 2016, titled "Electricity Use in Rural and Islanded Communities." The objective of the workshop was to help the QER Task Force public outreach efforts by focusing on communities with unique electricity challenges. The workshop explored challenges and opportunities for reducing electricity use and associated greenhouse gas emissions while improving electricity system reliability and resilience in rural and islanded communities. This report summarizes the presentation and discussion of the workshop.
|Effects of the Deletion of Chemical Agent Washout on Operations at the Blue Grass Chemical Agent Destruction Pilot Plant
The United States manufactured significant quantities of chemical weapons during the Cold War and the years prior. Because the chemical weapons are aging, storage constitutes an ongoing risk to the facility workforces and to the communities nearby. In addition, the Chemical Weapons Convention treaty stipulates that the chemical weapons be destroyed. The United States has destroyed approximately 90 percent of the chemical weapons stockpile located at seven sites.
As part of the effort to destroy its remaining stockpile, the Department of Defense is building the Blue Grass Chemical Agent Destruction Pilot Plant (BGCAPP) on the Blue Grass Army Depot (BGAD), near Richmond, Kentucky. The stockpile stored at BGAD consists of rockets and projectiles containing the nerve agents GB and VX and the blister agent mustard. Continued storage poses a risk to the BGAD workforce and the surrounding community because these munitions are several decades old and are developing leaks.
Due to public opposition to the use of incineration to destroy the BGAD stockpile, Congress mandated that non- incineration technologies be identified for use at BGCAPP. As a result, the original BGCAPP design called for munitions to be drained of agent and then for the munition bodies to be washed out using high-pressure hot water. However as part of a larger package of modifications called Engineering Change Proposal 87 (ECP-87), the munition washout step was eliminated. Effects of the Deletion of Chemical Agent Washout on Operations at the Blue Grass Chemical Agent Destruction Pilot Plant examines the impacts of this design change on operations at BGCAPP and makes recommendations to guide future decision making.
|2015-2016 Assessment of the Army Research LaboratoryInterim Report
The National Academies of Sciences, Engineering, and Medicine's Army Research Laboratory Technical Assessment Board (ARLTAB) provides biennial assessments of the scientific and technical quality of the research, development, and analysis programs at the Army Research Laboratory (ARL), focusing on ballistics sciences, human sciences, information sciences, materials sciences, and mechanical sciences.
This interim report summarizes the findings of the Board for the first year of this biennial assessment; the current report addresses approximately half the portfolio for each campaign; the remainder will be assessed in 2016. During the first year the Board examined the following elements within the ARL's science and technology campaigns: biological and bioinspired materials, energy and power materials, and engineered photonics materials; battlefield injury mechanisms, directed energy, and armor and adaptive protection; sensing and effecting, and system intelligence and intelligent systems; advanced computing architectures, computing sciences, data-intensive sciences, and predictive simulation sciences; human-machine interaction, intelligence and control, and perception; humans in multiagent systems, real-world behavior, and toward human variability; and mission capability of systems. A second, final report will subsume the findings of this interim report and add the findings from the second year of the review.
|Applying Materials State Awareness to Condition-Based Maintenance and System Life Cycle ManagementSummary of a Workshop
In August 2014, the committee on Defense Materials Manufacturing and Infrastructure convened a workshop to discuss issues related to applying materials state awareness to condition-based maintenance and system life cycle management. The workshop was structured around three focal topics: (1) advances in metrology and experimental methods, (2) advances in physics-based models for assessment, and (3) advances in databases and diagnostic technologies. This report summarizes the discussions and presentations from this workshop.
|New Frontiers in Solar System Exploration
Over the last four decades, robotic spacecraft have visited nearly every planet, from torrid Mercury to frigid Neptune. The data returned by these Pioneers, Mariners, Vikings, and Voyagers have revolutionized our understanding of the solar system. These achievements rank among the greatest accomplishments of the 20th century. Now, at the opening of the 21st, it is appropriate to ask, where do we go from here?
In 2001, NASA asked the National Academies to study the current state of solar system exploration in the United States and devise a set of scientific priorities for missions in the upcoming decade (2003-2013). After soliciting input from hundreds of scientists around the nation and abroad, the Solar System Exploration Survey produced the discipline's first long-range, community-generated strategy and set of mission priorities: New Frontiers in the Solar System: An Integrated Exploration Strategy. The key mission recommendations made in the report, and the scientific goals from which the recommendations flow, are summarized in this booklet.
|Mainstreaming Unmanned Undersea Vehicles into Future U.S. Naval OperationsAbbreviated Version of a Restricted Report
At the request of the former Chief of Naval Operations, the National Academies of Sciences, Engineering, and Medicine appointed an expert committee to assess the potential of unmanned undersea vehicles (UUVs) in enhancing future U.S. naval operations. The Department of the Navy has determined that the final report prepared by the committee is restricted in its entirety under exemption 3 of the Freedom of Information Act (5 USC § 552 (b) (3)), via 10 USC § 130 and therefore cannot be made available to the public. This abbreviated report provides background information on the full report and the committee that prepared it.
|Analytic Research Foundations for the Next-Generation Electric Grid
Electricity is the lifeblood of modern society, and for the vast majority of people that electricity is obtained from large, interconnected power grids. However, the grid that was developed in the 20th century, and the incremental improvements made since then, including its underlying analytic foundations, is no longer adequate to completely meet the needs of the 21st century. The next-generation electric grid must be more flexible and resilient. While fossil fuels will have their place for decades to come, the grid of the future will need to accommodate a wider mix of more intermittent generating sources such as wind and distributed solar photovoltaics.
Achieving this grid of the future will require effort on several fronts. There is a need for continued shorter-term engineering research and development, building on the existing analytic foundations for the grid. But there is also a need for more fundamental research to expand these analytic foundations. Analytic Research Foundations for the Next-Generation Electric Grid provide guidance on the longer-term critical areas for research in mathematical and computational sciences that is needed for the next-generation grid. It offers recommendations that are designed to help direct future research as the grid evolves and to give the nation’s research and development infrastructure the tools it needs to effectively develop, test, and use this research.
|An Assessment of the National Institute of Standards and Technology Physical Measurement LaboratoryFiscal Year 2015
The Physical Measurement Laboratory (PML) at the National Institute of Standards and Technology (NIST) is dedicated to three fundamental and complementary tasks: (1) increase the accuracy of our knowledge of the physical parameters that are the foundation of our technology-driven society; (2) disseminate technologies by which these physical parameters can be accessed in a standardized way by the stakeholders; and (3) conduct research at both fundamental and applied levels to provide knowledge that may eventually lead to advances in measurement approaches and standards. This report assesses the scientific and technical work performed by the PML and identifies salient examples of accomplishments, challenges, and opportunities for improvement for each of its nine divisions.
|Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific ResultsSummary of a Workshop
Questions about the reproducibility of scientific research have been raised in numerous settings and have gained visibility through several high-profile journal and popular press articles. Quantitative issues contributing to reproducibility challenges have been considered (including improper data measurement and analysis, inadequate statistical expertise, and incomplete data, among others), but there is no clear consensus on how best to approach or to minimize these problems.
A lack of reproducibility of scientific results has created some distrust in scientific findings among the general public, scientists, funding agencies, and industries. While studies fail for a variety of reasons, many factors contribute to the lack of perfect reproducibility, including insufficient training in experimental design, misaligned incentives for publication and the implications for university tenure, intentional manipulation, poor data management and analysis, and inadequate instances of statistical inference.
The workshop summarized in this report was designed not to address the social and experimental challenges but instead to focus on the latter issues of improper data management and analysis, inadequate statistical expertise, incomplete data, and difficulties applying sound statistic inference to the available data. Many efforts have emerged over recent years to draw attention to and improve reproducibility of scientific work. This report uniquely focuses on the statistical perspective of three issues: the extent of reproducibility, the causes of reproducibility failures, and the potential remedies for these failures.
|Privacy Research and Best PracticesSummary of a Workshop for the Intelligence Community
Recent disclosures about the bulk collection of domestic phone call records and other signals intelligence programs have stimulated widespread debate about the implications of such practices for the civil liberties and privacy of Americans. In the wake of these disclosures, many have identified a need for the intelligence community to engage more deeply with outside privacy experts and stakeholders.
At the request of the Office of the Director of National Intelligence, the National Academies of Sciences, Engineering, and Medicine convened a workshop to address the privacy implications of emerging technologies, public and individual preferences and attitudes toward privacy, and ethical approaches to data collection and use. This report summarizes discussions between experts from academia and the private sector and from the intelligence community on private sector best practices and privacy research results.
|Strategies to Enhance Air Force Communication with Internal and External AudiencesA Workshop Report
The U.S. Air Force (USAF) helps defend the United States and its interests by organizing, training, and equipping forces for operations in and through three distinct domains -- air, space, and cyberspace. The Air Force concisely expresses its vision as "Global Vigilance, Global Reach, and Global Power for America." Operations within each of these domains are dynamic, take place over large distances, occur over different operational timelines, and cannot be routinely seen or recorded, making it difficult for Airmen, national decision makers, and the American People to visualize and comprehend the full scope of Air Force operations. As a result, the Air Force faces increasing difficulty in succinctly and effectively communicating the complexity, dynamic range, and strategic importance of its mission to Airmen and to the American people.
To address this concern, the Chief of Staff of the USAF requested that the National Academies of Sciences, Engineering, and Medicine convene a workshop to explore options on how the Air Force can effectively communicate the strategic importance of the Service, its mission, and the role it plays in the defense of the United States. Participants worked to address the issues that a diverse workforce encompassing a myriad of backgrounds, education, and increasingly diverse current mission sets drives the requirement for a new communication strategy. The demographics of today's Air Force creates both a unique opportunity and a distinct challenge to Air Force leadership as it struggles to communicate its vision and strategy effectively across several micro-cultures within the organization and to the general public. This report summarizes the presentations and discussions from the workshop.
|Affordability of National Flood Insurance Program PremiumsReport 2
When Congress authorized the National Flood Insurance Program (NFIP) in 1968, it intended for the program to encourage community initiatives in flood risk management, charge insurance premiums consistent with actuarial pricing principles, and encourage the purchase of flood insurance by owners of flood prone properties, in part, by offering affordable premiums. The NFIP has been reauthorized many times since 1968, most recently with the Biggert-Waters Flood Insurance Reform Act of 2012 (BW 2012). In this most recent reauthorization, Congress placed a particular emphasis on setting flood insurance premiums following actuarial pricing principles, which was motivated by a desire to ensure future revenues were adequate to pay claims and administrative expenses. BW 2012 was designed to move the NFIP towards risk-based premiums for all flood insurance policies. The result was to be increased premiums for some policyholders that had been paying less than NFIP risk-based premiums and to possibly increase premiums for all policyholders.
Recognition of this possibility and concern for the affordability of flood insurance is reflected in sections of the Homeowner Flood Insurance Affordability Act of 2014 (HFIAA 2014). These sections called on FEMA to propose a draft affordability framework for the NFIP after completing an analysis of the efforts of possible programs for offering “means-tested assistance” to policyholders for whom higher rates may not be affordable.
BW 2012 and HFIAA 2014 mandated that FEMA conduct a study, in cooperation with the National Academies of Sciences, Engineering, and Medicine, which would compare the costs of a program of risk-based rates and means-tested assistance to the current system of subsidized flood insurance rates and federally funded disaster relief for people without coverage. Production of two reports was agreed upon to fulfill this mandate. This second report proposes alternative approaches for a national evaluation of affordability program policy options and includes lessons for the design of a national study from a proof-of-concept pilot study.