|A Review of the Next Generation Air Transportation System: Implications and Importance of System Architecture
Released 2015-05-01 Forthcoming/Prepublication
The Next Generation Air Transportation System's (NextGen) goal is the transformation of the U.S. national airspace system through programs and initiatives that could make it possible to shorten routes, navigate better around weather, save time and fuel, reduce delays, and improve capabilities for monitoring and managing of aircraft. A Review of the Next Generation Air Transportation provides an overview of NextGen and examines the technical activities, including human-system design and testing, organizational design, and other safety and human factor aspects of the system, that will be necessary to successfully transition current and planned modernization programs to the future system. This report assesses technical, cost, and schedule risk for the software development that will be necessary to achieve the expected benefits from a highly automated air traffic management system and the implications for ongoing modernization projects. The recommendations of this report will help the Federal Aviation Administration anticipate and respond to the challenges of implementing NextGen.
|Overcoming Barriers to Deployment of Plug-in Electric Vehicles
Released 2015-04-22 Forthcoming/Prepublication
In the past few years, interest in plug-in electric vehicles (PEVs) has grown. Advances in battery and other technologies, new federal standards for carbon-dioxide emissions and fuel economy, state zero-emission-vehicle requirements, and the current administration's goal of putting millions of alternative-fuel vehicles on the road have all highlighted PEVs as a transportation alternative. Consumers are also beginning to recognize the advantages of PEVs over conventional vehicles, such as lower operating costs, smoother operation, and better acceleration; the ability to fuel up at home; and zero tailpipe emissions when the vehicle operates solely on its battery. There are, however, barriers to PEV deployment, including the vehicle cost, the short all-electric driving range, the long battery charging time, uncertainties about battery life, the few choices of vehicle models, and the need for a charging infrastructure to support PEVs. What should industry do to improve the performance of PEVs and make them more attractive to consumers? At the request of Congress, Overcoming Barriers to Deployment of Plug-in Electric Vehicles identifies barriers to the introduction of electric vehicles and recommends ways to mitigate these barriers. This report examines the characteristics and capabilities of electric vehicle technologies, such as cost, performance, range, safety, and durability, and assesses how these factors might create barriers to widespread deployment. Overcoming Barriers to Deployment of Plug-in Electric Vehicles provides an overview of the current status of PEVs and makes recommendations to spur the industry and increase the attractiveness of this promising technology for consumers. Through consideration of consumer behaviors, tax incentives, business models, incentive programs, and infrastructure needs, this book studies the state of the industry and makes recommendations to further its development and acceptance.
|Optimizing the U.S. Ground-Based Optical and Infrared Astronomy System
Released 2015-04-17 Forthcoming/Prepublication
New astronomical facilities, such as the under-construction Large Synoptic Survey Telescope and planned 30-meter-class telescopes, and new instrumentation on existing optical and infrared (OIR) telescopes, hold the promise of groundbreaking research and discovery. How can we extract the best science from these and other astronomical facilities in an era of potentially flat federal budgets for both the facilities and the research grants? Optimizing the U.S. Ground-Based Optical and Infrared Astronomy System provides guidance for these new programs that align with the scientific priorities and the conclusions and recommendations of two National Research Council (NRC) decadal surveys, New Worlds, New Horizons for Astronomy and Astrophysics and Vision and Voyages for Planetary Sciences in the Decade 2013-2022, as well as other NRC reports. This report describes a vision for a U.S. OIR System that includes a telescope time exchange designed to enhance science return by broadening access to capabilities for a diverse community, an ongoing planning process to identify and construct next generation capabilities to realize decadal science priorities, and near-term critical coordination, planning, and instrumentation needed to usher in the era of LSST and giant telescopes.
|Space Studies Board Annual Report 2014
The original charter of the Space Science Board was established in June 1958, 3 months before the National Aeronautics and Space Administration (NASA) opened its doors. The Space Science Board and its successor, the Space Studies Board (SSB), have provided expert external and independent scientific and programmatic advice to NASA on a continuous basis from NASA's inception until the present. The SSB has also provided such advice to other executive branch agencies, including the National Oceanic and Atmospheric Administration (NOAA), the National Science Foundation (NSF), the U.S. Geological Survey (USGS), the Department of Defense, as well as to Congress. Space Studies Board Annual Report 2014 covers a message from the chair of the SSB, David N. Spergel. This report also explains the origins of the Space Science Board, how the Space Studies Board functions today, the SSB's collaboration with other National Research Council units, assures the quality of the SSB reports, acknowledges the audience and sponsors, and expresses the necessity to enhance the outreach and improve dissemination of SSB reports. This report will be relevant to a full range of government audiences in civilian space research - including NASA, NSF, NOAA, USGS, and the Department of Energy, as well members of the SSB, policy makers, and researchers.
|Affordability of National Flood Insurance Program Premiums: Report 1
Released 2015-03-26 Forthcoming/Prepublication
The National Flood Insurance Program (NFIP) is housed within the Federal Emergency Management Agency (FEMA) and offers insurance policies that are marketed and sold through private insurers, but with the risks borne by the U.S. federal government. NFIP's primary goals are to ensure affordable insurance premiums, secure widespread community participation in the program, and earn premium and fee income that covers claims paid and program expenses over time. In July 2012, the U.S. Congress passed the Biggert-Waters Flood Insurance Reform and Modernization Act (Biggert-Waters 2012), designed to move toward an insurance program with NFIP risk-based premiums that better reflected expected losses from floods at insured properties. This eliminated policies priced at what the NFIP called "pre-FIRM subsidized" and "grandfathered." As Biggert-Waters 2012 went into effect, constituents from multiple communities expressed concerns about the elimination of lower rate classes, arguing that it created a financial burden on policy holders. In response to these concerns Congress passed The Homeowner Flood Insurance Affordability Act of 2014 (HFIAA 2014). The 2014 legislation changed the process by which pre-FIRM subsidized premiums for primary residences would be removed and reinstated grandfathering. As part of that legislation, FEMA must report back to Congress with a draft affordability framework. Affordability of National Flood Insurance Program Premiums: Report 1 is the first part of a two-part study to provide input as FEMA prepares their draft affordability framework. This report discusses the underlying definitions and methods for an affordability framework and the affordability concept and applications. Affordability of National Flood Insurance Program Premiums gives an overview of the demand for insurance and the history of the NFIP premium setting. The report then describes alternatives for determining when the premium increases resulting from Biggert-Waters 2012 would make flood insurance unaffordable.
|Review Criteria for Successful Treatment of Hydrolysate at the Pueblo Chemical Agent Destruction Pilot Plant
One of the last two sites with chemical munitions and chemical materiel is the Pueblo Chemical Depot in Pueblo, Colorado. The stockpile at this location consists of about 800,000 projectiles and mortars, all of which are filled with the chemical agent mustard. Under the direction of the Assembled Chemical Weapons Alternative Program (ACWA), the Army has constructed the Pueblo Chemical Agent Destruction Pilot Plant (PCAPP) to destroy these munitions. The primary technology to be used to destroy the mustard agent at PCAPP is hydrolysis, resulting in a secondary waste stream referred to as hydrolysate. PCAPP features a process that will be used to treat the hydrolysate and the thiodiglycol - a breakdown product of mustard - contained within. The process is a biotreatment technology that uses what are known as immobilized cell bioreactors. After biodegradation, the effluent flows to a brine reduction system, producing a solidified filter cake that is intended to be sent offsite to a permitted hazardous waste disposal facility. Water recovered from the brine reduction system is intended to be recycled back through the plant, thereby reducing the amount of water that is withdrawn from groundwater. Although biotreatment of toxic chemicals, brine reduction, and water recovery are established technologies, never before have these technologies been combined to treat mustard hydrolysate. At the request of the U.S. Army, Review Criteria for Successful Treatment of Hydrolysate at the Pueblo Chemical Agent Destruction Pilot Plant reviews the criteria for successfully treating the hydrolysate. This report provides information on the composition of the hydrolysate and describes the PCAPP processes for treating it; discusses stakeholder concerns; reviews regulatory considerations at the federal, state, and local levels; discusses Department of Transportation regulations and identifies risks associated with the offsite shipment of hydrolysate; establishes criteria for successfully treating the hydrolysate and identifies systemization data that should factor into the criteria and decision process for offsite transport and disposal of the hydrolysate; and discusses failure risks and contingency options as well as the downstream impacts of a decision to ship hydrolysate offsite.
|Bulk Collection of Signals Intelligence: Technical Options
The Bulk Collection of Signals Intelligence: Technical Options study is a result of an activity called for in Presidential Policy Directive 28 (PPD-28), issued by President Obama in January 2014, to evaluate U.S. signals intelligence practices. The directive instructed the Office of the Director of National Intelligence (ODNI) to produce a report within one year "assessing the feasibility of creating software that would allow the intelligence community more easily to conduct targeted information acquisition rather than bulk collection." ODNI asked the National Research Council (NRC) -- the operating arm of the National Academy of Sciences and National Academy of Engineering -- to conduct a study, which began in June 2014, to assist in preparing a response to the President. Over the ensuing months, a committee of experts appointed by the Research Council produced the report.
|Aligning the Governance Structure of the NNSA Laboratories to Meet 21st Century National Security Challenges
Aligning the Governance Structure of the NNSA Laboratories to Meet 21st Century National Security Challenges is an independent assessment regarding the transition of the National Nuclear Security Administration (NNSA) laboratories - Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories - to multiagency, federally funded research and development centers with direct sustainment and sponsorship by multiple national security agencies. This report makes recommendations for the governance of NNSA laboratories to better align with the evolving national security landscape and the laboratories' increasing engagement with the other national security agencies, while simultaneously encouraging the best technical solutions to national problems from the entire range of national security establishments. According to this report, the Department of Energy should remain the sole sponsor of the NNSA laboratories as federally funded research and development centers. The NNSA laboratories will remain a critically important resource to meet U.S. national security needs for many decades to come. The recommendations of Aligning the Governance Structure of the NNSA Laboratories to Meet 21st Century National Security Challenges will improve the governance of the laboratories and strengthen their strategic relationship with the non-DOE national security agencies.
|Opportunities for the Employment of Simulation in U.S. Air Force Training Environments: A Workshop Report
Simulators currently provide an alternative to aircraft when it comes to training requirements, both for the military and for commercial airlines. For the U.S. Air Force, in particular, simulation for training offers a cost-effective way, and in many instances a safer way in comparison with live flying, to replicate real-world missions. Current technical issues related to simulation for training include simulation fidelity and multi-level security, among others, which will need to be addressed in order for the Air Force to take full advantage of this technology. The workshop held in November, 2014 examined the current status of simulation training, alternative uses, current and future technologies, and how the combination of simulation and live training can improve aircrew training. The scope of the workshop focused on technologies and practices that could be applicable to high-end aircraft simulations.
|Robust Methods for the Analysis of Images and Videos for Fisheries Stock Assessment: Summary of a Workshop
The National Marine Fisheries Service (NMFS) is responsible for the stewardship of the nation's living marine resources and their habitat. As part of this charge, NMFS conducts stock assessments of the abundance and composition of fish stocks in several bodies of water. At present, stock assessments rely heavily on human data-gathering and analysis. Automatic means of fish stock assessments are appealing because they offer the potential to improve efficiency and reduce human workload and perhaps develop higher-fidelity measurements. The use of images and video, when accompanies by appropriate statistical analyses of the inferred data, is of increasing importance for estimating the abundance of species and their age distributions. Robust Methods for the Analysis of Images and Videos for Fisheries Stock Assessment is the summary of a workshop convened by the National Research Council Committee on Applied and Theoretical Statistics to discuss analysis techniques for images and videos for fisheries stock assessment. Experts from diverse communities shared perspective about the most efficient path toward improved automation of visual information and discussed both near-term and long-term goals that can be achieved through research and development efforts. This report is a record of the presentations and discussions of this event.
|2013-2014 Assessment of the Army Research Laboratory
The National Research Council's Army Research Laboratory Technical Assessment Board (ARLTAB) provides biennial assessments of the scientific and technical quality of the research, development, and analysis programs at the Army Research Laboratory, focusing on ballistics sciences, human sciences, information sciences, materials sciences, and mechanical sciences. 2013-2014 Assessment of the Army Research Laboratory summarizes the findings of the Board for the first biennial assessment. This report discusses the biennial assessment process used by ARLTAB and its five panels; provides detailed assessments of each of the ARL core technical competency areas reviewed during the 2013-2014 period; and presents findings and recommendations common across multiple competency areas.
|A Review of the U.S. Navy Cyber Defense Capabilities: Abbreviated Version of a Classified Report
In order to conduct operations successfully and defend its capabilities against all warfighting domains, many have warned the Department of Defense (DoD) of the severity of the cyber threat and called for greater attention to defending against potential cyber attacks. For several years, many within and outside DoD have called for even greater attention to addressing threats to cyberspace. At the request of the Chief of Naval Operations, the National Research Council appointed an expert committee to review the U.S. Navy's cyber defense capabilities. The Department of the Navy has determined that the final report prepared by the committee is classified in its entirety under Executive Order 13526 and therefore cannot be made available to the public. A Review of U.S. Navy Cyber Defense Capabilities is the abbreviated report and provides background information on the full report and the committee that prepared it.
|Training Students to Extract Value from Big Data: Summary of a Workshop
As the availability of high-throughput data-collection technologies, such as information-sensing mobile devices, remote sensing, internet log records, and wireless sensor networks has grown, science, engineering, and business have rapidly transitioned from striving to develop information from scant data to a situation in which the challenge is now that the amount of information exceeds a human's ability to examine, let alone absorb, it. Data sets are increasingly complex, and this potentially increases the problems associated with such concerns as missing information and other quality concerns, data heterogeneity, and differing data formats. The nation's ability to make use of data depends heavily on the availability of a workforce that is properly trained and ready to tackle high-need areas. Training students to be capable in exploiting big data requires experience with statistical analysis, machine learning, and computational infrastructure that permits the real problems associated with massive data to be revealed and, ultimately, addressed. Analysis of big data requires cross-disciplinary skills, including the ability to make modeling decisions while balancing trade-offs between optimization and approximation, all while being attentive to useful metrics and system robustness. To develop those skills in students, it is important to identify whom to teach, that is, the educational background, experience, and characteristics of a prospective data-science student; what to teach, that is, the technical and practical content that should be taught to the student; and how to teach, that is, the structure and organization of a data-science program. Training Students to Extract Value from Big Data summarizes a workshop convened in April 2014 by the National Research Council's Committee on Applied and Theoretical Statistics to explore how best to train students to use big data. The workshop explored the need for training and curricula and coursework that should be included. One impetus for the workshop was the current fragmented view of what is meant by analysis of big data, data analytics, or data science. New graduate programs are introduced regularly, and they have their own notions of what is meant by those terms and, most important, of what students need to know to be proficient in data-intensive work. This report provides a variety of perspectives about those elements and about their integration into courses and curricula.
|An Assessment of the National Institute of Standards and Technology Engineering Laboratory: Fiscal Year 2014
The mission of the Engineering Laboratory of the National Institute of Standards and Technology (NIST) is to promote U.S. innovation and industrial competitiveness through measurement science and standards for technology-intensive manufacturing, construction, and cyberphysical systems in ways that enhance economic prosperity and improve the quality of life. To support this mission, the Engineering Laboratory has developed thrusts in smart manufacturing, construction, and cyberphysical systems; in sustainable and energy-efficient manufacturing materials and infrastructure; and in disaster-resilient buildings, infrastructure, and communities. The technical work of the Engineering Laboratory is performed in five divisions: Intelligent Systems; Materials and Structural Systems; Energy and Environment; Systems Integration; and Fire Research; and two offices: Applied Economics Office and Smart Grid Program Office. An Assessment of the National Institute of Standards and Technology Engineering Laboratory Fiscal Year 2014 assesses the scientific and technical work performed by the NIST Engineering Laboratory. This report evaluates the organization's technical programs, portfolio of scientific expertise within the organization, adequacy of the organization's facilities, equipment, and human resources, and the effectiveness by which the organization disseminates its program outputs.
|An Assessment of the National Institute of Standards and Technology Material Measurement Laboratory: Fiscal Year 2014
The National Institute of Standards and Technology's (NIST's) Material Measurement Laboratory (MML) is our nation's reference laboratory for measurements in the chemical, biological, and materials sciences and engineering. Staff of the MML develop state-of-the-art measurement techniques and conduct fundamental research related to measuring the composition, structure, and properties of substances. Tools that include reference materials, data, and measurement services are developed to support industries that range from transportation to biotechnology and to address problems such as climate change, environmental sciences, renewable energy, health care, infrastructure, food safety and nutrition, and forensics. This report assesses the scientific and technical work performed by NIST's Material Measurement Laboratory. In particular, the report assesses the organization's technical programs, the portfolio of scientific expertise within the organization, the adequacy of the organization\'s facilities, equipment, and human resources, and the effectiveness by which the organization disseminates its program outputs.
|U.S. Air Force Strategic Deterrence Analytic Capabilities: An Assessment of Tools, Methods, and Approaches for the 21st Century Security Environment
Since the early 1960s, the U.S. strategic nuclear posture has been composed of a triad of nuclear-certified long-range bombers, intercontinental ballistic missiles, and submarine-launched ballistic missiles. Since the early 1970s, U.S. nuclear forces have been subject to strategic arms control agreements. The large numbers and diversified nature of the U.S. nonstrategic (tactical) nuclear forces, which cannot be ignored as part of the overall nuclear deterrent, have decreased substantially since the Cold War. While there is domestic consensus today on the need to maintain an effective deterrent, there is no consensus on precisely what that requires, especially in a changing geopolitical environment and with continued reductions in nuclear arms. This places a premium on having the best possible analytic tools, methods, and approaches for understanding how nuclear deterrence and assurance work, how they might fail, and how failure can be averted by U.S. nuclear forces. U.S. Air Force Strategic Deterrence Analytic Capabilities identifies the broad analytic issues and factors that must be considered in seeking nuclear deterrence of adversaries and assurance of allies in the 21st century. This report describes and assesses tools, methods - including behavioral science-based methods - and approaches for improving the understanding of how nuclear deterrence and assurance work or may fail in the 21st century and the extent to which such failures might be averted or mitigated by the proper choice of nuclear systems, technological capabilities, postures, and concepts of operation of American nuclear forces. The report recommends criteria and a framework for validating the tools, methods, and approaches and for identifying those most promising for Air Force usage.