|Review Criteria for Successful Treatment of Hydrolysate at the Blue Grass Chemical Agent Destruction Pilot Plant
In 1993, the United States signed the Chemical Weapons Convention (CWC), an international treaty outlawing the production, stockpiling, and use of chemical weapons. The chemical weapons stockpiles at five of the U.S. chemical weapons storage sites have now been destroyed. At those sites, the munitions were robotically opened and the chemical agent was removed, collected, and incinerated.
One of the remaining sites with chemical weapons stockpiles is the Blue Grass Army Depot near Richmond, Kentucky. In this case, caustic hydrolysis will be used to destroy the agents and energetics, resulting in a secondary waste stream known as hydrolysate. Review Criteria for Successful Treatment of Hydrolysate at the Blue Grass Chemical Agent Destruction Pilot Plant develops criteria for successfully treating the hydrolysate, identifies systemization data that should factor into the criteria/decision process, suggests potential modifications to suggested treatment that would allow continued onsite processing, and assesses waste disposal procedures. This study further examines the possibility of delay or failure of the existing technology and examines possible alternatives to onsite treatment.
|Airport Passenger Screening Using Backscatter X-Ray Machines: Compliance with Standards
Released 2015-09-29 Forthcoming/Prepublication
Passenger screening at commercial airports in the United States has gone through significant changes since the events of September 11, 2001. In response to increased concern over terrorist attacks on aircrafts, the Transportation Security Administration (TSA) has deployed security systems of advanced imaging technology (AIT) to screen passengers at airports. To date (December 2014), TSA has deployed AITs in U.S. airports of two different technologies that use different types of radiation to detect threats: millimeter wave and X-ray backscatter AIT systems. X-ray backscatter AITs were deployed in U.S. airports in 2008 and subsequently removed from all airports by June 2013 due to privacy concerns. TSA is looking to deploy a second-generation X-ray backscatter AIT equipped with privacy software to eliminate production of an image of the person being screened in order to alleviate these concerns.
This report reviews previous studies as well as current processes used by the Department of Homeland Security and equipment manufacturers to estimate radiation exposures resulting from backscatter X-ray advanced imaging technology system use in screening air travelers. Airport Passenger Screening Using Backscatter X-Ray Machines examines whether exposures comply with applicable health and safety standards for public and occupational exposures to ionizing radiation and whether system design, operating procedures, and maintenance procedures are appropriate to prevent over exposures of travelers and operators to ionizing radiation. This study aims to address concerns about exposure to radiation from X-ray backscatter AITs raised by Congress, individuals within the scientific community, and others.
|Cost, Effectiveness and Deployment of Fuel Economy Technologies for Light-Duty Vehicles
The light-duty vehicle fleet is expected to undergo substantial technological changes over the next several decades. New powertrain designs, alternative fuels, advanced materials and significant changes to the vehicle body are being driven by increasingly stringent fuel economy and greenhouse gas emission standards. By the end of the next decade, cars and light-duty trucks will be more fuel efficient, weigh less, emit less air pollutants, have more safety features, and will be more expensive to purchase relative to current vehicles. Though the gasoline-powered spark ignition engine will continue to be the dominant powertrain configuration even through 2030, such vehicles will be equipped with advanced technologies, materials, electronics and controls, and aerodynamics. And by 2030, the deployment of alternative methods to propel and fuel vehicles and alternative modes of transportation, including autonomous vehicles, will be well underway. What are these new technologies - how will they work, and will some technologies be more effective than others?
Written to inform The United States Department of Transportation's National Highway Traffic Safety Administration (NHTSA) and Environmental Protection Agency (EPA) Corporate Average Fuel Economy (CAFE) and greenhouse gas (GHG) emission standards, this new report from the National Research Council is a technical evaluation of costs, benefits, and implementation issues of fuel reduction technologies for next-generation light-duty vehicles. Cost, Effectiveness, and Deployment of Fuel Economy Technologies for Light-Duty Vehicles estimates the cost, potential efficiency improvements, and barriers to commercial deployment of technologies that might be employed from 2020 to 2030. This report describes these promising technologies and makes recommendations for their inclusion on the list of technologies applicable for the 2017-2025 CAFE standards.
|Applying Materials State Awareness to Condition-Based Maintenance and System Life Cycle Management: Summary of a Workshop
Released 2015-09-25 Forthcoming/Prepublication
In August 2014, the committee on Defense Materials Manufacturing and Infrastructure convened a workshop to discuss issues related to applying materials state awareness to condition-based maintenance and system life cycle management. The workshop was structured around three focal topics: (1) advances in metrology and experimental methods, (2) advances in physics-based models for assessment, and (3) advances in databases and diagnostic technologies. This report summarizes the discussions and presentations from this workshop.
|Review of the MEPAG Report on Mars Special Regions
Released 2015-09-21 Forthcoming/Prepublication
Planetary protection is a guiding principle in the design of an interplanetary mission, aiming to prevent biological contamination of both the target celestial body and the Earth. The protection of high-priority science goals, the search for life and the understanding of the Martian organic environment may be compromised if Earth microbes carried by spacecraft are grown and spread on Mars. This has led to the definition of Special Regions on Mars where strict planetary protection measures have to be applied before a spacecraft can enter these areas.
At NASA's request, the community-based Mars Exploration Program Analysis Group (MEPAG) established the Special Regions Science Analysis Group (SR-SAG2) in October 2013 to examine the quantitative definition of a Special Region and proposed modifications to it, as necessary, based upon the latest scientific results. Review of the MEPAG Report on Mars Special Regions reviews the conclusions and recommendations contained in MEPAG's SR-SAG2 report and assesses their consistency with current understanding of both the Martian environment and the physical and chemical limits for the survival and propagation of microbial and other life on Earth. This report provides recommendations for an update of the planetary protection requirements for Mars Special Regions.
|A Strategy for Active Remote Sensing Amid Increased Demand for Radio Spectrum
Active remote sensing is the principal tool used to study and to predict short- and long-term changes in the environment of Earth - the atmosphere, the oceans and the land surfaces - as well as the near space environment of Earth. All of these measurements are essential to understanding terrestrial weather, climate change, space weather hazards, and threats from asteroids. Active remote sensing measurements are of inestimable benefit to society, as we pursue the development of a technological civilization that is economically viable, and seek to maintain the quality of our life.
A Strategy for Active Remote Sensing Amid Increased Demand for Spectrum describes the threats, both current and future, to the effective use of the electromagnetic spectrum required for active remote sensing. This report offers specific recommendations for protecting and making effective use of the spectrum required for active remote sensing.
|The Growing Gap in Life Expectancy by Income: Implications for Federal Programs and Policy Responses
The U.S. population is aging. Social Security projections suggest that between 2013 and 2050, the population aged 65 and over will almost double, from 45 million to 86 million. One key driver of population aging is ongoing increases in life expectancy. Average U.S. life expectancy was 67 years for males and 73 years for females five decades ago; the averages are now 76 and 81, respectively. It has long been the case that better-educated, higher-income people enjoy longer life expectancies than less-educated, lower-income people. The causes include early life conditions, behavioral factors (such as nutrition, exercise, and smoking behaviors), stress, and access to health care services, all of which can vary across education and income.
Our major entitlement programs – Medicare, Medicaid, Social Security, and Supplemental Security Income – have come to deliver disproportionately larger lifetime benefits to higher-income people because, on average, they are increasingly collecting those benefits over more years than others. This report studies the impact the growing gap in life expectancy has on the present value of lifetime benefits that people with higher or lower earnings will receive from major entitlement programs. The analysis presented in The Growing Gap in Life Expectancy by Income goes beyond an examination of the existing literature by providing the first comprehensive estimates of how lifetime benefits are affected by the changing distribution of life expectancy. The report also explores, from a lifetime benefit perspective, how the growing gap in longevity affects traditional policy analyses of reforms to the nation’s leading entitlement programs. This in-depth analysis of the economic impacts of the longevity gap will inform debate and assist decision makers, economists, and researchers.
|Review of the 21st Century Truck Partnership: Third Report
Released 2015-09-11 Forthcoming/Prepublication
The 21st Century Truck Partnership (21CTP) works to reduce fuel consumption and emissions, increase heavy-duty vehicle safety, and support research, development, and demonstration to initiate commercially viable products and systems. This report is the third in a series of three by the National Academies of Sciences, Engineering, and Medicine that have reviewed the research and development initiatives carried out by the 21CTP. Review of the 21st Century Truck Partnership, Third Report builds on the Phase 1 and 2 reviews and reports, and also comments on changes and progress since the Phase 2 report was issued in 2012.
|Improving the Air Force Scientific Discovery Mission: Leveraging Best Practices in Basic Research Management: A Workshop Report
In 2014, the Air Force Studies Board conducted two workshops to review current research practices employed by the Air Force Office of Scientific Research (AFOSR). Improving the Air Force Scientific Discovery Mission summarizes the presentation and discussion of these two workshops. This report explores the unique drivers associated with management of a 6.1 basic research portfolio in the Department of Defense and investigates current and future practices that may further the effective and efficient management of basic research on behalf of the Air Force.
|Tying Flood Insurance to Flood Risk for Low-Lying Structures in the Floodplain
Floods take a heavy toll on society, costing lives, damaging buildings and property, disrupting livelihoods, and sometimes necessitating federal disaster relief, which has risen to record levels in recent years. The National Flood Insurance Program (NFIP) was created in 1968 to reduce the flood risk to individuals and their reliance on federal disaster relief by making federal flood insurance available to residents and businesses if their community adopted floodplain management ordinances and minimum standards for new construction in flood prone areas. Insurance rates for structures built after a flood plain map was adopted by the community were intended to reflect the actual risk of flooding, taking into account the likelihood of inundation, the elevation of the structure, and the relationship of inundation to damage to the structure. Today, rates are subsidized for one-fifth of the NFIP's 5.5 million policies. Most of these structures are negatively elevated, that is, the elevation of the lowest floor is lower than the NFIP construction standard. Compared to structures built above the base flood elevation, negatively elevated structures are more likely to incur a loss because they are inundated more frequently, and the depths and durations of inundation are greater.
Tying Flood Insurance to Flood Risk for Low-Lying Structures in the Floodplain studies the pricing of negatively elevated structures in the NFIP. This report review current NFIP methods for calculating risk-based premiums for these structures, including risk analysis, flood maps, and engineering data. The report then evaluates alternative approaches for calculating risk-based premiums and discusses engineering hydrologic and property assessment data needs to implement full risk-based premiums. The findings and conclusions of this report will help to improve the accuracy and precision of loss estimates for negatively elevated structures, which in turn will increase the credibility, fairness, and transparency of premiums for policyholders.
|The Space Science Decadal Surveys: Lessons Learned and Best Practices
Released 2015-07-29 Forthcoming/Prepublication
The National Research Council has conducted 11 decadal surveys in the Earth and space sciences since 1964 and released the latest four surveys in the past 8 years. The decadal surveys are notable in their ability to sample thoroughly the research interest, aspirations, and needs of a scientific community. Through a rigorous process, a primary survey committee and thematic panels of community members construct a prioritized program of science goals and objectives and define an executable strategy for achieving them. These reports play a critical role in defining the nation's agenda in that science area for the following 10 years, and often beyond.
The Space Science Decadal Surveys considers the lessons learned from previous surveys and presents options for possible changes and improvements to the process, including the statement of task, advanced preparation, organization, and execution. This report discusses valuable aspects of decadal surveys that could taken further, as well as some challenges future surveys are likely to face in searching for the richest areas of scientific endeavor, seeking community consensus of where to go next, and planning how to get there. The Space Science Decadal Surveys describes aspects in the decadal survey prioritization process, including balance in the science program and across the discipline; balance between the needs of current researchers and the development of the future workforce; and balance in mission scale - smaller, competed programs versus large strategic missions.
|Transformation in the Air: A Review of the FAA's Certification Research Plan
The Federal Aviation Administration (FAA) is currently undertaking a broad program known as Next Generation Air Transportation System (NextGen) to develop, introduce, and certify new technologies into the National Airspace System. NextGen is a fundamentally transformative change that is being implemented incrementally over a period of many years. Currently, the FAA is putting into place the foundation that provides support for the future building blocks of a fully operational NextGen. NextGen is a challenging undertaking that includes ground systems, avionics installed in a wide range of aircraft, and procedures to take advantage of the new technology. Transformation in the Air assesses the FAA's plan for research on methods and procedures to improve both confidence in and the timeliness of certification of new technologies for their introduction into the National Airspace System. This report makes recommendations to include both ground and air elements and document the plan's relationship to the other activities and procedures required for certification and implementation into the National Airspace System.
|A Community-Based Flood Insurance Option
River and coastal floods are among the nation's most costly natural disasters. One component in the nation's approach to managing flood risk is availability of flood insurance policies, which are offered on an individual basis primarily through the National Flood Insurance Program (NFIP). Established in 1968, the NFIP is overseen by the Federal Emergency Management Agency (FEMA) and there are about 5.4 million individual policies in the NFIP. The program has experienced a mixture of successes and persistent challenges. Successes include a large number of policy holders, the insurance of approximately $1.3 trillion of property, and the fact that the large majority of policy holders - 80% - pay rates that are risk based. NFIP challenges include large program debt, relatively low rates of purchase in many flood-prone areas, a host of issues regarding affordability of premiums, ensuring that premiums collected cover payouts and administrative fees, and a large number of properties that experience severe repetitive flood losses.
At the request of FEMA, A Community-Based Flood Insurance Option identifies a range of key issues and questions that would merit consideration and further analysis as part of a community-based flood insurance program. As the report describes, the community-based option certainly offers potential benefits, such as the prospect of providing coverage for all (or nearly all) at-risk residents and properties in flood-prone communities. At the same time, many current challenges facing the NFIP may not necessarily be resolved by a community-based approach. This report discusses these and other prominent issues to be considered and further assessed.
|Owning the Technical Baseline for Acquisition Programs in the U.S. Air Force: A Workshop Report
The U.S. Air Force has experienced many acquisition program failures - cost overruns, schedule delays, system performance problems, and sustainability concerns - over program lifetimes. A key contributing factor is the lack of sufficient technical knowledge within the Air Force concerning the systems being acquired to ensure success. To examine this issue, the Assistant Secretary of the Air Force for Acquisition requested that the Air Force Studies Board of the National Research Council undertake a workshop to identify the essential elements of the technical baseline - data and information to establish, trade-off, verify, change, accept, and sustain functional capabilities, design characteristics, affordability, schedule, and quantified performance parameters at the chosen level of the system hierarchy - that would benefit from realignment under Air Force or government ownership, and the value to the Air Force of regaining ownership under its design capture process of the future. Over the course of three workshops from November 2014 through January 2015, presenters and participants identified the barriers that must be addressed for the Air Force to regain technical baseline control to include workforce, policy and process, funding, culture, contracts, and other factors and provided a terms of reference for a possible follow-on study to explore the issues and make recommendations required to implement and institutionalize the technical baseline concept. Owning the Technical Baseline for Acquisition Programs in the U.S. Air Force summarizes the presentations and discussion of the three workshops.
|Optimizing the U.S. Ground-Based Optical and Infrared Astronomy System
New astronomical facilities, such as the under-construction Large Synoptic Survey Telescope and planned 30-meter-class telescopes, and new instrumentation on existing optical and infrared (OIR) telescopes, hold the promise of groundbreaking research and discovery. How can we extract the best science from these and other astronomical facilities in an era of potentially flat federal budgets for both the facilities and the research grants? Optimizing the U.S. Ground-Based Optical and Infrared Astronomy System provides guidance for these new programs that align with the scientific priorities and the conclusions and recommendations of two National Research Council (NRC) decadal surveys, New Worlds, New Horizons for Astronomy and Astrophysics and Vision and Voyages for Planetary Sciences in the Decade 2013-2022, as well as other NRC reports. This report describes a vision for a U.S. OIR System that includes a telescope time exchange designed to enhance science return by broadening access to capabilities for a diverse community, an ongoing planning process to identify and construct next generation capabilities to realize decadal science priorities, and near-term critical coordination, planning, and instrumentation needed to usher in the era of LSST and giant telescopes.
|Overcoming Barriers to Deployment of Plug-in Electric Vehicles
In the past few years, interest in plug-in electric vehicles (PEVs) has grown. Advances in battery and other technologies, new federal standards for carbon-dioxide emissions and fuel economy, state zero-emission-vehicle requirements, and the current administration's goal of putting millions of alternative-fuel vehicles on the road have all highlighted PEVs as a transportation alternative. Consumers are also beginning to recognize the advantages of PEVs over conventional vehicles, such as lower operating costs, smoother operation, and better acceleration; the ability to fuel up at home; and zero tailpipe emissions when the vehicle operates solely on its battery. There are, however, barriers to PEV deployment, including the vehicle cost, the short all-electric driving range, the long battery charging time, uncertainties about battery life, the few choices of vehicle models, and the need for a charging infrastructure to support PEVs. What should industry do to improve the performance of PEVs and make them more attractive to consumers? At the request of Congress, Overcoming Barriers to Deployment of Plug-in Electric Vehicles identifies barriers to the introduction of electric vehicles and recommends ways to mitigate these barriers. This report examines the characteristics and capabilities of electric vehicle technologies, such as cost, performance, range, safety, and durability, and assesses how these factors might create barriers to widespread deployment. Overcoming Barriers to Deployment of Plug-in Electric Vehicles provides an overview of the current status of PEVs and makes recommendations to spur the industry and increase the attractiveness of this promising technology for consumers. Through consideration of consumer behaviors, tax incentives, business models, incentive programs, and infrastructure needs, this book studies the state of the industry and makes recommendations to further its development and acceptance.
|Interim Report on 21st Century Cyber-Physical Systems Education
Cyber-physical systems (CPS) are increasingly relied on to provide the functionality and value to products, systems, and infrastructure in sectors including transportation, health care, manufacturing, and electrical power generation and distribution. CPS are smart, networked systems with embedded sensors, computer processors, and actuators that sense and interact with the physical world; support real-time, guaranteed performance; and are often found in critical applications. Cyber-physical systems have the potential to provide much richer functionality, including efficiency, flexibility, autonomy, and reliability, than systems that are loosely coupled, discrete, or manually operated, but also can create vulnerability related to security and reliability. Advances in CPS could yield systems that can communicate and respond faster than humans; enable better control and coordination of large-scale systems, such as the electrical grid or traffic controls; improve the efficiency of systems; and enable advances in many areas of science. As CPS become more pervasive, so too will demand for a workforce with the capacity and capability to design, develop, and maintain them. Building on its research program in CPS, the National Science Foundation (NSF) has begun to explore requirements for education and training. As part of that exploration, NSF asked the National Research Council of the National Academies to study the topic. Two workshops were convened in 2014, on April 30 and October 2-3 in Washington, D.C., to explore the knowledge and skills required for CPS work, education, and training requirements and possible approaches to retooling engineering and computer science programs and curricula to meet these needs. Interim Report on 21st Century Cyber-Physical Systems Education highlights emerging themes and summarizes related discussions from the workshops.
|A Review of the Next Generation Air Transportation System: Implications and Importance of System Architecture
The Next Generation Air Transportation System's (NextGen) goal is the transformation of the U.S. national airspace system through programs and initiatives that could make it possible to shorten routes, navigate better around weather, save time and fuel, reduce delays, and improve capabilities for monitoring and managing of aircraft. A Review of the Next Generation Air Transportation provides an overview of NextGen and examines the technical activities, including human-system design and testing, organizational design, and other safety and human factor aspects of the system, that will be necessary to successfully transition current and planned modernization programs to the future system. This report assesses technical, cost, and schedule risk for the software development that will be necessary to achieve the expected benefits from a highly automated air traffic management system and the implications for ongoing modernization projects. The recommendations of this report will help the Federal Aviation Administration anticipate and respond to the challenges of implementing NextGen.
|Sharing the Adventure with the Student: Exploring the Intersections of NASA Space Science and Education: A Workshop Summary
On December 2-3, 2014, the Space Studies Board and the Board on Science Education of the National Research Council held a workshop on the NASA Science Mission Directorate (SMD) education program - "Sharing the Adventure with the Student." The workshop brought together representatives of the space science and science education communities to discuss maximizing the effectiveness of the transfer of knowledge from the scientists supported by NASA's SMD to K-12 students directly and to teachers and informal educators. The workshop focused not only on the effectiveness of recent models for transferring science content and scientific practices to students, but also served as a venue for dialogue between education specialists, education staff from NASA and other agencies, space scientists and engineers, and science content generators. Workshop participants reviewed case studies of scientists or engineers who were able to successfully translate their research results and research experiences into formal and informal student science learning. Education specialists shared how science can be translated to education materials and directly to students, and teachers shared their experiences of space science in their classrooms. Sharing the Adventure with the Student is the summary of the presentation and discussions of the workshop.
|Space Studies Board Annual Report 2014
The original charter of the Space Science Board was established in June 1958, 3 months before the National Aeronautics and Space Administration (NASA) opened its doors. The Space Science Board and its successor, the Space Studies Board (SSB), have provided expert external and independent scientific and programmatic advice to NASA on a continuous basis from NASA's inception until the present. The SSB has also provided such advice to other executive branch agencies, including the National Oceanic and Atmospheric Administration (NOAA), the National Science Foundation (NSF), the U.S. Geological Survey (USGS), the Department of Defense, as well as to Congress. Space Studies Board Annual Report 2014 covers a message from the chair of the SSB, David N. Spergel. This report also explains the origins of the Space Science Board, how the Space Studies Board functions today, the SSB's collaboration with other National Research Council units, assures the quality of the SSB reports, acknowledges the audience and sponsors, and expresses the necessity to enhance the outreach and improve dissemination of SSB reports. This report will be relevant to a full range of government audiences in civilian space research - including NASA, NSF, NOAA, USGS, and the Department of Energy, as well members of the SSB, policy makers, and researchers.
|Review Criteria for Successful Treatment of Hydrolysate at the Pueblo Chemical Agent Destruction Pilot Plant
One of the last two sites with chemical munitions and chemical materiel is the Pueblo Chemical Depot in Pueblo, Colorado. The stockpile at this location consists of about 800,000 projectiles and mortars, all of which are filled with the chemical agent mustard. Under the direction of the Assembled Chemical Weapons Alternative Program (ACWA), the Army has constructed the Pueblo Chemical Agent Destruction Pilot Plant (PCAPP) to destroy these munitions. The primary technology to be used to destroy the mustard agent at PCAPP is hydrolysis, resulting in a secondary waste stream referred to as hydrolysate. PCAPP features a process that will be used to treat the hydrolysate and the thiodiglycol - a breakdown product of mustard - contained within. The process is a biotreatment technology that uses what are known as immobilized cell bioreactors. After biodegradation, the effluent flows to a brine reduction system, producing a solidified filter cake that is intended to be sent offsite to a permitted hazardous waste disposal facility. Water recovered from the brine reduction system is intended to be recycled back through the plant, thereby reducing the amount of water that is withdrawn from groundwater. Although biotreatment of toxic chemicals, brine reduction, and water recovery are established technologies, never before have these technologies been combined to treat mustard hydrolysate. At the request of the U.S. Army, Review Criteria for Successful Treatment of Hydrolysate at the Pueblo Chemical Agent Destruction Pilot Plant reviews the criteria for successfully treating the hydrolysate. This report provides information on the composition of the hydrolysate and describes the PCAPP processes for treating it; discusses stakeholder concerns; reviews regulatory considerations at the federal, state, and local levels; discusses Department of Transportation regulations and identifies risks associated with the offsite shipment of hydrolysate; establishes criteria for successfully treating the hydrolysate and identifies systemization data that should factor into the criteria and decision process for offsite transport and disposal of the hydrolysate; and discusses failure risks and contingency options as well as the downstream impacts of a decision to ship hydrolysate offsite.
|Affordability of National Flood Insurance Program Premiums: Report 1
The National Flood Insurance Program (NFIP) is housed within the Federal Emergency Management Agency (FEMA) and offers insurance policies that are marketed and sold through private insurers, but with the risks borne by the U.S. federal government. NFIP's primary goals are to ensure affordable insurance premiums, secure widespread community participation in the program, and earn premium and fee income that covers claims paid and program expenses over time. In July 2012, the U.S. Congress passed the Biggert-Waters Flood Insurance Reform and Modernization Act (Biggert-Waters 2012), designed to move toward an insurance program with NFIP risk-based premiums that better reflected expected losses from floods at insured properties. This eliminated policies priced at what the NFIP called "pre-FIRM subsidized" and "grandfathered." As Biggert-Waters 2012 went into effect, constituents from multiple communities expressed concerns about the elimination of lower rate classes, arguing that it created a financial burden on policy holders. In response to these concerns Congress passed The Homeowner Flood Insurance Affordability Act of 2014 (HFIAA 2014). The 2014 legislation changed the process by which pre-FIRM subsidized premiums for primary residences would be removed and reinstated grandfathering. As part of that legislation, FEMA must report back to Congress with a draft affordability framework. Affordability of National Flood Insurance Program Premiums: Report 1 is the first part of a two-part study to provide input as FEMA prepares their draft affordability framework. This report discusses the underlying definitions and methods for an affordability framework and the affordability concept and applications. Affordability of National Flood Insurance Program Premiums gives an overview of the demand for insurance and the history of the NFIP premium setting. The report then describes alternatives for determining when the premium increases resulting from Biggert-Waters 2012 would make flood insurance unaffordable.
|Bulk Collection of Signals Intelligence: Technical Options
The Bulk Collection of Signals Intelligence: Technical Options study is a result of an activity called for in Presidential Policy Directive 28 (PPD-28), issued by President Obama in January 2014, to evaluate U.S. signals intelligence practices. The directive instructed the Office of the Director of National Intelligence (ODNI) to produce a report within one year "assessing the feasibility of creating software that would allow the intelligence community more easily to conduct targeted information acquisition rather than bulk collection." ODNI asked the National Research Council (NRC) -- the operating arm of the National Academy of Sciences and National Academy of Engineering -- to conduct a study, which began in June 2014, to assist in preparing a response to the President. Over the ensuing months, a committee of experts appointed by the Research Council produced the report.
|Aligning the Governance Structure of the NNSA Laboratories to Meet 21st Century National Security Challenges
Aligning the Governance Structure of the NNSA Laboratories to Meet 21st Century National Security Challenges is an independent assessment regarding the transition of the National Nuclear Security Administration (NNSA) laboratories - Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories - to multiagency, federally funded research and development centers with direct sustainment and sponsorship by multiple national security agencies. This report makes recommendations for the governance of NNSA laboratories to better align with the evolving national security landscape and the laboratories' increasing engagement with the other national security agencies, while simultaneously encouraging the best technical solutions to national problems from the entire range of national security establishments. According to this report, the Department of Energy should remain the sole sponsor of the NNSA laboratories as federally funded research and development centers. The NNSA laboratories will remain a critically important resource to meet U.S. national security needs for many decades to come. The recommendations of Aligning the Governance Structure of the NNSA Laboratories to Meet 21st Century National Security Challenges will improve the governance of the laboratories and strengthen their strategic relationship with the non-DOE national security agencies.
|Opportunities for the Employment of Simulation in U.S. Air Force Training Environments: A Workshop Report
Simulators currently provide an alternative to aircraft when it comes to training requirements, both for the military and for commercial airlines. For the U.S. Air Force, in particular, simulation for training offers a cost-effective way, and in many instances a safer way in comparison with live flying, to replicate real-world missions. Current technical issues related to simulation for training include simulation fidelity and multi-level security, among others, which will need to be addressed in order for the Air Force to take full advantage of this technology.
The workshop held in November, 2014 examined the current status of simulation training, alternative uses, current and future technologies, and how the combination of simulation and live training can improve aircrew training. The scope of the workshop focused on technologies and practices that could be applicable to high-end aircraft simulations.
|Robust Methods for the Analysis of Images and Videos for Fisheries Stock Assessment: Summary of a Workshop
The National Marine Fisheries Service (NMFS) is responsible for the stewardship of the nation's living marine resources and their habitat. As part of this charge, NMFS conducts stock assessments of the abundance and composition of fish stocks in several bodies of water. At present, stock assessments rely heavily on human data-gathering and analysis. Automatic means of fish stock assessments are appealing because they offer the potential to improve efficiency and reduce human workload and perhaps develop higher-fidelity measurements. The use of images and video, when accompanies by appropriate statistical analyses of the inferred data, is of increasing importance for estimating the abundance of species and their age distributions. Robust Methods for the Analysis of Images and Videos for Fisheries Stock Assessment is the summary of a workshop convened by the National Research Council Committee on Applied and Theoretical Statistics to discuss analysis techniques for images and videos for fisheries stock assessment. Experts from diverse communities shared perspective about the most efficient path toward improved automation of visual information and discussed both near-term and long-term goals that can be achieved through research and development efforts. This report is a record of the presentations and discussions of this event.
|2013-2014 Assessment of the Army Research Laboratory
The National Research Council's Army Research Laboratory Technical Assessment Board (ARLTAB) provides biennial assessments of the scientific and technical quality of the research, development, and analysis programs at the Army Research Laboratory, focusing on ballistics sciences, human sciences, information sciences, materials sciences, and mechanical sciences. 2013-2014 Assessment of the Army Research Laboratory summarizes the findings of the Board for the first biennial assessment. This report discusses the biennial assessment process used by ARLTAB and its five panels; provides detailed assessments of each of the ARL core technical competency areas reviewed during the 2013-2014 period; and presents findings and recommendations common across multiple competency areas.
|A Review of the U.S. Navy Cyber Defense Capabilities: Abbreviated Version of a Classified Report
In order to conduct operations successfully and defend its capabilities against all warfighting domains, many have warned the Department of Defense (DoD) of the severity of the cyber threat and called for greater attention to defending against potential cyber attacks. For several years, many within and outside DoD have called for even greater attention to addressing threats to cyberspace. At the request of the Chief of Naval Operations, the National Research Council appointed an expert committee to review the U.S. Navy's cyber defense capabilities. The Department of the Navy has determined that the final report prepared by the committee is classified in its entirety under Executive Order 13526 and therefore cannot be made available to the public. A Review of U.S. Navy Cyber Defense Capabilities is the abbreviated report and provides background information on the full report and the committee that prepared it.
|Training Students to Extract Value from Big Data: Summary of a Workshop
As the availability of high-throughput data-collection technologies, such as information-sensing mobile devices, remote sensing, internet log records, and wireless sensor networks has grown, science, engineering, and business have rapidly transitioned from striving to develop information from scant data to a situation in which the challenge is now that the amount of information exceeds a human's ability to examine, let alone absorb, it. Data sets are increasingly complex, and this potentially increases the problems associated with such concerns as missing information and other quality concerns, data heterogeneity, and differing data formats. The nation's ability to make use of data depends heavily on the availability of a workforce that is properly trained and ready to tackle high-need areas. Training students to be capable in exploiting big data requires experience with statistical analysis, machine learning, and computational infrastructure that permits the real problems associated with massive data to be revealed and, ultimately, addressed. Analysis of big data requires cross-disciplinary skills, including the ability to make modeling decisions while balancing trade-offs between optimization and approximation, all while being attentive to useful metrics and system robustness. To develop those skills in students, it is important to identify whom to teach, that is, the educational background, experience, and characteristics of a prospective data-science student; what to teach, that is, the technical and practical content that should be taught to the student; and how to teach, that is, the structure and organization of a data-science program. Training Students to Extract Value from Big Data summarizes a workshop convened in April 2014 by the National Research Council's Committee on Applied and Theoretical Statistics to explore how best to train students to use big data. The workshop explored the need for training and curricula and coursework that should be included. One impetus for the workshop was the current fragmented view of what is meant by analysis of big data, data analytics, or data science. New graduate programs are introduced regularly, and they have their own notions of what is meant by those terms and, most important, of what students need to know to be proficient in data-intensive work. This report provides a variety of perspectives about those elements and about their integration into courses and curricula.
|An Assessment of the National Institute of Standards and Technology Engineering Laboratory: Fiscal Year 2014
The mission of the Engineering Laboratory of the National Institute of Standards and Technology (NIST) is to promote U.S. innovation and industrial competitiveness through measurement science and standards for technology-intensive manufacturing, construction, and cyberphysical systems in ways that enhance economic prosperity and improve the quality of life. To support this mission, the Engineering Laboratory has developed thrusts in smart manufacturing, construction, and cyberphysical systems; in sustainable and energy-efficient manufacturing materials and infrastructure; and in disaster-resilient buildings, infrastructure, and communities. The technical work of the Engineering Laboratory is performed in five divisions: Intelligent Systems; Materials and Structural Systems; Energy and Environment; Systems Integration; and Fire Research; and two offices: Applied Economics Office and Smart Grid Program Office. An Assessment of the National Institute of Standards and Technology Engineering Laboratory Fiscal Year 2014 assesses the scientific and technical work performed by the NIST Engineering Laboratory. This report evaluates the organization's technical programs, portfolio of scientific expertise within the organization, adequacy of the organization's facilities, equipment, and human resources, and the effectiveness by which the organization disseminates its program outputs.
|An Assessment of the National Institute of Standards and Technology Material Measurement Laboratory: Fiscal Year 2014
The National Institute of Standards and Technology's (NIST's) Material Measurement Laboratory (MML) is our nation's reference laboratory for measurements in the chemical, biological, and materials sciences and engineering. Staff of the MML develop state-of-the-art measurement techniques and conduct fundamental research related to measuring the composition, structure, and properties of substances. Tools that include reference materials, data, and measurement services are developed to support industries that range from transportation to biotechnology and to address problems such as climate change, environmental sciences, renewable energy, health care, infrastructure, food safety and nutrition, and forensics. This report assesses the scientific and technical work performed by NIST's Material Measurement Laboratory. In particular, the report assesses the organization's technical programs, the portfolio of scientific expertise within the organization, the adequacy of the organization\'s facilities, equipment, and human resources, and the effectiveness by which the organization disseminates its program outputs.
|U.S. Air Force Strategic Deterrence Analytic Capabilities: An Assessment of Tools, Methods, and Approaches for the 21st Century Security Environment
Since the early 1960s, the U.S. strategic nuclear posture has been composed of a triad of nuclear-certified long-range bombers, intercontinental ballistic missiles, and submarine-launched ballistic missiles. Since the early 1970s, U.S. nuclear forces have been subject to strategic arms control agreements. The large numbers and diversified nature of the U.S. nonstrategic (tactical) nuclear forces, which cannot be ignored as part of the overall nuclear deterrent, have decreased substantially since the Cold War. While there is domestic consensus today on the need to maintain an effective deterrent, there is no consensus on precisely what that requires, especially in a changing geopolitical environment and with continued reductions in nuclear arms. This places a premium on having the best possible analytic tools, methods, and approaches for understanding how nuclear deterrence and assurance work, how they might fail, and how failure can be averted by U.S. nuclear forces. U.S. Air Force Strategic Deterrence Analytic Capabilities identifies the broad analytic issues and factors that must be considered in seeking nuclear deterrence of adversaries and assurance of allies in the 21st century. This report describes and assesses tools, methods - including behavioral science-based methods - and approaches for improving the understanding of how nuclear deterrence and assurance work or may fail in the 21st century and the extent to which such failures might be averted or mitigated by the proper choice of nuclear systems, technological capabilities, postures, and concepts of operation of American nuclear forces. The report recommends criteria and a framework for validating the tools, methods, and approaches and for identifying those most promising for Air Force usage.