Skip to Main Content
 
 
The National Academies of Sciences, Engineering and Medicine
Board on Mathematical Sciences and Analytics
Board on Mathematical Sciences and Analytics
BMSA Home
About BMSA
Committee On Applied & Theoretical Statistics
Math Frontiers Webinar Series
Data Education Roundtable
Member Bios
Publications
BMSA & CATS Impacts
Events
DEPS Home

BMSA & CATS Impacts

 

Refining the Concept of Scientific Inference When Working with Big Data
The use of big data to enable scientific development has generated excitement and investment from both the public and private sectors over the past decade, but careful consideration must be given to available data and statistical modeling techniques to ensure that reliable inferences can be made. Refining the Concept of Scientific Inference When Working with Big Data: Proceedings of a Workshop summarized a 2017 workshop for the National Institutes of Health, which explored new methodological developments that hold significant promise, potential research program areas for the future, and critical challenges and opportunities in performing reliable scientific inference with big data. The proceedings is in the top 2 percent of all National Academies’ downloads, with an international audience of 140 countries. The workshop has informed a diverse array of research, government, and industry specialists, from being cited in articles published in Circulation: Cardiovascular Quality and Outcomes to serving as a training reference for program staff at various health agencies. The topics discussed at the workshop have gone on to inspire research in genetic epidemiology, limnology, modeling in high-dimensional semantic spaces, proteomics, remote sensing from satellite data, renewable energy systems, and water utility. The proceedings has also been utilized as a teaching reference for various courses in the analysis and interpretation of healthcare data sets, big data analytics, and mechanics and materials. 

Download the workshop proceeding
Watch videos of the workshop presentations

 

Strengthening Data Science Methods for Department of Defense Personnel and Readiness Missions
Techniques such as complex network analysis, machine learning, and anomaly detection are some of many mathematical and computational tools used in intelligence, surveillance, and reconnaissance that could also be applied to the personnel and readiness enterprise. Strengthening Data Science Methods for Department of Defense Personnel and Readiness Missions outlines strategies for the implementation and integration of data analysis in support of personnel and readiness decisions in the U.S. Department of Defense. The report has been downloaded in 124 countries since its release in 2017. The report has been utilized as a tool by diverse government agencies to enhance workforce development and improve protocols. The report has also been utilized within academia to develop new research centers, to inform research in analytic methods of personnel behavior modeling and human resources agency development, and to teach courses on smart cities, machine learning in geoinformatics, and data science for military personnel.

Download the full report

 

Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific Results

A lack of reproducibility of scientific research may be due to failures in experimental design, data management, data analysis, and statistical expertise. Researchers can also be swayed by publication incentives and the need to manipulate results for a better outcome. This lack of reproducibility creates a distrust of the scientific community, yet there is no consensus on how to best alleviate this problem. Supported by the National Science Foundation, Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific Results discusses the extent of irreproducibility, the causes of reproducibility failures, and the potential remedies for these failures. Since the workshop summary’s release in March 2016, it has been downloaded nearly 4,000 times in 122 countries.   Described as the “Analytical Bible” for reproducibility by the Department of Biomedical Informatics at Harvard Medical School, the workshop summary shows how statistics impacts a broad community. It has been consulted by various state and federal government agencies and private institutions as they develop reproducible research frameworks, evaluate research findings, and help researchers improve upon the ways they communicate their results. The workshop summary has been useful for educators in the physical sciences, statistics, and elsewhere who wish to introduce their students to important scientific issues as well as for administrators who wish to offer professional development to their faculty. Journal editors have also found the text useful in facilitating the guidelines for reproducibility and repeatability of published research. The workshop summary has been used as a resource for other workshops and activities examining reproducibility challenges.

Watch videos of workshop presentations
Download the workshop proceedings


Analytic Research Foundations for the Next-Generation Electric Grid
Electricity is essential for modern society, and the next-generation electric grid must be flexible, resilient, and capable of building on its pre-existing analytic foundations. Sponsored by the United States Department of Energy (DOE), Analytic Research Foundations for the Next-Generation Electric Grid aims to provide guidance on critical mathematical and computational science research needed to enhance the 21st century grid. Since its release in 2016, the report has been viewed by individuals in 50 states, the District of Columbia, and 126 countries, putting it in the top 3 percent of all downloads for National Academies’ products. The report has been cited by authors in numerous top journals, including Proceedings of the IEEE; IEEE Transactions on Power Systems; Energies; and Complexities. Additionally, the report helped to facilitate the development of the Algorithms for Modern Power Systems research program jointly run by the DOE and the National Science Foundation, which provided more than $1 million in grant funding to universities across the country in 2017. The report has also been utilized as educational course material as well as background literature for international energy policies. The study included a workshop in 2015 on mathematical sciences research challenges for the next-generation electric grid, which identified mathematical and computational challenges in electric transmission and distribution systems, as well as future needs and ways that current research efforts could be augmented.

Download the full report
Download the workshop summary
Watch videos of the workshop presentations
 

From Maps to Models: Augmenting the Nation’s Geospatial Intelligence Capabilities
Augmenting imagery analysis and mapping for the management of national security risks could enable anticipation and exploration of future outcomes for threats such as terrorism, scarcity and disruption of food and water supplies, extreme weather events, and regional conflicts around the world. The 2016 consensus study report From Maps to Models: Augmenting the Nation’s Geospatial Intelligence Capabilities describes the types of models and analytical methods used to understand real-world systems, discusses what would be required to make these models and methods useful for geospatial intelligence, and identifies supporting research and development for the National Geospatial-Intelligence Agency (NGA). The report was prepared in collaboration with the National Academies’ Board on Earth Sciences and Resources and the Board on Atmospheric Sciences and Climate. The report is in the top 7 percent of all National Academies Press products, and it has been downloaded in 115 countries. The report has enhanced government understanding of real-time predictive models that incorporate geospatial information as natural events occur, aided local governments on improving energy and food sustainability with spatial analysis, and informed international reports on the future of geographic information system mapping technologies (GIS) for defense applications. The report has aided research efforts in diverse areas and has been used as a resource in the development of several undergraduate and graduate courses on intelligence in emergency management in homeland security, the use of spatial technologies in archaeology, cartography and GIS, and physical hydrology.

Download the full report

 

Tying Flood Insurance to Flood Risk for Low-Lying Structures in the Floodplan
Floods take a heavy toll on society, costing lives, damaging buildings and property, disrupting livelihoods, and sometimes necessitating federal disaster relief, which has risen to record levels in recent years. Tying Flood Insurance to Flood Risk for Low-Lying Structures in the Floodplan is a 2015 consensus study report that reviews current National Flood Insurance Program (NFIP) methods for calculating risk-based premiums for negatively elevated structures, evaluates alternative approaches for calculating risk-based premiums, and discusses engineering hydrologic and property assessment data needs to implement full risk-based premiums. The report was completed in collaboration with the Water Science and Technology Board and sponsored by the Federal Emergency Management Agency (FEMA). The report has been downloaded in 114 countries, and it has helped local, state, federal, and international governments understand how communities are impacted by flood insurance rates in order to improve hazard mitigation strategies. The report has been cited in two Government Accountability Office (GAO) reports: National Flood Insurance Program: Continued Progress Needed to Fully Address Prior GAO Recommendations on Rate-Setting Methods (March 2016) and Flood Insurance: Review of FEMA Study and Report on Community-Based Options (August 2016). Tying Flood Insurance to Flood Risk for Low-Lying Structures in the Floodplan has been cited in journals such as Water Resources Research, Journal of Risk Insurance, and Global Environmental Change. The report has also informed research that led to publications in the fields of accounting, earth and planetary sciences, economics and econometrics, finance, renewable energy and sustainability, environmental engineering, environmental science, and water science and technology.

Download the full report

 

Developing a 21st Century Global Library for Mathematics Research
Today’s information technologies and machine learning tools provide an opportunity to further organize and enhance discoverability of the mathematics literature, with the potential to significantly facilitate mathematics research and learning. Developing a 21st Century Global Library for Mathematics Research discusses how information about what the mathematical literature contains can be formalized and made easier to express, encode, and explore. The 2014 consensus study report was sponsored by the Alfred P. Sloan Foundation and has since been downloaded in by individuals in 116 countries, all 50 states, and the District of Columbia. The report has been cited in numerous conference proceedings and presentations, such as the 2017 IEEE International Conference on Cyber Conflicts, the 18th International Symposium on Symbolic and Numeric Algorithmics for Scientific Computing (2017), the 18th and 19th International Conference on Data Analytics and Management in Data Intensive Domains (2017), the 10th Annual Conference on Intelligent Computer Mathematics (2017), and the 39th International ACM SIGIR Conference on Research and Development in Information Retrieval (2016). The report was has informed collection development, library instruction, and literacy requirements; assisted in the development of reference services for postsecondary school faculty and students; and aided in the planning of governmental technical libraries.

Download the full report

 

Training Students to Extract Value from Big Data
The increasing availability and complexity of high-throughput data collection across disciplines requires a workforce that is properly trained and ready to tackle high-needs areas. Training Students to Extract Value from Big Data: Summary of a Workshop provides a synopsis of a 2014 workshop, jointly funded by the National Academy of Sciences and the National Science Foundation, that explored ways to best train students to utilize big data sets. The publication is the second-most downloaded workshop summary from the National Academies. The summary has international acclaim in more than 160 countries, as well as all 50 states and the District of Columbia. The article has been utilized by numerous academic institutions to facilitate the curricular integration of big data into various domains, including agriculture and agronomy, biology, biostatistics, business administration, business intelligence analytics, civil engineering, computer science, data mining, mathematics, public administration, physics, psychology, statistics, and synthetic biology. It has also been applied to research in application areas such as astronomy, education, environmental science, epidemiology, nursing, plant biology, and reservoir geomechanics, in addition to informing the development of big-data-focused institutes and centers within both academia and industry. Furthermore, the summary has informed the development of new undergraduate and master’s programs in data science, computational modeling, and computational statistics, as well as improved understanding of how to engage underrepresented populations in data science. Beyond the classroom, Training Students to Extract Value in Big Data has served as a reference for both internal government evaluation analytics and international policy and has sparked training opportunities across industry to improve employee knowledge and understanding of big data.

Download the workshop summary
Watch videos of the workshop presentations


Robust Methods for the Analysis of Images and Videos for Fisheries Stock Assessment
Current approaches to assessing fish stock populations rely heavily on human data-gathering and analysis, but increased automation offers the potential to improve efficiency, reduce human workload, and develop higher-fidelity measurements. The 2014 workshop on robust methods for the analysis of images and videos for fisheries stock assessment, sponsored by the National Oceanic and Atmospheric Administration, was convened to discuss analysis techniques for images and videos for fisheries stock assessment. The workshop summary has had an international reach in 112 countries, impacting individuals from diverse academic institutions, nonprofits, and industry, as well as members of local, state, and federal government. The summary has been used to inform teaching on environmental science, fisheries biology, fisheries stock assessment, marine biology, and oceanography, as well as reference materials for government agencies. The summary has also been utilized to foster research in mathematical and computational fields, such as computer vision and planar shapes, and to inform acoustic analysis for different fish populations, aquatic animal surveillance, data quality improvement of regional reef fish videos, fisheries management, marine science and ecology, and the well-being of Gulf Coast fishing communities.

Download the workshop summary
Watch videos of the workshop presentations
 

Frontiers in Massive Data Analysis

With the onset of large, complex data sets, new tools, skills, and analytical techniques are needed. Frontiers in Massive Data Analysis, published in July 2013 and supported by the National Security Agency, describes these innovations and explores cross-disciplinary research opportunities for making inferences from massive data. Since its release, the report has been downloaded over 25,000 times in 159 countries, making it the 26th most downloaded report of the National Academies Press as of June 2017.  The report continues to be consulted by various government agencies, health care policy analysts, and university educators, students, and researchers across the United States. The report is also included in the reading lists for various big data analytics courses and often cited at conferences and in publications in a variety of fields, including statistics, computer science, data science, information technology, artificial intelligence, biology, healthcare, neuroscience, forecasting, child psychology and psychiatry, computational social science, veterinary medicine, materials science, geoscience, innovation, public policy, water management, political science, chemical engineering, and electrical engineering.

Download the full report

 

Fueling Innovation and Discovery: The Mathematical Sciences in the 21st Century

The mathematical sciences play a vital role in science, engineering, medicine, industry, and national security. Fueling Innovation and Discovery: The Mathematical Sciences in the 21st Century is written for a general audience and provides an overview of how our societies, industries, and technologies are advancing as a result of continued innovation in the mathematical sciences. Since its release in July 2012, this report has been downloaded over 9,000 times in 150 countries. Fueling Innovation and Discovery has helped both 2- and 4-year academic institutions across the United States set new standards, revise curricular offerings, enhance assessment, and present new degree programs in both mathematics and physical sciences; introduced advanced placement high school students to new mathematical applications; inspired elementary, middle, and high school faculty, as well as local administrators, school boards, and state education departments, to strengthen STEM courses; enhanced grant proposals; suggested new content both for pre-service teacher training and staff professional development programs; and encouraged cross-disciplinary ventures. The report has been used to provide evidence of needed policy and/or curricular reforms in documents such as A Common Vision for the Undergraduate Mathematics Program in 2025 (2015), Indiana’s Math Pathways Recommendations (2015), Mathematical Sciences Driving the UK Economy (2015), Rethinking Postsecondary Mathematics: Final Report of the Ohio Mathematics Steering Committee (2014), and Proceedings of the International Institute of Industrial Engineering Conference (2013). The report has also been consulted by diverse educators, policy makers, academic institutions, and organizations, including the Transforming Postsecondary Education in Mathematics (TPSEMath) Advisory Group, the Association of Mathematics Teacher Educators, the Detroit Area Council of Teachers of Mathematics, the European Service Network of Mathematics for Industry and Innovation, and by participants of the Conference Board of the Mathematical Sciences. Fueling Innovation and Discovery was part of a 2-year National Academies’ study supported by the National Science Foundation.

Download the full report

 

Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification

Advances in computing hardware and algorithms have dramatically improved the ability to simulate complex processes computationally, providing the ability to address questions that in the past could be addressed only by resource-intensive experimentation. The 2012 consensus study report Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification discusses changes in the education of professionals and dissemination of information that could enhance the ability of future verification, validation, and uncertainty of quantification (VVUQ) practitioners to improve and properly apply VVUQ methodologies to difficult problems; enhance the ability of VVUQ customers to understand VVUQ results and use them to make informed decisions; and enhance the ability of all VVUQ stakeholders to communicate with each other. The report, sponsored by the U.S. Department of Energy and the Air Force Office of Scientific Research, has been downloaded in 129 countries and all 50 states (as well as the District of Columbia), and it is in the top 2 percent of downloads of all National Academies Press publications. The report has been cited in 11 textbooks, most recently the Handbook of Uncertainty Quantification (2017), Systems Pharmacology and Pharmacodynamics (2016), Fundamentals of Pediatric Drug Dosing (2016), and Data Simplification: Taming Information with Open Source Tools (2016). The report has aided government officials in the examination of international nuclear reactor safety; policy makers in the development of tools for complex modeling in regulatory decision making; and researchers in the fields of agricultural and biological sciences, business and international management, chemistry, computer graphics and computer-aided design, software development, computational earth science, nuclear energy and engineering, biomedical engineering, mechanical engineering, biochemistry and genetics, electronic and magnetic materials science, applied mathematics, medical physiology, and condensed matter physics. The report has been cited in more than 50 journals, including Science; IEEE Transactions on Industrial Informatics; Reliability Engineering & Systems Safety; Computational Mechanics; and Annual Review of Statistics and Its Applications.

Download the full report

 

Strengthening Forensic Science in the United States: A Path Forward

Improved forensic science methods could assist law enforcement officials, enhance homeland security, and reduce the risk of wrongful conviction and exoneration. However, systematic and scientific advancements are needed in a number of forensic science disciplines to ensure the reliability of work, establish enforceable standards, and promote best practices with consistent application. Strengthening Forensic Science in the United States: A Path Forward provides a detailed plan for addressing these needs and gives a full account of how to advance the forensic science discipline, including the upgrading of systems and organizational structures, better training, widespread adoption of uniform and enforceable best practices, and mandatory certification and accreditation programs. Sponsored by the National Institute of Justice and completed in collaboration with the National Academies’ Committee on Science, Technology, and Law, the report has been downloaded more than 22,000 times in more than 150 countries since its release in 2009. The report has fostered intergovernmental collaborations such as the development of the National Commission on Forensic Science (a joint effort between the U.S. Department of Justice and the National Institutes of Standards and Technology), as well as the Microscopic Hair Analysis project among the Innocence Project, the National Association for Criminal Defense Lawyers, the U.S. Federal Bureau of Investigation, and the U.S. Department of Justice. The report has also led to improvements in state-level forensic science programs, such as the development of the Texas Forensic and Crime Scene Investigation Consortium, the Connecticut Forensic Science Laboratory, and the District of Columbia Department of Forensic Science Consolidated Forensic Laboratory (which combines the Metropolitan Police Department Forensic Investigation Units, the Office of the Chief Medical Examiner, and the Department of Health Public Health Laboratory). The report has been cited in numerous official judicial decisions as well as a variety of Congressional testimonies. Strengthening Forensic Science in the United States: A Path Forward has also inspired the creation of research centers such as the Forensic Science Center of Excellence, the creation of international government-academia partnerships on computer vision, and the development of multiple graduate degree programs. The report has also been cited in more than 650 papers in over 200 journals and in nearly 50 books.

Download the full report