Join thousands of book lovers
Sign up to our newsletter and receive discounts and inspiration for your next reading experience.
By signing up, you agree to our Privacy Policy.You can, at any time, unsubscribe from our newsletters.
The Next Generation Air Transportation System's (NextGen) goal is the transformation of the U.S. national airspace system through programs and initiatives that could make it possible to shorten routes, navigate better around weather, save time and fuel, reduce delays, and improve capabilities for monitoring and managing of aircraft. A Review of the Next Generation Air Transportation provides an overview of NextGen and examines the technical activities, including human-system design and testing, organizational design, and other safety and human factor aspects of the system, that will be necessary to successfully transition current and planned modernization programs to the future system. This report assesses technical, cost, and schedule risk for the software development that will be necessary to achieve the expected benefits from a highly automated air traffic management system and the implications for ongoing modernization projects. The recommendations of this report will help the Federal Aviation Administration anticipate and respond to the challenges of implementing NextGen.
BioWatch is an air monitoring system deployed in jurisdictions around the country with the goal of detecting the presence of certain high risk pathogenic microorganisms. It relies on a network of federal and nonfederal collaborative relationships to be successful, and is one part of a larger array of disease surveillance, intelligence-gathering, and biomonitoring activities in support of public safety and health. The assays used in the BioWatch system to detect the presence of pathogens in collected samples rely on the technique of polymerase chain reaction (PCR) to sensitively and specifically amplify target nucleic acid sequences. BioWatch PCR Assays evaluates and provides guidance on appropriate standards for the validation and verification of PCR tests and assays in order to ensure that adequate performance data are available to public health and other key decision makers with a sufficient confidence level to facilitate the public health response to a BioWatch Actionable Response. This report discusses principles of performance standards, reviews information from several existing guidance documents and standards that might be applicable to BioWatch, and discusses assay testing efforts that have occurred or are ongoing. BioWatch PCR Assays provides recommendations on general principles and approaches for a performance standard and validation framework to meet BioWatch's mission. The report also considers how developments in technology, particularly in multiplex PCR and next-generation sequencing, can contribute to the ability of the BioWatch program to meet current and future challenges. This report has been determined to contain information exempt from disclosure under 5 U.S.C. 552(b). Section 15 of the Federal Advisory Committee Act provides that the National Academies shall make its final report available to the public unless the National Academies determines that the report would disclose matters described in one or more of the exemption provisions under the Freedom of Information Act (FOIA). In such case, the National Academies "shall make public an abbreviated version of the report that does not disclose those matters." This unrestricted, abbreviated version of the report represents, in so far as possible, the committee's findings, recommendations, and other substantive material without disclosing materials described in 5 U.S.C. 552(b).
The Florida Everglades is a large and diverse aquatic ecosystem that has been greatly altered over the past century by an extensive water control infrastructure designed to increase agricultural and urban economic productivity. The Comprehensive Everglades Restoration Plan (CERP), launched in 2000, is a joint effort led by the state and federal government to reverse the decline of the ecosystem. Increasing water storage is a critical component of the restoration, and the CERP included projects that would drill over 330 aquifer storage and recovery (ASR) wells to store up to 1.65 billion gallons per day in porous and permeable units in the aquifer system during wet periods for recovery during seasonal or longer-term dry periods.To address uncertainties regarding regional effects of large-scale ASR implementation in the Everglades, the U.S. Army Corps of Engineers (USACE) and the South Florida Water Management District conducted an 11-year ASR Regional Study, with focus on the hydrogeology of the Floridan aquifer system, water quality changes during aquifer storage, possible ecological risks posed by recovered water, and the regional capacity for ASR implementation. At the request of the USACE, Review of the Everglades Aquifer Storage and Recovery Regional Study reviews the ASR Regional Study Technical Data Report and assesses progress in reducing uncertainties related to full-scale CERP ASR implementation. This report considers the validity of the data collection and interpretation methods; integration of studies; evaluation of scaling from pilot-to regional-scale application of ASR; and the adequacy and reliability of the study as a basis for future applications of ASR.
New astronomical facilities, such as the under-construction Large Synoptic Survey Telescope and planned 30-meter-class telescopes, and new instrumentation on existing optical and infrared (OIR) telescopes, hold the promise of groundbreaking research and discovery. How can we extract the best science from these and other astronomical facilities in an era of potentially flat federal budgets for both the facilities and the research grants? Optimizing the U.S. Ground-Based Optical and Infrared Astronomy System provides guidance for these new programs that align with the scientific priorities and the conclusions and recommendations of two National Research Council (NRC) decadal surveys, New Worlds, New Horizons for Astronomy and Astrophysics and Vision and Voyages for Planetary Sciences in the Decade 2013-2022, as well as other NRC reports.This report describes a vision for a U.S. OIR System that includes a telescope time exchange designed to enhance science return by broadening access to capabilities for a diverse community, an ongoing planning process to identify and construct next generation capabilities to realize decadal science priorities, and near-term critical coordination, planning, and instrumentation needed to usher in the era of LSST and giant telescopes.
The National Flood Insurance Program (NFIP) is housed within the Federal Emergency Management Agency (FEMA) and offers insurance policies that are marketed and sold through private insurers, but with the risks borne by the U.S. federal government. NFIP's primary goals are to ensure affordable insurance premiums, secure widespread community participation in the program, and earn premium and fee income that covers claims paid and program expenses over time. In July 2012, the U.S. Congress passed the Biggert-Waters Flood Insurance Reform and Modernization Act (Biggert-Waters 2012), designed to move toward an insurance program with NFIP risk-based premiums that better reflected expected losses from floods at insured properties. This eliminated policies priced at what the NFIP called "pre-FIRM subsidized" and "grandfathered." As Biggert-Waters 2012 went into effect, constituents from multiple communities expressed concerns about the elimination of lower rate classes, arguing that it created a financial burden on policy holders. In response to these concerns Congress passed The Homeowner Flood Insurance Affordability Act of 2014 (HFIAA 2014). The 2014 legislation changed the process by which pre-FIRM subsidized premiums for primary residences would be removed and reinstated grandfathering. As part of that legislation, FEMA must report back to Congress with a draft affordability framework. Affordability of National Flood Insurance Program Premiums: Report 1 is the first part of a two-part study to provide input as FEMA prepares their draft affordability framework. This report discusses the underlying definitions and methods for an affordability framework and the affordability concept and applications. Affordability of National Flood Insurance Program Premiums gives an overview of the demand for insurance and the history of the NFIP premium setting. The report then describes alternatives for determining when the premium increases resulting from Biggert-Waters 2012 would make flood insurance unaffordable.
"The Edwards Aquifer in south-central Texas is the primary source of water for one of the fastest growing cities in the United States, San Antonio, and it also supplies irrigation water to thousands of farmers and livestock operators. It is also is the source water for several springs and rivers, including the two largest freshwater springs in Texas that form the San Marcos and Comal Rivers. The unique habitat afforded by these spring-fed rivers has led to the development of species that are found in no other locations on Earth. Due to the potential for variations in spring flow caused by both human and natural causes, these species are continuously at risk and have been recognized as endangered under the federal Endangered Species Act (ESA). In an effort to manage the river systems and the aquifer that controls them, the Edwards Aquifer Authority and stakeholders have developed a Habitat Conservation Plan (HCP). The HCP seeks to effectively manage the river-aquifer system to ensure the viability of the ESA-listed species in the face of drought, population growth, and other threats to the aquifer. The National Research Council was asked to assist in this process by reviewing the activities around implementing the HCP. Review of the Edwards Aquifer Habitat Conservation Plan: Report 1 is the first stage of a three-stage study. This report reviews the scientific efforts that are being conducted to help build a better understanding of the river-aquifer system and its relationship to the ESA-listed species. These efforts, which include monitoring and modeling as well as research on key uncertainties in the system, are designed to build a better understanding of how best to manage and protect the system and the endangered species. Thus, the current report is focused specifically on a review of the hydrologic modeling, the ecological modeling, the water quality and biological monitoring, and the Applied Research Program. The fundamental question that Review of the Edwards Aquifer Habitat Conservation Plan: Report 1 addresses is whether the scientific initiatives appropriately address uncertainties and fill knowledge gaps in the river-aquifer system and the species of concern. It is hoped that the successful completion of these scientific initiatives will ultimately lead the Edwards Aquifer Authority to an improved understanding of how to manage the system and protect these species.--
Extremely hazardous substances can be released accidentally as a result of chemical spills, industrial explosions, fires, or accidents involving railroad cars and trucks transporting EHSs. Workers and residents in communities surrounding industrial facilities where these substances are manufactured, used, or stored and in communities along the nation's railways and highways are potentially at risk of being exposed to airborne EHSs during accidental releases or intentional releases by terrorists. Pursuant to the Superfund Amendments and Reauthorization Act of 1986, the U.S. Environmental Protection Agency (EPA) has identified approximately 400 EHSs on the basis of acute lethality data in rodents. Acute Exposure Guideline Levels for Selected Airborne Chemicals, Volume 19 identifies, reviews, and interprets relevant toxicologic and other scientific data for selected AEGL documents for cyanide salts, diketene, methacrylaldehyde, pentaborane, tellurium hexafluoride, and tetrafluoroethylene in order to develop acute exposure guideline levels (AEGLs) for these high-priority, acutely toxic chemicals. AEGLs represent threshold exposure limits (exposure levels below which adverse health effects are not likely to occur) for the general public and are applicable to emergency exposures ranging from 10 minutes (min) to 8 h. Three levels - AEGL-1, AEGL-2, and AEGL-3 - are developed for each of five exposure periods (10 min, 30 min, 1 h, 4 h, and 8 h) and are distinguished by varying degrees of severity of toxic effects. This report will inform planning, response, and prevention in the community, the workplace, transportation, the military, and the remediation of Superfund sites.
Individuals with disabilities, chronic conditions, and functional impairments need a range of services and supports to keep living independently. However, there often is not a strong link between medical care provided in the home and the necessary social services and supports for independent living. Home health agencies and others are rising to the challenges of meeting the needs and demands of these populations to stay at home by exploring alternative models of care and payment approaches, the best use of their workforces, and technologies that can enhance independent living. All of these challenges and opportunities lead to the consideration of how home health care fits into the future health care system overall. On September 30 and October 1, 2014, the Institute of Medicine and the National Research Council convened a public workshop on the future of home health care. The workshop brought together a spectrum of public and private stakeholders and thought leaders to improve understanding of the current role of Medicare home health care in supporting aging in place and in helping high-risk, chronically ill, and disabled Americans receive health care in their communities. Through presentations and discussion, participants explored the evolving role of Medicare home health care in caring for Americans in the future, including how to integrate Medicare home health care into new models for the delivery of care and the future health care marketplace. The workshop also considered the key policy reforms and investments in workforces, technologies, and research needed to leverage the value of home health care to support older Americans, and research priorities that can help clarify the value of home health care. This summary captures important points raised by the individual speakers and workshop participants.
"Many measurement systems to monitor the well-being of children and guide services are implemented across the community, state, and national levels in the United States. While great progress has been made in recent years in developing interventions that have been shown to improve the cognitive, affective, and behavioral health of children, many of these tested and effective interventions have yet to be widely implemented. One potential reason for this lag in implementation is a need to further develop and better utilize measures that gauge the success of evidence-based programs as part of a broad effort to prevent negative outcomes and foster children's health and well-being. To address this issue, the Institute of Medicine Forum on Promoting Children's Cognitive, Affective, and Behavioral Health held a workshop in Washington, DC, on November 5-6, 2014. The workshop featured presentations on the use of data linkage and integration to inform research and practice related to children's cognitive, affective, and behavioral health; the use of quality measures to facilitate system change in health care, classroom, and juvenile justice settings; and tools developed to measure implementation of evidence-based prevention programs at scale to support sustainable program delivery, among other topics. Workshop presenters and participants discussed examples of innovative design and utilization of measurement systems, new approaches to build on existing data systems, and new data systems that could support the cognitive, affective, and behavioral health and well-being of children. This report summarizes the presentation and discussions of the event."--
A Framework for K-12 Science Education and Next Generation Science Standards (NGSS) describe a new vision for science learning and teaching that is catalyzing improvements in science classrooms across the United States. Achieving this new vision will require time, resources, and ongoing commitment from state, district, and school leaders, as well as classroom teachers. Successful implementation of the NGSS will ensure that all K-12 students have high-quality opportunities to learn science.Guide to Implementing the Next Generation Science Standards provides guidance to district and school leaders and teachers charged with developing a plan and implementing the NGSS as they change their curriculum, instruction, professional learning, policies, and assessment to align with the new standards. For each of these elements, this report lays out recommendations for action around key issues and cautions about potential pitfalls. Coordinating changes in these aspects of the education system is challenging. As a foundation for that process, Guide to Implementing the Next Generation Science Standards identifies some overarching principles that should guide the planning and implementation process.The new standards present a vision of science and engineering learning designed to bring these subjects alive for all students, emphasizing the satisfaction of pursuing compelling questions and the joy of discovery and invention. Achieving this vision in all science classrooms will be a major undertaking and will require changes to many aspects of science education. Guide to Implementing the Next Generation Science Standards will be a valuable resource for states, districts, and schools charged with planning and implementing changes, to help them achieve the goal of teaching science for the 21st century.
On October 17, 2014, spurred by incidents at U.S. government laboratories that raised serious biosafety concerns, the United States government launched a one-year deliberative process to address the continuing controversy surrounding so-called "gain-of-function" (GOF) research on respiratory pathogens with pandemic potential. The gain of function controversy began in late 2011 with the question of whether to publish the results of two experiments involving H5N1 avian influenza and continued to focus on certain research with highly pathogenic avian influenza over the next three years. The heart of the U.S. process is an evaluation of the potential risks and benefits of certain types of GOF experiments with influenza, SARS, and MERS viruses that would inform the development and adoption of a new U.S. Government policy governing the funding and conduct of GOF research. Potential Risks and Benefits of Gain-of-Function Research is the summary of a two-day public symposia on GOF research. Convened in December 2014 by the Institute of Medicine and the National Research Council, the main focus of this event was to discuss principles important for, and key considerations in, the design of risk and benefit assessments of GOF research. Participants examined the underlying scientific and technical questions that are the source of current discussion and debate over GOF research involving pathogens with pandemic potential. This report is a record of the presentations and discussion of the meeting.
The California Department of Pesticide Regulation(DPR)conducts human health risk assessments as part of its mission to ensure the protection of workers and public health in the state. The risk assessments identify potential health hazards posed by pesticides, characterize dose-response relationships, and estimate exposure to characterize potential risks to humans. Over the last decade, advances in methods of scientific and technical analysis have led to improvements in the risk-assessment process that have made them more rigorous, transparent, and useful to risk managers. In light of the advances, the California legislature asked DPR to arrange an independent peer review of the agency's risk-assessment practices to ensure that they are scientifically and technically credible.Review of California's Risk-Assessment Process for Pesticides examines DPR's processes of hazard identification, exposure assessment, dose-response analysis, and risk characterization to determine whether they are consistent with best practices. This report also evaluates the methods used for setting priorities among pesticides for risk assessment and identifies possible options for improving efficiency and productivity. Recommendations of this report will help to make the process more transparent and defensible.
One of the last two sites with chemical munitions and chemical materiel is the Pueblo Chemical Depot in Pueblo, Colorado. The stockpile at this location consists of about 800,000 projectiles and mortars, all of which are filled with the chemical agent mustard. Under the direction of the Assembled Chemical Weapons Alternative Program (ACWA), the Army has constructed the Pueblo Chemical Agent Destruction Pilot Plant (PCAPP) to destroy these munitions. The primary technology to be used to destroy the mustard agent at PCAPP is hydrolysis, resulting in a secondary waste stream referred to as hydrolysate.PCAPP features a process that will be used to treat the hydrolysate and the thiodiglycol - a breakdown product of mustard - contained within. The process is a biotreatment technology that uses what are known as immobilized cell bioreactors. After biodegradation, the effluent flows to a brine reduction system, producing a solidified filter cake that is intended to be sent offsite to a permitted hazardous waste disposal facility. Water recovered from the brine reduction system is intended to be recycled back through the plant, thereby reducing the amount of water that is withdrawn from groundwater. Although biotreatment of toxic chemicals, brine reduction, and water recovery are established technologies, never before have these technologies been combined to treat mustard hydrolysate.At the request of the U.S. Army, Review Criteria for Successful Treatment of Hydrolysate at the Pueblo Chemical Agent Destruction Pilot Plant reviews the criteria for successfully treating the hydrolysate. This report provides information on the composition of the hydrolysate and describes the PCAPP processes for treating it; discusses stakeholder concerns; reviews regulatory considerations at the federal, state, and local levels; discusses Department of Transportation regulations and identifies risks associated with the offsite shipment of hydrolysate; establishes criteria for successfully treating the hydrolysate and identifies systemization data that should factor into the criteria and decision process for offsite transport and disposal of the hydrolysate; and discusses failure risks and contingency options as well as the downstream impacts of a decision to ship hydrolysate offsite.
"Simulators currently provide an alternative to aircraft when it comes to training requirements, both for the military and for commercial airlines. For the U.S. Air Force, in particular, simulation for training offers a cost-effective way, and in many instances a safer way in comparison with live flying, to replicate real-world missions. Current technical issues related to simulation for training include simulation fidelity and multi-level security, among others, which will need to be addressed in order for the Air Force to take full advantage of this technology. The workshop held in November, 2014 examined the current status of simulation training, alternative uses, current and future technologies, and how the combination of simulation and live training can improve aircrew training. The scope of the workshop focused on technologies and practices that could be applicable to high-end aircraft simulations."-- Publisher's description
The National Research Council's Army Research Laboratory Technical Assessment Board (ARLTAB) provides biennial assessments of the scientific and technical quality of the research, development, and analysis programs at the Army Research Laboratory, focusing on ballistics sciences, human sciences, information sciences, materials sciences, and mechanical sciences.This report discusses the biennial assessment process used by ARLTAB and its five panels; provides detailed assessments of each of the ARL core technical competency areas reviewed during the 2013-2014 period; and presents findings and recommendations common across multiple competency areas.
"Aligning the Governance Structure of the NNSA Laboratories to Meet 21st Century National Security Challenges is an independent assessment regarding the transition of the National Nuclear Security Administration (NNSA) laboratories - Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories - to multiagency, federally funded research and development centers with direct sustainment and sponsorship by multiple national security agencies. This report makes recommendations for the governance of NNSA laboratories to better align with the evolving national security landscape and the laboratories' increasing engagement with the other national security agencies, while simultaneously encouraging the best technical solutions to national problems from the entire range of national security establishments. According to this report, the Department of Energy should remain the sole sponsor of the NNSA laboratories as federally funded research and development centers. The NNSA laboratories will remain a critically important resource to meet U.S. national security needs for many decades to come. The recommendations of Aligning the Governance Structure of the NNSA Laboratories to Meet 21st Century National Security Challenges will improve the governance of the laboratories and strengthen their strategic relationship with the non-DOE national security agencies."--
"In January 2014, the Board on Children, Youth, and Families of the Institute of Medicine and the National Research Council, in collaboration with the IOM Board on Global Health, launched the Forum on Investing in Young Children Globally. At this meeting, the participants agreed to focus on creating and sustaining, over 3 years, an evidence-driven community of stakeholders that aims to explore existing, new, and innovative science and research from around the world and translate this evidence into sound and strategic investments in policies and practices that will make a difference in the lives of children and their caregivers. Financing Investments in Young Children Globally is the summary of a workshop hosted by the Forum on Investing in Young Children Globally in August 2014. This workshop, on financing investments for young children, brought together stakeholders from such disciplines as social protection, nutrition, education, health, finance, economics, and law and included practitioners, advocates, researchers, and policy makers. Presentations and discussions identified some of the current issues in financing investments across health, education, nutrition, and social protection that aim to improve children's developmental potential. This report explores issues across three broad domains of financing: (1) costs of programs for young children; (2) sources of funding, including public and private investments; and (3) allocation of these investments, including cash transfers, microcredit programs, block grants, and government restructuring."
The mission of the Engineering Laboratory of the National Institute of Standards and Technology (NIST) is to promote U.S. innovation and industrial competitiveness through measurement science and standards for technology-intensive manufacturing, construction, and cyberphysical systems in ways that enhance economic prosperity and improve the quality of life. To support this mission, the Engineering Laboratory has developed thrusts in smart manufacturing, construction, and cyberphysical systems; in sustainable and energy-efficient manufacturing materials and infrastructure; and in disaster-resilient buildings, infrastructure, and communities. The technical work of the Engineering Laboratory is performed in five divisions: Intelligent Systems; Materials and Structural Systems; Energy and Environment; Systems Integration; and Fire Research; and two offices: Applied Economics Office and Smart Grid Program Office. An Assessment of the National Institute of Standards and Technology Engineering Laboratory Fiscal Year 2014 assesses the scientific and technical work performed by the NIST Engineering Laboratory. This report evaluates the organization's technical programs, portfolio of scientific expertise within the organization, adequacy of the organization's facilities, equipment, and human resources, and the effectiveness by which the organization disseminates its program outputs.
The National Institute of Standards and Technology's (NIST's) Material Measurement Laboratory (MML) is our nation's reference laboratory for measurements in the chemical, biological, and materials sciences and engineering. Staff of the MML develop state-of-the-art measurement techniques and conduct fundamental research related to measuring the composition, structure, and properties of substances. Tools that include reference materials, data, and measurement services are developed to support industries that range from transportation to biotechnology and to address problems such as climate change, environmental sciences, renewable energy, health care, infrastructure, food safety and nutrition, and forensics. This report assesses the scientific and technical work performed by NIST's Material Measurement Laboratory. In particular, the report assesses the organization's technical programs, the portfolio of scientific expertise within the organization, the adequacy of the organization\'s facilities, equipment, and human resources, and the effectiveness by which the organization disseminates its program outputs.
"The 2012 National Research Council report Disaster Resilience: A National Imperative highlighted the challenges of increasing national resilience in the United States. One finding of the report was that "without numerical means of assessing resilience, it would be impossible to identify the priority needs for improvement, to monitor changes, to show that resilience had improved, or to compare the benefits of increasing resilience with the associated costs." Although measuring resilience is a challenge, metrics and indicators to evaluate progress, and the data necessary to establish the metric, are critical for helping communities to clarify and formalize what the concept of resilience means for them, and to support efforts to develop and prioritize resilience investments. One of the recommendations from the 2012 report stated that government entities at federal, state, and local levels and professional organizations should partner to help develop a framework for communities to adapt to their circumstances and begin to track their progress toward increasing resilience. To build upon this recommendation and begin to help communities formulate such a framework, the Resilient America Roundtable of the National Academies convened the workshop Measures of Community Resilience: From Lessons Learned to Lessons Applied on September 5, 2014 in Washington, D.C. The workshop's overarching objective was to begin to develop a framework of measures and indicators that could support community efforts to increase their resilience. The framework will be further developed through feedback and testing in pilot and other partner communities that are working with the Resilient America Roundtable. This report is a summary of the one-day workshop, which consisted of a keynote address and two panel sessions in the morning and afternoon breakout sessions that began the discussion on how to develop a framework of resilience measures."-- Publisher's description
Every year, the U.S. Army must select from an applicant pool in the hundreds of thousands to meet annual enlistment targets, currently numbering in the tens of thousands of new soldiers. A critical component of the selection process for enlisted service members is the formal assessments administered to applicants to determine their performance potential. Attrition for the U.S. military is hugely expensive. Every recruit that does not make it through basic training or beyond a first enlistment costs hundreds of thousands of dollars. Academic and other professional settings suffer similar losses when the wrong individuals are accepted into the wrong schools and programs or jobs and companies. Picking the right people from the start is becoming increasingly important in today's economy and in response to the growing numbers of applicants. Beyond cognitive tests of ability, what other attributes should selectors be considering to know whether an individual has the talent and the capability to perform as well as the mental and psychological drive to succeed? Measuring Human Capabilities: An Agenda for Basic Research on the Assessment of Individual and Group Performance Potential for Military Accession examines promising emerging theoretical, technological, and statistical advances that could provide scientifically valid new approaches and measurement capabilities to assess human capability. This report considers the basic research necessary to maximize the efficiency, accuracy, and effective use of human capability measures in the military's selection and initial occupational assignment process. The research recommendations of Measuring Human Capabilities will identify ways to supplement the Army's enlisted soldier accession system with additional predictors of individual and collective performance. Although the primary audience for this report is the U.S. military, this book will be of interest to researchers of psychometrics, personnel selection and testing, team dynamics, cognitive ability, and measurement methods and technologies. Professionals interested in of the foundational science behind academic testing, job selection, and human resources management will also find this report of interest.
Hurricane- and coastal-storm-related losses have increased substantially during the past century, largely due to increases in population and development in the most susceptible coastal areas. Climate change poses additional threats to coastal communities from sea level rise and possible increases in strength of the largest hurricanes. Several large cities in the United States have extensive assets at risk to coastal storms, along with countless smaller cities and developed areas. The devastation from Superstorm Sandy has heightened the nation's awareness of these vulnerabilities. What can we do to better prepare for and respond to the increasing risks of loss?Reducing Coastal Risk on the East and Gulf Coasts reviews the coastal risk-reduction strategies and levels of protection that have been used along the United States East and Gulf Coasts to reduce the impacts of coastal flooding associated with storm surges. This report evaluates their effectiveness in terms of economic return, protection of life safety, and minimization of environmental effects. According to this report, the vast majority of the funding for coastal risk-related issues is provided only after a disaster occurs. This report calls for the development of a national vision for coastal risk management that includes a long-term view, regional solutions, and recognition of the full array of economic, social, environmental, and life-safety benefits that come from risk reduction efforts. To support this vision, Reducing Coastal Risk states that a national coastal risk assessment is needed to identify those areas with the greatest risks that are high priorities for risk reduction efforts. The report discusses the implications of expanding the extent and levels of coastal storm surge protection in terms of operation and maintenance costs and the availability of resources.Reducing Coastal Risk recommends that benefit-cost analysis, constrained by acceptable risk criteria and other important environmental and social factors, be used as a framework for evaluating national investments in coastal risk reduction. The recommendations of this report will assist engineers, planners and policy makers at national, regional, state, and local levels to move from a nation that is primarily reactive to coastal disasters to one that invests wisely in coastal risk reduction and builds resilience among coastal communities.
"National Center for Science and Engineering Statistics (NCSES) of the National Science Foundation is responsible for national reporting of the research and development (R&D) activities that occur in all sectors of the United States economy. For most sectors, including the business and higher education sectors, NCSES collects data on these activities on a regular basis. However, data on R&D within the nonprofit sector have not been collected in 18 years, a time period which has seen dynamic and rapid growth of the sector. NCSES decided to design and implement a new survey of nonprofits, and commissioned this workshop to provide a forum to discuss conceptual and design issues and methods. Measuring Research and Development Expenditures in the U.S. Nonprofit Sector: Conceptual and Design Issues summarizes the presentations and discussion of the workshop. This report identifies concepts and issues for the design of a survey of R&D expenditures made by nonprofit organizations, considering the goals, content, statistical methodology, data quality, and data products associated with this data collection. The report also considers the broader usefulness of the data for understanding the nature of the nonprofit sector and their R&D activities. Measuring Research and Development Expenditures in the U. S. Nonprofit Sector will help readers understand the role of nonprofit sector given its enormous size and scope as well as its contribution to identifying new forms of R&D beyond production processes and new technology."--
"Measuring the Risks and Causes of Premature Death is the summary of two workshops conducted by The Committee on Population of the National Research Council at the National Academies to address the data sources, science and future research needs to understand the causes of premature mortality in the United States. The workshops reviewed previous work in the field in light of new data generated as part of the work of the NRC Panel on Understanding Divergent Trends in Longevity in High-Income Countries (NRC, 2011) and the NRC/IOM Panel on Understanding Cross-National Differences Among High-Income Countries (NRC/IOM, 2013). The workshop presentations considered the state of the science of measuring the determinants of the causes of premature death, assessed the availability and quality of data sources, and charted future courses of action to improve the understanding of the causes of premature death. Presenters shared their approaches to and results of measuring premature mortality and specific risk factors, with a particular focus on those factors most amenable to improvement through public health policy. This report summarizes the presentations and discussion of both workshops." --
"The American Community Survey (ACS) was conceptualized as a replacement to the census long form, which collected detailed population and housing data from a sample of the U.S. population, once a decade, as part of the decennial census operations. The long form was traditionally the main source of socio-economic information for areas below the national level. The data provided for small areas, such as counties, municipalities, and neighborhoods is what made the long form unique, and what makes the ACS unique today. Since the successful transition from the decennial long form in 2005, the ACS has become an invaluable resource for many stakeholders, particularly for meeting national and state level data needs. However, due to inadequate sample sizes, a major challenge for the survey is producing reliable estimates for smaller geographic areas, which is a concern because of the unique role fulfilled by the long form, and now the ACS, of providing data with a geographic granularity that no other federal survey could provide. In addition to the primary challenge associated with the reliability of the estimates, this is also a good time to assess other aspects of the survey in order to identify opportunities for refinement based on the experience of the first few years. Realizing the Potential of the American Community Survey provides input on ways of improving the ACS, focusing on two priority areas: identifying methods that could improve the quality of the data available for small areas, and suggesting changes that would increase the survey's efficiency in responding to new data needs. This report considers changes that the ACS office should consider over the course of the next few years in order to further improve the ACS data. The recommendations of Realizing the Potential of the American Community Survey will help the Census Bureau improve performance in several areas, which may ultimately lead to improved data products as the survey enters its next decade."--Publisher's description.
Over the past few decades there have been major successes in creating evidence-based interventions to improve the cognitive, affective, and behavioral health of children. Many of these interventions have been put into practice at the local, state, or national level. To reap what has been learned from such implementation, and to explore how new legislation and policies as well as advances in technology and analytical methods can help drive future implementation, the Institute of Medicine-National Research Council Forum on Promoting Children's Cognitive, Affective, and Behavioral Health held the workshop "Harvesting the Scientific Investment in Prevention Science to Promote Children's Cognitive, Affective, and Behavioral Health" in Washington, DC, on June 16 and 17, 2014. The workshop featured panel discussions of system-level levers and blockages to the broad implementation of interventions with fidelity, focusing on policy, finance, and method science; the role of scientific norms, implementation strategies, and practices in care quality and outcomes at the national, state, and local levels; and new methodological directions. The workshop also featured keynote presentations on the role of economics and policy in scaling interventions for children's behavioral health, and making better use of evidence to design informed and more efficient children's mental health systems. Harvesting the Scientific Investment in Prevention Science to Promote Children's Cognitive, Affective, and Behavioral Health summarizes the presentations and discussion of the workshop.
"Building Infrastructure for International Collaborative Research in the Social and Behavioral Sciences is the summary of a workshop convened by the National Research Council's Committee on International Collaborations in Social and Behavioral Sciences in September 2013 to identify ways to reduce impediments and to increase access to cross-national research collaborations among a broad range of American scholars in the behavioral and social sciences (and education), especially early career scholars. Over the course of two and a half days, individuals from universities and federal agencies, professional organizations, and other parties with interests in international collaboration in the behavior and social sciences and education made presentations and participated in discussions. They came from diverse fields including cognitive psychology, developmental psychology, comparative education, educational anthropology, sociology, organizational psychology, the health sciences, international development studies, higher education administration, and international exchange."--Publisher's description.
"Since the early 1960s, the U.S. strategic nuclear posture has been composed of a triad of nuclear-certified long-range bombers, intercontinental ballistic missiles, and submarine-launched ballistic missiles. Since the early 1970s, U.S. nuclear forces have been subject to strategic arms control agreements. The large numbers and diversified nature of the U.S. nonstrategic (tactical) nuclear forces, which cannot be ignored as part of the overall nuclear deterrent, have decreased substantially since the Cold War. While there is domestic consensus today on the need to maintain an effective deterrent, there is no consensus on precisely what that requires, especially in a changing geopolitical environment and with continued reductions in nuclear arms. This places a premium on having the best possible analytic tools, methods, and approaches for understanding how nuclear deterrence and assurance work, how they might fail, and how failure can be averted by U.S. nuclear forces. U.S. Air Force Strategic Deterrence Analytic Capabilities identifies the broad analytic issues and factors that must be considered in seeking nuclear deterrence of adversaries and assurance of allies in the 21st century. This report describes and assesses tools, methods - including behavioral science-based methods - and approaches for improving the understanding of how nuclear deterrence and assurance work or may fail in the 21st century and the extent to which such failures might be averted or mitigated by the proper choice of nuclear systems, technological capabilities, postures, and concepts of operation of American nuclear forces. The report recommends criteria and a framework for validating the tools, methods, and approaches and for identifying those most promising for Air Force usage."--Publisher's description.
The Science of Responding to a Nuclear Reactor Accident summarizes the presentations and discussions of the May 2014 Gilbert W. Beebe Symposium titled "The Science and Response to a Nuclear Reactor Accident". The symposium, dedicated in honor of the distinguished National Cancer Institute radiation epidemiologist who died in 2003, was co-hosted by the Nuclear and Radiation Studies Board of the National Academy of Sciences and the National Cancer Institute. The symposium topic was prompted by the March 2011 accident at the Fukushima Daiichi nuclear power plant that was initiated by the 9.0-magnitude earthquake and tsunami off the northeast coast of Japan. This was the fourth major nuclear accident that has occurred since the beginning of the nuclear age some 60 years ago. The 1957 Windscale accident in the United Kingdom caused by a fire in the reactor, the 1979 Three Mile Island accident in the United States caused by mechanical and human errors, and the 1986 Chernobyl accident in the former Soviet Union caused by a series of human errors during the conduct of a reactor experiment are the other three major accidents. The rarity of nuclear accidents and the limited amount of existing experiences that have been assembled over the decades heightens the importance of learning from the past. This year's symposium promoted discussions among federal, state, academic, research institute, and news media representatives on current scientific knowledge and response plans for nuclear reactor accidents. The Beebe symposium explored how experiences from past nuclear plant accidents can be used to mitigate the consequences of future accidents, if they occur. The Science of Responding to a Nuclear Reactor Accident addresses off-site emergency response and long-term management of the accident consequences; estimating radiation exposures of affected populations; health effects and population monitoring; other radiological consequences; and communication among plant officials, government officials, and the public and the role of the media.
Sign up to our newsletter and receive discounts and inspiration for your next reading experience.
By signing up, you agree to our Privacy Policy.