ML20080N774

From kanterella
Jump to navigation Jump to search
Uncertainty Analysis Technology
ML20080N774
Person / Time
Issue date: 03/23/2020
From: Nathan Siu
Office of Nuclear Regulatory Research
To:
N. Siu
References
Download: ML20080N774 (61)


Text

Technology for the Treatment of Uncertainties:

History, Status, Commentary and Challenges Nathan Siu Senior Technical Adviser for PRA Analysis U.S. Nuclear Regulatory Commission Expanded version of a presentation originally developed for CRIEPI/NRRC and OECD/NEA Workshop on the Proper Treatment of Uncertainties in Reactor Safety Assessment March, 2020

2 Foreword On December 19, 2019, the Nuclear Risk Research Center (NRRC) of the Japan Central Research Institute of Electric Power Industry (CRIEPI) and the Organization for Economic Cooperation (OECD) Nuclear Energy Agency (NEA) invited the author to participate in a workshop on the improvement and enhancement of risk-informed decision making (RIDM) processes in reactor safety assessment. The workshop, titled A Workshop on the Proper Treatment of Uncertainties in Reactor Safety Assessment, was to be held on May 26-27, 2020 in Tokyo, Japan. At the request of the workshop organizers, the authors talk was to be titled Technology for the Treatment of Uncertainties: History, Status, and Some Challenges. On March 12, due to travel restrictions arising from the covid-19 pandemic, the author was directed to withdraw from the workshop. The following slides are an expanded version of the talk the author was planning on presenting.

3 Outline

  • Framework for discussion

- Parameter Uncertainties

- Model Uncertainties

- Completeness Uncertainties

- Communication

  • Current state of practice
  • History
  • Commentary and challenges tech*nol*o*gy, n. the sum of techniques, skills, methods, and processes used in the production of goods or services or in the accomplishment of objectives, such as scientific investigation. [Wikipedia]

In this talk:

technology {methods, models, computational tools, guidance, data}

4 DISCUSSION FRAMEWORK What are we talking about?

5 Context for Treatment of Uncertainties: Risk-Informed Decisionmaking (RIDM)

P{XlC,H}

subjective proposition conditions knowledge Discussion Framework Adapted from NUREG-2150

6 Parameter, Model, and Completeness Uncertainty:

A Practical Categorization M (Model of the World):

Scope, structure i: Parameters

Universe Known Unknowns Unknown Unknowns Discussion Framework mod*el, n. a representation of reality created with a specific objective in mind.

A. Mosleh, N. Siu, C. Smidts, and C. Lui, Model Uncertainty: Its Characterization and Quantification, Center for Reliability Engineering, University of Maryland, College Park, MD, 1995. (Also NUREG/CP-0138, 1994)

PRA models for NPPs Typically an assemblage of sub-models with parameters Implicitly include issues considered but not explicitly quantified

7 Parameter, Model, and Completeness Uncertainty:

A Practical Categorization M (Model of the World):

Scope, structure i: Parameters

Universe Known Unknowns Unknown Unknowns Discussion Framework mod*el, n. a representation of reality created with a specific objective in mind.

A. Mosleh, N. Siu, C. Smidts, and C. Lui, Model Uncertainty: Its Characterization and Quantification, Center for Reliability Engineering, University of Maryland, College Park, MD, 1995. (Also NUREG/CP-0138, 1994)

PRA models for NPPs Distinctions are not necessarily crisp Regardless of allocation to categories, need to consider in characterization of uncertainties

8 Parameter Uncertainty: An Example

  • Parameter of interest: frequency of flooding ()
  • Prior state-of-knowledge: minimal
  • Evidence: 10 events over 1877-2017 (140 years)
  • Posterior state-of-knowledge:

Date Flood Height (ft) 3/19/1936 36.5 6/1/1889 34.8 10/16/1942 33.8 10/1/1896 33.0 11/6/1985 30.1 9/8/1996 29.8 1/21/1996 29.4 11/25/1877 29.2 4/27/1937 29.0 6/23/1972 27.7 0.00 0.05 0.10 0.15 0.20 0.25 0.30 Probability Density

Flood Frequency (/yr) 05 = 0.040/yr 50 = 0.069/yr 95 = 0.11/yr mean = 0.071/yr prior posterior Discussion Framework return period = 12 yr 1880 1900 1920 1940 1960 1980 2000

9 Hurricane Andrew: 8/22/1992, 1200 UTC (about 2 days before FL landfall)

Plot adapted from University of Wisconsin-Milwaukee (https://web.uwm.edu/hurricane-models/models/archive/)

Model Uncertainty:

Hurricane Example Discussion Framework

10 https://en.wikipedia.org/wiki/Hurricane_Irma#/media/File:Irma,_Jose_and_Katia_2017-09-07.png Completeness Uncertainty:

Multiple Hurricane Example (A Known Unknown)

Irma Jose Katia Discussion Framework Turkey Point

11 Risk Communication (Internal)

Discussion Framework Other Considerations Current regulations Safety margins Defense-in-depth Monitoring Quantitative Qualitative Adapted from NUREG-2150

12 CURRENT STATE-OF-PRACTICE What do people do now?

13 State-of-Practice: Parameter Uncertainties

  • Treatment involves Estimation (including expert elicitation)

Propagation

  • Straightforward mathematics and mechanics
  • Some practical challenges Current State of Practice

14 State-of-Practice:

Model Uncertainties

  • Important to acknowledge and treat (in context of decision)
  • Multiple approaches

- Consensus model

- Sensitivity analysis

- Weighted alternatives (e.g., SSHAC)

- Output uncertainties Current State of Practice Hurricane Andrew 8/22/1992, 1200 UTC Adapted from University of Wisconsin-Milwaukee (https://web.uwm.edu/hurricane-models/models/archive/)

Adapted from V.M. Andersen, Seismic Probabilistic Risk Assessment Implementation Guide, EPRI 3002000709, Electric Power Research Institute, Palo Alto, CA, December 2013 M.H. Salley and A. Lindeman, Verification and Validation of Selected Fire Models for Nuclear Power Plant Applications, NUREG-1824 Supplement 1/EPRI 3002002182, November 2016.

15 State-of-Practice:

Completeness Uncertainties

  • Potential concerns

- Known gaps (missing scope)

  • Scenario categories
  • Contributors within categories

- Unknown gaps

- Heuristics/biases

  • Excessive amplification (fear of the dark)
  • Excessive discounting (out of sight, out of mind)
  • Treatment

- Analysis guidance

- Additional analysis/R&D

- Risk-informed decisionmaking NUREG-1855 Rev. 1 (2017)

Options:

Progressive analysis (screening, bounding, conservative, detailed)

Change scope of risk-informed application RG 1.174 Rev. 3 (2019)

Current State of Practice

16 State-of-Practice: Internal Risk Communication

  • Often implicit (focus on mean values)
  • Various graphic displays
  • Includes story as well as numbers Current State of Practice Documents and Presentations (Flatland)

Interactive Discussion (Storytelling)

Likelihood Class 5 (10-5/yr) 4 (10-4/yr) 3 (10-3/yr) 2 (10-2/yr) 1 (10-1/yr)

Severity Class A

Marginal Undesirable Undesirable Critical Critical B

Marginal Marginal Undesirable Undesirable Critical C

No Action Marginal Marginal Undesirable Undesirable D

No Action No Action Marginal Marginal Undesirable E

No Action No Action No Action Marginal Marginal

17 A BRIEF HISTORY How did we get here?

18 A Series of Challenges and Responses 1940 1950 1960 1970 1980 1990 2000 2010 2020 Hanford to WASH-1400 Early PRAs Expansion Across Industry Modern Applications History

19 TMI-2 From Hanford to WASH-1400 SGHWR analysis WASH-740 For more information: T.R. Wellock, A Figure of Merit: Quantifying the Probability of a Nuclear Reactor Accident, Technology and Culture, 58, No. 3, July 2017, pp. 678-721.

Credible Accident System reliability studies Recommend:

accident chain analysis Hanford AEC/NRC UKAEA Technical Challenges: 1) Quantifying accident probability

2) Means to communicate risk not in the generation of the ACRS members present Farmer Curve WASH-1400 Estimates:

OpE (pessimistic)

Decomposition (optimistic)

History Windscale 1950 1960 1970 1980 System reliability studies

20 WASH-1400 Uncertainties (Level 1)

WASH-1400: it is reasonable to believe that the core melt probability of about 5x10-5 per reactor-year predicted by this study should not be significantly larger and would almost certainly not exceed the value of 3x10-4 which has been estimated as the upper bound for core melt probability.

Risk Assessment Review Group (NUREG/CR-0400):

We are unable to define whether the overall probability of a core melt given in WASH-1400 is high or low, but we are certain that the error bands are understated. We cannot say by how much.

1.E-05 1.E-04 1.E-03 CDF (/ry)

WASH-1400 Uncertainties (Estimated*)

Surry Peach Bottom 5th 50th 95th mean

  • Based on data from Tables V 3-14 (PWR) and 3-16 (BWR) of Appendix V, assuming distributions are lognormal; median values are somewhat higher than reported in Section 7.3.1 of the Main Report.

History

21 TMI-2 Chernobyl Some Early Developments and PRAs Challenges: 1) Filling known gaps (completeness uncertainty)

2) Clarifying meaning: models and results Clinch River (LMFBR)

Limerick Millstone Seabrook (full scope)

Fleming

(-factor)

Zion (full scope)

TMI-1 (full scope)

Oconee (full scope) 1980 1985 1975 Apostolakis (subjective probability)

Forsmark Koeberg

(~WASH-1400)

Super Phénix (FBR DHR)

AIPA (HTGR)

USDOE NRC US Industry International Other Notable Kaplan/

Garrick (risk)

History EC/JRC Benchmarks (systems, CCF, HRA)

RSSMAP/IREP Sizewell

(+DI&C)

Indian Point (full scope)

Oyster Creek

(+seismic)

Biblis

(+aircraft)

NUREG/CR-2300

22 Sample Level 1 Results Display History

23 Sample Results - Sub-Model Uncertainty Effect History Effects of fire model (COMPBRN) uncertainty on fire growth time N. Siu, "Modeling Issues in Nuclear Plant Fire Risk Analysis," in EPRI Workshop on Fire Protection in Nuclear Power Plants, EPRI NP-6476, J.-P. Sursock, ed., August 1989, pp. 14-1 through 14-16.

24 Sample Results - Model Uncertainty (User Effect)

Damage State Frequency (/yr), Review Damage State Frequency (/yr), Original 10-10 10-8 10-6 10-4 10-10 10-8 10-6 10-4 Early core melt, containment cooling Early core melt, no containment cooling Steam generator tube rupture Containment bypass Direct containment failure Late core melt, containment cooling Late core melt, no containment cooling 1.E-11 1.E-10 1.E-09 1.E-08 1.E-07 1.E-06 1.E-05 1.E-04 1.E-03 1.E-11 1.E-10 1.E-09 1.E-08 1.E-07 1.E-06 1.E-05 1.E-04 1.E-03 Original Review Internal Events 1.E-11 1.E-10 1.E-09 1.E-08 1.E-07 1.E-06 1.E-05 1.E-04 1.E-03 1.E-11 1.E-10 1.E-09 1.E-08 1.E-07 1.E-06 1.E-05 1.E-04 1.E-03 Original Review External Events Data source: G.J. Kolb, et al., Review and Evaluation of the Indian Point Probabilistic Safety Study, NUREG/CR-2934, December 1982.

(ML091540534)

History

25 Chernobyl 9/11 Expansion Across Industry (US)

Technical challenges: 1) Characterizing the fleet (variability)

2) Developing confidence for mainstreaming RIDM 1985 1990 2000 1995 GL 88-20 GL 88-20 Supplement 4 NUREG-1560 NUREG-1742 NUREG-1150 (final)

NUREG-1150 (draft)

Severe Accident Policy Statement Safety Goal Policy Statement PRA Policy Statement ASP Plant Class Models 1982 SPAR Models History NRC US Industry IPEs IPEEEs

26 NUREG-1150 Estimated* Uncertainties (Level 1)

Model Uncertainty Model Uncertainty

  • Notes: totals shown in this 1)

NUREG-1150 does not aggregate the hazard-specific results. The totals shown are rough estimates assuming that the NUREG-1150 distributions are lognormal.

2)

The WASH-1400 distributions are based on data from Tables V 3-14 (PWR) and 3-16 (BWR) of Appendix V, assuming that the distributions are lognormal. The median values are somewhat higher than reported in Section 7.3.1 of the Main Report History

27 IPE/IPEEE - Variability Across Fleet 0

10 20 30 40 Number BWR PWR CDF (/ry) 1x10-6 3x10-6 1x10-5 3x10-5 1x10-4 3x10-4 1x10-3 Internal Events + Internal Floods 0

10 20 30 40 Number BWR PWR CDF (/ry) 1x10-6 3x10-6 1x10-5 3x10-5 1x10-4 3x10-4 1x10-3 Total History

28 9/11 The Modern Era (US)

Technical challenges: 1) RIDM issues (e.g., realism, heterogeneity, aggregation)

2) Post-Fukushima issues (e.g., external hazards)
3) New/advanced reactors (e.g., conduct of operations)

NUREG-1855 History Fukushima RG 1.174 ASME PRA Standard 10 CFR 50.48(c)

(Fire Protection)

Risk-Informed ROP NFPA 805 NUREG-2150 NTTF Request for Information (Reevaluations) 2000 2010 2020 2005 2015 NRC US Industry SECY-98-144 Risk-Informed License Amendment Requests (LARs)

SAMAs (Life Extension)

SPAR Models NFPA 805 LARs (Fire Protection)

29 Variability in Recent Results (Level 1)

History 0.00 0.05 0.10 0.15 0.20 0.25 0.30 0.35

-6.0

-5.5

-5.0

-4.5

-4.0

-3.5

-3.0 1E-6 1E-5 1E-4 1E-3 CDF (per reactor year)

Fraction of Plants Highest Reported:

1.3x10-4 Lowest Reported:

3.5x10-6 Population Mean:

4.7x10-5

30 Variability in Results - Comparison with IPE/IPEEE History 0.00 0.10 0.20 0.30 0.40 0.50 1

2 3

4 5

6 7

8 9

10 NFPA 805 IPE/IPEEE 0.01 0.1 1

10 100 1000 Fire CDF/Internal Events CDF Fraction of PRAs 0.00001 0.0001 0.001 1.00E-05 1.00E-04 1.00E-03 Total CDF (IPE + IPEEE)

Total CDF (Recent LARs) 1E-5 1E-4 1E-3 1E-5 1E-4 1E-3

31 COMMENTARY AND CHALLENGES Where might we do better and how?

32 An Important Note

  • Challenges regarding the treatment of uncertainty in PRA and RIDM exist for non-probabilistic approaches as well; the PRA/RIDM approach acknowledges these challenges explicitly.
  • The following slides are not a critique of the overall PRA/RIDM philosophy - they should be viewed in the framework of continuous improvement.

Commentary and Challenges

33 A Changing World

  • Evolving situation*

- market forces

- new nuclear technologies

- new analytical methods and data

- new professionals

  • Increased reliance on risk models, characterization of uncertainties
  • See Applying the Principles of Good Regulation as a Risk-Informed Regulator, October 15, 2019 (ADAMS ML19260E683)

Commentary and Challenges

34 Reminder: Parameter Uncertainties and Mean Values Commentary and Challenges: Parameter Uncertainties Mean = 7.6 x 10-5 /yr 95th = 2.6 x 10-4 /yr 50th (Median) = 3.9 x 10-5 /yr probability density function frequency (/yr)

Mean 0

Mathematically defined Affected by tail Does not correspond to a specific percentile

35 Parameter Uncertainties: Challenges

  • Quantification generally required, diverse views on value added
  • Technical challenges:

- Effect of data pre-processing

  • Selection
  • Interpretation

- Effect of analysis shortcuts

  • Standard prior distributions
  • Simplified expert elicitation
  • Independence assumption

- Ensuring correspondence with actual state-of-knowledge

  • Basic events (micro)
  • Overall results (macro)

Commentary and Challenges: Parameter Uncertainties 1.E-09 1.E-08 1.E-07 1.E-06 1.E-05 1.E-04 1.E-03 Probability Density Function (Normalized)

Failure Rate (/hr)

Service Water Normally Running Standby 2015 Industry-wide estimates from: https://nrcoe.inl.gov/resultsdb/AvgPerf/

Service Water Pumps: 2 failures in 16,292,670 hours0.00775 days <br />0.186 hours <br />0.00111 weeks <br />2.54935e-4 months <br /> Normally Running Pumps: 225 failures in 59,582,350 hours0.00405 days <br />0.0972 hours <br />5.787037e-4 weeks <br />1.33175e-4 months <br /> Standby Pumps (1st hour operation): 48 failures in 437,647 hours0.00749 days <br />0.18 hours <br />0.00107 weeks <br />2.461835e-4 months <br />

36 Model Uncertainties - Commentary

  • Model uncertainties can be large; importance depends on decision
  • Some practical approaches (e.g., consensus models, deterministic screening) can understate uncertainties
  • Subjective probability framework =>

- Need to include user effect

- Raises question regarding fundamental meaning of weighted average approaches

Hurricane Irma: 9/8/2017, 0000 UTC (about 2 days before FL landfall)

Outer prediction is closest to actual course

37 Model Uncertainty User Effects: HRA Example 1 Same method, different teams Same team, different methods All teams, all methods NRI, CREAM NRI, DT+ASEP NRC, SPAR-H INL, SPAR-H A Bye, et al., International HRA Empirical Study, NUREG/IA-0216, August 2011.

Commentary and Challenges: Model Uncertainties

38 Model Uncertainty User Effects: HRA Example 2 HFE 2A HFE 1C HFE 1A HFE 3A HFE 1B Decreasing difficulty Human Error Probability (HEP) 1.0E+0 1.0E-1 1.0E-2 1.0E-3 1.0E-4 1.0E-5 ASEP Team 1 ASEP Team 2 SPAR-H Team 1 SPAR-H Team 2 CBDT & HCR/ORE Team 1 CBDT & HCR/ORE Team 2 CBDT & HCR/ORE Team 3 ATHEANA Team 1 ATHEANA Team 2 Empirical 95th Percentile Empirical 5th Percentile Adapted from NUREG-2156 Commentary and Challenges: Model Uncertainties

39 Challenges: Quantification of Model Output Uncertainty

  • Bayesian methods

- Framework consistent with overall PRA

- Early approaches used in past PRAs

- Can address practical issues (e.g., non-homogeneous data)*

  • Challenges include

- Uncertainties in unmeasured parameters

- Sub-model limits of applicability

- Representativeness of computed results Time (s)

Experiment (K)

DRM (K) 180 400 450 360 465 510 720 530 560 840 550 565

  • See E. Droguett and Ali Mosleh, Bayesian methodology for model uncertainty using model performance data, Risk Analysis, 28, No. 5, 1457-1476, 2008.

Commentary and Challenges: Model Uncertainties Temperature (K)

Percentile Assume Homogeneous Data Assume Non-Homogeneous Data 1st 415.2 372.8 5th 437.5 400.7 50th 457.1 470.5 95th 479.7 559.4 99th 509.1 608.7 Data Output Uncertainty

40 Completeness Uncertainty

  • Sources

- Known gaps (missing scope)

- Unknown gaps

  • Concerns

- Excessive amplification (Fear of the dark)

- Excessive discounting (availability heuristic:

Out of sight, out of mind)

It would cease to be a danger if we could define it.

Sherlock Holmes (The Adventure of the Copper Beeches)

Commentary and Challenges: Completeness Uncertainties B. Fischhoff, P. Slovic, S. Lichtenstein, Fault trees: Sensitivity of estimated failure probabilities to problem representation, Journal of Experimental Psychology: Human Perception and Performance, 4(2), May 1978, 330-344.

Car Wont Start Battery Charge Insufficient Starting System Defective Ignition System Defective Mischievous Acts Of Vandalism All Other Problems Fuel System Defective Other Engine Problems

41 Known Gaps (Known Unknowns)

  • Broad scenario categories
  • Contributors within categories
  • Technology = {methods, models, tools, data}

Rationale Common Example(s)

Out of scope security/sabotage, operation outside approved limits Low significance (pre-analysis judgment) external floods (many plants pre-Fukushima)

Appropriate PRA technology* unavailable management and organizational factors PRA not appropriate software, security Category Example(s)

External hazards multiple coincident or sequential hazards Human reliability errors of commission, non-proceduralized recovery Passive systems thermal-hydraulic reliability Commentary and Challenges: Completeness Uncertainties

42 Unknown Unknowns: You Say Tomto Model Known Unknowns Unknown Unknowns

  • Explicit or implicit?
  • Extent of coverage?
  • Known by whom?
  • Known when?
  • Time from idea to theory to PRA implementation?

Viewpoint Precise classification is important only if it affects:

  • Understanding
  • Communication
  • Decision making Commentary and Challenges: Completeness Uncertainties

43 Unknown Unknowns: A Demonstrated Problem?

Model Known Unknowns Unknown Unknowns Then (a surprise?)

Now (treated in current PRAs?)

Browns Ferry fire (1975) - a long-recognized hazard; not in draft WASH-1400 but routinely treated now Chernobyl (1986) - precursor at Leningrad (1975); non-routine test during shutdown in any LPSD analyses?

TMI (1979) - precursors include Davis-Besse (1977); operator EOCs not in models; current recognition and some explorations Blayais flood (1999) - external floods often screened at time; current recognition, multi-hazard under development Maanshan HEAF/SBO (2001) - HEAF phenomenon known, in any PRAs at time? Now included as an initiator; smoke effect?

Davis-Besse RPV corrosion (2002) - RPV failure analyses focused on crack propagation; M&O failure not in PRAs Fukushima Daiichi (2011) - precursors: Blayais (1999), Indian Ocean (2004), hazard under review at time; PRA models under development Commentary and Challenges: Completeness Uncertainties

44 Illuminating Uncertainties: From Lampposts to Search Beacons Wheres the goat???

Commentary and Challenges: Completeness Uncertainties

45 What Can We (PRA R&D) Do?

  • Continue to develop technology to address known gaps

- Risk-informed prioritization

- Fully engage appropriate disciplines

- Take advantage of general computational and methodological developments

  • Facilitate re-emphasis on searching

- Demonstrate efficiency and effectiveness with current tools (e.g., MLD, HBFT) vs.

checklist/screening

- Develop improved tools (including OpE mining)

Event (NUREG/CR-4839), 1992 Aircraft impact Avalanche Coastal erosion Drought External flooding Extreme winds and tornadoes Fire Fog Forest fire Frost Hail High tide, high lake level, or high river stage

Commentary and Challenges: Completeness Uncertainties

46 Sources of Breakdowns: Risk Communication Between Risk Managers and Public*

  • Differences in perception of information

- Relevance

- Consistency with prior beliefs

  • Lack of understanding of underlying science
  • Conflicting agendas
  • Failure to listen
  • Trust Commentary and Challenges: Internal Risk Communication
  • J.L. Marble, N. Siu, and K. Coyne, Risk communication within a risk-informed regulatory decision-making environment, International Conference on Probabilistic Safety and Assessment (PSAM 11/ESREL 2012), Helsinki, Finland, June 25-29, 2012. (ADAMS ML120480139)

47 Risk Information: Inherently Complex Hyperdimensional

- Scenarios

- Likelihood

- Multiple consequence measures Heterogeneous

- Qualitative and quantitative

- Multiple technical disciplines Dynamic

- System changes (e.g., different operational modes, effects of decisions)

- Changing information (learning, adding/discounting data)

- New applications (and contexts)

Uncertain

- Sparse or non-existent data

- Outside range of personal experience Will somebody find me a one-handed scientist?!

- Senator Edmund Muskie (Concorde hearings, 1976)

I. Flatow, Truth, Deception, and the Myth of the One-Handed Scientist, October 18, 2012. Available from:

https://thehumanist.com/magazine/november-december-2012/features/truth-deception-and-the-myth-of-the-one-handed-scientist Commentary and Challenges: Internal Risk Communication

48 and the World is changing

  • Experiences, knowledge
  • Information content and delivery preferences
  • Comfort with analytics, risk, probability

- P.S. Dull, 1978 P.S. Dull, A Battle History of the Imperial Japanese Navy (1941-1945), Naval Institute Press, Annapolis, MD, 1978

49 Addressing Complexity (and Escaping Flatland)

  • Tufte model: use rich displays and reports, encourage user to explore

- Promotes active involvement of decision maker

- Increases general trust?

  • A graduated technical approach to assist?

Interface Interaction Mode Hyperlinked dashboards, reports Manual Video AI assist Visual immersion Multisensory immersion Time Commentary and Challenges: Internal Risk Communication Target audience(s)

- Heterogeneous

- Changing

- Constrained resources Schema

- No standards:

currently an art

- Solutions being developed intuitively; no scientific testing Continuing Challenges

50 Graphic adapted from https://www.flickr.com/photos/83823904@N00/64156219/

(permission CC-BY-2.0)

From Static to Interactive Dashboard to Sci-Fi?

M. Korsnick, Risk Informing the Commercial Nuclear Enterprise, Promise of a Discipline: Reliability and Risk in Theory and in Practice, University of Maryland, April 2, 2014.

Commentary and Challenges: Internal Risk Communication

51 Closing Remarks

  • RIDM, enabled by PRA, provides a practical approach to safety-related decisionmaking under uncertainty
  • Appropriate application of RIDM requires appropriate characterization and communication of uncertainties, supported by technology
  • Moving forward: bold exploration or avoidance?

Jason/(Momotar) or Pandora/

(Urashima Tar)?

Many calculations bring success; few calculations bring failure. No calculations at all spell disaster!

- Sun-Tzu (The Art of War)

52 Acknowledgments The author gratefully acknowledges helpful suggestions by G. Apostolakis, A. Mosleh, and M. Cheok on presentation structure, approach, and content, and technical information provided by M. Kazarians and J. Nakoski.

53 ADDITIONAL SLIDES

54 Reasonable Assurance of Adequate Protection 1940 1950 1960 1970 Atomic Energy Act (AEA) protect health and minimize danger AEA (Amended) adequate protection to the health and safety of the public AEC Chairman recognize every possible event assure that the probability of a mishap is satisfactorily low AEC Staff Credible Accident MIT Proposal for reactor risk study UKAEA Staff Farmer Curve TRG Report 1949(R)

SGHWR analysis WASH-740 Atoms for Peace Conf.

UKAEA call for comprehensive safety assessment NRC Letter reasonable assurance of adequate protection USAEC/USNRC UKAEA

55 Parameter Uncertainties:

Some Historical Results Industry results from: Garrick, B.J., Lessons learned from 21 nuclear plant probabilistic risk assessments, Nuclear Technology, 84, No. 3, 319-339(1989).

56 Uncertainty Reduction - Perspective Depends on Scaling

57 Early Views on Completeness W. F. Libby (Acting Chairman, AEC) - March 14, 1956 response to Senator Hickenlooper: it is incumbent upon the new industry and the Government to make every effort to recognize every possible event or series of events which could result in the release of unsafe amounts of radioactive material to the surroundings and to take all steps necessary to reduce to a reasonable minimum the probability that such events will occur in a manner causing serious overexposure to the public. [Emphasis added]

  • L. Silverman (Chairman, ACRS) - October 22, 1960 letter to AEC Chairman John A. McCone: We believe that a searching analysis which is necessary at this stage [reactor siting approval] should be done independently by the owner of the reactor [Emphases added]

58 ACRS Concerns with WASH-1400 Methodology*

Topic Signature Events[1]

Post-WASH-1400 Accident initiator quantification (Presumably external events)

Fukushima Extensive treatment: fires, earthquakes Inconsistent treatment: floods Atypical reactors Fermi 1 [2]

Multiple PRAs for non-LWRs Design errors

[3]

Many design and operational improvements identified by PRAs; database includes events involving design problems Operator error quantification TMI-2 Multiple methods emphasizing importance of context; still an active area of development Consequence modeling Chernobyl, Fukushima Continuing, evolutionary improvements (MACCS)

Data Many Improved hardware database; fits and starts with HRA; extreme natural hazards a continuing challenge

  • ACRS letter to Congressman Udall re: adequacy for estimating likelihood of low probability/high consequence events (Dec. 16, 1976)

Table Notes:

1.

Events whose key characteristics (for the given topic) might not have been captured by a WASH-1400 vintage analysis.

2.

Fermi 1 had limited fuel melting. However, without an analysis, it isnt clear if a WASH-1400 vintage analysis would have captured this scenario.

3.

Design weaknesses have played a role in multiple events. More detailed review is needed to determine if: a) these are errors, and b) if they would have been missed by a WASH-1400 vintage analysis.

59 Empirical Experience Accidents Year Plant(s)

Precursor?

1979 TMI Davis-Besse (1977) 1986 Chernobyl Leningrad (1975) 2011 Fukushima Blayais (1999)

Some Significant* U.S. Precursors Year Plant(s)

Notes 1975 Browns Ferry Worst precursor Fire => loss of U1 ECCS 1978 Rancho Seco Next worst precursor Human error (maintenance) => loss of NNI, LOFW 2002 Davis-Besse Most recent significant precursor Multiple human/organizational faults

=> RPV head corrosion

  • Per Accident Sequence Precursor (ASP) program

60 Some Other Interesting International Events Year Plant(s)

Scenario Type Notes 1957 Windscale 1 (UK)

Fire Graphite fire in core, release to environment.

1975 Greifswald 1 (East Germany)

Fire Power cable fire, loss of main feedwater, pressurizer safety valves fail to re-seat.

1977 Gundremmingen A (East Germany)

LOOP/LOCA Partial loss of offsite power (LOOP) and subsequent loss of cooling accident (LOCA) with internal flooding.

1978 Beloyarsk 2 (Soviet Union)

Fire Turbine Building fire spreads into Main Control Room, collapses Turbine Building roof.

1981 Hinkley Point A-1, A-2 (UK)

External Flood; LOOP (weather)

Severe weather LOOP and loss of ultimate heat sink (LOUHS).

1982 Armenia 1 (Soviet Union)

Fire Fire-induced station blackout (SBO).

1989 Vandellos 1 (Spain)

Fire Fire-induced internal flood.

1991 Chernobyl 2 (Soviet Union)

Fire Fire-induced Turbine Building roof collapse.

1993 Narora 1 (India)

Fire Fire-induced SBO.

1993 Onagawa 1 (Japan)

Reactivity Excursion Seismically-induced reactivity excursion.

1999 Blayais 1, 2 (France)

External Flood Severe weather LOOP and partial LOUHS.

2001 Maanshan 1 (Taiwan)

LOOP (Weather); Fire (HEAF)

Severe weather LOOP and subsequent SBO.

2003 Pickering 4-8; Darlington 1, 2, and 4; Bruce 3, 4, and 6 (Canada);

Fermi 2, Fitzpatrick, Ginna, Indian Point 2 and 3, Nine Mile Point 1 and 2, Oyster Creek, Perry (U.S.)

LOOP (weather)

Northeast Blackout.

2004 Madras 2 (India)

External Flood Tsunami-induced LOUHS.

2009 Cruas 2-4 (France)

External Flood LOUHS due to flood debris.

2011 Fukushima Dai-ichi 5-6, Fukushima Dai-ni 1-4, Onagawa 1-3, Tokai Dai-ni, Higashidori 1-2 (Japan)

External Flood Earthquake-and tsunami-induced incidents (in addition to accidents at Fukushima Dai-ichi 1-3).

61 External Hazards Scenario-Based Classification:

An Aid for Completeness?