ML22334A057: Difference between revisions

From kanterella
Jump to navigation Jump to search
(StriderTol Bot insert)
 
(StriderTol Bot change)
Line 2: Line 2:
| number = ML22334A057
| number = ML22334A057
| issue date = 11/30/2022
| issue date = 11/30/2022
| title = Draft NEI 99-02 Revision 8 Alert and Notification System Reliability
| title = Draft NEI 99-02 Revision 8 Alert and Notification System Reliability
| author name =  
| author name =  
| author affiliation = NRC/NSIR/DPR/POB
| author affiliation = NRC/NSIR/DPR/POB
Line 17: Line 17:
=Text=
=Text=
{{#Wiki_filter:DRAFT NEI 99-02 [Revision 8]
{{#Wiki_filter:DRAFT NEI 99-02 [Revision 8]
XX/XX/2021 ALERT AND NOTIFICATION SYSTEM RELIABILITY Purpose This indicator monitors the reliability of the offsite Alert and Notification System (ANS), a critical link for alerting and notifying the public of the need to take protective actions. It provides the percentage of the sirens that are capable of performing their safety function based on regularly scheduled tests.
XX/XX/2021
Indicator Definition The percentage of ANS sirens that are capable of performing their function, as measured by periodic siren testing in the previous 12 months.
 
ALERT AND NOTIFICATION SYSTEM RELIABILITY
 
Purpose
 
This indicator monitors the reliability of the offsite Alert and Notification System (ANS), a critical link for alerting and notifying the public of the need to take protective actions. It provides the percentage of the sirens that are capable of performing their safety function based on regularly scheduled tests.
 
Indicator Definition
 
The percentage of ANS sirens that are capable of performing their function, as measured by periodic siren testing in the previous 12 months.
 
Periodic tests are the regularly scheduled tests (documented in the licensees test plan or guidelines) that are conducted to actually test the ability of the sirens to perform their function (e.g., silent, growl, siren sound test). Tests performed for maintenance purposes should not be counted in the performance indicator database. Actions that could affect the as found condition of sirens prior to testing are not allowed.
Periodic tests are the regularly scheduled tests (documented in the licensees test plan or guidelines) that are conducted to actually test the ability of the sirens to perform their function (e.g., silent, growl, siren sound test). Tests performed for maintenance purposes should not be counted in the performance indicator database. Actions that could affect the as found condition of sirens prior to testing are not allowed.
Data Reporting Elements The following data are reported: (see clarifying notes)
 
Data Reporting Elements
 
The following data are reported: (see clarifying notes)
* the total number of ANS siren-tests during the previous quarter
* the total number of ANS siren-tests during the previous quarter
* the number of successful ANS siren-tests during the previous quarter Calculation The site value for this indicator is calculated as follows:
* the number of successful ANS siren-tests during the previous quarter
                  # of succesful siren - tests in the previous 4 qtrs x100 total number of siren - tests in the previous 4 qtrs Definition of Terms Siren-Tests: the number of sirens times the number of times they are tested. For example, if 100 sirens are tested 3 times in the quarter, there are 300 siren-tests.
 
Calculation
 
The site value for this indicator is calculated as follows:
 
# x ofsuccesfulsiren-testsin theprevious4qtrs100 total number of siren -tests in the previous 4 qtrs
 
Definition of Terms
 
Siren-Tests: the number of sirens times the number of times they are tested. For example, if 100 sirens are tested 3 times in the quarter, there are 300 siren-tests.
 
Successful siren-tests are the sum of sirens that performed their function when tested. For example, if 100 sirens are tested three times in the quarter and the results of the three tests are:
Successful siren-tests are the sum of sirens that performed their function when tested. For example, if 100 sirens are tested three times in the quarter and the results of the three tests are:
first test, 90 performed their function; second test, 100 performed their function; third test, 80 performed their function. There were 270 successful siren-tests.
first test, 90 performed their function; second test, 100 performed their function; third test, 80 performed their function. There were 270 successful siren-tests.
Clarifying Notes The purpose of the ANS PI is to provide a uniform industry reporting approach and is not intended to replace the FEMA Alert and Notification reporting requirement at this time.
 
Clarifying Notes
 
The purpose of the ANS PI is to provide a uniform industry reporting approach and is not intended to replace the FEMA Alert and Notification reporting requirement at this time.


DRAFT NEI 99-02 [Revision 8]
DRAFT NEI 99-02 [Revision 8]
XX/XX/2021 For those sites that do not have sirens, the performance of the licensees alert and notification system will be evaluated through the NRC baseline inspection program. A site that does not have sirens does not report data for this indicator A site with a FEMA-approved primary public alerting method(s) that does not use sirens will not report data for this indicator and may stop reporting data beginning with the quarter the method is implemented (e.g., implementation of IPAWS to replace sirens as the primary alerting method). In this case, the licensees ANS will be evaluated through the NRC baseline inspection program. When reporting ROP Cornerstone PI data, a licensee that does not use sirens as the primary public alerting method should leave the ANS PI data fields blank (i.e., no entries) and add a comment stating: Sirens are not part of the sites primary ANS. ANS will be inspected via IP 71114.02..
XX/XX/2021
 
For those sites that do not have sirens, the performance of the licensees alert and notification system will be evaluated through the NRC baseline inspection program. A site that does not have sirens does not report data for this indicator A site with a FEMA-approved primary public alerting method(s) that does not use sirens will not report data for this indicator and may stop reporting data beginning with the quarter the method is implemented (e.g., implementation of IPAWS to replace sirens as the primary alerting method). In thi s case, the licensees ANS will be evaluated through the NRC baseline inspection program. When reporting ROP Cornerstone PI data, a licensee that does not use sirens as the primary public alerting method should leave the ANS PI data fields blank (i.e., no entries) and add a comment stating: Sirens are not part of the sites primary ANS. ANS will be inspected via IP 71114.02..
 
If a siren is out of service for maintenance or is inoperable at the time a regularly scheduled test is conducted, then it counts as both a siren test and a siren failure. Regularly scheduled tests missed for reasons other than siren unavailability (e.g., out of service for planned maintenance or repair) should be considered non opportunities. The failure to perform a regularly scheduled test should be noted in the comment field. Additionally, if the sirens are not available for operation because of intentional actions to disable them, and the area is deemed uninhabitable by State and/or Local agencies, then the siren(s) in question are not required to be counted in the numerator or denominator of the Performance Indicator for testing throughout the event. The conditions causing the suspension of testing, its duration and restoration are to be noted in the comment field for the indicator.
If a siren is out of service for maintenance or is inoperable at the time a regularly scheduled test is conducted, then it counts as both a siren test and a siren failure. Regularly scheduled tests missed for reasons other than siren unavailability (e.g., out of service for planned maintenance or repair) should be considered non opportunities. The failure to perform a regularly scheduled test should be noted in the comment field. Additionally, if the sirens are not available for operation because of intentional actions to disable them, and the area is deemed uninhabitable by State and/or Local agencies, then the siren(s) in question are not required to be counted in the numerator or denominator of the Performance Indicator for testing throughout the event. The conditions causing the suspension of testing, its duration and restoration are to be noted in the comment field for the indicator.
For plants where scheduled siren tests are initiated by local or state governments, if a scheduled test is not performed either intentionally or accidentally, the missed test is not considered as valid test opportunities. Missed test occurrences should be entered in the plants corrective action program.
For plants where scheduled siren tests are initiated by local or state governments, if a scheduled test is not performed either intentionally or accidentally, the missed test is not considered as valid test opportunities. Missed test occurrences should be entered in the plants corrective action program.
If a siren failure is determined to be due only to testing equipment, and subsequent testing shows the siren to be operable (verified by telemetry or simultaneous local verification) without any corrective action having been performed, the siren test should be considered a success.
If a siren failure is determined to be due only to testing equipment, and subsequent testing shows the siren to be operable (verified by telemetry or simultaneous local verification) without any corrective action having been performed, the siren test should be considered a success.
Maintenance records should be complete enough to support such determinations and validation during NRC inspection.
Maintenance records should be complete enough to support such determinations and validation during NRC inspection.
A licensee may change ANS test methodology at any time consistent with regulatory guidance.
A licensee may change ANS test methodology at any time consistent with regulatory guidance.
For the purposes of this performance indicator, only the testing methodology in effect on the first day of the quarter shall be used for that quarter. Neither successes nor failures beyond the testing methodology at the beginning of the quarter will be counted in the PI. (No actual siren activation data results shall be included in licensees ANS PI data.) Any change in test methodology shall be reported as part of the ANS Reliability Performance Indicator effective the start of the next quarterly reporting period. Changes should be noted in the comment field.
For the purposes of this performance indicator, only the testing methodology in effect on the first day of the quarter shall be used for that quarter. Neither successes nor failures beyond the testing methodology at the beginning of the quarter will be counted in the PI. (No actual siren activation data results shall be included in licensees ANS PI data.) Any change in test methodology shall be reported as part of the ANS Reliability Performance Indicator effective the start of the next quarterly reporting period. Changes should be noted in the comment field.
Siren systems may be designed with equipment redundancy, multiple signals or feedback capability. It may be possible for sirens to be activated from multiple control stations or signals.
If the use of redundant control stations or multiple signals is in approved procedures and is part of the actual system activation process then activation from either control station or any signal should be considered a success. A failure of both systems would only be considered one failure, whereas the success of either system would be considered a success. If the redundant control station is not normally attended, requires setup or initialization, it may not be considered as part of the regularly scheduled test. Specifically, if the station is only made ready for the purpose of


DRAFT NEI 99-02 [Revision 8]
Siren systems may be designed with equipment redundancy, multip le signals or feedback capability. It may be possible for sirens to be activated from multiple control stations or signals.
XX/XX/2021 siren tests it should not be considered as part of the regularly scheduled test.
If the use of redundant control stations or multiple signals is in approved procedures and is part of the actual system activation process then activation from either control station or any signal should be considered a success. A failure of both systems would only be considered one failure, whereas the success of either system would be considered a success. If the redundant control station is not normally attended, requires setup or initialization, it may not be considered as part of the regularly scheduled test. Specifically, if the station is only made ready for the purpose of DRAFT NEI 99-02 [Revision 8]
XX/XX/2021
 
siren tests it should not be considered as part of the regularly scheduled test.
 
Actions specifically taken to improve the performance of a scheduled test are not appropriate.
Actions specifically taken to improve the performance of a scheduled test are not appropriate.
The test results should indicate the actual as-found condition of the ANS. Such practices will result in an inaccurate indication of ANS reliability.
The test results should indicate the actual as-found condition of the ANS. Such practices will result in an inaccurate indication of ANS reliability.
Examples of actions that are NOT allowed and DO affect the as found conditions of sirens (not an all-inclusive list):
Examples of actions that are NOT allowed and DO affect the as found conditions of sirens (not an all-inclusive list):
o Preceding test with an unscheduled test with the sole purpose to validate the siren is functional.
 
o Preceding test with an unscheduled test with the sole purpose t o validate the siren is functional.
 
o Prior to a scheduled test, adjustment or calibration of siren system activation equipment that was not scheduled to support post maintenance testing.
o Prior to a scheduled test, adjustment or calibration of siren system activation equipment that was not scheduled to support post maintenance testing.
o Prior to a scheduled test, testing siren system activation equipment or an individual siren(s) unless the equipment is suspected damaged from adverse weather, vandalism, vehicular strikes, etc.
o Prior to a scheduled test, testing siren system activation equipment or an individual siren(s) unless the equipment is suspected damaged from adverse weather, vandalism, vehicular strikes, etc.
o Prior to a scheduled test, testing siren system activation equipment or an individual siren(s) unless the equipment is suspected as being non-functional as a result of a computer hardware or software failure, radio tower failure, cut phone line, etc.
o Prior to a scheduled test, testing siren system activation equipment or an individual siren(s) unless the equipment is suspected as being non-functional as a result of a computer hardware or software failure, radio tower failure, cut phone line, etc.
However, in no case should response preclude the timely correction of ANS problems and subsequent post-maintenance testing, or the execution of a comprehensive preventive maintenance program.
However, in no case should response preclude the timely correction of ANS problems and subsequent post-maintenance testing, or the execution of a comprehensive preventive maintenance program.
Testing opportunities that will be included in the ANS performance indicator are required to be defined in licensee ANS procedures. These are typically: bi-weekly, monthly quarterly and annual tests. The site specific ANS design and testing document approved by FEMA is a reference for the appropriate types of test, however licensees may perform tests in addition to what is discussed in the FEMA report.
Testing opportunities that will be included in the ANS performance indicator are required to be defined in licensee ANS procedures. These are typically: bi-weekly, monthly quarterly and annual tests. The site specific ANS design and testing document approved by FEMA is a reference for the appropriate types of test, however licensees may perform tests in addition to what is discussed in the FEMA report.
Examples of actions that ARE allowed and do not affect the as found conditions of sirens (not an all-inclusive list):
Examples of actions that ARE allowed and do not affect the as found conditions of sirens (not an all-inclusive list):
o Regardless of the time, an unscheduled diagnostic test and subsequent maintenance and repair followed by post maintenance testing after any event that causes actual or suspected damage, such as:
o Regardless of the time, an unscheduled diagnostic test and subsequent maintenance and repair followed by post maintenance testing after any event that causes actual or suspected damage, such as:
: 1. Severe/inclement weather (high winds, lightning, ice, etc.),
: 1. Severe/inclement weather (high winds, lightning, ice, etc.),
Line 57: Line 100:
: 4. Computer hardware and software failures,
: 4. Computer hardware and software failures,
: 5. Damaged communication cables or phone lines.
: 5. Damaged communication cables or phone lines.
DRAFT NEI 99-02 [Revision 8]
DRAFT NEI 99-02 [Revision 8]
XX/XX/2021
XX/XX/2021
: 6. Problems identified by established routine use of the siren feedback systems.
: 6. Problems identified by established routine use of the siren feedback systems.
o Scheduled polling tests for the purpose of system monitoring to optimize system availability and functionality.
o Scheduled polling tests for the purpose of system monitoring to optimize system availability and functionality.
If a siren is out of service for scheduled planned refurbishment or overhaul maintenance performed in accordance with an established program, or for scheduled equipment upgrades, the siren need not be counted as a siren test or a siren failure. However, sirens that are out of service due to unplanned corrective maintenance would continue to be counted as failures. Unplanned corrective maintenance is a measure of program reliability. The exclusion of a siren due to temporary unavailability during planned maintenance/upgrade activities is acceptable due to the level of control placed on scheduled maintenance/upgrade activities. It is not the intent to create a disincentive to performing maintenance/upgrades to ensure the ANS performs at its peak reliability.
 
If a siren is out of service for scheduled planned refurbishment or overhaul maintenance performed in accordance with an established program, or for sch eduled equipment upgrades, the siren need not be counted as a siren test or a siren failure. However, sirens that are out of service due to unplanned corrective maintenance would continue to be counted as failures. Unplanned corrective maintenance is a measure of program reliability. The exclusion of a siren due to temporary unavailability during planned maintenance/upgrade activities is acceptable due to the level of control placed on scheduled maintenance/upgrade activities. It is not the intent to create a disincentive to performing maintenance/upgrades to ensure the ANS performs at its peak reliability.
 
As part of a refurbishment or overhaul plan, it is expected that each utility would communicate to the appropriate state and/or local agencies the specific sirens to be worked and ensure that a functioning backup method of public alerting would be in-place. The acceptable timeframe for allowing a siren to remain out of service for system refurbishment or overhaul maintenance should be coordinated with the state and local agencies. Based on the impact to their organization, these timeframes should be specified in upgrade or system improvement implementation plans and/or maintenance procedures. Deviations from these plans and/or procedures would constitute unplanned unavailability and would be included in the PI.
As part of a refurbishment or overhaul plan, it is expected that each utility would communicate to the appropriate state and/or local agencies the specific sirens to be worked and ensure that a functioning backup method of public alerting would be in-place. The acceptable timeframe for allowing a siren to remain out of service for system refurbishment or overhaul maintenance should be coordinated with the state and local agencies. Based on the impact to their organization, these timeframes should be specified in upgrade or system improvement implementation plans and/or maintenance procedures. Deviations from these plans and/or procedures would constitute unplanned unavailability and would be included in the PI.
Siren testing conducted at redundant control stations, such as county EOCs that are staffed during an emergency by an individual capable of activating the sirens, may be credited provided the redundant control station is in an approved facility as documented in the FEMA ANS design report.
Siren testing conducted at redundant control stations, such as county EOCs that are staffed during an emergency by an individual capable of activating the sirens, may be credited provided the redundant control station is in an approved facility as documented in the FEMA ANS design report.
In initiating EP03 reporting data for a new siren system (where there were no sirens previously),
In initiating EP03 reporting data for a new siren system (where there were no sirens previously),
data is entered over the 12-month period starting from the system implementation date. Each quarter results will be submitted in accordance with NEI 99-02. Zero should be entered for the 12-month quarterly average until four quarters of data have accumulated. The EP03 PI will be valid once four quarters of data have been accrued.
data is entered over the 12-month period starting from the system implementation date. Each quarter results will be submitted in accordance with NEI 99-02. Zero should be entered for the 12-month quarterly average until four quarters of data have accumulated. The EP03 PI will be valid once four quarters of data have been accrued.
FAQs incorporated into this revision of the ANS PI:
FAQs incorporated into this revision of the ANS PI:
* 14-05, Reporting New Siren System Data (ML16285A328)
* 14-05, Reporting New Siren System Data (ML16285A328)
* 21-03, Reporting ANS Data Following a Transition to IPAWS (ML21244A427)
* 21-03, Reporting ANS Data Following a Transition to IPAWS (ML21244A427)
DRAFT NEI 99-02 [Revision 8]
XX/XX/2021
Data Example
Alert & Notification System Reliability
Quarter 3Q/97 4Q/97 1Q/98 2Q/98 3Q/98 4Q/98 Prev. Q Number of succesful siren-tests in the qtr 47 48 49 49 49 54 52 Total number of sirens tested in the qtr 50 50 50 50 50 55 55 Number of successful siren-tests over 4 qtrs 193 195 201 204 Total number of sirens tested over 4 qtrs 200 200 205 210 2Q/98 3Q/98 4Q/98 Prev. Q Indicator expressed as a percentage of sirens 96.5% 97.5% 98.0% 97.1%
Thresholds Green 94%
White <94%
Yellow <90%
Red N/A


DRAFT NEI 99-02 [Revision 8]
ANS Reliability 2Q/98 3Q/98 4Q/98 Prev. QQuarter 100.0%
XX/XX/2021 Data Example Alert & Notification System Reliability Quarter                                        3Q/97    4Q/97  1Q/98    2Q/98    3Q/98 4Q/98        Prev. Q Number of succesful siren-tests in the qtr        47      48      49      49        49        54        52 Total number of sirens tested in the qtr          50      50      50      50        50        55        55 Number of successful siren-tests over 4 qtrs                                193        195      201        204 Total number of sirens tested over 4 qtrs                                  200        200      205        210 2Q/98    3Q/98 4Q/98        Prev. Q Indicator expressed as a percentage of sirens                            96.5%    97.5% 98.0%          97.1%
Thresholds Green                                        94%
White                                        <94%
Yellow                                        <90%
Red                                          N/A ANS Reliability 2Q/98             3Q/98       Quarter    4Q/98                     Prev. Q 100.0%
98.0%
98.0%
96.0%                           GREEN 94.0%
96.0% GREEN 94.0%
Indicator 92.0%
Indicator 92.0% WHITE 90.0%
WHITE 90.0%
88.0% YELLOW 86.0%
88.0%                           YELLOW 86.0%
84.0% Note: No Red Threshold 82.0%
84.0%                                         Note: No Red Threshold 82.0%
80.0%
80.0%


DRAFT NEI 99-02 [Revision 8]
DRAFT NEI 99-02 [Revision 8]
XX/XX/2021 NOTE: Licensees are in the process of replacing their offsite siren systems with the Integrated Public Alert and Warning System (IPAWS) as the primary method for performing prompt public alerting during an emergency. Once a site no longer uses sirens as a primary alerting method, it ceases to report ANS PI data. Below is the performance indicator proposed as a replacement for the current ANS PI. The proposed performance indicator is taken from ROP FAQ 22-01, Replace the ANS PI and with an ERFER PI (ML22055A562). The PI described below may be revised if needed to address NRC staff comments or direction from the Commission. It is intended that the replacement of the ANS PI with the ERFER PI will affect all power reactor licensees at the same time. i.e., there will be one PI implementation (cutover) date for all sites, regardless of any given sites intent or status concerning IPAWS implementation.
XX/XX/2021
Emergency Response Facility and Equipment Readiness Purpose The Emergency Response Facility and Equipment Readiness (ERFER) performance indicator measures licensee performance in maintaining the emergency response facilities and equipment of greater importance to the protection of public health and safety. It reflects the ability of the licensee to perform the surveillance, testing, inventory, and preventative and corrective maintenance activities that contribute to the availability of emergency response facilities and equipment necessary to implement Risk Significant Planning Standard (RSPS) functions and response actions.
 
Indicator Definition The number of occurrences during a quarter that the Technical Support Center (TSC) or Emergency Operations Facility (EOF) is nonfunctional, or equipment necessary to implement the emergency plan is not available or functional, such that an RSPS function or response action could not be performed for greater than 168 continuous hours from the Time of Discovery (TOD) and no Compensatory Measure(s) was implemented.
NOTE: Licensees are in the process of replacing their offsite siren systems with the Integrated Public Alert and Warning System (IPAWS) as the primary method for performing prompt public alerting during an emergency. Once a site no longer uses sirens as a primary alerting method, it ceases to report ANS PI data. Below is the performance indicat or proposed as a replacement for the current ANS PI. The proposed performance indicator is taken from ROP FAQ 22-01, Replace the ANS PI and with an ERFER PI (ML22055A562). The PI described below may be revised if needed to address NRC staff comments or direction from the Commission. It is intended that the replacement of the ANS PI with the ERFER PI will affect all power reactor licensees at the same time. i.e., there will be one PI implemen tation (cutover) date for all sites, regardless of any given sites intent or status concerning IPAWS implementation.
Data Reporting Elements The number of occurrences that the TSC or EOF is nonfunctional, or equipment necessary to implement the emergency plan is not available or functional, such that an RSPS function or response action could not be performed for greater than 168 hours from the TOD and no Compensatory Measure(s) was implemented.
 
Calculation Count the number of occurrences that the TSC or EOF is nonfunctional, or equipment necessary to implement the emergency plan is not available or functional, such that an RSPS function or response action could not be performed for greater than 168 hours from the TOD and no Compensatory Measure(s) was implemented.
Emergency Response Facility and Equipment Readiness
H-6
 
Purpose
 
The Emergency Response Facility and Equipment Readiness (ERFER) performance indicator measures licensee performance in maintaining the emergency response facilities and equipment of greater importance to the protection of public health and safety. It reflects the ability of the licensee to perform the surveillance, testing, inventory, and preventative and corrective maintenance activities that contribute to the availability of emergency response facilities and equipment necessary to implement Risk Significant Planning Standard (RSPS) functions and response actions.
 
Indicator Definition
 
The number of occurrences during a quarter that the Technical Support Center (TSC) or Emergency Operations Facility (EOF) is nonfunctional, or equipment necessary to implement the emergency plan is not available or functional, such that an RSPS function or response action could not be performed for greater than 168 continuous hours from the Time of Discovery (TOD) and no Compensatory Measure(s) was implemented.
 
Data Reporting Elements
 
The number of occurrences that the TSC or EOF is nonfunctional, or equipment necessary to implement the emergency plan is not available or functional, such that an RSPS function or response action could not be performed for greater than 168 hours from the TOD and no Compensatory Measure(s) was implemented.
 
Calculation
 
Count the number of occurrences that the TSC or EOF is nonfunctional, or equipment necessary to implement the emergency plan is not available or functional, such that an RSPS function or response action could not be performed for greater than 168 hours from the TOD and no Compensatory Measure(s) was implemented.
 
H-6 DRAFT NEI 99-02 [Revision 8]
XX/XX/2021
 
Definition of Terms
 
The definition of the terms Risk Significant Planning Standard function, Time of Discovery, and Compensatory Measure are those described in NRC Inspection Manual Chapter 0609, Appendix B, Emergency Preparedness Significance Determination Process.1
 
Clarifying Notes
 
The ERFER PI reflects the ability of a licensee to perform the surveillance, testing, inventory, and preventative and corrective maintenance activities that contribute to the availability of the facilities and equipment necessary to accomplish RSPS functions and response actions.
 
Consistent with the Indicator Definition, a facility or equipment issue must be impactful enough to prevent the performance of an RSPS function or response action; a degraded capability to perform a function or action should not be counted. A Compensatory Measure need not meet the same design or operating requirements as the methods normally u sed to perform an RSPS function or response action; however, its effectiveness should be sufficient to ensure that the supported function or action would be accomplished during an ac tual emergency, albeit in a possibly degraded manner.


DRAFT NEI 99-02 [Revision 8]
XX/XX/2021 Definition of Terms The definition of the terms Risk Significant Planning Standard function, Time of Discovery, and Compensatory Measure are those described in NRC Inspection Manual Chapter 0609, Appendix B, Emergency Preparedness Significance Determination Process.1 Clarifying Notes The ERFER PI reflects the ability of a licensee to perform the surveillance, testing, inventory, and preventative and corrective maintenance activities that contribute to the availability of the facilities and equipment necessary to accomplish RSPS functions and response actions.
Consistent with the Indicator Definition, a facility or equipment issue must be impactful enough to prevent the performance of an RSPS function or response action; a degraded capability to perform a function or action should not be counted. A Compensatory Measure need not meet the same design or operating requirements as the methods normally used to perform an RSPS function or response action; however, its effectiveness should be sufficient to ensure that the supported function or action would be accomplished during an actual emergency, albeit in a possibly degraded manner.
To be counted towards the performance indicator, the occurrence of a given facility or equipment issue must exceed 168 hours during one continuous period (i.e., continuous hours) in one quarter.
To be counted towards the performance indicator, the occurrence of a given facility or equipment issue must exceed 168 hours during one continuous period (i.e., continuous hours) in one quarter.
The starting point of the issue should be determined in accordance with the Time of Discovery guidance in NRC Inspection Manual Chapter 0609, Appendix B. Further, if an equipment issue affects performance of an RSPS function or response action at multiple facilities (e.g., loss of common computer or communications system) but the impact started at different times depending on the facility, then the performance indicator assessment should use the longest out-of-service time.
The starting point of the issue should be determined in accorda nce with the Time of Discovery guidance in NRC Inspection Manual Chapter 0609, Appendix B. Further, if an equipment issue affects performance of an RSPS function or response action at multiple facilities (e.g., loss of common computer or communications system) but the impact started at different times depending on the facility, then the performance indicator assessment should use the longest out-of-service time.
A loss of the TSC or EOF, or associated equipment, that precludes the performance of an RSPS function or response action for 12-hours from TOD should be documented (e.g., in the licensees corrective action program). The Compensatory Measure implemented in response to the facility or equipment issue should also be documented.
 
A loss of the TSC or EOF, or associated equipment, that precludes the performance of an RSPS function or response action for 12-hours from TOD should be documented (e.g., in the licensees corrective action program). The Compensatory Measure implemented in response to the facility or equipment issue should also be documented.
 
If the licensee reports a lost RSPS function or response action under this performance indicator but later determines that the capability was not lost (e.g., through a subsequent engineering analysis), then the performance indicator data should be revised accordingly. The basis for this determination should be documented and the documentation retained for inspection.
If the licensee reports a lost RSPS function or response action under this performance indicator but later determines that the capability was not lost (e.g., through a subsequent engineering analysis), then the performance indicator data should be revised accordingly. The basis for this determination should be documented and the documentation retained for inspection.
NOTE: The ROP ERFER PI and the ERFER PI described in NEI White Paper, Implementing a 24-Month Frequency for Emergency Preparedness Program Reviews, dated November 2019 (ML19344C419) use the same approach but with different threshold values, reflecting their different purposes. The NEI white paper is endorsed in Regulatory Guide 1.101, Emergency Planning and Preparedness for Nuclear Power Reactors. In addition to monitoring performance indicators, licensees implementing a 24-month review frequency, per 10 CFR 50.54(t)(1)(ii), will need to conduct periodic evaluations of the adequacy of interfaces with State and local governments as described in the NEI white paper.
NOTE: The ROP ERFER PI and the ERFER PI described in NEI White Paper, Implementing a 24-Month Frequency for Emergency Preparedness Program Reviews, dated November 2019 (ML19344C419) use the same approach but with different threshold values, reflecting their different purposes. The NEI white paper is endorsed in Regulatory Guide 1.101, Emergency Planning and Preparedness for Nuclear Power Reactors. In addition to monitoring performance indicators, licensees implementing a 24-month review frequency, per 10 CFR 50.54(t)(1)(ii), will need to conduct periodic evaluations of the adequacy of interfaces with State and local governments as described in the NEI white paper.
1 See Inspection Manual Chapter 0609, Appendix B, Emergency Preparedness Significance Determination Process, Issue Date September 22, 2015, (ADAMS ML15128A462), Section 2.0, Definitions, Abbreviations, and Acronyms.
H-7


DRAFT NEI 99-02 [Revision 8]
1 See Inspection Manual Chapter 060 9, Appendix B, Emergency Pre paredness Significance Determination Process, Issue Date Septe mber 22, 2015, (ADAMS ML15128A462), Section 2.0, Definitions, Abbreviations, and Acronyms.
XX/XX/2021 Data Example Threshold
 
* White   1/quarter
H-7 DRAFT NEI 99-02 [Revision 8]
* Yellow   3/quarter
XX/XX/2021
* Red     N/A H-8}}
 
Data Example
 
Threshold
* White 1/quarter
* Yellow 3/quarter
* Red N/A
 
H-8}}

Revision as of 16:17, 15 November 2024

Draft NEI 99-02 Revision 8 – Alert and Notification System Reliability
ML22334A057
Person / Time
Issue date: 11/30/2022
From:
Policy and Oversight Branch
To:
References
NEI 99-02, Rev 8
Download: ML22334A057 (8)


Text

DRAFT NEI 99-02 [Revision 8]

XX/XX/2021

ALERT AND NOTIFICATION SYSTEM RELIABILITY

Purpose

This indicator monitors the reliability of the offsite Alert and Notification System (ANS), a critical link for alerting and notifying the public of the need to take protective actions. It provides the percentage of the sirens that are capable of performing their safety function based on regularly scheduled tests.

Indicator Definition

The percentage of ANS sirens that are capable of performing their function, as measured by periodic siren testing in the previous 12 months.

Periodic tests are the regularly scheduled tests (documented in the licensees test plan or guidelines) that are conducted to actually test the ability of the sirens to perform their function (e.g., silent, growl, siren sound test). Tests performed for maintenance purposes should not be counted in the performance indicator database. Actions that could affect the as found condition of sirens prior to testing are not allowed.

Data Reporting Elements

The following data are reported: (see clarifying notes)

  • the total number of ANS siren-tests during the previous quarter
  • the number of successful ANS siren-tests during the previous quarter

Calculation

The site value for this indicator is calculated as follows:

  1. x ofsuccesfulsiren-testsin theprevious4qtrs100 total number of siren -tests in the previous 4 qtrs

Definition of Terms

Siren-Tests: the number of sirens times the number of times they are tested. For example, if 100 sirens are tested 3 times in the quarter, there are 300 siren-tests.

Successful siren-tests are the sum of sirens that performed their function when tested. For example, if 100 sirens are tested three times in the quarter and the results of the three tests are:

first test, 90 performed their function; second test, 100 performed their function; third test, 80 performed their function. There were 270 successful siren-tests.

Clarifying Notes

The purpose of the ANS PI is to provide a uniform industry reporting approach and is not intended to replace the FEMA Alert and Notification reporting requirement at this time.

DRAFT NEI 99-02 [Revision 8]

XX/XX/2021

For those sites that do not have sirens, the performance of the licensees alert and notification system will be evaluated through the NRC baseline inspection program. A site that does not have sirens does not report data for this indicator A site with a FEMA-approved primary public alerting method(s) that does not use sirens will not report data for this indicator and may stop reporting data beginning with the quarter the method is implemented (e.g., implementation of IPAWS to replace sirens as the primary alerting method). In thi s case, the licensees ANS will be evaluated through the NRC baseline inspection program. When reporting ROP Cornerstone PI data, a licensee that does not use sirens as the primary public alerting method should leave the ANS PI data fields blank (i.e., no entries) and add a comment stating: Sirens are not part of the sites primary ANS. ANS will be inspected via IP 71114.02..

If a siren is out of service for maintenance or is inoperable at the time a regularly scheduled test is conducted, then it counts as both a siren test and a siren failure. Regularly scheduled tests missed for reasons other than siren unavailability (e.g., out of service for planned maintenance or repair) should be considered non opportunities. The failure to perform a regularly scheduled test should be noted in the comment field. Additionally, if the sirens are not available for operation because of intentional actions to disable them, and the area is deemed uninhabitable by State and/or Local agencies, then the siren(s) in question are not required to be counted in the numerator or denominator of the Performance Indicator for testing throughout the event. The conditions causing the suspension of testing, its duration and restoration are to be noted in the comment field for the indicator.

For plants where scheduled siren tests are initiated by local or state governments, if a scheduled test is not performed either intentionally or accidentally, the missed test is not considered as valid test opportunities. Missed test occurrences should be entered in the plants corrective action program.

If a siren failure is determined to be due only to testing equipment, and subsequent testing shows the siren to be operable (verified by telemetry or simultaneous local verification) without any corrective action having been performed, the siren test should be considered a success.

Maintenance records should be complete enough to support such determinations and validation during NRC inspection.

A licensee may change ANS test methodology at any time consistent with regulatory guidance.

For the purposes of this performance indicator, only the testing methodology in effect on the first day of the quarter shall be used for that quarter. Neither successes nor failures beyond the testing methodology at the beginning of the quarter will be counted in the PI. (No actual siren activation data results shall be included in licensees ANS PI data.) Any change in test methodology shall be reported as part of the ANS Reliability Performance Indicator effective the start of the next quarterly reporting period. Changes should be noted in the comment field.

Siren systems may be designed with equipment redundancy, multip le signals or feedback capability. It may be possible for sirens to be activated from multiple control stations or signals.

If the use of redundant control stations or multiple signals is in approved procedures and is part of the actual system activation process then activation from either control station or any signal should be considered a success. A failure of both systems would only be considered one failure, whereas the success of either system would be considered a success. If the redundant control station is not normally attended, requires setup or initialization, it may not be considered as part of the regularly scheduled test. Specifically, if the station is only made ready for the purpose of DRAFT NEI 99-02 [Revision 8]

XX/XX/2021

siren tests it should not be considered as part of the regularly scheduled test.

Actions specifically taken to improve the performance of a scheduled test are not appropriate.

The test results should indicate the actual as-found condition of the ANS. Such practices will result in an inaccurate indication of ANS reliability.

Examples of actions that are NOT allowed and DO affect the as found conditions of sirens (not an all-inclusive list):

o Preceding test with an unscheduled test with the sole purpose t o validate the siren is functional.

o Prior to a scheduled test, adjustment or calibration of siren system activation equipment that was not scheduled to support post maintenance testing.

o Prior to a scheduled test, testing siren system activation equipment or an individual siren(s) unless the equipment is suspected damaged from adverse weather, vandalism, vehicular strikes, etc.

o Prior to a scheduled test, testing siren system activation equipment or an individual siren(s) unless the equipment is suspected as being non-functional as a result of a computer hardware or software failure, radio tower failure, cut phone line, etc.

However, in no case should response preclude the timely correction of ANS problems and subsequent post-maintenance testing, or the execution of a comprehensive preventive maintenance program.

Testing opportunities that will be included in the ANS performance indicator are required to be defined in licensee ANS procedures. These are typically: bi-weekly, monthly quarterly and annual tests. The site specific ANS design and testing document approved by FEMA is a reference for the appropriate types of test, however licensees may perform tests in addition to what is discussed in the FEMA report.

Examples of actions that ARE allowed and do not affect the as found conditions of sirens (not an all-inclusive list):

o Regardless of the time, an unscheduled diagnostic test and subsequent maintenance and repair followed by post maintenance testing after any event that causes actual or suspected damage, such as:

1. Severe/inclement weather (high winds, lightning, ice, etc.),
2. Suspected or actual vandalism,
3. Physical damage from impact (vehicle, tree limbs, etc.),
4. Computer hardware and software failures,
5. Damaged communication cables or phone lines.

DRAFT NEI 99-02 [Revision 8]

XX/XX/2021

6. Problems identified by established routine use of the siren feedback systems.

o Scheduled polling tests for the purpose of system monitoring to optimize system availability and functionality.

If a siren is out of service for scheduled planned refurbishment or overhaul maintenance performed in accordance with an established program, or for sch eduled equipment upgrades, the siren need not be counted as a siren test or a siren failure. However, sirens that are out of service due to unplanned corrective maintenance would continue to be counted as failures. Unplanned corrective maintenance is a measure of program reliability. The exclusion of a siren due to temporary unavailability during planned maintenance/upgrade activities is acceptable due to the level of control placed on scheduled maintenance/upgrade activities. It is not the intent to create a disincentive to performing maintenance/upgrades to ensure the ANS performs at its peak reliability.

As part of a refurbishment or overhaul plan, it is expected that each utility would communicate to the appropriate state and/or local agencies the specific sirens to be worked and ensure that a functioning backup method of public alerting would be in-place. The acceptable timeframe for allowing a siren to remain out of service for system refurbishment or overhaul maintenance should be coordinated with the state and local agencies. Based on the impact to their organization, these timeframes should be specified in upgrade or system improvement implementation plans and/or maintenance procedures. Deviations from these plans and/or procedures would constitute unplanned unavailability and would be included in the PI.

Siren testing conducted at redundant control stations, such as county EOCs that are staffed during an emergency by an individual capable of activating the sirens, may be credited provided the redundant control station is in an approved facility as documented in the FEMA ANS design report.

In initiating EP03 reporting data for a new siren system (where there were no sirens previously),

data is entered over the 12-month period starting from the system implementation date. Each quarter results will be submitted in accordance with NEI 99-02. Zero should be entered for the 12-month quarterly average until four quarters of data have accumulated. The EP03 PI will be valid once four quarters of data have been accrued.

FAQs incorporated into this revision of the ANS PI:

DRAFT NEI 99-02 [Revision 8]

XX/XX/2021

Data Example

Alert & Notification System Reliability

Quarter 3Q/97 4Q/97 1Q/98 2Q/98 3Q/98 4Q/98 Prev. Q Number of succesful siren-tests in the qtr 47 48 49 49 49 54 52 Total number of sirens tested in the qtr 50 50 50 50 50 55 55 Number of successful siren-tests over 4 qtrs 193 195 201 204 Total number of sirens tested over 4 qtrs 200 200 205 210 2Q/98 3Q/98 4Q/98 Prev. Q Indicator expressed as a percentage of sirens 96.5% 97.5% 98.0% 97.1%

Thresholds Green 94%

White <94%

Yellow <90%

Red N/A

ANS Reliability 2Q/98 3Q/98 4Q/98 Prev. QQuarter 100.0%

98.0%

96.0% GREEN 94.0%

Indicator 92.0% WHITE 90.0%

88.0% YELLOW 86.0%

84.0% Note: No Red Threshold 82.0%

80.0%

DRAFT NEI 99-02 [Revision 8]

XX/XX/2021

NOTE: Licensees are in the process of replacing their offsite siren systems with the Integrated Public Alert and Warning System (IPAWS) as the primary method for performing prompt public alerting during an emergency. Once a site no longer uses sirens as a primary alerting method, it ceases to report ANS PI data. Below is the performance indicat or proposed as a replacement for the current ANS PI. The proposed performance indicator is taken from ROP FAQ 22-01, Replace the ANS PI and with an ERFER PI (ML22055A562). The PI described below may be revised if needed to address NRC staff comments or direction from the Commission. It is intended that the replacement of the ANS PI with the ERFER PI will affect all power reactor licensees at the same time. i.e., there will be one PI implemen tation (cutover) date for all sites, regardless of any given sites intent or status concerning IPAWS implementation.

Emergency Response Facility and Equipment Readiness

Purpose

The Emergency Response Facility and Equipment Readiness (ERFER) performance indicator measures licensee performance in maintaining the emergency response facilities and equipment of greater importance to the protection of public health and safety. It reflects the ability of the licensee to perform the surveillance, testing, inventory, and preventative and corrective maintenance activities that contribute to the availability of emergency response facilities and equipment necessary to implement Risk Significant Planning Standard (RSPS) functions and response actions.

Indicator Definition

The number of occurrences during a quarter that the Technical Support Center (TSC) or Emergency Operations Facility (EOF) is nonfunctional, or equipment necessary to implement the emergency plan is not available or functional, such that an RSPS function or response action could not be performed for greater than 168 continuous hours from the Time of Discovery (TOD) and no Compensatory Measure(s) was implemented.

Data Reporting Elements

The number of occurrences that the TSC or EOF is nonfunctional, or equipment necessary to implement the emergency plan is not available or functional, such that an RSPS function or response action could not be performed for greater than 168 hours0.00194 days <br />0.0467 hours <br />2.777778e-4 weeks <br />6.3924e-5 months <br /> from the TOD and no Compensatory Measure(s) was implemented.

Calculation

Count the number of occurrences that the TSC or EOF is nonfunctional, or equipment necessary to implement the emergency plan is not available or functional, such that an RSPS function or response action could not be performed for greater than 168 hours0.00194 days <br />0.0467 hours <br />2.777778e-4 weeks <br />6.3924e-5 months <br /> from the TOD and no Compensatory Measure(s) was implemented.

H-6 DRAFT NEI 99-02 [Revision 8]

XX/XX/2021

Definition of Terms

The definition of the terms Risk Significant Planning Standard function, Time of Discovery, and Compensatory Measure are those described in NRC Inspection Manual Chapter 0609, Appendix B, Emergency Preparedness Significance Determination Process.1

Clarifying Notes

The ERFER PI reflects the ability of a licensee to perform the surveillance, testing, inventory, and preventative and corrective maintenance activities that contribute to the availability of the facilities and equipment necessary to accomplish RSPS functions and response actions.

Consistent with the Indicator Definition, a facility or equipment issue must be impactful enough to prevent the performance of an RSPS function or response action; a degraded capability to perform a function or action should not be counted. A Compensatory Measure need not meet the same design or operating requirements as the methods normally u sed to perform an RSPS function or response action; however, its effectiveness should be sufficient to ensure that the supported function or action would be accomplished during an ac tual emergency, albeit in a possibly degraded manner.

To be counted towards the performance indicator, the occurrence of a given facility or equipment issue must exceed 168 hours0.00194 days <br />0.0467 hours <br />2.777778e-4 weeks <br />6.3924e-5 months <br /> during one continuous period (i.e., continuous hours) in one quarter.

The starting point of the issue should be determined in accorda nce with the Time of Discovery guidance in NRC Inspection Manual Chapter 0609, Appendix B. Further, if an equipment issue affects performance of an RSPS function or response action at multiple facilities (e.g., loss of common computer or communications system) but the impact started at different times depending on the facility, then the performance indicator assessment should use the longest out-of-service time.

A loss of the TSC or EOF, or associated equipment, that precludes the performance of an RSPS function or response action for 12-hours from TOD should be documented (e.g., in the licensees corrective action program). The Compensatory Measure implemented in response to the facility or equipment issue should also be documented.

If the licensee reports a lost RSPS function or response action under this performance indicator but later determines that the capability was not lost (e.g., through a subsequent engineering analysis), then the performance indicator data should be revised accordingly. The basis for this determination should be documented and the documentation retained for inspection.

NOTE: The ROP ERFER PI and the ERFER PI described in NEI White Paper, Implementing a 24-Month Frequency for Emergency Preparedness Program Reviews, dated November 2019 (ML19344C419) use the same approach but with different threshold values, reflecting their different purposes. The NEI white paper is endorsed in Regulatory Guide 1.101, Emergency Planning and Preparedness for Nuclear Power Reactors. In addition to monitoring performance indicators, licensees implementing a 24-month review frequency, per 10 CFR 50.54(t)(1)(ii), will need to conduct periodic evaluations of the adequacy of interfaces with State and local governments as described in the NEI white paper.

1 See Inspection Manual Chapter 060 9, Appendix B, Emergency Pre paredness Significance Determination Process, Issue Date Septe mber 22, 2015, (ADAMS ML15128A462), Section 2.0, Definitions, Abbreviations, and Acronyms.

H-7 DRAFT NEI 99-02 [Revision 8]

XX/XX/2021

Data Example

Threshold

  • White 1/quarter
  • Yellow 3/quarter
  • Red N/A

H-8