From Advanced Lightning Data to Automated IEC Compliance
Florent Giraudet (Metarresters) | Kelly Buza + Jim Grasty (Skytree Scientific) March 2026
Introduction: Why Lightning Data Quality Matters Today
Lightning is one of the most frequent and disruptive natural hazards affecting power systems and critical infrastructure worldwide. Globally, an estimated 1.4 billion lightning flashes occur every year, the majority over land, where transmission lines, substations, renewable assets, and industrial facilities are exposed. For utilities, EPCs, and large infrastructure operators in the power systems industry, lightning is therefore not an exceptional risk—it is a recurring operational reality that must be managed.
The financial consequences are far from negligible. In the United States alone, lightning-related damage to electrical infrastructure is estimated to exceed one billion dollars per year when accounting for equipment damage, outage restoration, and indirect economic losses. Similar impacts are observed worldwide, particularly in regions with high lightning activity combined with expanding transmission networks and renewable integration. These figures do not even fully capture the downstream effects of penalties linked to service interruptions, lost revenue, contractual non-compliance, or reputational damage.
Beyond direct repair costs, lightning-related outages disrupt power supply to customers, affect industrial processes, and increase operational and maintenance expenditures. For asset owners and system operators, this creates a clear need for due diligence: lightning risk must be quantified accurately, spatially resolved, and translated into informed engineering and investment decisions. Without a reliable understanding of actual lightning exposure, mitigation strategies risk being either insufficient—or unnecessarily costly.
At the same time, the operating environment itself is changing. Storm behavior is becoming less predictable in many regions, with changes observed in both the occurrence and severity of lightning-related storm activity, while infrastructure footprints continue expanding into areas with historically limited exposure data. Yet many risk assessments still rely on legacy ground flash density maps or long-term historical averages that may not reflect current or localized lightning patterns. When capital projects are moving faster, and uptime expectations are tighter, relying on static or outdated inputs creates blind spots. Today’s risk landscape demands more granular, continuously updated lightning data that reflects real-world exposure; not just historical assumptions.
This is where modern Lightning Location Systems (LLS) and intelligent automation are reshaping the standard. High-resolution LLS networks provide precise, time-stamped strike data that can be directly integrated into structured lightning risk assessments. When paired with AI-supported platforms, this data can be processed more efficiently and translated into clear, defensible outputs – from streamlined calculations to multilingual compliance-ready reports and protection measure recommendations. The goal is not to replace engineering expertise, but to remove friction from the process and improve consistency, transparency, and speed. Together, high-quality detection data and intelligent tools enable a more responsive and scalable approach to lightning risk management.
The Old Standard: Keraunic Levels & Thunderstorm-Day Counts
Keraunic levels based on thunderstorm-day counts are considered one of the least reliable methods for estimating Ground Flash Density (GFD or Ng). This has been explicitly recognized by several reference standards, including those issued by IEC (e.g. IEC 62305), which discourage their use for engineering applications.
The keraunic level is a legacy metric developed at a time when satellite-based lightning detection and ground-based Lightning Location Systems (LLS) were not available. A thunderstorm day is defined as any day on which thunder is heard, regardless of how many lightning events actually occur during that day.
To calculate the keraunic level, one simply takes the average annual number of thunderstorm days for a given location. For example, if thunder is heard on 30 different days in a year, the keraunic level is 30.
This method has an inherently subjective nature: it depends entirely on human perception. Was thunder heard or not? By whom, where, and under what conditions? As a result, it provides no objective or reproducible measurement of lightning activity.
Unless a power system is located in an extremely remote region—such as polar areas—where LLS deployment is genuinely not feasible, both CIGRÉ (for power systems) and IEC recommendations strongly advise against using keraunic levels. In any country where LLS data is available, this approach should be considered obsolete.
Keraunic-level methodologies represent a serious engineering limitation. They offer very poor spatial resolution and completely lack strike-specific information such as precise timestamps, current magnitude estimation, lightning type, or polarity. In the context of modern power grids, this makes them unsuitable for any rigorous analysis and they should no longer be used.
The problem is that many utilities and engineering companies still base specifications, regulatory requirements, and internal processes on this outdated method.
A clear example can be found in Peru, where many utilities and engineering firms continue to rely on the Osinergmin isokeraunic map as a reference for Ground Flash Density. This map is used because it is a national regulatory tool that is free and easily accessible. It was likely adopted at a time when LLS data was not available or not widely accessible.
However, Osinergmin’s map is not based on direct lightning measurements. It relies on keraunic levels (Td)—thunderstorm days—converted into GFD using an outdated empirical formula from IEC 62305-2:2010, which has since been withdrawn.
This approach comes with serious technical limitations:
- Subjective and indirect proxy (hearing thunder ≠ lightning impact)
- Dataset limited to the 2013–2018 period
- No differentiation between cloud-to-ground and intra-cloud lightning
- Proven strong deviation compared to LLS-based GFD (up to ~7× overestimation in some regions)
- Results that cannot reliably support engineering-grade lightning studies
This is not merely a theoretical concern. When such metrics are used for electro-geometric method (EGM) studies, outage analysis, or asset protection design, the results often fail to match observed system behavior—particularly in regions with strong spatial variability and long transmission lines.
The good news is that Lightning Location System data does exist in Peru and should be used as the primary data source for any serious lightning risk assessment.
The Modern Standard: Lightning Location Systems (LLS)
The most accurate and reliable source for lightning data today is the ground-based Lightning Location System (LLS). These systems represent the current state of the art and are widely considered the gold standard for precise lightning analysis and risk assessment.
A modern Lightning Location System is a network of ground-based sensors designed to detect the electromagnetic signals emitted by lightning return strokes at the moment they occur, typically in the very low frequency (VLF) and low frequency (LF) bands. Each sensor continuously measures the electric and magnetic fields associated with a lightning discharge and timestamps the detected signal using a highly accurate reference such as GPS or GNSS.
When a lightning stroke occurs, its electromagnetic pulse propagates outward and is detected by multiple sensors at slightly different times. By comparing the time of arrival at three or more sensors and applying triangulation or multilateration algorithms, the central processing system determines the geographic location of the stroke with high precision. Individual strokes that are close in time and space are then grouped into flashes based on defined spatial and temporal criteria, allowing analyses to be performed at the flash level rather than using raw stroke counts.
LLS datasets typically provide much more than location alone. They include precise timing (often with sub-millisecond accuracy), polarity, lightning type (cloud-to-ground or intra-cloud), and an estimate of peak current for each stroke and flash. This level of detail enables high-resolution lightning density mapping and supports engineering-grade lightning risk assessments.
Several major LLS data providers operate large-scale networks covering most regions worldwide, including
- Vaisala Xweather (GLD360 globally and NLDN in the United States),
- Nowcast (LINET, available globally or as a deployable network),
- Météorage (ELDN in Europe), and
- Earth Networks (ENTLN, with global coverage and strong presence in the U.S. market).
These providers enable utilities and industrial users to access accurate, actionable lightning data at national and international scales.
When selecting an LLS provider or evaluating lightning datasets, several key aspects must be considered.
- Data quality is critical, including location accuracy, spatial resolution, and long-term reliability.
- Network coverage also matters, as some systems are truly global while others are regional.
- Finally, underlying technology differs between providers: sensor types, signal processing methods, and classification algorithms all influence detection efficiency and positional accuracy.
A representative example is the National Lightning Detection Network (NLDN), widely used across the United States and operated by Vaisala Xweather. According to published performance reports, the NLDN achieves
- a median location accuracy of approximately 84 m,
- a cloud-to-ground lightning detection efficiency exceeding 95%, and
- long-term stability that enables more than a decade of consistent historical analysis.
Compared to global networks with limited capabilities, this results in denser, more precise, and more stable datasets, which are particularly valuable for infrastructure design and detailed engineering studies.
In practice, this level of lightning data quality means that lightning maps and statistics can—and should—be systematically integrated into lightning performance studies of overhead lines, insulation coordination and surge arrester application, asset risk ranking and prioritization, and post-event analysis used to validate mitigation measures. The use of lightning data is no longer an exception; in countries such as the United States, it should increasingly be considered the baseline.
For applications involving ground flash density calculations (Ng), it is essential to comply with the requirements defined in IEC 62858. This standard specifies minimum performance criteria for LLS networks used to compute Ng. In particular, the annual average detection efficiency for cloud-to-ground lightning must be at least 80% in the region of interest, the median location accuracy must be better than 500 m, and at least 85% of cloud-to-ground flashes must be correctly classified to limit errors in Ng estimation.
The value of such data for power utilities and system operators is significant. Consider the following use case of a long transmission line located in a region with relatively high lightning activity. Using a grid-based lightning map, the line can be divided into distinct sections.
- One section may exhibit a ground flash density close to zero flashes per square kilometre per year, effectively requiring little or no lightning protection investment.
- Another section may show Ng values in the range of 3.5 to 4 flashes per square kilometre per year, clearly indicating the need for targeted lightning mitigation measures.
For long transmission lines, obtaining accurate Ng values and understanding their spatial variation along the route is extremely valuable. It allows utilities to focus resources on critical areas, optimize investments in lightning protection systems, and perform more precise risk assessments while avoiding unnecessary expenditures on non-critical segments.
Beyond density analysis, LLS data also enable correlation studies between recorded lightning events and line tripping failures. By matching lightning stroke timestamps with utility outage records, it becomes possible to distinguish lightning-related outages from those caused by other factors. In our example below, 81% of tripping events recorded between 2016 and 2024 were confirmed to be lightning-related in a specific line section with elevated Ng values. This correlation is not always systematic, however, as terrain effects such as altitude and topography—particularly in mountainous regions—can influence line performance independently of lightning density.
Such analyses are essential to pinpoint vulnerabilities along transmission lines, differentiate between lightning-related and non-lightning-related outages, improve the evaluation of protection system effectiveness, and support informed, targeted investment decisions for line upgrades and lightning protection strategies.
Beyond Raw Data: Turning Lightning Location Records Into Actionable Risk Assessments
Lightning Location Systems provide precise parameters for every detected event – location, polarity, peak current estimate, classification, timestamp. That level of accuracy is a major step forward for the industry.
But even high-quality detection data is not decision-ready on its own.
A cloud-to-ground strike plotted on a map does not answer the questions infrastructure owners actually face:
- What is the measurable exposure at this specific site?
- How does this facility compare to others?
- Is the current protection level technically justified?
- What does IEC require based on actual density values?
Before lightning data can inform engineering decisions, it must be filtered, validated, and translated into standardized exposure metrics.
From Detection Data to IEC 62305-2:2024 Metrics
For many years, Ground Flash Density (GFD), commonly expressed as Ng, has been the primary environmental input used in lightning risk assessments under IEC 62305.
With the release of IEC 62305-2:2024, the standard now places formal emphasis on Ground Strike Point Density (Nsg), which more directly represents actual strike attachment locations at ground level. This evolution reflects the broader shift toward physically representative, measurement-based exposure modeling.
High-quality LLS datasets make this possible; but only when processed correctly.
Raw strike parameters must be converted into compliant Ng and Nsg values using structured methodologies aligned with IEC 62305-2:2024 and IEC 62858 performance requirements. Without this step, even accurate detection data remains descriptive rather than engineering-grade.
Within LRA Plus™, validated LLS data is integrated into a standards-aligned calculation engine that now computes both Ng and Nsg in accordance with IEC 62305-2:2024. The result is consistent, traceable exposure metrics that can be used directly in structured lightning risk assessments.
Although lightning density metrics such as Ng and Nsg are used across multiple industries, their application differs depending on the engineering context. In power systems, lightning data is often used to study line performance, outage correlation, and surge protection along transmission networks. Infrastructure risk assessments under IEC 62305 follow a different approach: lightning exposure is evaluated at the scale of a specific structure or facility. Factors such as the footprint, height, surrounding environment, and protection systems are considered together to determine the overall lightning risk and the appropriate protection level.
Making Exposure Truly Site-Specific
Regional lightning maps are informative. Infrastructure decisions, however, are made at the site level.
A refinery, data center, manufacturing facility, or renewable plant occupies a defined footprint. Its exposure depends on lightning activity within that footprint – not across an entire province or grid zone.
Structured analysis links IEC-compliant Ng and Nsg values directly to defined site boundaries. Instead of relying on broad regional averages or legacy proxy maps, the assessment reflects measured lightning density at the actual location under evaluation.
That distinction becomes critical when protection design, investment decisions, or compliance documentation are involved.
From Density Values to Practical Risk Scoring
Once exposure is quantified correctly, it becomes actionable.
Site-specific Ng and Nsg values feed directly into structured lightning risk assessments, enabling:
- Practical risk scoring for individual facilities
- Clear comparison between candidate project locations
- Objective evaluation of whether protection measures are proportionate
- Alignment with IEC 62305-2:2024 design assumptions
Lightning exposure transitions from a background environmental statistic to a defined engineering parameter.
For developers evaluating multiple sites, lightning density becomes an additional siting input alongside grid access, environmental constraints, and constructability. For operators managing existing facilities, it provides clarity on where protection investment is justified, and where it may exceed measured exposure.
From Calculation to Compliance-Ready Documentation
Engineering analysis must ultimately be documented in a form that supports review, permitting, and audit.
Modern cloud-based platforms replace fragmented spreadsheet workflows with centralized data management and repeatable computation. This improves consistency, traceability, and version control; particularly important in regulated infrastructure sectors.
Within LRA Plus™, standardized processing pipelines generate structured, IEC-aligned reports. These reports can be produced in multiple languages, supporting international project teams and cross-border infrastructure development.
Automation here is practical rather than promotional. This is where AI is most useful: helping standardize inputs, streamline data processing, and generate consistent reporting outputs; without altering the underlying engineering methodology. It ensures that exposure metrics and design parameters are reflected consistently in the final documentation, improving accuracy and reproducibility.
Why This Matters for Distributed Portfolios
The value of structured lightning exposure analysis increases significantly for organizations managing multiple assets, particularly when those assessments are performed consistently across a centralized digital platform.
Renewable developers, industrial operators, EPC contractors, and infrastructure owners rarely manage a single site. They manage portfolios across different climate regions.
When Ng and Nsg values are computed consistently across facilities, lightning exposure becomes comparable. Sites can be re-assessed periodically as updated LLS datasets reflect evolving strike patterns. Protection strategies can then be aligned with measured exposure rather than legacy averages.
This shifts lightning risk management from static compliance toward ongoing, portfolio-level risk oversight based on continuously updated lightning data.
From Exposure to Action: Practical Implications for Asset Owners
Quantifying lightning exposure is only the first step. The real value emerges when that exposure translates into clear engineering and investment decisions.
For infrastructure owners, lightning risk is not abstract. It affects capital budgets, uptime commitments, insurance discussions, and long-term asset reliability.
Consider a practical scenario: an industrial operator planning a new processing facility.
At first glance, the site appears viable:
- Grid access is available.
- Environmental constraints are manageable.
- Construction costs are competitive.
Yet when site-specific Ng and Nsg values are computed in accordance with IEC 62305-2:2024, the measured lightning density is materially higher than the regional average.
That finding changes the engineering conversation.
Informing Protection Design
Lightning protection level (LPL) selection directly influences:
- Air termination system design
- Down conductor configuration
- Surge protection coordination
- Grounding system dimensioning
If density inputs are based on simplified lightning maps or generalized averages, protection measures may be misaligned with actual exposure.
Measured Ng and Nsg values provide a defensible foundation for selecting the appropriate protection level. For example, a facility located in an area with elevated strike density may result in higher calculated risk values within the IEC 62305 assessment, which can support the selection of a higher Lightning Protection Level (LPL). In higher-density environments, this prevents under-design. In lower-density environments, it avoids unnecessary over-engineering.
The objective is proportionality – aligning protection with measurable environmental conditions.
Supporting Capital Allocation
Protection upgrades are rarely minor expenditures, particularly for data centers, renewable facilities, and high-value industrial assets.
When lightning exposure is quantified correctly; investment decisions can be justified clearly. For example, an LRA may show that a facility exceeds the acceptable IEC risk thresholds, supporting the installation of additional surge protection or structural lightning protection measures, while another site with lower calculated risk may require no additional investment.
An IEC-aligned LRA demonstrates:
- How exposure values were derived
- Why a given protection level was selected
- How environmental inputs influenced design parameters
This strengthens internal governance reviews and improves clarity in discussions with insurers and technical auditors.
It also reduces reliance on generalized “worst-case” assumptions that may not reflect actual site conditions.
Enabling Ongoing Risk Management
When risk assessments are built on measured inputs rather than fixed historical averages, they can be updated periodically using current datasets. If density values shift, protection strategies can be reassessed. If exposure remains stable, design criteria remain valid.
This creates a structured framework for ongoing due diligence rather than a one-time compliance exercise.
Documentation That Reflects the Analysis
Engineering decisions must ultimately be documented in a way that supports review, permitting, and audit.
In many organizations, lightning risk assessments are still spreadsheet-driven, with manual calculations and limited traceability. Modern cloud-based platforms centralize exposure inputs, apply standards-aligned calculations, and generate consistent reporting outputs.
Within LRA Plus™, IEC 62305-2:2024-aligned reports are generated directly from the underlying exposure calculations. Reports can be produced in multiple languages, supporting international project teams and cross-border infrastructure development.
The result is not simply a calculated density value, but structured documentation that links measured lightning exposure to defined protection decisions.
From Reactive Response to Structured Planning
Historically, lightning protection improvements were often reactive; triggered by failure events.
A strike occurs. Equipment is damaged. Protection is reinforced.
A structured, exposure-based approach shifts that dynamic.
When lightning density is quantified accurately and integrated into formal risk assessments, asset owners can anticipate vulnerability rather than respond to loss events. Protection strategies become aligned with measured environmental conditions and long-term reliability objectives.
Lightning risk management becomes part of structured infrastructure planning – not an afterthought.
Conclusion: LLS + Intelligent Automation = The Future of Lightning Risk Assessment
Lightning risk assessment is undergoing a measurable shift.
For decades, the industry relied on indirect estimation methods and simplified historical averages to approximate lightning exposure. Today, high-quality Lightning Location Systems provide precise, spatially resolved records of actual lightning activity. The conversation is no longer about whether measurement-based data is superior, it is about how that data is applied.
The release of IEC 62305-2:2024 reinforces this transition. With formal recognition of metrics such as Ground Strike Point Density (Nsg), standards are evolving to reflect physically representative, measurement-driven exposure modeling. Lightning density is no longer a generalized regional statistic; it is a site-specific engineering input.
At the same time, structured digital platforms are changing how this data is processed and applied. When validated LLS datasets are integrated into standards-aligned calculation frameworks, exposure metrics such as Ng and Nsg can be derived consistently, documented transparently, and incorporated directly into formal risk assessments.
The convergence of accurate detection networks and structured analytical platforms represents more than a technical improvement. It enables:
- Utilities to base reliability planning on measured environmental conditions
- EPCs to justify protection designs using IEC-aligned exposure inputs
- Infrastructure owners to align capital investment with actual site-specific risk
Lightning risk management shifts from estimation to measurement, and from reactive correction to structured planning.
Looking forward, the future of lightning risk assessment will not be defined by data alone, nor by software alone. It will be defined by the integration of high-quality detection networks, evolving international standards, and practical tools that translate exposure into defensible engineering decisions.
In that convergence; measurement, standardization, and structured risk intelligence – the industry moves toward a more accurate, transparent, and resilient approach to managing lightning risk.
For more information or to request a demo of LRA Plus™, contact Skytree Scientific here.
For lightning performance studies, surge arrester consulting, and software tools for overhead line analysis, contact Florent Giraudet at florent.giraudet@metarresters.com
About the Authors
Florent Giraudet (Metarresters) is an independent consultancy based in Berlin, Germany, specializing in lightning performance of overhead transmission and distribution lines, surge arrester technology, and insulation coordination.
Jim Grasty is Chief Science Officer and Co-Founder of Skytree Scientific. Jim Grasty brings decades of hands-on experience in lightning protection and IEC 62305 risk assessment, helping organizations design safer, more resilient infrastructure.
Kelly Buza is a marketing and leader at Skytree Scientific, where she focuses on bringing clarity to lightning risk assessment through standards-aligned communication and go-to-market strategy. With over 15 years of experience in technical industries, she works across product, sales, and operations to support adoption of modern IEC 62305 methodologies.



