
Introduction
Many utilities in North America are strategically undergrounding medium voltage lines as a resilience solution against extreme weather events and wildfire. Those opposed to undergrounding frequently cite costs and concerns about future outages. How can underground line scanning technologies address these concerns?
The Power Delivery Intelligence Initiative, (PDI2.org) has done much to address these concerns and put forward solutions in a recent report called the “Utility Underground Life- Cycle Cost Guide.” For example, did you know that by deploying the PDI2 recommended cable commissioning specification (offline 50 or 60Hz partial discharge (PD) test with 5pC sensitivity) and investing a few percent of construction cost, utilities can, triple their cable life span, significantly reduce up-front costs, enable workforce training, and reduce future operating cost/revenue loss and boost reliability and safety by 10 times? This article covers some practical strategies to select an effective cable test specification and achieve the aforementioned goals.
Cable Commission Helps Utilities Reach Life-Cycle Goals
The Utility Underground Life-Cycle Cost Guide’s cable commissioning specification is based on industry peer reviewed experience including thousands of defect dissections and hundreds of thousands of cable condition profile scans in the 5kV to 500kV class range over the last couple decades. The cable scanning specification provides a proven solution to lowering installation upfront costs, extending cable life, and accelerating workforce training, all while eliminating future truck roles, improving safety and reducing O&M and revenue loss.
Lowering Installation Upfront Costs
Some utilities have outdated specifications that limit pull lengths, require splices every few hundred feet, and are reluctant to change due to fear of damaging systems with longer pulls. These extremely conservative limits may have made some sense with legacy paper insulated lead covered (PILC) cable or unjacketed cable in dense urban areas with frequent load points and no method to scan the line for damage after the pull. However, with modern conduit and pulling techniques with lubricant, thousands of feet/meters can be pulled with relative ease. An accurate commissioning test typically finds only about 1% of cable segments (not counting accessory defects) have damage and need to be replaced. A defectively installed splice is 4 times more likely to occur than a damaged cable. Imagine the savings and risk reduction of 3 times fewer, vaults, splice boxes, and cable splices.
If a utility wants to further reduce costs and maintain reliability, directly buried cable can be installed at a much lower cost than cable in conduit. Provided the cable system is commissioned with an effective commissioning technique the cost savings can be realized without significant installation damage risk
Extending Cable System Life
Most future failures are due to anomalies existing during installation or caused by extreme duty-cycle during service. With proper specifications and operational procedures, utilities have control of these risks. Studies show that nearly 40% of new cable system segments (cable and accessories) have at least one defect. The first cable failure is most often caused by an installation defect. The intermittent erosion process associated with modern solid dielectric insulation deterioration can take years and sometimes decades to fail. However, once an aged cable fails, studies show that it is 10 times more likely to fail again due to extreme voltage transients caused by the fault, fault location and reenergization, and finally, the errors introduced during the emergency repair. Studies show that the failure risk during the first few years of service can be lowered by 100 times with effective commissioning and repairs of anomalies. Removing the biggest risk, installation defects, and protecting the cable system from extreme voltage with proper arresters (IEEE C62.22.1-2024) and extreme loading with proper over current protection, reliability can be improved by 100 times. Economic returnson this approach are in excess of 500% and simple paybacks can be achieved in a couple of years when comparing the upfront commissioning and fully burdened future failure costs.
Workforce Training Acceleration
According to a 2023 study by the Center for Energy Workforce Development (CEWD), over 60% of line workers have less than 10 years of experience. Experts who train splicers frequently find they cannot identify common defects. One cannot completely blame installers since training is often inconsistent and defects (incipient faults) can easily pass legacy commissioning tests, like DC HIPOT and VLF withstand tests. These tests detect less than 1% of defects, and utilities using these commissioning tests have unknowingly been giving workers a false sense of security and reinforcing poor workmanship. The Utility Underground Life-Cycle Cost Guide recommended practice of an offline 50 or 60Hz PD test with 5pC sensitivity objectively addresses the training issue. This approach locates over 99% of insulation defects and provides instant feedback to installers on the job site so they can learn what a “defect looks like” and develop functional repair techniques. Utilities can now employ junior employees with confidence that an effective commissioning test will give them the feedback they need and protect the underground cable investments from future O&M and revenue loss.
The cable scanning specification provides a proven solution to lowering installation upfront costs, extending cable life, and accelerating workforce training, all while eliminating future truck roles, improving safety and reducing O&M and revenue loss.
Why the Offline 50 or 60Hz PD Test Specification?
The Utility Underground Life-Cycle Cost Guide recommends an offline 50 or 60Hz PD test with 5pC sensitivity due to the efficacy demonstrated over the last couple decades in large scale industry studies from both a reliability and return-on-investment point of view. There are other specification options that have been considered but most of them are less than a 10% solution as compared to the cable and accessory manufacturers’ standards (Table I) and are thus not recommended. However, there is sometimes confusion around why strict specification is needed, leading some cable owners to consider using lower cost damped AC (DAC) or very low frequency (VLF) voltage sources, or a test with a detection sensitivity in the range of 100pC without understanding the implications.

Table 1: Field tests are limited to the level of system overvoltage protection (2 to 2.5 U0)
Test Voltage Frequency Implications.
PD is a “micro arching” that does not bridge the insulation. Its behavior is sensitive to the frequency of the test voltage source. All manufacturers’ standards (Table I) require the cable be energized with a continuous 50 Hz or 60 Hz AC voltage and often applied for longer than 10 seconds but shorter than a minute. Testing at power frequency allows cable owners to observe the systems under typical operating frequency conditions. Frequency of the test voltage source is a direct factor in a defect turn on voltage or PD inception voltage (PDIV) and turn off voltage or PD extinction voltage (PDEV). PDIV and PDEV performance and measured values are the basis of all design and production quality control tests for modern power cable system components. VLF solutions which test at very low frequencies (0.1 Hz to 1 Hz) act more like DC than power frequency and are unlikely to excite PD in many cases. Damped AC voltage and Cosine rectangular voltage sources, with a controlled polarity reversal slope change, rely on charging the cable system with DC first followed by a short impulse which is only a fraction of one 50/60Hz cycle. The DC associated with these voltage sources introduce so called “space charges” in the insulation causing a significantly different electric field distribution at the defect location. Additionally, the resulting impulse is less than a full 50/60Hz cycle, and the frequency content is typically in the hundreds of hertz. In summary, performing a PD test with less than 1Hz, charging with DC first, at hundreds of hertz, or a duration shorter than the typical standard duration (10 to 60 seconds), produces conditions that can change the PDIV in the range of 150 to 200% and render the test results erroneous and impossible to compare to industry standards.

Studies show that the failure risk during the first few years of service can be lower by 100 times with effective commissioning and repairs of anomalies.
Pre-detection Noise Removal
When manufacturers perform PD tests, the standards require each component be placed in an electrically shielded room (typically steel walled) that provides a radio frequency (RF) free environment prior to checking for PD, effectively providing “predetection noise removal.” In the field, the manufacturers’ standards equivalent test emulates the shielded room by removing background RF noise through an assemblage of techniques and processes. This starts with a high-frequency measurement connection method to the medium voltage cable which of course is designed for a power frequency (50/60Hz) application. This is followed by a broadband signal coupler capable of decoupling the high frequency signals while it is operating at power frequency during the test. The system digitizes the data at a high enough rate to determine PD charge measurement, and lastly the captured digital data is processed through advanced iterative digital signal processing (DSP). All the above elements effectively replicate the “pre-detection noise removal,” while preserving signal integrity for PD detection. Most field PD tests employ generic connection techniques and couplers to the cable. The equipment is often the same model commonly used in a low noise laboratory setting. In the field, this approach attempts to detect PD first before using DSP to help remove the remaining noise, effectively post-detection noise removal. This practice provides significantly less sensitivity and often yields results typically only achieving 100 to 200pC sensitivity, missing over 90% of cable defects as compared to the 5pC requirement of the cable and accessory manufacturers’ standards (Table I).
The net result of VLF or DAC PD tests which combine a non-standardized voltage source with a generic signal detector, is typically less than a 10% solution and thus produces less than 10% of the reliability performance value and return on investment.

Example defects missed by a non-standardized PD test. Cable Defect (top), Splice defect (bottom)

Best practice dictates that once a cable system passes this specification, a reliable 40-year performance is expected, provided there are no extreme/physically altering operational events to damage it.
Comparison Case Studies and Results
In Table II, one can see 13 case studies where controlled “standardized Offline 50 or 60Hz PD tests” were compared to experimental VLF or DAC PD tests. In all 13 cases, the result of the VLF and DAC PD test produced erroneous results. These are just a few cases, but they should give the reader sufficient comparison to understand the reason PDI2, American Clean Power (ACP) and IEEE standard 1185 recommended practice specify a standardized offline 50 or 60Hz PD approach to commissioning.

Table 1: Field tests are limited to the level of system overvoltage protection (2 to 2.5 U0)
Conclusion
Utilities installing medium voltage lines underground are making massive investments. To lower costs and protect these investments against future O&M and revenue loss, an effective cable scanning commissioning test is recommended. The Power Delivery Intelligence Initiative, (PDI2.org) report called the “Utility Underground Life-Cycle Cost Guide” recommends an offline 50 or 60Hz partial discharge test with 5pC sensitivity. Best practice dictates that once a cable system passes this specification, a reliable 40-year performance is expected, provided there are no extreme/physically altering operational events to damage it. If such an event occurs another baseline scan is recommended. Deploying this specification for less than a few percent of construction cost, utilities can triple their cable life span, significantly reduce up-front costs, enable workforce training, and reduce future operating cost/revenue loss and boost reliability and safety by over 10 times.
