Forget your password?
Fields marked with an asterisk (*) are mandatory
Drones are being used for overhead line inspection worldwide. They can cover a lot of area quickly, taking thousands of pictures and collecting gigabytes of video, spectral, and thermal data all during one flight. To take advantage of that visual power, Avitas Systems, a GE venture created a platform to analyze the data in real time, applying artificial intelligence to inspection data to identify risk-based insights.
This innovative solution joins GE’s advanced non-intrusive inspection services for grid assets and complements the Asset Lifecycle Management Services offering, helping customers to boost profitability, improve safety and compliance, while increasing asset and operational efficiency.
The electrical grid in the United States is one of the largest, most complex systems developed by man. There are over 200,000 miles of high-voltage transmission lines and nearly 6,000,000 miles of distribution lines. The interconnected lines link the average home, commercial, and industrial users to the thousands of generating stations of every type from coal, oil, and gas to nuclear, hydro-electric, wind, and solar. This massive “system of systems” requires computer and millions of lines of software code to monitor and control the safe, uninterrupted delivery of electrons from one end of the system to other. The inspection and maintenance of millions of miles of lines takes thousands of inspectors and uses expensive and dangerous technologies, like helicopters, to do many of the tasks to keep costs down and minimize the risk to people.
Historically, utilities operate manually, by means of paper forms. A typical manual process flow consists of paper forms being filled out by inspectors who must walk every inch of the lines and fly every inch of remote, hard-to-reach lines and assets. The data they collect generates more forms, including work orders, to direct other teams of workers to repair lines, replace components, and trim vegetation. Then those workers produce reports confirming the work is done, other work orders, and so on. This manual, tedious process has had few improvements over the years. Philip Schoonover ii, Sr Staff Solutions Engineer, notes, “Utilities spent millions improving the overall process with some success but still rely heavily on people in the field to collect the data, understand the issues, determine a course of action, and direct teams to perform the work needed”.
The advent of a new “system of systems” to help to meet the challenges of the industry worldwide has only recently been explored by utilities. This entails making the leap forward from manual to automated, from analog (pen and paper) to a fully digital twin, monitored, assessed, and executed by software-driven tools, artificial intelligence, and complex machine and deep learning algorithms. The algorithms capture decades of the inspectors’ vast experience.
Picture 1: Hype curve. Drones, AI, and machine learning coming fast. Source: Gartner 2017
The Hype curve by Gartner shows that Machine Learning and deep learning algorithms are peaking in interest now and will soon be part of everyday usage. In terms of adoption, drones are just ahead of them and edge computing is closely lagging. This makes sense as we learn about how to use the analytics we also learn that they can be applied to great advantage on the drone platform to drive intelligent decisions about what to inspect and what actions to take in real time or near real time.
As a total cost of ownership proposition, robotic inspection and maintenance combined with Artificial Intelligence (AI) presents a true disruption in the industry.
The most critical part of the process begins with data collection. The collection of multiple modalities of data, all at once, to enable the democratization of the grid physical state to all departments from inspection and maintenance to engineering, vegetation management, fire hazard, emergency management, and loss prevention.
Schoonover ii explains, “The use of drones is the primary platform of choice for data collection. Drones can carry a wide range of sensors, storage, communications, and importantly computer power. The advanced AI analytics can be run “on the edge,” locally on the drone. It can be integrated with highly automated flight management software and sense-and-avoid software to allow the drone to fly autonomous routes [ Figure 1 (a)]. The drone can inspect assets, assess the asset against large databases, and with finely tuned algorithms make in-flight decisions to inspect in more detail [Figure 1 (b)], deviate to additional assets, call out immediate repair teams, or report dangerous or inefficient conditions in the network based on complex rules engines and years of fused data”.
Picture 2: Drone on autonomous route Picture 3: Drone deviating due to analytic assessment.
Pictures: (a) A drone follows a pre-programmed flight path. (b) The drone uses image analytics to identify an abnormal condition and deviates from its pre-planned path to further investigate and document the anomaly.
Sensor data that isn’t immediately needed for analysis can be transmitted to local or cloud assets for access to much more powerful computer platforms to develop 3D digital baseline models, Light Detection and Ranging (LiDAR) point clouds, analyze multi-spectral (3- ten bands), hyper-spectral (hundreds or thousands of bands), infrared, and UV/Corona data and provide detailed reports. The data and metadata can be used to verify existing Geographical Information System (GIS) information against the latest GPS measurements and finally perform advanced predictive estimates on asset health, environmental conditions and impacts on the overall network such as corrosion, foundation washout, or damage from coming storms. Network segments that are likely to see a storm can be inspected for integrity and to create a baseline. After a storm, the drones can retrace their steps and use the on-board analytics to discern changes in the assets and create a response request within hours of the storm passing. This can direct emergency response teams to the more critical parts of the network and lets the operators know what equipment to send for the repair.
Picture 4 is the Avitas Systems Platform network view with Points of Interest (POI)
Picture 5 is identifying vegetation encroachment using LiDAR, with color coding to show severity
Using advanced image analytics, Avitas Systems can automatically identify asset components enabling as-is versus as-design comparisons for parts such as avian guards, pole caps, strain and pin insulators, transformers, and capacitors of distribution poles, as well as insulators, spacers, and vibration dampers of transmission towers. Other defect identification capabilities include corrosion detection and tracking over time, detection of foreign objects such as birds’ nests, as well as thermographic fault detection from overlaid inspection IR imagery.
The use of drones will continue to evolve. In the U.S., the FAA and NASA are working to create a traffic management system for drones that will allow Beyond Visual Line of Sight (BVLOS) operation of drones reducing the cost to operate, while allowing resident drones to remain in a facility and be dispatched automatically every day to do “rounds.” These can be in power stations, on top of towers, or even distribution poles. Simple daily visual inspections do not need large expensive drones and the edge analytics can direct more detailed inspections if needed by more capable drones.
“Highly capable, advanced drone platforms combined with autonomous flight management software and sense and avoid technology performing real-time inspections with multiple modalities of sensors and assessing the data stream using the most advanced edge analytics is poised to create a disruptive shift in the operation of transmission and distribution lines across the US and around the world,” concludes Schoonover ii.
To inspect vertical assets, such as transmission towers, a specialized autonomous aerial inspection system has been developed. It integrates rotary-wing aerial robotics, sensor technology, and software applications.
This system provides an intuitive, software-based interface for inspection planning and data collection through 3D modeling. Inspectors can select points of inspection (POI) on digital 3D models of a vertical asset. Users simply select a POI on the model, change the perspective of the model to define the sensor angle, and indicate the size of the resolvable defect (e.g., crack of 1 mm, corrosion patch of 2 sq. inches, etc.) by extending the POI, which translates into the Unmanned Aerial Vehicle’s (UAV’s) standoff distance. This point and click method reduces the inspection planning time from hours to five or ten minutes. The system autonomously converts this 3D modeling, integrated with existing requirements, into flight paths for data collection.
Human and asset safety is a top priority. The 3D model allows UAVs to navigate around obstacles so there is no crash risk. Customers can define no-fly zones of any shape and size. Measurements of no-fly zones are three dimensional, meaning the length width, and height can be adjusted. For example, if the height of the no-fly zone is ten meters, and the minimum safety standoff distance is seven meters, then the UAV will stay 17 meters above the no-fly zone.
The UAVs are equipped with a triple redundant IMU (Inertial measurement unit, a complement to GPS) and GPS receiver to maintain precise flight control in case of failure, which ensures mission robustness. Additionally, safety-critical communication is separated from mission-critical communication. Independent communication systems ensure that in case of unforeseeable circumstances, the operator on the ground can take control with 100 percent certainty.
While this system is entirely autonomous, it provides continuous situational awareness throughout the mission, including forward simulation, which compares system intended action with original planning, so the operator can take corrective action if there’s a discrepancy. The system also has live feed capabilities, so operators receive updates from the UAV’s camera in real time.
Fields marked with an asterisk (*) are mandatory