Within the telecom industry, among others, we are seeing many inspection companies beginning to use drones for inspections prior to a tower climb. Drones are tremendously beneficial when used correctly, allowing climbers to understand a tower’s condition in detail, and from the comfort of the ground. Drones are becoming an essential tool for tower climbers and asset owners to boost their inspection schedules, reduce costs and improve safety. As someone heavily invested in unleashing the power of drone data, I am extremely optimistic about this growing part of our industry as enterprises embraces drones for inspections.
A pilot’s skill level is very important to safe drone flight operations, but not as critical as the organization’s approach to training, supervision and, most importantly, learning from mistakes in a constructive manner.
It goes without saying that aviation in any form can be a dangerous business. Of course, the level of danger varies whether you are flying an airliner, fighter jet, attack helicopter, or using small commercial drones for inspections, but several common factors will contribute to reducing the risk of a drone flight. The FAA provides extensive guidelines for commercial drone safety, but simply following these minimum guidelines isn’t enough to ensure safe operations. While a pilot’s skill level is very important to safe drone flight operations, it’s not as critical as the organization’s approach to training, supervision and, most importantly, creating a culture of learning from mistakes in a constructive manner.
A few years ago, I witnessed first-hand how organizational mismanagement can lead to a drone accident. I was talking to a drone pilot who worked for a local drone service provider (DSP) who had been tasked at short notice at the end of a long day to capture some imagery of a self-support tower. Crucially, when travelling between tasks, he had failed to charge his drone remote control and started his flights with only one-third of his battery power remaining.
The electric utility industry is also adopting drones for inspections, requiring rigorous safety standards
He began collecting the required inspection imagery, but started to feel time pressure as the evening twilight progressed. His final serial required him to collect drone imagery on a sector on the opposite side of the tower. The first battery alarm was triggered on his iPad as his remote hit 10% power, and then the drone itself reached 20% battery, triggering a second alarm. Unfortunately, he assumed both were informing him of the drone battery, not the remote. Assessing he had enough power to capture the remaining images, he pressed on. Approximately two minutes later, his remote shut down, and the drone began its pre-programmed Return-To-Home (RTH) Mission. The drone had been programmed to RTH at its current altitude, so turned directly towards the take-off position. Because it was on the opposite side of the tower to the pilot, it began to track home, and towards the structure. In a state of panic, the pilot could only watch as his drone got within inches of the cell tower before its obstacle detection recognized the self-support framing. The drone initiated a maneuver and climbed towards a beam that now lay in an obstacle avoidance blind spot. The drone’s props contacted the beam, and it immediately began an uncontrolled tumbling descent from approximately 75 feet. Fortunately, it impacted on open ground, but was very close to coming down on top of the ancillary equipment in the tower compound itself.
A “Just” Safety Culture includes training, SOPs, professional supervision and a learning environment
Safety Management Systems – The “Just” Culture
What lessons can be distilled from such a drone safety incident? It would be easy to rule that the cause of this drone accident was solely due to drone pilot error or incompetence. If this was our only conclusion, we would be ignoring a host of other issues, and more importantly, setting the organization up to repeat the failure in the future. The example shows that significant change must occur in the organization if it is to build a safety-focused program for managing drones for inspections. But it is not enough to create rules, impose processes, and run training programs. The biggest contributor to a long-term drone safety program is a cultural shift in attitude to mistakes, where an organization’s immediate reaction to an incident is not to look for blame, but to seek to learn the lessons and better prepare the team, so it does not occur again. This will require an open and honest system of incident reporting, where those who make mistakes will be supported, not punished, when they come forward after a safety occurrence. This “Just” culture should result in lessons identified that improve pilot training and standard operating procedures (SOPs), so they are constantly evolving to reflect best practices and prevent repeating safety incidents, occurrences, and accidents. In most modern air forces, accident investigators would attempt to break crashes down into causal factors, with an emphasis on forensically identifying the contributing factors and what organizational changes could be made to stop a future occurrence.
Optelos Turnkey Drone Inspection Services follow strict safety protocols to create a “Just” Safety Culture
So, let’s try to break this example down to the three main failings:
- Pilot Training and Knowledge Deficiencies. In this situation, the pilot was P107 qualified, but with formal training restricted to the technical aspects of capturing drone imagery. He had not been formally trained on drone RTH logic, nor had he been trained on procedures to correctly program safe, mission appropriate RTH settings. He also had not been trained on the basics of how obstructions can mask the control link signal from the remote to the drone, or the circumstances in which RTH can be initiated by remote control failures.
- Supervision Culture – SOPs and Flight Planning. The pilot was dispatched on this drone inspection without a plan that been approved by their company management, and management was not present during the flight. In short, execution was delegated entirely to the pilot, with no direct supervision. He was neither provided with cell tower specific checklists prior to the drone flight, nor did he have any formal SOPs to provide a framework for how to conduct flight profiles when operating drones for inspections. It is worth emphasizing that supervision does not need to be physically present. It can be done remotely if the correct procedures are in place. It can also be done through company-approved SOPs. Supervision should also cover important human factors such as the crew rest periods, and whether they are too fatigued to fly. It also covers the environment in which the drone pilot is operating. Will they be comfortable enough to operate drones for inspections safely in extreme temperatures, do they have enough light, etc.?
- Supervision Culture – Time & Task Pressure. It would be fair to say that the pilot put himself under pressure to achieve the task, and he should have recognized he was beginning the flight in an unsafe condition; with batteries in a poor state and insufficient for the task. That being said, the management was perfectly aware of the flight profile that the drone inspection task requirement dictated, and that the task had been added to the end of a day after a full schedule. It was also unusual in that the flight profile was around a vertical structure, which itself carries many more additional risks than any other inspection. A good supervisor would have recognized this, and directed him to execute the flight the next day, with a fresh set of batteries, a clear head, and minimal pressures. Task pressures can often override immediate safety concerns, this is true in all walks of life, but in aviation disciplines it is especially critical to maintain a strong culture of safety. Professional supervision could have mitigated the safety incident in this case.
Skydio autonomous drone conducting cell tower inspection
There is a well-worn phrase in the Royal Air Force flight safety culture that if you put enough blocks of Swiss cheese together, eventually the holes will line up and something could pass straight through. It is a cumbersome analogy, but it does illustrate the point that accidents occur when the gaps in an organization’s safety management policy line up. Drones for inspections are not inherently prone to accidents, but the self-support tower crash occurred because the pilot was deployed to complete a task with inadequate training, with no SOPs, and with poor supervision. Management at the DSP pressed the pilot to fly the cell tower with poor appreciation of the risks the drone inspection task entailed. The holes were numerous, and the outcome inevitable – whether today, or tomorrow. Accidents are going to happen if this culture continues unchecked.
An Example of Open & Honest Reporting (and a donut penalty)
In my career with the Royal Air Force (RAF), I had a safety occurrence while flying an MQ-9 Reaper UAV on an operational mission in Afghanistan. I was turning toward a potential target and had to pull up a menu screen to make a change to the aircraft’s weapon configuration. The menus were accessed through a mouse and keyboard, but shortcut keys were available by pressing the function keys. In this case, all I had to do was press F1, F2 and F3 in that order then hit ENTER to make the required setting. This was a technique taught in training and one I had performed successfully hundreds of times in both the simulator and while airborne. On this occasion in a combat stressed and dark Ground Control Station at 0300 in the Nevada Desert, I made a mistake, and pressed the F2 key first, followed by F3 and F4 and ENTER. Imagine my surprise when the unexpected occurred and the aircraft’s under carriage began to deploy. In this incident, there were no safety factors, other than the embarrassment it caused me and a slight delay to an operational outcome. The gear was raised, and no limits were exceeded. But the point is that it could have been worse. If the aircraft was traveling 20 knots faster, or if the tactical situation was more pressing, there could have been a more severe result from my error. Thankfully, my organization had a mature approach to such mistakes, and the process that came next was well versed in ensuring the correct response. First, I completed some incident paperwork covering the specifics of what occurred. Soon a Report Signal was circulated to the whole force describing what had happened, along with management’s recommendations to stop a recurrence. It was mandatory reading for all other pilots and all management. Training was adjusted to stop others making the same mistake, and many approached me afterwards to say they had been performing the same procedure, unaware of the risk of what could happen. The incident was even used to inform the aircraft design team of the ergonomic issue of keyboards. Of course, I was also gently ridiculed in the Squadron bar and forced to purchase a box of donuts for the next shift, but the point is that the whole system was improved slightly as a result of my error. Over time, those errors and lessons from multiple pilots will add up, leading to improved safety and outcomes for all by establishing and supporting a healthy safety culture. In summary, a “Just” safety culture is an important part of developing a drone program. Such a culture has its pedigree in manned aviation, but the practices transfer to those trying to establish a viable long-term use of drones in any organization. Of course, telecom drone flights will not require such an extensive network of reporting and safety management as the Military example, but with a small investment in time, training, and safety management, your organization will be better prepared for long-term drone use, will have a lower incident rate, and will earn a reputation for trouble-free operations when using drones for inspections.
USAF MQ-9 Reaper
Safety is a critical aspect of all who touch towers and requires commitment at all levels. Drones will continue to be a great tool for your organization to reduce costs and risk if well supported by an open and honest culture of continuous improvement. If this culture already exists broadly within your organization, and the effort is made to extend that culture to drones for inspections, you will experience increasingly safe operations through a “just” safety culture supported by training, professional supervision, documented standard operating procedures and shared learnings.
Director of Client Programs at Optelos
Former RAF helicopter and UAV pilot