Hurricane Wilma hit the South Florida area Oct. 25th, 2005. In my current role as a Telecom Manager for a large fire department, I got to see many of my predictions and fears about Telephone company powering come true.
Wilma was, as hurricanes go, pretty mild. It was a slow moving, wet Category 2 hurricane. Compared to Hurricane Andrew of 1992, we were very lucky. Andrew was a very fast moving, compact, Cat 5 storm that totally devastated South Dade. Wilma was mild in comparison. But it was much larger and did strike all of Miami Dade and Broward Counties. Even now, a year later, many buildings still have blue tarps on their roofs.
Wilma did do some roof damage, ripped up trees, and caused huge power failures, for weeks in some places. Debate continues about the condition of the FP&L poles and infrastructure. Were they maintaining their plant? Was the decision not to trim trees that threatened, but didn't touch, power lines a good decision?
The end result was far more power outages than a strong Cat 1 or weak Cat 2 storm should have caused. We learned a lot those days about how dependent we were on power.
Some topics I'll try to cover here:
Access to effected area after the storm
Sites that stayed up
Changes made with how critical sites are now serviced
Some cases I'll mention from my own personal experience.
Case 1, RT with a bad breaker panel
This wasn't Wilma related. Back in the 1980's I worked for a cable company. Our circuit to our main office had been down for hours. I was staring to get annoyed because all work stopped for our business without this critical link. I went down the street to what must have been one of the early RTs, in a CEV (Controlled Environment Vault). In introduced myself to the techs working... well... standing around. I asked what was wrong and they explained it was a power outage. Mind you, they were speaking louder than normal to be heard over the sound of the generator that was running right there. When I pointed out that power seemed to be there, they agreed, but said there was still a power outage. When a supervisor arrived, I asked if I could somehow help. I explained that knew some some skills in power systems. After many hours of outage, he was happy to accept my offer. I went down the ladder and he showed me around. My first trip into a CEV. Sure enough, everything was dead. He explained that generator didn't seem to make any difference. I decided a good next step would be to bypass the CEV's electrical system. He showed me which equipment racks were the most important. I went to my van and got an extension cord. This is when I learned that they don't use the 15 amp plugs we're used to. They use the 20 Amp version that won't go into a 15 amp cord. So next I went to the empty racks and gathered together the cords that when from bottom of the rack to the top of the wall where they plugged in. They were basically heavy duty 8' cords. A few of them in series and I had enough to make it to the accessory outlet on the generator trailer. The rack powered right up. I ran another set of cord to power another rack the same way. Same positive result. Our circuit to the home office came up and we got back to business. Later that day I found out that the main breaker in the CEV had gone bad. That, and the transfer switch were common to both power sources so it became the common point of failure. While an electrician found this in seconds, it appears that BellSouth didn't have a tech that had the level of training necessary to do basic electrical troubleshooting.
Keep in mind, what I did to get this RT back up wasn't anything that an electrician should have done instead. I took no risks here. I literally used no tools and opened nothing to expose power. I simply ran an extension cord to a known working outlet. Some pilot lights and a simple plug in power tester could have helped an RT tech to figure out what to do.
One lesson I learned, keep 20 Amp cord sets handy. This would come in handy almost 20 years later.
Case 2, Road side RT
One critical radio site in South Dade went down. Our radio system is linked with Telco T1 circuits. I knew that this somewhat remote site was serviced by an RT. Apparently very early in the storm, power failed in this area. The '4 to 8 hour' countdown just started At first we were told that the area hadn't been cleared as safe by the power company, so the Telco would not bring crews into the area. As I was reporting the T1 outages, I got no comfort level that they knew the RT was either on battery power or had failed. They were proceeding to test each circuit as if it was an unknown problem. On each circuit I had to go through the same routine, which usually ended with 'we'll have to dispatch a tech'. The failure was always the same, 'we can't loop up the HLU, (HDSL Local Unit). That's the card in the RT that works with the HRU, HDSL Remote Unit, also known as the "Smart Jack" in our building. The RT powers the HLU which the powers the HRU by sending 180 VDC over the same pair as the T1 data.
They finally got a generator to the site and we were back up. For a while. Then we failed again. After hours of being down, I decided to visit the site myself to see what was going on. A Telco Tech was there. I introduced myself and offered to help. He was grateful for the help and let me check a few basics. First off, I disconnected the large power connector coming from the generator trailer. A quick check of the pins with my volt meter confirmed, no power. Meanwhile the generator was clearly running. No need to keep checking things in the RT, the problem was power. I asked the tech for the key to the generator panel. He didn't have them. Getting to the generator switches is a 'different group'. Mind you, I'm not proposing a tune-up or overhaul, just to get to the panel with the breakers, meter and start switch. It was locked with no key available. The crew that works on generators was long gone working on other sites. Lucky for me, the hinges on the panel cover had exposed Philips screws. I'm in. There are two double pole breakers. One for the larger cable, one for the smaller one. Only one was on. You can guess by now, it was the wrong one. One flip off, one flip on and lights come on in the RT. In less than 5 minutes the system is back to normal and our radio system is working again.
My theory: Someone came out to refuel the generator. Their procedure is to shut down the generator first. This is a very smart safety step. It's little risk to service because by the time it would need fuel, the RT batteries would have recharged. Apparently, when the person fueling the generator was done, he or she restarted the generator, then turned on the wrong breaker. There's no visual indicator on the RT or the permanent manual transfer switch that power is on. You could listen to the fan hum, but that's coming from the batteries. Suggestion: Add a neon power indicators on the transfer switch box, one for generator, one for commercial power and one for output to the breaker panel. These can be made tamper resistant. The same lights appear on Cable TV power supplies in very similar situations. The generator fuel person could have quickly seen that the RT wasn't getting power and could have done something about it right then.
A few days after Wilma, fuel was hard to get. BellSouth warned us that they couldn't get enough fuel to keep the RT generators going. The fire department agreed to refuel the ones that we were depending on.
This same RT also served a fire station. That station had lost its T1, effecting the phones, computers and even the regular POTS (Plain Old Telephone Service) lines that were intended to be the back up if the T1 failed.
Case 3, Private RT
Our fire station at the Oka Locka airport had been without T1 connectivity for days after Wilma. Telco reported there wasn't anything they could do, including a generator. I went to investigate. I found that for some unknown reason, our fire station was being served by an indoor RT at a private aviation company about a mile away. Lucky for me, they were there trying to recover from the storm damage. I introduced myself and asked to see the telephone closet. An odd request, but sometimes the right ID helps open doors, literally. They showed me to a second floor, very crowded closet that included an ATT DDM2000 RT, much like the one we have at our HQ. I asked their IT how they were doing. He was fustrated. They had powered all of their critical phone and computer equipment from a 5 kW portable generator yet they couldn't communicate. So I made him an offer... If you can give me some power from your portable generators, I can get both of us back on.
He ran an extension cord from outside, up a stairwell and into the closet. From pervious experience, I knew the the plugs they used would be 120 VAC 20 Amp, not 15 Amp like most equipment. I had in my van adapter cords I had made with a 20 Amp female going to a 15 Amp Male. There were two racks and two plugs like this. Once I plugged in the cords, the room lit up with pilot lights, fitting for an airport. When I saw the alarms clear from the HLU card for our fire station, I called them and confirm things were back to normal. The host company was also getting phone calls for the first time since the storm.
I also noticed some tags marked "FAA". So I called the tower at the airport and they too reported a lot of their communication had just come back.
Suggestion: Private RTs make sense. But feeding other customers from them does not. Why is my telecom service dependent on another business? What if they didn't want to reopen? What if they had gone bankrupt and stopped paying their power bill?
Case 4, Private RT generator removed
[I hope to write this up soon]
Again and again, it became clear to me that if there was an alarm going to the Telco about a lack of commercial power, it wasn't being followed up on. Off the record I was told they barely pay attention to the door alarms. The batteries are allowed to die before anyone is dispatched. This needs to be corrected. A lot of tester, tech and MY time was wasted testing circuits on RTs that had no hope of working.
Information about TSP (The Telecommunications Service Priority) can be found here: http://tsp.ncs.gov/. We pay an extra $6 a month to have this flag on our T1 circuits. My experience was that it made little difference. I was told repeatedly that the circuits would be fixed when they could get the area back up. I'm sure at some levels and in some situations it can make a difference it repair times. But after Wilma, the speed to repair the circuits appeared to be more about how they were prioritized at our Emergency Operations Center.