The Usual Tech Ramblings


What a week. Most of this week has been dedicated to two things. The data center move, and the office move. Neither of which are fun tasks to say the least. Especially not when AT&T are jerking your chain on installs for office connectivity, voice services, and other fun items.

Monday, and Tuesday evening of this week was spent load testing the new servers in the new data center to ensure they could meet and exceed the performance of the servers we currently use. Unfortunately that didn’t start off too well, with terrible performance, which resulted in 2 further days of load testing on Thursday and Friday (which I couldn’t attend). They seem to have straightened out some of the issues now, and performance is at least on par with what we get in our current environment.

Starting Thursday began the “big move”. The moving company dropped off a boat load of red crates, and labels, and we all got busy tagging. Friday saw more tagging, and visits to the new office, and the old one several times.

Friday evening was where the weekend fun began. We had to move the phone system Friday night as AT&T said they were unable to diagnose any issues due to a yearly server maintenance issue occurring on Saturday (sound like BS to you?). So we had to pull down 3 servers (a domain controller, and two phone servers), as well as all the phone switching equipment. This went relatively smoothly, that was, until we got it to the new office.

The company that had built out our server room, had not ordered big enough cabinets, despite visiting our existing server, and seeing all of our equipment. When we got the phone servers to the new building (Dell PowerEdge 1850 for the curious folks), we found that the posts in the cabinet were positioned wrong, and when all bolted in, the servers stuck out the back, so the doors are currently sitting at the back of the server room. Apart from that minor hiccup, the rest of the phone migration went very smooth.

Saturday rolls in with an early start of 7am (not too bad I guess). When I arrived, I started powering down all the non-critical servers to get them ready, and once I’d been given the go-ahead by our data teamed, pulled down the last 4 servers for the move. I’d been given a deadline of getting the routing, and firewall equipment to the new office by 9:30am for a 10am conference call with AT&T to cut over the Internet services to the new building (we needed to keep IP to reduce customer and vendor impact).

We got the firewall and routers in place, and started building up the switching equipment, when I received a call from the AT&T tech that was waiting. He said the bridge hadn’t been opened, and was going to handle it direct instead of on a bridge. He said he’d have to get some configs confirmed with the ENOC in California, and that he’d call back in 15 mins. Well, while waiting, I started working on the switches some more.

Then I got a call from a second ENOC technician that said he was waiting for 15 mins on the bridge, and was about to head home. When he asked what I was trying to do, he told me it’d take him all of 30 seconds to do, and he’d be done… By the time I had a remote connection to the router up, he said he was already done, and sure enough, I could see the world again (hooray).

Now back to cabling, whilst waiting for our servers to arrive. An hour or so later whilst in the middle of racking a server, I received a third AT&T call, regarding our long distance T1 services, reminding me that I should be on a conference call with them. So I jump on, with a tech that had no idea how our phone system worked, and why the install people had configured the T1 circuit the way they did. 2 hours later, and 4 servers on the rack, they finally went with the standard T1 office move option (I was racking servers whilst on the call).

All in all, it didn’t go too badly. The only “bad” thing about the whole weekend was the cabinet, and a couple of bad cables. I will probably post some pictures of the new office, and the new server room once we’ve finished tidying it all up.