|Peter Bowditch's Web Site|
|Home | Interests | Writing | Speaking | Videos and Photos | Books | Podcast|
Security experts. Right!
April 20, 2014
Everyone is concerned about the security of corporate data. We all do regular backups (and test that they work and store them in a safe place), we restrict physical access to places where people shouldn't go, and we all have good password management policies. Don't we? Here are some stories about security.
The secure premises.
I used to do some business with a company that ran an installation with very fast and powerful computers. They shared the time on these computers among their customers. These days we would call them a cloud provider but back then they were called a "service bureau" or "timeshare facility". The system was used for such complex tasks as modelling and balancing telephone network traffic, simulation of nuclear explosions, producing three-dimensional maps of the universe for research into gravity and the Big Bang (this last two for physicists at universities), complex financial and economic modelling for both companies and government, and a lot of work for the national defence people (I could know about the client, I could not know about what work was done for them).
This was obviously a very secure facility. I had reason to go there often and every time I had to go through the routine of confirming I had an appointment with someone, producing identification, and being issued with a visitor's pass. This pass was not issued at reception, as it is in most places I go, but from the installation security office, although I had to check in at reception first. After getting the pass I could return to the reception area and wait for whoever I was meeting to come out and escort me into the secure area.
I was waiting one day, watching people pass through the door into the inner sanctum. This was one of the first places I had seen card key access used, as the technology was in its relative infancy. People would approach the door, swipe their cards, wait for the buzz and click, and then proceed to open the door. Someone arrived in an obvious hurry and leant on the door as he started to reach for the card reader. The door opened. Investigation showed that the lock was broken, and even though it made all the right noises when a card was swiped the door was never actually locked. The wonderful, and worrying, thing was that there was no way to know how long it had been broken. The system obviously recorded every successful (and failed) card reading, but it had no way of recording that the door was opened without a card. Everyone had always used the correct swipe, wait, enter protocol until this day.
To say that there were red faces around the office that day was a serious understatement. I also believe that if I hadn't been there (I was the only outsider waiting in reception at the time) then nobody outside the place would have known there was ever any problem.
The backup backup
A client company was using ACT! in Melbourne and had decided to install it in their Sydney office. As it was a very large database and data communication over telephone lines was slow at the time, the database was sent to Sydney on a CD. I installed ACT! in all the relevant places, installed the database onto their Novell server, trained some of the staff, and banked the cheque for my consulting fees.
A few weeks later I received a panic phone call. The company's IT experts had upgraded the server over the weekend and all the work the staff had done with ACT! for more than a month had disappeared. I told the distraught sales manager to immediately go to the server room, get the backup tape that had been created on the Friday night before the upgrade, and lock it in his desk so there was no chance it could be overwritten. I then leapt into my car to drive to their office, which was about as far as you could get from my office and still be in Sydney.
By the time I got there the mystery of the disappeared data had been solved - the IT people had somehow found the CD from Melbourne and for no reason that anyone could understand had copied the files on it over those on the server. The sales manager was saying unpleasant things about the IT people. Whew! At least it wasn't an ACT! problem or anything that was my fault. And we had the backup from before the disaster. Didn't we?
The backup system that they were using had two parts. The actual backing up and recording of the relevant tape labels for the various daily, weekly, and monthly backups was run as a service on the Novell server. Restoration and general management of what was on the backup tapes was done using a program running on a Windows workstation. I asked where I could find a workstation with the relevant software installed, one obvious place being the sales manager's computer. I was given the alarming news that it wasn't installed anywhere. The backup had never been tested, not just the part where you see if the tapes can be read, but the part where you get back stuff from the tapes. Nobody was available that day to install it. I drove home. The sales manager considered asking his doctor for blood pressure tablets.
Recognising the urgency of fixing a problem that was preventing the entire sales staff from doing their jobs, the IT people finally got around to installing the restore and management software about three days later. An expert rang me to ask how he should use the program. I'd never seen it, but I knew the principles so I told him to put the Friday backup tape in the server's tape drive and the workstation program would then show the local machine's drives (including server drives) in one window and the tape contents in another, because that's how all programs like this work. He could then copy the relevant files from the tape to where they should be and everyone would be happy again. After a few minutes he admitted defeat and said he couldn't work out how to do all this. I got back into my car, where I could swear without offending anyone.
Ten minutes after arriving I had the restore program open in front of me (four minutes to walk to and from the server room to check that the right tape was in the drive, one minute to start the program, five minutes patting the sales manager on the shoulder to stop him crying). I navigated to the appropriate place in the tape directory for the ACT! backup and found the four or five files that had been backed up before the weekend's disaster. The problem was that the version of ACT! that they were using always had 21-22 files to hold the data and configuration. Most of the database, including the file that actually contained the base data for all customers, had not been backed up. There is only one reason that a partial backup like this happens - files that are open and in use are skipped. We got all available backup tapes from the other building on the site where such things were stored and checked them all - none of them had a full backup of all files. The sales manager curled up in a foetal position in the corner of his office.
This company was security-conscious, so all workstations had screen savers which came on after one minute of operator inactivity and which required a password to start using the machine again. The staff understandably found this tedious, and a habit had developed of people not bothering to turn their computers off when they went home if the screen saver was on - they would just switch off the monitor, and if they had been using ACT! all its files would still be open. I assume the same thing was happening to other systems the company used such as accounts but this wasn't my problem.
Five minutes work with a program that I was unfamiliar with showed that the backup program on the server had been specifically configured to not back up open files. The default, the way the program worked if not instructed to do otherwise, was to defer backup of open files for a few minutes and after a few attempts like this to back the files up anyway. I can't comment on any other systems which might have been affected, but the worst thing that could happen to ACT! was that database indexes could be corrupted; I always forced a rebuild after restoring from backup anyway. The sales manager stood silently behind his desk with his fists clenched for a few minutes and then told me that he was going to have words with the IT people and that I was to invoice the company for all the time I had spent looking at the problem, including travel.
And the business this company was in - secure storage of documents, archival material, and data backups. To get to the backup tapes that were not in the server room we had to enter a locked warehouse, go into a separate locked enclosure, and open a safe.
What's that they say about cobblers' shoes and mechanics' cars?
The super secret passwords
I was supporting ACT! for a company that had offices all over Australia, so the obvious way to do things was to use remote control software to get at the machines outside Sydney. The company had an external IT support organisation looking after everything else, and these professionals had a remote control system in place. The begrudgingly allowed me to use it on the first remote installation I had to do but I was then told that I had to find my own way to access the interstate machines because of the security risk I posed by being able to log in to their system. I switched to GoToAssist (which was a much better system anyway), but I thought it would be pointless and mildly insulting to point out that any system I installed could be used to uninstall their client software on the remote machines and thereby block the other support company. I wouldn't do this, because I have ethics, and I thought they would be aware of the potential problem anyway as they were so security conscious.
Several of the staff across the country used Handheld Contact to synchronise the ACT! data to their phones. Managing Handheld Contact is done through a secure web site protected by user name and password. Anyone with access to this site can configure the cloud system to send the data to their phone, so it is a very sensitive area which allows anyone who can get in to take a copy of the complete ACT! data belonging to a target company. This might take thirty minutes for the whole hack. To the legitimate user it will appear as some sort of glitch which can be easily fixed and blamed on events such as the Internet connection dropping out during synchronisation.
The user name is always the person's email address, so the password is the real protection. All of the users had been set up with the same password, and recent research by a security investigator has shown that the one chosen ("letmein") is among the fifteen most commonly used passwords in the world. Remember that this had been set up by people who thought that allowing a consultant access to remote computers was a security risk.
And speaking of that remote control system, you got to it by a publicly-available link on the support company's web site, and all you had to do was enter the correct password to get in. That password was "Pa55word".
Experts. You have to love them.
|Copyright © 1998- Peter Bowditch|
Logos and trademarks belong to whoever owns them