Yesterday I lost a multiplayer match of StarCraft II because my computer turned itself off to install automatic updates. To be honest, it ultimately was my fault - it had been needling me to do so for days, but I'd kept on just clicking "postpone." The installation of automatic updates requires that the computer be rebooted a few times, and I dislike rebooting unnecessarily. The problem is that when it happened, it came completely by surprise. The computer restarted after a fifteen-minute countdown, but a countdown that popped under. Since I was busy trying desperately to find a tactic that would end with my opponent not standing atop the broken remnants of my base, my first indication of its presence was when programs started shutting down of their own accord.
Without giving me a chance to save, either.
Over the last few decades our civilization has been computerized to a great degree, and this is a trend that's only going to gain strength in the coming years. We're at the point now where armed military robots are no longer science fiction, but are being used in theatres of operation at this moment. If that's going to continue to be the case, we can't just let our current operating procedures stand. My issue was minor - it meant my opponent could find someone else to smash sooner, and I lost a few points on the division ladder - but it arose from the computer's programming telling it to embark on an action unless that action was specifically countermanded.
The attention of a human in a serious issue is important, but humans can only divide their attention so much - and sometimes, be it through information overload or programming defect, the computer's message to the operator may be missed. If we're not careful, this could lead to a future where automated systems put dangerous wheels in motion because no human contradicted them in time. Things are stressful enough without a ticking clock, especially one that has no reason to be there.
This is the sort of thing that creeps in, a bit at a time. No one's ever going to sit down and say "you know what? Let's make it so that computers figure out what we're going to do." No, they'll automate control of low-priority systems, always intending to keep humans in the loop, but as things go on and as the organization paying the bills realizes that automated decision loops mean that they can get by with fewer humans on the payroll, people may gradually but steadily get crowded out. It happens by degrees - the sort of transition that humans are the worst at perceiving until it's too late, or nearly so, to do anything about. I've heard stories of automated stock trading that's caused a company's share price to crash utterly because of the decisions some computer program made - I can't recall now if it actually happened or is just an urban legend, but the danger is there just the same. It would certainly make it harder to make those systems Three Laws-compliant.
Freedom of choice is one of the most valuable things in the universe. If we invest the course-choosing mechanisms of our civilization in machines that will act unless specifically instructed otherwise, well - at that point, whose civilization is it _really_?