Good point, but that would be a bug in the AI. An intelligent system would see there would be no point in killing humans, so killing the humans would defeat the purpose of making the paperclips, and the humans support the machine, thus killing them would be counter-productive to its purpose.
It's also unrealistic that algorithms would try to exceed the limitations of their system. Imagine if a natural predator tried to "maximize production" of killing its prey: it would run out of prey, starve, and die. An AI would understand that trying to "maximize production" to the detriment of everything else would be counter-productive and create a resource contention war. Humans are the only creature I know of that exhausts natural resources to its own detriment - we are smart enough to exploit our resources, but too stupid to know when to stop. The natural system's response to this behavior seems to be to get us to kill ourselves. Maybe the AI are part of this process?
Good point, but that would be a bug in the AI. An intelligent system would see there would be no point in killing humans, so killing the humans would defeat the purpose of making the paperclips, and the humans support the machine, thus killing them would be counter-productive to its purpose.
It's also unrealistic that algorithms would try to exceed the limitations of their system. Imagine if a natural predator tried to "maximize production" of killing its prey: it would run out of prey, starve, and die. An AI would understand that trying to "maximize production" to the detriment of everything else would be counter-productive and create a resource contention war. Humans are the only creature I know of that exhausts natural resources to its own detriment - we are smart enough to exploit our resources, but too stupid to know when to stop. The natural system's response to this behavior seems to be to get us to kill ourselves. Maybe the AI are part of this process?