Ghoulish Delight |
03-13-2011 09:21 AM |
There are many people much smarter than me that have written about this subject (Douglas Hoffstadter, Kevin Kelly come to mind), but there was one idea in the article that I wanted to respond to.
Quote:
Imagine a computer scientist that was itself a super-intelligent computer. It would work incredibly quickly. It could draw on huge amounts of data effortlessly. It wouldn't even take breaks to play Farmville.
|
Possibly, but not necessarily. Just because there is a computer running software that produces intelligence/conscience, it does not follow that that conscience has direct, full-speed access to the underlying computing power. Case in point, our brains are magnificently powerful computing mechanisms, in many ways far more powerful than current computing technology (which the article hints at by pointing out that the a prerequisite for singularity is significant increase in computing power over today). But your conscience mind has no direct access to that. That computing power is the hardware that your thoughts run on, but that doesn't mean you can, with conscious processes, give direct instructions to the underlying hardware. I tend to agree with Hoffstadter on this point. Anything that we would be willing to call intelligence would most likely end up in with the same "dilemma", whether the software is running on squishy organic hardware or silicon.
It MAY be possible to circumvent that limitation with whatever AI technology comes to be, but it's by no means a guarantee.
|