Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Science progress is not just a question of intelligence.

Therefore, even if computers become more intelligent than humans, it is doubtful a "singularity" will occur.



If not, then what?

Even while I agree with Alan Kay when he says that IQ << knowledge << outlook. But all three happen in the brain regardless.

Imagine we manage to build a machine that produce more insights than Newton. That particular form of intelligence would be quite likely to trigger a singularity, don't you think?


No I don't, because you still need time and a certain amount of randomnesses to make discoveries.

Another way to put it is that even if you are twice as intelligent as Newton, you won't discover twice as many things or discover things twice faster.


Sifting through the day's top-list on the AI appstore... WTH's this? "Newton AI. The power of a 1000 research assistants at the click of a button, and they'll run all day tirelessly.". "99c launch deal, just for today -- get it now!" Hmm. Click.

You're now waiting for it to download to your little AiPod that'll beam 'brain bits' to your home-bots, that're now busy sketching out the next monalisa onto a couple of shiny new dreamPads.

Why wouldn't those startup dudes down the street try and build a beefier/faster "runs at 50x universe speed" AiPod for the AI platform you just bought your Newton AI app for?

-------

Why wouldn't AI be able to simulate 'regular universe time/ human time' faster? Why couldn't AI have stronger, more varied randomness?

The bottleneck would be interactions that require a peek into regular universe time: live human input (phone calls, emails), weather, biological data etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: