Even while I agree with Alan Kay when he says that IQ << knowledge << outlook. But all three happen in the brain regardless.
Imagine we manage to build a machine that produce more insights than Newton. That particular form of intelligence would be quite likely to trigger a singularity, don't you think?
Sifting through the day's top-list on the AI appstore... WTH's this? "Newton AI. The power of a 1000 research assistants at the click of a button, and they'll run all day tirelessly.". "99c launch deal, just for today -- get it now!" Hmm. Click.
You're now waiting for it to download to your little AiPod that'll beam 'brain bits' to
your home-bots, that're now busy sketching out the next monalisa onto a couple of shiny new dreamPads.
Why wouldn't those startup dudes down the street try and build a beefier/faster "runs at 50x universe speed" AiPod for the AI platform you just bought your
Newton AI app for?
-------
Why wouldn't AI be able to simulate 'regular universe time/ human time' faster? Why couldn't AI have stronger, more varied randomness?
The bottleneck would be interactions that require a peek into regular universe time: live human input (phone calls, emails), weather, biological data etc.
Therefore, even if computers become more intelligent than humans, it is doubtful a "singularity" will occur.