On March 19, 2008 Richard Dawkins, the famous evolutionary biologist and popularizer of science, gave a public lecture at the University of Texas in Austin; it was preceded by a reception hosted by the Center of Inquiry Austin. Though I didn't have a chance to exchange more than a few sentences with Dawkins at the reception, I formed some kind of impression of him as a person. For example, he speaks in perfect phrases and is hip on technology. (Though I bet he would never use the word "hip". :-)) His lecture topics I found familiar, even though I haven't read his books where he expounds on them. I guess I've absorbed his ideas by osmosis. The questions the audience asked revolved around whether atheists should adopt an in-your-face or a conciliatory tone with general public; some of the questions were more unusual. (Would you ask a well-known skeptic to support his reasoning with astrology? :-)) Then someone asked what Dawkins thinks of transhumanist visions. Finally, a concept he wanted us to take away from this lecture, if it was the only thing we would take away: why evolution is NOT equal to random chance.
What armchair "scientists" get out of reading popular science magazines or websites? Mostly they get excited about "out there" speculation that gets batted on wired.com or io9.com. Holographic universe. Preferred direction in the universe.
What a real astrophysicist's answer to "what do you think about this?" is: nothing.
10 people were at the discussion. Some have heard more about the concept of the Singularity, others less. Moderator Mike Ignatowski described two common Singularity scenarios. They are:
-- "hard takeoff": a computer develops human-level AI, and then within a few hours doubles, quadruples it, etc., and very soon becomes intelligent beyond our comprehension and takes over the world;
-- "soft takeoff" -- technological advance is gradual enough so any given human does not lose comprehension of what's happening; however, in a few hundreds of years the society and technology nonetheless changes so much that it's incomprehensible to a modern-day human.
We examined some of those scenarios and objections to them.