As our “miracle” 21st century begins to unfold, a statement,
which has been an eternal truth for most of human history, is now
being seriously challenged: Humans will always be battling
sicknesses. Many scientists believe this statement could be
overturned within the next three decades, and most of the credit
for this feat would lie in our ability to increase computer power.
Today, medical researchers, in efforts to cure heart disease,
cancer, obesity, Alzheimer’s disease, and many other human ills,
perform trial and error experiments in labs, and conduct human
clinical trials that yield excruciatingly slow results. Cancer
deaths are predicted to not end for another seven years, and cures
for other diseases are projected to be even more elusive.
But researchers say we could speed medical research progress by
first using Clinical Trial Simulations (CTS). If we preceded actual
human trials with high-speed computer simulations, the end results
would be reached much faster. Ronald Gieschke, of Hoffmann-La Roche
in Switzerland, claims CTS will have a
significant impact on the way in which drugs are developed in the
future. “Human clinical trials will still be necessary,” Gieschke
says, “but CTS will make them faster and
In addressing the need for increased computer power,
IBM’s new “Roadrunner,” built for the US
Department of Energy’s Los Alamos National Laboratory has achieved
performance of 1.026 petaflops (more than one quadrillion floating
point operations per second) and is now rated as the fastest
supercomputer in the world.
The DOE announced that this computer
will link its facilities to other government labs and major
research centers around the world. Scientists will find easy access
to this new supercomputer later this year, according to a
LANL spokesman. The new machine will
enable breakthrough discoveries in biology that will fundamentally
change medical science and its impact across society. (cont.)
The future of computing has many different aspects and it is not
my intention with this post to provide a detailed explanation of
each. Rather, I merely want to share with readers who are
interested in the future of computing some interesting and
For those looking for a broad-based overview of how computers
will change our lives, I highly recommend this detailed report by
Microsoft Research entitled “Being
Human: Human-Computer Interaction in the Year 2020.” The second
chapter, in particular, is very insightful and documents five major
transformations: 1) The End of Interface Stability; 2) The Growth
of Techno-Dependancy; 3) The Growth of Hyper-Connectivity; 4) The
End of Ephemeral; and 5) the Growth of Creative Engagement.
For readers seeking a slightly more technical understanding of
where computers are headed, I’d recommend this press release by
Gartner, Inc. It covers a number of “grand challenges” which will dramatically
alter how future computers operate and are used.
Succinctly, the major changes are:
1. Never having to manually recharge devices.
2. Parallel Programming.
3. Non-tactile, Natural Computing Interfaces. (This corresponds
with the Microsoft report.)
4. Automated Speech Translation
5. Persistent and Reliable Long-Term Storage; and
6. Increasing Programmer Productivity 100-fold.
Human computation, the basis of which is discovering what tasks humans can do to make computers smarter, may someday be responsible for making computers not only smarter, but significantly smarter than humans.
Human computation has many applications. For example, computers aren’t very good at identifying what appears in an image, but humans are. To make online image searches more accurate, von Ahn developed the ESP game, which led to the creation of Google’s Image Labeler, and finally the compilation of five different games: Games With A Purpose (GWAP.com).
The model of game play works well. The games are fun, foster bonds and competition, and are free. These are all qualities that have attracted high numbers of players – thus, creating a strong effort to make not only image searches more durable, but also bring computers closer to thinking like humans.
The question is, when will all our game playing lead to a smarter computer that no longer needs our help?
In June 2006, von Ahn was invited to the Google campus to give a TechTalk lecture on human computation and brought up some interesting points about the bond and tension between humans and machines:
At one point von Ahn jokes that the interactions he’s created through GWAP could lead to a world similar to the one depicted in The Matrix; that is, one in which machines rule the universe and generate power from human brains.
Although his speculation appeared light-hearted, when I ask von Ahn what he thinks now, he asserts: “I completely believe computers will become every bit as intelligent as humans, possibly even more intelligent. I don’t see why not: the brain is a machine, we just don’t understand how it works yet.”
Wondering what all of the Alpha hype is about? Here's a dense 10-minute video snippet of the official Wolfram Alpha "computational knowledge engine" unveiling, presented by the mathematician himself, at Harvard's Berkman Center.
I found notable:
the label "computational knowledge engine" - reinfirces that we're moving from the information age to the knowledge age (and fairly quickly)
Alpha's ability to factor in the location of the user submitting the request into computation results
results that begin with a list of assumptions that essentially present your query back to you in more technical terms (an advanced "did you mean this?" feature) which seems to make a great deal of sense when relating to machine data/knowledge, it's like having a conversation about science and establishing basic consensus before venturing complex and potentially unrelated ideas
the program's seemingly robust ability to mix data from different sources to return logically related results
Conclusions: Upon launch, Wolfram Alpha will be a science researcher's dream if it can perform as effectively - for a wide range of queries - as it did in this demo. It'll also serve as a nice accelerative kick in the ass for Google. I can't wait to try this new quantification assistant.