Showing posts with label Exascale Computing. Show all posts
Showing posts with label Exascale Computing. Show all posts

December 4, 2007

Exascale Computing and Enterprise Architecture

Faster computers mean more capability for problem solving some of our most difficult and challenging business, technical, and scientific quandaries.

The Washington Post, 3 December 2007, reports that by next year, the next generation of supercomputers will come online at Lawrence Livermore National Laboratory.

The computers will be petascale, capable of processing 1,000 trillion calculations per second! (note: that is almost double the current capability of 596 trillion cps.)

Imagine that we don’t even readily have a term to describe a 1,000 trillion, yet we’ll be able to do it!

That much processing power is the equivalent of 100,000 desktop computers combined.

IBM’s “Roadrunner” is the “leading candidate to become the first petascale machine,” and will enable computer simulations that will “shed new light on subjects such as climate change, geology, new drug development, dark matter, and other secrets of the universe, as well as other fields in which direct experimental observation is time-consuming, costly, dangerous, or impossible.”

Another area that supercomputers help with is in assessing “the reliability, safety, and performance of weapons in the U.S. nuclear stockpile” without any real-life testing necessary.

One big advantage to these powerful supercomputers is that rather than doing experiments, we can simply simulate them. So, computational science (generated by supercomputer power) supplants to some extent observational or theoretical science.

What’s more amazing yet? Scientists are anticipating the exascale machine, yet another thousand times more powerful, by 2018. Now we’re talking a million trillion calculations per second. And that’s not “baby talk,” either.

From a User-centric enterprise architecture perspective, the importance of petascale and exascale supercomputing is that we need to think beyond the existing models of distributed computing and recognize the vast potential that supercomputers can provide. As architects, we need to envision the potential of future low-cost supercomputing power and what impact this can have on our organization’s ability to better perform its mission and achieve improved results. One day, supercomputing will not only be for scientists, but it will be employed by those savvy organizations that can harness its processing power to deliver better, faster, and cheaper products and services to its users. Some day, when we apply supercomputing power to everyday problems, we’ll be approaching the vision of Ray Kurzweil’s, the singularity, where machine intelligence surpasses human intelligence. But to me that point really isn’t who is smarter, man or machine, but rather can we—in organizations of various sizes, in every industry, and around the globe—harness the power of the supercomputer to make the world a truly better place.


Share/Save/Bookmark