Showing posts with label Fatalistic. Show all posts
Showing posts with label Fatalistic. Show all posts

October 22, 2007

The Atomic Bomb and Enterprise Architecture

J. Robert Oppenheimer (April 22, 1904 – February 18, 1967) was an American theoretical physicist, best known for his role as the director of the Manhattan Project, the World War II effort to develop the first nuclear weapons, at the secret Los Alamo laboratory in New Mexico. Known as "the father of the atomic bomb," Oppenheimer was shocked by the weapon's killing power after it was used to destroy the Japanese cities of Hiroshima and Nagasaki…After the war, Oppenheimer was a chief advisor to the newly created United States Atomic Energy Commission and used that position to lobby for international control of atomic energy and to avert the nuclear arms race with the Soviet Union.” (Wikipedia)

Oppenheimer believed that technology and science had their own imperatives, and that whatever could be discovered or done would be discovered and done. "It is a profound and necessary truth," he told a Canadian audience in 1962, "that the deep things in science are not found because they are useful; they are found because it was possible to find them." Because he believed that some country would build a nuclear bomb, he preferred that it be the United States, whose politics were imperfect but preferable to those of Nazi Germany or the Soviet Union…Oppenheimer was a fatalist about the evolution of technology and science…He looked to humanity's most progressive institutions to restrain the malignant use of technology. Oppenheimer was asked to build a nuclear bomb, and he hoped reason would dictate that it be used twice, in a just war, and then never again. (MIT Technology Review, “Oppenheimer's Ghost”, November/December 2007)

From a humanistic perspective—I am intrigued by the polarity of Oppenheimer’s acknowledgement that in building the atomic bomb and supporting its use against Japan in WWII that he had blood on his hands, but at the same time believing that its use in WWII was justified to prevent further loss of life as well as it existence being a deterrent for future conflicts.

From a User-centric EA perspective—I am interested in Oppenheimer’s fatalistic belief in the evolution of technology. Are technology advances predetermined and inevitable as Oppenheimer believed or is there an element of human control?

Of course, organizations determine through their IT governance (i.e. investment review boards and enterprise architecture strategy), what IT projects to invest in. However, looking beyond distinct individuals or organizations, it seems that nothing will truly impede global technological progress, if there is any gain to be had economically, politically, socially, or otherwise. Net utility (cost-benefit analysis) determines whether innovation is funded and pursued.

EA and IT governance can broker IT investments, but just like the building of the atomic bomb, if it can be done and it benefits someone, it will be done by someone, somewhere!