October 16, 2009

Seeing things Differently with Augmented Reality

One of the most exciting emerging technologies out there is Augmented Reality (AR). While the term has been around since approximately 1990, the technology is only really beginning to take off now for consumer uses.

In augmented reality, you layer computer-generated information over real world physical environment. This computer generated imagery is seen through special eye wear such as contacts, glasses, monocles, or perhaps even projected as a 3-D image display in front off you.

With the overlay of computer information, important context can be added to everyday content that you are sensing. This takes place when names and other information are layered over people, places, and things to give them meaning and greater value to us.

Augmented reality is really a form of mashups, where information is combined (i.e. content aggregration) from multiple sources to create a higher order of information with enhanced end-user value.

In AR, multiple layers of information can be available and users can switch between them easily at the press of a button, swipe of a screen, or even a verbal command.

Fast Company, November 2009, provides some modern day examples of how this AR technology is being used:

Yelp’s iPhone App—“Let’s viewers point there phone down a street and get Yelp star ratings for merchants.”

Trulia for Android—“The real-estate search site user Layar’s Reality Browser to overlay listings on top of a Google phone’s camera view. Scan a neighborhood’s available properties and even connect to realtors.”

TAT’s Augmented ID— “Point your Android phone at a long-lost acquaintance for his Facebook, Twitter, and YouTube activity.”

Michael Zollner, an AR researcher, puts it this way: “We have a vast amount of data on the Web, but today we see it on a flat screen. It’s only a small step to see all of it superimposed on our lives.”

Maarteen Lens-FitzGerald, a cofounder of Layar, said: “As the technology improves, AR apps will be able to recognize faces and physical objects [i.e. facial and object recognition] and render detailed 3-D animation sequences.”

According to Fast Company, it will be like having “Terminator eyes,” that see everything, but with all the information about it in real time running over or alongside the image.

AR has been in use for fighter pilots and museum exhibits and trade shows for a number of years, but with the explosive growth of the data available on the Internet, mobile communication devices, and wireless technology, we now have a much greater capability to superimpose data on everything, everywhere.

The need to “get online” and “look things up” will soon be supplanted by the real time linkage of information and imagery. We will soon be walking around in a combined real and virtual reality, rather than coming home from the real world and sitting down at a computer to enter a virtual world. The demarcation will disappear to a great extent.

Augmented reality will bring us to a new level of efficiency and effectiveness in using information to act faster, smarter, and more decisively in all our daily activities personally and professionally and in matters of commerce and war.

With AR, we will never see things the same way again!


Share/Save/Bookmark

October 12, 2009

Timeouts for Professionals—Ouch

Experts have been teaching parents for years to discipline children, when needed, with timeouts. This is seen as a combined rehabilitative and punitive method to deal with “bad” behavior. The idea is that the child has time to reflect on what they did “wrong” and how they can do better in the future. It also functions as a way to sort of “punish” the child to teach them that there are consequences to their actions, like having to sit in inaction for a period of time. Of course, time-outs also serve the purpose of a “cooling off” period for both parent and child when things are heating up.

Interestingly enough, like many things in life, adults, in a sense, are just big children. And the time-out method doesn’t end in childhood. This method of discipline is used in the workplace as well.

I have seen and heard story after story of people at work who do something “wrong” (whether as defined by objective policy or more often it seems by some subjective management whim) and they get sidelined. They get moved off into a corner—with the proverbial dunce cap on their heads—where they can do no harm. They are for all intensive purposes ignored. They are not assigned any meaningful or significant work. They are neutered.

Unlike a child’s timeout though, an adult timeout may be for a period of time or this may be permanent—no one knows in advance.

Just as with a child, the adult timeout is both punitive and possibly rehabilitative. Punitively, it is supposed to take the “problem” worker out of the larger workplace equation, and it therefore hurts their career, personal and professional learning and growth, and their self-esteem. In terms of rehabilitation, I imagine some may think that like a child, the adult will have time to reflect on what they did wrong—if they even know what they did—and commit to never doing it again—to be a better employee in the future.

Well, why don’t employers just help the employee to do better in their jobs by coaching, mentoring, training, providing constructive feedback, counseling and if necessary taking other corrective actions--why the childish timeouts?

Perhaps, managers think it is easier to just “ignore” a problem—literally—or to handle it quietly and subtly, rather than “confronting” the employee and having to work with them over time to improve.

Unfortunately, this erroneous thinking—the desire to handle it the “easy way out”—is reinforced by often-archaic performance management systems that do not distinguish between employee performances. They neither meaningfully reward or recognize good performance nor discourage poor employee performance.

Certainly, it is important to have fairness, objectivity, and controls in any performance management system, but this needs to be balanced with managing our human capital in a way that is good for the organization and good for the employee.

We cannot continue to manage our employees like children. We cannot punish people for honest mistakes at work that were unintentional, not malicious, and done in good faith and best effort in performance of their jobs.

Instead, we need to manage people with maturity. We need to identify where the issues are, emphasize where appropriate, understand what can be done to correct problems, and work with employees on how they can learn and grow.

Alternatively, we need to handle true performance issues and not bury them indefinitely in timeouts. Our organizations and our employees need to move past childish modes of performance management and handle people decisively, with measured intent, and with absolute integrity.


Share/Save/Bookmark

October 10, 2009

Making Something Out of Nothing

At the Gartner Enterprise Architecture Summit this past week (October 7-9, 2009), I heard about this new math for value creation:

Nothing + Nothing = Something

At first, you sort of go, WHAT?

Then, it starts to make a lot of sense.

Seemingly nothings can be combined (for example, through mashups) to become something significant.

When you really think about it, doesn’t this really happen all the time.

INFORMATION: You can have tens or thousands of data points, but it’s not till you connect the dots that you have meaningful information or business intelligence.

PEOPLE: Similarly, you can have individuals, but it’s not until you put them together—professionally or personally—that you really get sparks flying.

Harvard Business Review, October 2009, put it this way:

Ants aren’t smart…ant colonies are…under the right conditions, groups—whether ant colonies, markets, or corporations—can be smarter than any of their members.” This is the “wisdom of crowds and swarm intelligence.”

PROCESS: We can have a workable process, but a single process alone may not produce diddly. However, when you string processes together—for example, in an assembly line—you can produce a complex product or service. Think of a car or a plane or a intricate surgical procedure.

TECHNOLOGY: I am sure you have all experienced the purchase of hardware or software technologies that in and of themselves are basically useless to the organization. It’s only when we combine them into a workable application system that we have something technologically valuable to the end-user.

Whatever, the combination, we don’t always know in advance what we are going to get when we make new connections—this is the process of ideation, innovation, and transformation.

Think of the chemist or engineer or artist that combines chemicals, building blocks elements, or colors, textures, and styles in new ways and gets something previously unimaginable or not anticipated.

In a sense, organization and personal value creation is very much about creating relationships and associations between things. And a good leader knows how to make these combinations work:

Getting people and organizations to work together productively.

Generating new ideas for innovative business products or better ways of serving the customer.

Linking people, process, and technology in ever expanding ways to execute more effectively and efficiently than ever before.

Enterprise architecture shares this principle of identifying and optimizing relationships and associations between architectural entities such as business processes, data elements, and application systems. Typically, we perform these associations in architectural models, such as business process, data, and system models. Moreover, when we combine these models, we really advance the cause by determining what our processes are/should be, what information is needed to perform these, and what are the systems that serve up this information. Models help architects to identify gaps, redundancies, inefficiencies, and opportunities between the nothings to improve the greater whole of the something.

The real enterprise architect will make the leap from just describing many of these elements to making the real connections and providing a future direction (aka a target architecture) or at least recommending some viable options for one.

Nothing + Nothing (can) = Something. This will happen when we have the following:

  • The right touch of leadership skills to encourage, motivate and facilitate value creation.
  • The allocation of talented people to the task of combining things in new ways.
  • And the special sauce—which is everyone’s commitment, creativity, and hard work to make something new and wonderful emerge.


Share/Save/Bookmark

October 6, 2009

Constructive Truth Hurts, But Helps

It is pretty hard to give and to get honest feedback.

It is often acknowledged that performance reviews are one of the most difficult task for managers to perform. Managers don’t like to “get into it” with the employees, and employees often can’t deal with a straightforward evaluation from their supervisors. Plenty of sugarcoating seems to go on to make the process more digestible for all.

Similarly, people tend not to say what they “really think” in many situations at work. Either, they feel that saying what they mean would be “politically incorrect” or would be frowned upon, ignored, or may even get them in trouble. So people generally “toe the line” and “try not to rock the boat,” because the “nail that stands up, gets hammered down hard.”

An article in the Wall Street Journal, 5 October 2009, reports a similar pattern of behavior with ratings on the Internet. “One of the Web’s little secrets is that when consumers write online reviews, they tend to be positive ratings: The average grade for things online is about 4.3 stars out of five.” On Youtube, the average review for videos is even higher at 4.6.

Ed Keller, the chief executive of Bazaarvoice, says that on average he finds that 65% of the word-of-mouth reviews are positive and only 8% are negative. Likewise, Andy Chen, the chief executive of Power Reviews, says “It’s like gambling. Most people remember the times they win and don’t realize that in aggregate they’ve lost money.”

Some people say that ratings are inflated because negative reviews are deleted, negative reviewers are given flak for their “brutal honesty,” or the reviews are tainted with overly positive self-aggrandizing reviews done on themselves.

With product reviews or performance reviews, “it’s kind of meaningless if every one is great.”

I remember when I was in the private sector, as managers we had to do a “forced rankings” of our employees regardless of their performance rating, in an effort to “get to truth” across the organization.

Generally speaking, performance systems have been lambasted for years for not recognizing and rewarding high performers or for dealing with performance problems.

Whether it products, people, or workplace issues, if we are not honest in measuring and reporting on what’s working and what's not—fairly and constructively—then we will continue to delude ourselves and each other and hurt future performance. We cannot improve the status quo, if we don’t face up to real problems. We cannot take concrete, constructive action to learn and grow and apply innovate solutions, if we don’t know or can’t acknowledge our fundamental weaknesses.

“Being nice” with reviews may avert a confrontation in the short-term, but it causes more problems in the long-term.

Being honest, empathetic, and offering constructive suggestions for improvement with a genuine desire to see the person succeed or product/service improve—and not because the manager is "going after" someone—can be a thousand times more helpful than giving the nod, wink, and look-away to another opportunity for learning, growth, and personal and professional success.
Share/Save/Bookmark

Measurement is Essential to Results

Mission execution and performance results are the highest goals of enterprise architecture.

In the book Leadership by Rudolph Giuliani, he describes how performance measurement in his administration as mayor of NYC resulted in tremendous improvements, such as drastic decreases in crime. He states: “Every time we’d add a performance indicator, we’d see a similar pattern of improvement.”

How did Giuliani use performance measures? The centerpiece of the effort to reduce crime was a process called Compstat in which crime statistics were collected and analyzed daily, and then at meetings these stats were used to “hold each borough command’s feet to the fire.”

What improvements did Giuliani get from instituting performance measurements? Major felonies fell 12.3%, murder fell 17.9%, and robbery 15.5% from just 1993-1994. “New York’s [crime] rate reduction was three to six times the national average…far surpassed that of any other American city. And we not only brought down the crime rate, we kept it down.”

How important was performance measurement to Giuliani? Giuliani states, “even after eight years, I remain electrified by how effective those Compstat meetings could be. It became the crown jewel of my administration’s push for accountability—yet it had been resisted by many who did not want their performance to be measured.”

From an architecture perspective, performance measurement is critical—you cannot manage what you don’t measure!

Performance measurement is really at the heart of enterprise architecture—identifying where you are today (i.e. your baseline), setting your goals where you want to be in the future (i.e. your targets), and establishing a plan to get your organization from here to there through business process improvement, reengineering, and technology enablement.

In the end, genuine leadership means we direct people, process, and technology towards achieving measureable results. Fear of measurement just won't make the grade!


Share/Save/Bookmark

October 3, 2009

Effective Presentation Skills

Watch this helpful video on effective presentations by Paul Maloney and Associates (a product of Gartner).

Understand and rectify the top 10 presenter mistakes:
  1. "Little audience contact
  2. Distracting habits and mannerisms
  3. Inadequate preparation
  4. Unclear purpose and objectives
  5. Failure to maintain presence
  6. Lack of organization
  7. Too few examples and illustrations
  8. Little vocal animation or variety
  9. Too much information
  10. Too many slides"
What effective presenters do:
  1. "Establish and maintain eye contact
  2. Take a steady stance
  3. channel nervous energy
  4. Speak with animation and enthusiasm
  5. Reinforce the message
  6. Handle questions well"

Share/Save/Bookmark

October 1, 2009

Conversational Computing and Enterprise Architecture

In MIT Technology Review, 19 September 2007, in an article entitled “Intelligent, Chatty Machines” by Kate Green, the author describes advances in computers’ ability to understand and respond to conversation. No, really.

Conversational computing works by using a “set of algorithms that convert strings of words into concepts and formulate a wordy response.”

The software product that enables this is called SILVIA and it works like this: “during a conversation, words are turned into conceptual data…SILVIA takes these concepts and mixes them with other conceptual data that's stored in short-term memory (information from the current discussion) or long-term memory (information that has been established through prior training sessions). Then SILVIA transforms the resulting concepts back into human language. Sometimes the software might trigger programs to run on a computer or perform another task required to interact with the outside world. For example, it could save a file, query a search engine, or send an e-mail.”

There has been much research done over the years in natural-language processing technology, but the results so far have not fully met expectations. Still, the time will come when we will be talking with our computers, just like on Star Trek, although I don’t know if we’ll be saying quite yet “Beam me up, Scotty.”

From an enterrpise architecture standpoint, the vision of conversational artificial intelligence is absolutely incredible. Imagine the potential! This would change the way we do everyday mission and business tasks. Everything would be affected from how we execute and support business functions and processes, and how we use, access, and share information. Just say the word and it’s done! Won't that be sweet?

I find it marvelous to imagine the day when we can fully engage with our technology on a more human level, such as through conversation. Then we can say goodbye to the keyboard and mouse, the way we did to the typewriter--which are just museum pieces now.


Share/Save/Bookmark

September 30, 2009

Conflict Management and Enterprise Architecture

What is conflict?

In the book Images of Organization by Gareth Morgan, the author states “Conflict arises whenever interests collide…whatever the reason, and whatever form it takes, its source rests in some perceived or real divergence of interests.”


Why does conflict occur?


Morgan continues: “People must collaborate in pursuit of a common task, yet are often pitted against each other in competition for limited resources, status, and career advancement.”


How does conflict manifest?


The conflicting dimensions of organization are most clearly symbolized in the hierarchical organization chart, which is both a system of cooperation, in that it reflects a rational subdivision of tasks, and a career ladder up which people are motivated to climb. The fact is there are more jobs at the bottom than at the top means that competition for the top places is likely to be keen, and that in any career race there are likely to be far fewer winners than losers.”


How does User-centric EA help Manage Conflict?


Enterprise architecture is a tool for resolving organizational conflict. EA does this in a couple of major ways:

  1. Information Transparency: EA makes business and technical information transparent in the organization. And as they say, “information is power”, so by providing information to everyone, EA becomes a ‘great equalizer’—making information equally available to those throughout the organization. Additionally, by people having information, they can better resolve conflict through informed decision-making.
  2. Governance: EA provides for governance. According to Wikipedia, “governance develops and manages consistent, cohesive policies, processes and decision-rights for a given area of responsibility.” As such, governance provides a mechanism to resolve conflicts, in an orderly fashion. For example, an IT Investment Review Board and supporting EA Review Board enables a decision process for authorizing, allocating, and prioritizing new IT investments, an otherwise highly contentious area for many sponsors and stakeholders in the organization.

Conflict is inevitable; however, EA can provide both information and governance to help manage and resolve conflict.


Share/Save/Bookmark

September 29, 2009

Turning the Tables on Terrorists

Rep. Roscoe Bartlett (R-Md) said that an Electromagnetic Pulse (EMP)—“it would bring down the whole [electrical] grid and cost between $1 trillion to $2 trillion” to repair with full recovery taking up to 10 years!

“It sounds like a science-fiction disaster: A nuclear weapon is detonated miles above the Earth’s atmosphere and knocks out power from New York City to Chicago for weeks, maybe months. Experts and lawmakers are increasing warning that terrorists or enemy nation state could wage that exact type of attack, idling electricity grids and disrupting everything from communications networks to military defenses…such an attack would halt banking, transportation, food, water, and emergency services and might result in the defeat of our military forces.” (Federal Times—September 21, 2009)

The Federal Energy Regulatory Commission (FERC) says “the U.S. is ill-prepared to prevent or recover from an EMP”—they are asking Congress for authority to require power companies to take protective steps to build metal shields around sensitive computer equipment.

It is imperative for us to protect our critical infrastructure so that we are not vulnerable to the devastating effects of a potential EMP blast. We must think beyond simple guns and bullets and realize that our technological progress is on one hand a great advantage to our society, but on the other hand, can be a huge liability if our technical nerve centers are “taken out”. Our technology is a great strategic advantage for us, but also it is our soft underbelly, and whether, we are surprised by an EMP or some hard-hitting cyber warfare, we are back to the stone age and it will hurt.

It also occurs to me that the same tools terrorists use against others can also be used against them.


Share/Save/Bookmark

Embracing Instability and Enterprise Architecture

Traditional management espouses that executives are supposed to develop a vision, chart a course for the organization, and guide it to that future destination. Moreover, everyone in the enterprise is supposed to pull together and sing off the same sheet of music, to make the vision succeed and become reality. However, new approaches to organizational management acknowledge that in today’s environment of rapid change and the many unknowns that abound, executives need to be far more flexible and adaptable, open to learning and feedback, and allow for greater individualism and creativity to succeed.

In the book Managing the Unknowable by Ralph Stacey, the author states that “by definition, innovative strategic directions take an organization into uncharted waters. It follows that no one can know the future destination of an innovative organization. Rather, that organization’s managers must create, invent, and discover their destination as they go.”

In an environment of rapid change, the leader’s role is not to rigidly control where the organization is going, but rather to create conditions that foster creativity and learning. In other words, leaders do not firmly set the direction and demand a “cohesive team” to support it, but rather they create conditions that encourage and promote people to “question everything and generate new perspectives through contention and conflict.” The organization is moved from "building on their strengths and merely adapting to existing market conditions, [to insted] they develop new strengths and at least partly create their own environments.”

An organization just sticking to what they do best and incrementally improving on that was long considered a strategy for organizational success; however, it is now understood as a recipe for disaster. “It is becoming clearer why so many organizations die young…they ‘stick to their knitting’ and do better and better what they already do well. When some more imaginative competitors come along and change the rules of the game, such over-adapted companies…cannot respond fast enough. The former source of competitive success becomes the reason for failure and the companies, like animals, become extinct.”

Organizations must be innovative and creative to succeed. “The ‘new science’ for business people is this: Organizations are feedback systems generating such complex behavior that cause-and-effect links are broken. Therefore, no individual can intend the future of that system or control its journey to that future. Instead what happens to an organization is created by and emerges from the self-organizing interactions between its people. Top managers cannot control this, but through their interventions, they powerfully influence this.

With the rapidly changing economic, political, social, and technological conditions in the world, “the future is inherently unpredictable.” To manage effectively then is not to set rigid plans and targets, but rather to more flexibly read, analyze, and adapt to the changes as they occur or as they can be forecast with reasonable certainly. “A ‘shared vision’ of a future state must be impossible to formulate, unless we believe in mystic insight.” “No person, no book, can prescribe systems, rules, policies, or methods that dependably will lead to success in innovative organizations. All managers can do it establish the conditions that enable groups of people to learn in each new situation what approaches are effective in handling it.”

For enterprise architecture, there are interesting implications from this management approach. Enterprise architects are responsible for developing the current and target architecture and transition plan. However, with the rapid pace of change and innovation and the unpredictability of things, we learn that “hard and fast” plans will not succeed, but rather EA plans and targets must remain guidelines only that are modified by learning and feedback and is response to the end-user (i.e User-centric). Secondly, EA should not become a hindrance to organizational innovation, creativity, and new paradigms for organizational success. EA needs to set standards and targets and develop plans and administer governance, but this must be done simultaneously with maintaining flexibility and harnessing innovation into a realtime EA as we go along. It’s not a rigid EA we need, but as one of my EA colleagues calls it, it’s an “agile EA”.


Share/Save/Bookmark

September 27, 2009

Rational Decision Making and Enterprise Architecture

In the book Images of Organization by Gareth Morgan, the Nobel Prize winner Herbert Simon is cited as exploring the parallels between human and organization decision making, as follows:

Organizations can never be completely rational, because their members have limited information processing abilities…people

  • usually have to act on the basis of incomplete information about possible courses of action and their consequences

  • are able to explore only a limited number of alternatives relating to any given decision, and

  • are unable to attach accurate values to outcome


...In contrast to the assumptions made in economics about the optimizing behavior of individuals, he concluded that individuals and organizations settle for a ‘bounded rationality’ of a good enough decision based on simple rules of thumb and limited search and information.”


While EA provides a way ahead for the organization, based on Herbert Simon explanation, we learn that there is really no 100% right answers. Organizations, like individuals, have limited ability to plan for the future, since they cannot adequately analyze potential outcomes of decisions in an uncertain environment with limited information.


Architects and the organizations they serve must recognize that the best laid plans are based on bounded rationality, and there is no "right" or "wrong" answers, just rational planning and due diligence.


Share/Save/Bookmark

September 26, 2009

The Doomsday Machine is Real

There is a fascinating article in Wired (Oct. 2009) on a Doomsday Machine called “the Perimeter System” created by the Soviets. If anyone tries to attack them with a debilitating first strike, the doomsday machine will take over and make sure that the adversary is decimated in return.

“Even if the US crippled the USSR with a surprise attack, the Soviets could still hit back. It wouldn’t matter if the US blew up the Kremlin, took out the defense ministry, severed the communications network, and killed everyone with stars on their shoulders. Ground-based sensors would detect that a devastating blow had been struck and a counterattack would be launched.”

The Doomsday machine has supposedly been online since 1985, shortly after President Reagan proposed the Strategic Defense Initiative (SDI or “Star Wars”) in 1983. SDI was to shield the US from nuclear attack with space lasers (missile defense). “Star Wars would nullify the long-standing doctrine of mutually assured destruction.”

The logic of the Soviet’s Doomsday Machine was “you either launch first or convince the enemy that you can strike back even if you’re dead.”

The Soviet’s system “is designed to lie dormant until switched on by a high official in a crisis. Then it would begin monitoring a network of seismic, radiation, and air pressure sensors for signs of nuclear explosion.”

Perimeter had checks and balances to hopefully prevent a mistaken launch. There were four if/then propositions that had to be meet before a launch.

Is it turned on?

Yes then…

Had a nuclear weapon hit Soviet soil?

Yes, then…

Was there still communications links to the Soviet General Staff?

No, then launch authority is transfered to whoever is left in protected bunkers

Will they press the button?

Yes, then devastating nuclear retaliation!

The Perimeter System is the realization of the long-dreaded reality of machines taking over war.

The US never implemented this type of system for fear of “accidents and the one mistake that could end it all.”

“Instead, airborne American crews with the capacity and authority to launch retaliatory strikes were kept aloft throughout the Cold War.” This system relied more on people than on autonomous decision-making by machines.

To me, the Doomsday Machine brings the question of automation and computerization to the ultimate precipice of how far we are willing to go with technology. How much confidence do we have in computers to do what they are supposed to do, and also how much confidence do we have in people to program the computers correctly and with enough failsafe abilities not to make a mistake?

On one hand, automating decision-making can help prevent errors, such as a mistaken retaliatory missile launch to nothing more than a flock of geese or malfunctioning radar. On the other hand, with the Soviet’s Perimeter System, once activated, it put the entire launch sequence in the hands of a machine, up until the final push a button by a low-level duty station officer, who has a authority transferred to him/her and who is perhaps misinformed and blinded by fear, anger, and the urge to revenge the motherland in a 15 minute decision cycle—do or die.

The question of faith in technology is not going away. It is only going to get increasingly dire as we continue down the road of computerization, automation, robotics, and artificial intelligence. Are we safer with or without the technology?

There seems to be no going back—the technology genie is out of the bottle.

Further, desperate nations will take desperate measures to protect themselves and companies hungry for profits will continue to innovate and drive further technological advancement, including semi-autonomous and perhaps, even fully autonomous decision-making.

As we continue to advance technologically, we must do so with astute planning, sound governance, thorough quality assurance and testing, and always revisiting the technology ethics of what we are embarking on and where we are headed.

It is up to us to make sure that we take the precautions to foolproof these devices or else we will face the final consequences of our technological prowess.


Share/Save/Bookmark

September 25, 2009

The Window and the Mirror and Enterprise Architecture

I came across some interesting leadership lessons that can be helpful to enterprise architect leaders in the book Good to Great by Jim Collins.

At the most basic level, Collins says that a “level 5” executive or great leader is a “paradoxical blend of personal humility and professional will." “Level 5 leaders channel their ego away from themselves and into the larger goal of building a great company…their ambition is first and foremost for the institution, not themselves.”

Furthermore, level 5 great leaders differ from good leaders in terms of “the window and the mirror.”
  • Great leaders—“look out the window to attribute success to factors outside themselves, [and] when things go poorly, they look in the mirror and blame themselves.”
  • Good (non-great) leaders—“look in the mirror to take credit for success, but out the window to assign blame for disappointing results.”

Interestingly enough, many leaders attributed their company’s success to “good luck” and failures to “bad luck”. Collins writes: “Luck. What an odd factor to talk about. Yet, the good-to-great executives talked a lot about luck in our interviews. This doesn’t sound like Harvard or Yale MBAs talking does it?

Collins comments on this bizarre and repeated reference to luck and states: “We were at first puzzled by this emphasis on good luck. After all, we found no evidence that the good-to-great companies were blessed with more good luck than the comparison companies.”

What puzzles me is not only the lack of attribution for company success to global factors, general market conditions, competitive advantage, talented leadership, great architecture, astute planning, sound governance, great products/services, creative marketing, or amazing employees, but also that there is no mention or recognition in the study of good-to-great leaders in the benevolence from the Almighty G-d, and no apparent gratitude shown for their companies’ success. Instead, it's all about their personal brilliance or general good luck.

Where is G-d in the leaders' calculus for business success?

It seems that the same good-to-great leaders that “look out the window to attribute success to factors outside themselves,” also are looking down at superstitious or “Vegas-style” factors of luck, rather than looking out the window and up to the heavens from where, traditionally speaking, divine will emanates.

Perhaps, there should be a level 6 leader (after the level 5 great leader) that is “truly great” and this is the leader that not only has personal humility and professional will, but also belief in a power much higher than themselves that supersedes “good luck.”

Share/Save/Bookmark

Nanotechnology and Enterprise Architecture

“Nanotechnology is the engineering of functional systems at the molecular scale. In its original sense, 'nanotechnology' refers to the ability to construct items from the bottom up.” (Center for Responsible Nanotechnology)

Two examples of nanotechnology include the manufacturing of super strength polymers, and the design of computer chips at the molecular level (quantum computing). This is related to biotechnology, where technology is applied to living systems, such as recombinant DNA, biopharmaceuticals, or gene therapy.


How do we apply nanotechnology concepts to User-centric EA?
  • Integration vs. Decomposition: Traditional EA has looked at things from the top-down, where we decompose business functions into processes, information flows, and systems into services. But nanotechnology, from a process perspective, shows us that there is an alternate approach, where we integrate or build up from the bottom-up. This concept of integration can be used, for example, to connect activities into capabilities, and capabilities into competencies. These competencies are then the basis for building competitive advantage or carrying out mission execution.
  • Big is out, small is in: As we architect business processes, information sharing, and IT systems, we need to think “smaller”. Users are looking to shed the monolithic technology solutions of yesteryear for smaller, agile, and more mobile solutions today. For example, centralized cloud computing services replacing hundreds and thousands of redundant instances of individuals systems and infrastructure silos, smaller sized but larger capacity storage solutions, and ever more sleek personal digital assistants that pack in the functionality of cellphones, email, web browsing, cameras, ipods, and more.
  • Imagination and the Future State: As architects, we are concerned not only with the as-is, but also with the to-be state (many would say this is the primary reason for EA, and I would agree, although you can't establish a very effective transition plan without knowing where your coming from and going to). As we plan for the future state of things, we need to let our imagination soar. Moore’s Law, which is a view into the pace of technological change, is that the number of transistors on an integrated circuit doubles every 24 months. With the rapid pace of technological change, it is difficult for architects to truly imagine what the true possibilities are 3-5 years out--but that can't stop of from trying based on analysis, trends, forecasts, emerging technologies, competitive assessments, and best practice research.

The field of information technology, like that of nanotechnology and biotechnology is not only evolving, but is moving so quickly as to seem almost revolutionary at times. So in enterprise architecture, we need to use lots of imagination in thinking about the future and target state. Additionally, we need to think not only in terms of traditional architecture decomposition (a top-down view), but also integration (a bottom-up view) of the organization, its processes, information shares, and technologies. And finally, we need to constantly remain nimble and agile in the globalized, competitive marketplace where change is a constant.


Share/Save/Bookmark

September 24, 2009

Creating Win-Win and Enterprise Architecture

We are all familiar with conflict management and day-to-day negotiations in our everyday leadership role in our organizations, and the key to successful negotiation is creating win-win situations.

In the national bestseller, Getting to Yes, by Fisher and Ury, the authors call out the importance of everyday negotiation and proposes a new type of negotiation called "principled negotiation".


“Everyone negotiates something every day…negotiation is a basic means of getting what you want from others. It is a back-and-forth communciation designed to reach an agreement when you and the other side have some interests that are shared and others that are opposed. More and more occasions require negotiation. Conflict is a growth industry…whether in business, government, or the family, people reach most decisions through negotiation.”


There are two standard ways to negotiate that involve trading off between getting what you want and getting along with people:


Soft—“the soft negotiator wants to avoid personal conflict and so makes concessions readily in order to reach agreement. He wants an amicable resolution yet he often ends up exploited and feeling bitter.”


Hard—“the hard negotiator sees any situation as a contest of wills in which the side that takes more extreme positions and holds out londer fares better. He want to win yet he often ends up producing an equally hard response which exhausts him and his resources and harms his relationship with the other side.”


The third way to negotiate, developed by the Harvard Negotiation Project, is Principled Negotiation.


Principled Negotiation—“neither hard nor soft, but rather both hard and soft…decide issues on their merits rather than through a haggling process…you look for mutual gains wherever possible, and that where your interests conflict, you should insist that the results be based on some fair standards independent of the will of either side.”


In principled negotiation, the method is based on the following:

  1. People—participants are not friends and not adversaries, but rather problem solvers
  2. Goal—the goal is not agreement or victory, but rather a “wise outcome reached efficiently and amicably”
  3. Stance—your stance is “soft on the people, hard on the problem”
  4. Pressure—you don’t yield or apply pressure, but rather “reason and be open to reasons”
  5. Position—you don’t change your position easily or dig in, but rather you “focus on interests, not positions”
  6. Solution—the optimal solution is win-win; you develop “options for mutual gain”

In User-centric EA, there are many situations that involve negotiation, and using principled negotiation to develop win-win solutions for the participants is critical for developing wise solutions and sustaining important personal relationships.

  • Building and maintaining the EA—first of all, just getting people to participate in the process of sharing information to build and maintain an EA involves negotiation. In fact, the most frequent question from those asked to participate is “what’s in it for me?” So enterprise architects must negotiate with stakeholders to share information and participate and take ownership in the EA initiative.
  • Sound IT governance—second, IT governance, involves negotiating with program sponsors on business and technical alignment and compliance issues. Program sponsors and project managers may perceive enterprise architects as gatekeepers and your review board and submission forms or checklists as a hindrance or obstacle rather than as a true value-add, so negotiation is critical with these program/project managers to enlist their support and participation in the review, recommendation, and decision process and follow-up on relevant findings and recommendations from the governance board.
  • Robust IT planning—third, developing an IT plan involves negotiation with business and technical partners to develop vision, mission, goals, objectives, initiatives, milestones, and measures. Everyone has a stake in the plan and negotiating the plan elements and building consensus is a delicate process.
In negotiating for these important EA deliverables, it’s critical to keep in mind and balance the people and the problem. Winning the points and alienating the people is not a successful long-term strategy. Similarly, keeping your associates as friends and conceding on the issues, will not get the job done. You must develop win-win solutions that solve the issues and which participants feel are objective, fair, and equitable. Therefore, using principled negotiation, being soft on people and hard on the problem is the way to go.

Share/Save/Bookmark

September 23, 2009

Realistic Optimism and Enterprise Architecture

Optimism can be a key to success in your personal and professional life!

The Wall Street Journal reported in Nov. 2007 that optimism leads to action and that “if even half the time our actions work out well, our life is going to turn out for the better…if you are a pessimist, you are unlikely to even try,” says Dr. Phelps an NYU neuroscientist. Similarly, Dr. Martin Seligman of the University of Pennsylvania observes that “optimists tend to do better in life than their talents alone may suggest.”

So while optimism is often “derided as a naïve, soft-soap disposition that distorts the realities of life,” Duke University researchers found that optimists actually lead more productive and by some measures, successful lives. For example, they found that optimists “worked longer hours every week, expected to retire later in life, were less likely to smoke and, when they divorced, were more likely to remarry. They also saved more, had more of their wealth in liquid assets, invested more in individual stocks, and paid credit-card debt bills more frequently.”

At the same time, overly optimistic people behaved in a counter-productive or destructive fashion. “They overestimated their own likely lifespan by 20 years or more…they squandered, they postponed bill paying. Instead of taking the long view, they barely looked past tomorrow.”

Overall though, “the influence of optimism on human behavior is so pervasive that it must have survival value, researchers speculate, and may give us the ability to act in the face of uncertain odds.”

Optimism coupled with a healthy dose of realism is the best way to develop and maintain the organization’s enterprise architecture plans and governance. Optimism leads the organization to “march on” and take prudent action. At the same time, realism keeps the enterprise from making stupid mistakes. An EA that is grounded in “realistic optimism” provides for better, sounder IT investments. Those investments proactively meet business requirements, but are not reliant on bleeding-edge technologies that are overly risky, potentially harmful to mission execution, and wasteful of valuable corporate resources.


Share/Save/Bookmark

September 22, 2009

Organizational Politics and Enterprise Architecture

Organizations are intrinsically political systems, “in the sense that ways must be found to create order and direction among people with potentially diverse and conflicting interests.”

“All organizational activity is interest-based…an organization is simultaneously a system of competition and a system of collaboration.” Because of the diversity of interests… [the organization] always has a latent tendency to move in diverse directions, and sometimes to fall apart.

Organizational politics is founded in Aristotle’s idea “that diversity of interests gives rise to the ‘wheeling and dealing’, negotiation, and other processes of coalition building and mutual influence that shape so much of organizational life.”

“Organizational politics arise when people think differently and want to act differently. This diversity creates a tension that must be resolved through political means…there are many ways in which this can be done: aristocratically (‘We’ll do it this way’); bureaucratically (‘We’re supposed to do it this way”), technocratically (‘It’s best to do it this way’), or democratically (‘How shall we do it?’). In each case the choice between alternative paths of action usually hinges on the power relations between the actors involved.”

Power is the medium through which conflicts of interest are ultimately resolved. Power influences who gets what, when, and how.” Organizational power is derived from formal authority, control of scarce resources, control of information, use of structure, policies, and rules, and so on.

(Adapted from Images of Organization by Gareth Morgan)

Recognizing the importance of organizational politics—individual, group, and special interests, as well as the resulting conflict, and resolution through the levers of power is critical in User-centric Enterprise Architecture.

EA works within a diverse organization, takes competing interests and organizational conflicts, and turns it into common objectives and goals and the striving towards their achievement.

Enterprise architects work across organizational boundaries to synthesize business and technology to create interoperability, standardization, efficiencies, enterprise and common solutions, and integration.

Through the target architecture and transition plan, EA seeks to transform the organization from its intrinsic conflicts into a force with unity of purpose and mind to achieve ever greater accomplishments.


Share/Save/Bookmark

September 21, 2009

Testing EA in Virtual Reality

In enterprise architecture, we develop IT targets and plans for the organization, but these are usually not tested in any meaningful or significant way, since they are “future tense”.

Wouldn’t it be incredible to be able to actually test EA hypotheses, targets, and plans in a virtual environment before actually setting off the organization in a specific direction that can have huge implications for its ability to conduct business and achieve results?

MIT Technology Review, in an article entitled “The Fleecing of the Avatars” (Jan/Feb 2008) addresses how virtual reality is being used to a greater extent to mimic and test reality.

One example of the booming virtual world is Second Life, run by Linden Labs. It has 10,000,000 subscribers and “about 50,000 are online at any one time.” In this virtual world, subscribers playing roles as avatars “gather to role-play reenactments of obscure digital Star Trek cartoon episodes, build and buy digital homes and furniture, and hang out on digital beaches.”

However, more and more virtual worlds, like Second Life, are being used by real world mainstream businesses. For example, many companies are developing a presence in the virtual world, such as Dell with a sales office in Second Life, Reebok a store, and IBM maintains business centers in this virtual world. Further, “the World Bank presented a report in Second Life about business development.”

“But big companies like Sun, Reebok, and IBM don’t really do business in virtual worlds; they ‘tunnel’ into them. [In other words,] To close a deal, you need to step out of the ‘sim’ and into the traditional Sun or Reebok or IBM website.”

The development of company’s virtual presence online and their connection back to the real world is potentially a precursor to planning disciplines like EA testing out hypotheses of targets and plans in virtual reality and then actually implementing these back in the real organization.

Others are actually planning to use virtual worlds to test and conduct research. So there is precedent for other disciplines such as EA. For example, Cornell’s Robert Bloomfield, an experimental economist, “conducts lab research—allowing 20 students to make simulated stock trades using real money…and seeing how regulatory changes affect their behavior. He envisions a day when he can do larger studies by setting up parallel virtual worlds. ‘I could create two virtual worlds, one with legal structure, one with another, and compare them…I might lower the capital-gains tax in one and see how business responds. There are things I can’t do with 20 people in a classroom but I can do with 2,000 or 20,000 people in a virtual world.”

Could enterprise architecture do something similar in a virtual world? For example, could we test how business processes need to change when new technology is introduced or how information sharing improves with better architectures for discovering and exchanging data? How about testing people’s reactions and behavior to new systems in a broader virtual world instead of with a more limited number of customers in user acceptance testing? Another possibility is testing the effectiveness of new IT security in a virtual world of gamers and hackers?

Modeling and simulation (M&S) can improve enterprise architecture by testing plans before deploying them. We need to to hire and train people with knowledge, skills, and experience in the M&S discipline and with tools that support this. Then we can test hypothetical return on investment for new IT investments before we open our organizational wallets.


Share/Save/Bookmark

Leading Through Planning

Recently, I was reminded of two pointers in developing an effective IT strategic plan:
  1. Strategic planning is about leadership and setting direction—There is an interesting saying with respect to this that the manager ensures that you do things right, and the leader ensures that you do the right things. The strategic plan, including the vision, mission, and value statements are about leadership establishing and communicating what the ‘right thing’ is. An effective metaphor for this is that the manager ensures that you climb the ladder, but the leader ensures that the ladder is up against the “right” wall.
  2. Strategic planning goals, objectives, and initiatives have to be aligned and actionable —that means that you need to set the strategic plan elements at an appropriate level of detail and in cascading fashion. One way to do this is to navigate up and down between goal, objectives, and initiatives in the following way: To navigate to a higher elements of the plan hierarchy, ask why. Why do we do XYZ? To navigate to lower levels of detail and specificity, ask how. How do or will we do XYZ.

Together, these two guidelines help to develop an IT strategic plan that is both effective in terms of goal setting and organizational focus as well as at the appropriate levels of detail and alignment to be truly actionable.


Share/Save/Bookmark

September 20, 2009

Is Free Worth the Price?

In the computer world, free is often the architecture and economic model of choice or is it?

We have various operating systems like Linux, Chrome, Android and more now costing nothing. Information is free on the Internet. Online news at no cost to the reader is causing shock waves in the print news world. There are thousands of free downloads available online for applications, games, music, and more.

What type of business model is free—where is the revenue generation and profit margin?

Yes, we know you can use giveaways to cross sell other things which is what Google does so well making a boat load of money (billions) from its free search engine by selling ads. Others are trying to copy this model but less successfully.

Also, sometimes, companies give product away (or undercharge) in order to undermine their competitive challengers, steal market share, and perhaps even put their rivals out of business.

For example, some have accused Google of providing Google Apps suite for free as a competitive challenge to Microsoft dominant and highly profitable Office Suite in order to shake one of Microsoft’s key product lines and get them off-balance to deflect the other market fighting going on in Search between Google and Microsoft’s new Bing “decision engine.”

So companies have reasons for providing something for free and usually it is not pure altruism, per se.

But from the consumers perspective, free is not always really free and is not worth the trouble.

Fast Company has an interesting article (October 2009) called “The High Cost of Free.”

“The strategy of giving everything away often creates as many hassles as it solves.”

Linux is a free operating system, yet “netbooks running Windows outsell their Linux counterparts by a margin of nine to one.”

“Why? Because free costs too much weighted down with hassles that you’ll happily pay a little to do without.”

For example, when you need technical support, what are the chances you’ll get the answers and help you need on a no-cost product?

That why “customers willingly pay for nominally free products, because they understand that only when money changes hands does the seller become reliably responsive to the buyer.”

And honestly, think about how often--even when you do pay--that trying to get good customer service is more an anomaly than the rule. So what can you really reasonably expect for nothing?

“Some companies have been at the vanguard of making a paying business of “free.” IBM, HP and other tech giants generate significant revenue selling consulting services and support for Linux and other free software to business.”

Also, when you decide to go with free products, you may not be getting everything you bargained for either in the base product or in terms of all the “bells and whistles” compared with what a paid-for-product offers. It’s reminiscent of the popular adages that “you get what you pay for” and “there’s no such thing as a free lunch.”

Sure, occasionally there is a great deal out there—like when we find a treasure at a garage or estate sale or even something that someone else discarded perhaps because they don’t recognize it’s true value—and we need to be on the lookout for those rare finds. But I think we’d all be hard pressed to say that this is the rule rather than the exception. If it were the rule, it would probably throw a huge wrench in the notion of market equilibrium.

And just like everyone savors a bargain, people are of course seriously enticed by the notion of anything that is free. But do you think a healthy dose of skepticism is appropriate at something that is free? Again, another old saying comes to mine, “if it’s too good to be true, it probably is.”

Remember, whoever is providing the “free” product or service, still needs to pay their mortgage and feed their family too, so you may want to ask yourself, how you or someone else is paying the price of “free,” and see if it is really worth it before proceeding.

From the organization’s perspective, we need to look beyond the immediate price tag (free or otherwise discounted) and determine the medium- to long-term costs that include operations and maintenance, upgrades, service support, interoperability with other products and platforms, and even long-term market competition for the products we buy.

So let’s keep our eyes open for a great deal or paradigm shift, but let’s also make sure we are protecting the vital concerns of our users for functionality, reliability, interoperability, and support.


Share/Save/Bookmark

September 18, 2009

The CIO Support Services Framework Improves IT Operations


Share/Save/Bookmark

What Stops Us From Going Cashless

How many of you ever wondered why we continue to use dollar bills and coins when we have credit and debit cards that make cash virtually obsolete?

I for one have long abandoned cash in lieu of the ease of use, convenience, orderliness of receiving monthly statements and paying electronically, and the cleanliness of not having to carry and handle the cold hard stuff.

Not that I am complaining about money at a time of recession, but seriously why do we not go dollar-digital in the “digital age”?

Before debit cards, I understood that some people unfortunately have difficulty getting the plastic because of credit issues. But now with debit cards, everyone can shop and pay digitally.

Even government run programs like the Supplemental Nutrition Assistance Program (SNAP aka food stamps) now uses an electronic card for purchasing no money paper stamps.

It seems that credit/debit card readers are pretty much ubiquitous—stores of course, online—it’s the way to go, even on the trains/buses and candy machines.

From the taxman perspective, I would imagine it is also better and more equitable to track genuine sales transactions in a documented digital fashion than enabling funny “cash business.”

So why don’t we go paperless and coinless and fully adopt e-Commerce?

An interesting article in the Wall Street Journal, 11 Sept. 2009, described a trendy NYC restaurant that was doing just that.

“The high-end New York City restaurant said goodbye to dollars: Tip in cash if you like but otherwise, your money is no good here.”

Others have been going cashless for some time now.

“In the world of online and catalog retailing, credit and debit cards have long been king. And in recent years, a handful of airlines have adopted ‘cashless cabins.’”

As the NYC restaurant owner said, “Suddenly, it struck me how unnecessary cash was…[moreover,] the convenience and security of going cashless are well worth the added cost.”

Further, from the customer perspective, using a debit or credit card lets users optimize their cash flow and earn reward points.

I believe that the day is coming when bites and bytes are going to win over paper and coins.

This is going to happen, when the IRS requires it, the government stops printing it because it always has (i.e. inertia), when retailers recognize that the benefits of digital money outweigh the fees, and when resistance to change is defeated by common sense of modernization.


Share/Save/Bookmark