October 23, 2009

Business Process Reengineering and Enterprise Architecture

User-centric EA analyzes problem areas in the organization and uncovers gaps, redundancies, inefficiencies, and opportunities; EA uses this information to drive business process reengineering and improvement as well as to introduce new technologies to the enterprise.

According to the Office of Management and Budget (OMB) Circular A-130, Management of Federal Information Resources, business process reengineering needs to take place to achieve the benefits of new information technology: “Moreover, business process reengineering should accompany all attempts to facilitate a transaction through information technology. Often the full benefits will be realized only by restructuring the process to take advantage of the technology. Merely moving an existing paper based process to an electronic one is unlikely to reap the maximum benefits from the electronic system.”

In the book The 21st Century Organization by Bennis and Mische the authors explain how organizations can reinvent themselves through reengineering.

What exactly is reengineering?

Reengineering is reinventing the enterprise by challenging its existing doctrines, practices, and activities and then innovatively redeploying its capital and human resources into cross-functional processes. This reinvention is intended to optimize the organization’s competitive position, it value to shareholders, and its contribution to society.”

What are the essential elements of reengineering?

There are five:

  1. A bold vision
  2. A systemic approach
  3. A clear intent and mandate
  4. A specific methodology
  5. Effective and visible leadership”

What activities are involved in reengineering?

  • “Innovating
  • Listening to customers
  • Learning
  • Generating ideas
  • Designing new paradigms
  • Anticipating and eclipsing competitors
  • Contributing to the quality of the workplace and the community
  • Constructively challenging established management doctrines”

“Reengineering the enterprise is difficult. It means permanently transforming the entire orientation and direction of the organization. It means challenging and discarding traditional values, historical precedents, tried-and-true processes, and conventional wisdom and replacing them with entirely different concepts and practices. It means redirecting and retraining workers with those new concepts and practices...The very cultural fiber of the enterprise must be interrogated and redefined. Traditional work flows must be examined and redesigned. Technology must be redirected from supporting individual users and departments to enabling cross-functional processes.”

What are the goals of reengineering?

  • “Increasing productivity
  • Optimizing value to shareholders
  • Achieving quantum results
  • Consolidating functions
  • Eliminating unnecessary levels of work”

Reengineering seeks to increase productivity by creating innovative and seamless processes…the paradigms of vertical ‘silo’ tasks and responsibilities is broken down and replaced with a cross-functional, flatter, networked structure. The classical, top-down approach to control is replaced with an approach that is organized around core processes, is characterized by empowerment, and is closer to the customer....Reengineering constructively challenges and analyzes the organization’s hierarchy and activities in terms of their value, purpose, and content. Organizational levels and activities that represent little value to shareholders or contribute little to competitiveness are either restructured or eliminated.”

What is the role of EA?

EA is the discipline that synthesizes key business and technology information across the organization to support better decision-making. EA develops and maintains the current and target architectures and transition plan for the organization. As OMB recommends, in setting enterprise targets, EA should focus first and foremost on business process reengineering and then on technology enablement. If the organization does not do process reengineering first, the organization risks not only failing to achieve the benefits of introducing new IT, but also causing actual harm to the organizations existing processes and results. For example, adding a new technology without reengineering process can add additional layers of staff and management to implement, maintain, and operate the technology instead of creating a net resource savings to the organization, from more efficient operations. Similarly, without doing reengineering before IT implementation, the enterprise may actually implement IT that conflicts with existing process and thus either require timely and costly system customization or end up adversely impacting process cycle time, delaying shipments, harming customer satisfaction, and creating bloated inventories, and so on.

Bennis and Mische predict that in the 21st century “to be competitive, an organization will have to be technology enabled…the specific types of technology and vendors will be unimportant, as most organizations will have access to or actually have similar technologies. However, how the organization deploys its technological assets and resources to achieve differentiation will make the difference in whether it is competitive.”


Share/Save/Bookmark

Stairway to User-centric Heaven



This video was sent to me and I do not know the original source (except VW), but it's great.

It shows what happens when you take the most ordinary daily activity (in this case a simple flight of stairs) and make it user-centric.

Even more, people will walk "the extra mile" when something is appealing to them.

Notice how an unused staircase becomes the preferred method--down and even up--over the escalator when people have a user-centric reason to switch.

This is brilliant and the true essence of what it means to enterprise architect our organizations, products, services, policies, plans, and so forth in a way that people can really use.

Further, technology is not only bits and bytes, but any tool we use to get the job done.

Life truly can be healthy, meaningful, and fun when it's user-centric, visionary, and innovative.


Share/Save/Bookmark

October 20, 2009

What We Lose When We Lie

If you watch House MD on TV, House always says something sort of striking: “everyone lies.”

Today, an article in the Wall Street Journal, 20 October 2009, says something similar, that we all lie even (some, not me, would say “especially”) in our closest relationships, marriage.

“We fib to avoid conflict. To gain approval. To save face. Or just to be kind.”

Some claim lying is a survival mechanism because “they [lies] allow us to avoid conflict.”

Others feel that it’s okay to lie in order to be tactful with others. For example, a retired financial executive explained that “when his wife ask how she looks, he always tells her she is beautiful. ‘A bad hair day isn’t going to change your life. What’s to be gained by saying something negative to someone that is of such fleeting importance.'”

Even those who supposedly don’t lie, have all these little twists:

One man when asked about lying said: “I don’t lie, I tell the truth…slowly.”

George Costanza on Seinfeld used to say: “It’s not a lie if you believe it.”

In society, we’ve even come up with a terms for lies that are small or harmless and we call those “white lies.”

Even in court rooms, we don’t trust that people will tell the truth, but rather we have to literally ask them “Do you swear to tell the truth, the whole truth, and nothing but the truth, so help you G-d?”

Many people have pointed out that even in the Ten Commandments, we are not commanded directly not to lie, but rather “you shall not bear false witness against your neighbor.”—Hey, just for the record, that’s close enough for me!

Not surprisingly, the mixed thinking about whether it is okay to lie in certain “charged” situations carries over into our organizations.

On one hand, many of our organizations, especially in the public sector, have wonderful core values such as truth, justice, integrity, and so on. Moreover, for certain national security positions, we even give people lie detector (polygraph) tests to ensure their personal truthfulness.

Yet, on the other hand, we all have heard of project managers who lie in order to cover up failing or failed projects—and many implicitly accept this behavior.

I read that the Standish Group recently reported that 82% of our organizational projects are failing or seriously challenged i.e. they are over budget, behind schedule, or not meeting customer requirements. Moreover, we have for years, seen numerous projects end up on watch list for failing projects and even have websites that track these.

Yet, ask many project managers how their projects are doing and you get the spectrum of whitewash answers like “everything is great,” “we’re right on track,” “no problem,” “everyone’s working hard,” or sometimes simply “nothing to report.”

Perhaps, project managers are afraid to tell the truth for fear of retribution, punishment, or other negative impacts to their career, those that work for them, or others who are “implicated.”

As one psychologist says about little white lies: “If you don’t fib, you don’t live.”

How unfortunate this thinking is—rather than encouraging honesty, we develop cultures of fear, where cover-ups are routine and truth in reporting is a practically a misnomer.

By creating a culture where lying is endemic to reporting, we are harming our people and our organizations. Organizationally, we can only manage if we can measure, and we can only measure if people are honest as to what is working and what isn’t. Personally, we hurt our own integrity as human beings by lying (or being dishonest, deceiving, whitewashing or whatever you want to call it) and then justifying it in so many little and big ways.

Sure, there is such a thing as tact, but you can be tactful and truthful at the same time!

Some of this may come down to improving communication and people skills and this needs to be emphasized in our training plans. Of course, we need to work with each other in socially appropriate ways.

But at the same time, at the end of the day, people need to maintain what is really important—their integrity, and at the same time move the organization to make the right decisions, and this can only be done by being frank and honest with ourselves and with each other.

My suggestion is for leaders to surround themselves with those who are not only “the best and the brightest,” but also those with the most honesty and integrity around.


Share/Save/Bookmark

“The Happiness Myth” and Enterprise Architecture


Recently, I was reminded of an interesting article that appeared in The Wall Street Journal (20 Dec 2007) that what really matters in life is not happiness, but rather peace of mind.

Generally speaking, people “are consumed by the pursuit of happiness,” and this fact is codified in our very Declaration of Independence
that states: “that all men are created equal, that they are endowed with certain unalienable rights, that are among these are life, liberty, and the pursuit of happiness.”

However, absolute happiness is often in conflict with the "reality on the ground".

There are some of the inherent conflicts we deal with in enterprise architecture (sort of like the Murphy's Law of EA):

Here are some typical user wants (often associated with problematic architectures):
  • A baseline, target, and transition plan without their having to provide virtually any input or to collaborate whatsoever.
  • An architecture roadmap that they do not have to actually follow or execute on.
  • A platform for information sharing and access to information 24/7, but they also want to hoard “their information”, and keep it secure and private, on a need-to-know only basis, which they subjectively decide.
  • A structured IT governance process to ensure sound IT investments for the organization, but also they want leeway to conduct their own affairs, their way, in which they buy want they want, when they want, how they want, from whomever they want, with whatever founds they can scrounge up.
  • A requirements generation and management process that captures and aligns specific functional requirements all the way up to the organization’s strategic plan, mandates and legislation, but that they don't have to be bothered with identifying, articulating, or aligning.


The world of EA is filled with conflicting user demands and polarizing directions from user that want and expect to have it all. While certainly, EA wants and strives to meet all reasonable user requirements and to satisfy the user community and “make them happy,” at a point there comes the realization that you can’t (no matter how hard you try) make everyone happy all of the time.

People want it all, want it now, and often when you give them what they want, they realize that it wasn’t “really” what they had wanted anyway.

So the way ahead is to understand and take into account your user requirements, but more importantly to do the “right” thing for the organization based on best practices, common sense, and initiatives that will truly drive improved performance and mission results.

The WSJ states, “Dad told me: “life isn’t built around ‘fun.’ It’s built around peace of mind. Maybe Dad sensed the paradox of happiness: those most desperate for it run a high risk of being the last to find it. That’s because they make foolish decisions. They live disorderly lives, always chasing the high of the moment.”

In User-centric EA, we don’t “chase the high of the moment,” or look to satisfy each and every user whim, but rather we keep the course to developing sound IT planning and governance and to enhancing organizational decision-making capabilities for our end users. EA is a discipline that ultimately strives to ensure peace of mind for the enterprise through the provision of vital "insight" and "oversight" functions.


Share/Save/Bookmark

October 16, 2009

Paper Catalogs Have Seen Their Day

Every day in the mail comes oodles of consumer catalogs: printed on quality stock paper, glossy, and many almost as thick as the community phone book.

Often, right in the mailroom, there is a huge recycle bin and there just about everybody drops the catalogues from their mailbox straight into the “trash.”

Who needs these expensive and wasteful printed catalogues that typically go from mailbox to recycle bin or garbage can without anyone even breaking the binding on them? With the Internet, the same information—and more—is available online. Moreover, online, you can comparison shop between stores for the best prices, shipping, and return policies, and you can typically get product and vendor ratings too to make sure that you are not buying a dud from a dud!

Despite this, according to the Wall Street Journal, 16 October 2009, “more than 17 billion catalogs were mailed in the U.S. last year--about 56 for every American.”

Read again—56 for every American! This is obscene.

Here are some basic statistics on the wastefulness of these catalogs:

“Catalogs account for 3% of the roughly 80 million tons of paper products.”

“Making paper accounted for 2.4% of U.S. energy use in 2006.”

“The paper typically used in catalogs contains about 10% recycled content…far less than paper in general, which typically contains about 30%...[and] for newspapers, the amount of recycled content is roughly 40%.”

“The average U.S catalog retailer reported mailing about 21 million catalogs in 2007.”

“The National Directory of Catalogs…lists 12,524 catalogs.”

YET…

“Only 1.3% of those catalogs generated a sale.”

So why do printed paper catalogs persist?

Apparently, “because glossy catalog pages still entice buyers in a way that computer images don’t.” Moreover, marketers say that catalogs at an average cost of slightly over a $1.20 each “drive sales at web sites.”

And of course, the U.S. Postal Service “depends on catalogs as an important source of revenue.”

However, in the digital era, it is time for us to see these paper catalogs get converted en-mass into e-catalogs. Perhaps, a paper copy can still be made available to consumers upon request, so those who really want them and will use them, can still get them, but on a significantly more limited basis.

Sure, catalogs are nice to leaf through, especially around the holiday time. But overall, they are a profligate waste of money and a drain on our natural resources. They fill our mailboxes with mostly “junk” and typically are completely unsolicited. With the advent of the Internet, paper catalogs are “overcome by events” (OBE), now that we have vast information rich, e-commerce resources available online, all the time.

Normally, I believe in taking a balanced approach to issues, and moderating strong opinions. However, in this case, we are talking about pure waste and harm to our planet, just because we don’t have the capacity to change.

We need to stop persisting in the old ways of doing business when they are no longer useful. This is just one example of those, and business that don’t transition to digital modernity in a timely fashion risk becoming obsolete along with their catalogs that go from the mailbox right into the trash.


Share/Save/Bookmark

Seeing things Differently with Augmented Reality

One of the most exciting emerging technologies out there is Augmented Reality (AR). While the term has been around since approximately 1990, the technology is only really beginning to take off now for consumer uses.

In augmented reality, you layer computer-generated information over real world physical environment. This computer generated imagery is seen through special eye wear such as contacts, glasses, monocles, or perhaps even projected as a 3-D image display in front off you.

With the overlay of computer information, important context can be added to everyday content that you are sensing. This takes place when names and other information are layered over people, places, and things to give them meaning and greater value to us.

Augmented reality is really a form of mashups, where information is combined (i.e. content aggregration) from multiple sources to create a higher order of information with enhanced end-user value.

In AR, multiple layers of information can be available and users can switch between them easily at the press of a button, swipe of a screen, or even a verbal command.

Fast Company, November 2009, provides some modern day examples of how this AR technology is being used:

Yelp’s iPhone App—“Let’s viewers point there phone down a street and get Yelp star ratings for merchants.”

Trulia for Android—“The real-estate search site user Layar’s Reality Browser to overlay listings on top of a Google phone’s camera view. Scan a neighborhood’s available properties and even connect to realtors.”

TAT’s Augmented ID— “Point your Android phone at a long-lost acquaintance for his Facebook, Twitter, and YouTube activity.”

Michael Zollner, an AR researcher, puts it this way: “We have a vast amount of data on the Web, but today we see it on a flat screen. It’s only a small step to see all of it superimposed on our lives.”

Maarteen Lens-FitzGerald, a cofounder of Layar, said: “As the technology improves, AR apps will be able to recognize faces and physical objects [i.e. facial and object recognition] and render detailed 3-D animation sequences.”

According to Fast Company, it will be like having “Terminator eyes,” that see everything, but with all the information about it in real time running over or alongside the image.

AR has been in use for fighter pilots and museum exhibits and trade shows for a number of years, but with the explosive growth of the data available on the Internet, mobile communication devices, and wireless technology, we now have a much greater capability to superimpose data on everything, everywhere.

The need to “get online” and “look things up” will soon be supplanted by the real time linkage of information and imagery. We will soon be walking around in a combined real and virtual reality, rather than coming home from the real world and sitting down at a computer to enter a virtual world. The demarcation will disappear to a great extent.

Augmented reality will bring us to a new level of efficiency and effectiveness in using information to act faster, smarter, and more decisively in all our daily activities personally and professionally and in matters of commerce and war.

With AR, we will never see things the same way again!


Share/Save/Bookmark

October 12, 2009

Timeouts for Professionals—Ouch

Experts have been teaching parents for years to discipline children, when needed, with timeouts. This is seen as a combined rehabilitative and punitive method to deal with “bad” behavior. The idea is that the child has time to reflect on what they did “wrong” and how they can do better in the future. It also functions as a way to sort of “punish” the child to teach them that there are consequences to their actions, like having to sit in inaction for a period of time. Of course, time-outs also serve the purpose of a “cooling off” period for both parent and child when things are heating up.

Interestingly enough, like many things in life, adults, in a sense, are just big children. And the time-out method doesn’t end in childhood. This method of discipline is used in the workplace as well.

I have seen and heard story after story of people at work who do something “wrong” (whether as defined by objective policy or more often it seems by some subjective management whim) and they get sidelined. They get moved off into a corner—with the proverbial dunce cap on their heads—where they can do no harm. They are for all intensive purposes ignored. They are not assigned any meaningful or significant work. They are neutered.

Unlike a child’s timeout though, an adult timeout may be for a period of time or this may be permanent—no one knows in advance.

Just as with a child, the adult timeout is both punitive and possibly rehabilitative. Punitively, it is supposed to take the “problem” worker out of the larger workplace equation, and it therefore hurts their career, personal and professional learning and growth, and their self-esteem. In terms of rehabilitation, I imagine some may think that like a child, the adult will have time to reflect on what they did wrong—if they even know what they did—and commit to never doing it again—to be a better employee in the future.

Well, why don’t employers just help the employee to do better in their jobs by coaching, mentoring, training, providing constructive feedback, counseling and if necessary taking other corrective actions--why the childish timeouts?

Perhaps, managers think it is easier to just “ignore” a problem—literally—or to handle it quietly and subtly, rather than “confronting” the employee and having to work with them over time to improve.

Unfortunately, this erroneous thinking—the desire to handle it the “easy way out”—is reinforced by often-archaic performance management systems that do not distinguish between employee performances. They neither meaningfully reward or recognize good performance nor discourage poor employee performance.

Certainly, it is important to have fairness, objectivity, and controls in any performance management system, but this needs to be balanced with managing our human capital in a way that is good for the organization and good for the employee.

We cannot continue to manage our employees like children. We cannot punish people for honest mistakes at work that were unintentional, not malicious, and done in good faith and best effort in performance of their jobs.

Instead, we need to manage people with maturity. We need to identify where the issues are, emphasize where appropriate, understand what can be done to correct problems, and work with employees on how they can learn and grow.

Alternatively, we need to handle true performance issues and not bury them indefinitely in timeouts. Our organizations and our employees need to move past childish modes of performance management and handle people decisively, with measured intent, and with absolute integrity.


Share/Save/Bookmark

October 10, 2009

Making Something Out of Nothing

At the Gartner Enterprise Architecture Summit this past week (October 7-9, 2009), I heard about this new math for value creation:

Nothing + Nothing = Something

At first, you sort of go, WHAT?

Then, it starts to make a lot of sense.

Seemingly nothings can be combined (for example, through mashups) to become something significant.

When you really think about it, doesn’t this really happen all the time.

INFORMATION: You can have tens or thousands of data points, but it’s not till you connect the dots that you have meaningful information or business intelligence.

PEOPLE: Similarly, you can have individuals, but it’s not until you put them together—professionally or personally—that you really get sparks flying.

Harvard Business Review, October 2009, put it this way:

Ants aren’t smart…ant colonies are…under the right conditions, groups—whether ant colonies, markets, or corporations—can be smarter than any of their members.” This is the “wisdom of crowds and swarm intelligence.”

PROCESS: We can have a workable process, but a single process alone may not produce diddly. However, when you string processes together—for example, in an assembly line—you can produce a complex product or service. Think of a car or a plane or a intricate surgical procedure.

TECHNOLOGY: I am sure you have all experienced the purchase of hardware or software technologies that in and of themselves are basically useless to the organization. It’s only when we combine them into a workable application system that we have something technologically valuable to the end-user.

Whatever, the combination, we don’t always know in advance what we are going to get when we make new connections—this is the process of ideation, innovation, and transformation.

Think of the chemist or engineer or artist that combines chemicals, building blocks elements, or colors, textures, and styles in new ways and gets something previously unimaginable or not anticipated.

In a sense, organization and personal value creation is very much about creating relationships and associations between things. And a good leader knows how to make these combinations work:

Getting people and organizations to work together productively.

Generating new ideas for innovative business products or better ways of serving the customer.

Linking people, process, and technology in ever expanding ways to execute more effectively and efficiently than ever before.

Enterprise architecture shares this principle of identifying and optimizing relationships and associations between architectural entities such as business processes, data elements, and application systems. Typically, we perform these associations in architectural models, such as business process, data, and system models. Moreover, when we combine these models, we really advance the cause by determining what our processes are/should be, what information is needed to perform these, and what are the systems that serve up this information. Models help architects to identify gaps, redundancies, inefficiencies, and opportunities between the nothings to improve the greater whole of the something.

The real enterprise architect will make the leap from just describing many of these elements to making the real connections and providing a future direction (aka a target architecture) or at least recommending some viable options for one.

Nothing + Nothing (can) = Something. This will happen when we have the following:

  • The right touch of leadership skills to encourage, motivate and facilitate value creation.
  • The allocation of talented people to the task of combining things in new ways.
  • And the special sauce—which is everyone’s commitment, creativity, and hard work to make something new and wonderful emerge.


Share/Save/Bookmark

October 6, 2009

Constructive Truth Hurts, But Helps

It is pretty hard to give and to get honest feedback.

It is often acknowledged that performance reviews are one of the most difficult task for managers to perform. Managers don’t like to “get into it” with the employees, and employees often can’t deal with a straightforward evaluation from their supervisors. Plenty of sugarcoating seems to go on to make the process more digestible for all.

Similarly, people tend not to say what they “really think” in many situations at work. Either, they feel that saying what they mean would be “politically incorrect” or would be frowned upon, ignored, or may even get them in trouble. So people generally “toe the line” and “try not to rock the boat,” because the “nail that stands up, gets hammered down hard.”

An article in the Wall Street Journal, 5 October 2009, reports a similar pattern of behavior with ratings on the Internet. “One of the Web’s little secrets is that when consumers write online reviews, they tend to be positive ratings: The average grade for things online is about 4.3 stars out of five.” On Youtube, the average review for videos is even higher at 4.6.

Ed Keller, the chief executive of Bazaarvoice, says that on average he finds that 65% of the word-of-mouth reviews are positive and only 8% are negative. Likewise, Andy Chen, the chief executive of Power Reviews, says “It’s like gambling. Most people remember the times they win and don’t realize that in aggregate they’ve lost money.”

Some people say that ratings are inflated because negative reviews are deleted, negative reviewers are given flak for their “brutal honesty,” or the reviews are tainted with overly positive self-aggrandizing reviews done on themselves.

With product reviews or performance reviews, “it’s kind of meaningless if every one is great.”

I remember when I was in the private sector, as managers we had to do a “forced rankings” of our employees regardless of their performance rating, in an effort to “get to truth” across the organization.

Generally speaking, performance systems have been lambasted for years for not recognizing and rewarding high performers or for dealing with performance problems.

Whether it products, people, or workplace issues, if we are not honest in measuring and reporting on what’s working and what's not—fairly and constructively—then we will continue to delude ourselves and each other and hurt future performance. We cannot improve the status quo, if we don’t face up to real problems. We cannot take concrete, constructive action to learn and grow and apply innovate solutions, if we don’t know or can’t acknowledge our fundamental weaknesses.

“Being nice” with reviews may avert a confrontation in the short-term, but it causes more problems in the long-term.

Being honest, empathetic, and offering constructive suggestions for improvement with a genuine desire to see the person succeed or product/service improve—and not because the manager is "going after" someone—can be a thousand times more helpful than giving the nod, wink, and look-away to another opportunity for learning, growth, and personal and professional success.
Share/Save/Bookmark

Measurement is Essential to Results

Mission execution and performance results are the highest goals of enterprise architecture.

In the book Leadership by Rudolph Giuliani, he describes how performance measurement in his administration as mayor of NYC resulted in tremendous improvements, such as drastic decreases in crime. He states: “Every time we’d add a performance indicator, we’d see a similar pattern of improvement.”

How did Giuliani use performance measures? The centerpiece of the effort to reduce crime was a process called Compstat in which crime statistics were collected and analyzed daily, and then at meetings these stats were used to “hold each borough command’s feet to the fire.”

What improvements did Giuliani get from instituting performance measurements? Major felonies fell 12.3%, murder fell 17.9%, and robbery 15.5% from just 1993-1994. “New York’s [crime] rate reduction was three to six times the national average…far surpassed that of any other American city. And we not only brought down the crime rate, we kept it down.”

How important was performance measurement to Giuliani? Giuliani states, “even after eight years, I remain electrified by how effective those Compstat meetings could be. It became the crown jewel of my administration’s push for accountability—yet it had been resisted by many who did not want their performance to be measured.”

From an architecture perspective, performance measurement is critical—you cannot manage what you don’t measure!

Performance measurement is really at the heart of enterprise architecture—identifying where you are today (i.e. your baseline), setting your goals where you want to be in the future (i.e. your targets), and establishing a plan to get your organization from here to there through business process improvement, reengineering, and technology enablement.

In the end, genuine leadership means we direct people, process, and technology towards achieving measureable results. Fear of measurement just won't make the grade!


Share/Save/Bookmark

October 3, 2009

Effective Presentation Skills

Watch this helpful video on effective presentations by Paul Maloney and Associates (a product of Gartner).

Understand and rectify the top 10 presenter mistakes:
  1. "Little audience contact
  2. Distracting habits and mannerisms
  3. Inadequate preparation
  4. Unclear purpose and objectives
  5. Failure to maintain presence
  6. Lack of organization
  7. Too few examples and illustrations
  8. Little vocal animation or variety
  9. Too much information
  10. Too many slides"
What effective presenters do:
  1. "Establish and maintain eye contact
  2. Take a steady stance
  3. channel nervous energy
  4. Speak with animation and enthusiasm
  5. Reinforce the message
  6. Handle questions well"

Share/Save/Bookmark

October 1, 2009

Conversational Computing and Enterprise Architecture

In MIT Technology Review, 19 September 2007, in an article entitled “Intelligent, Chatty Machines” by Kate Green, the author describes advances in computers’ ability to understand and respond to conversation. No, really.

Conversational computing works by using a “set of algorithms that convert strings of words into concepts and formulate a wordy response.”

The software product that enables this is called SILVIA and it works like this: “during a conversation, words are turned into conceptual data…SILVIA takes these concepts and mixes them with other conceptual data that's stored in short-term memory (information from the current discussion) or long-term memory (information that has been established through prior training sessions). Then SILVIA transforms the resulting concepts back into human language. Sometimes the software might trigger programs to run on a computer or perform another task required to interact with the outside world. For example, it could save a file, query a search engine, or send an e-mail.”

There has been much research done over the years in natural-language processing technology, but the results so far have not fully met expectations. Still, the time will come when we will be talking with our computers, just like on Star Trek, although I don’t know if we’ll be saying quite yet “Beam me up, Scotty.”

From an enterrpise architecture standpoint, the vision of conversational artificial intelligence is absolutely incredible. Imagine the potential! This would change the way we do everyday mission and business tasks. Everything would be affected from how we execute and support business functions and processes, and how we use, access, and share information. Just say the word and it’s done! Won't that be sweet?

I find it marvelous to imagine the day when we can fully engage with our technology on a more human level, such as through conversation. Then we can say goodbye to the keyboard and mouse, the way we did to the typewriter--which are just museum pieces now.


Share/Save/Bookmark

September 30, 2009

Conflict Management and Enterprise Architecture

What is conflict?

In the book Images of Organization by Gareth Morgan, the author states “Conflict arises whenever interests collide…whatever the reason, and whatever form it takes, its source rests in some perceived or real divergence of interests.”


Why does conflict occur?


Morgan continues: “People must collaborate in pursuit of a common task, yet are often pitted against each other in competition for limited resources, status, and career advancement.”


How does conflict manifest?


The conflicting dimensions of organization are most clearly symbolized in the hierarchical organization chart, which is both a system of cooperation, in that it reflects a rational subdivision of tasks, and a career ladder up which people are motivated to climb. The fact is there are more jobs at the bottom than at the top means that competition for the top places is likely to be keen, and that in any career race there are likely to be far fewer winners than losers.”


How does User-centric EA help Manage Conflict?


Enterprise architecture is a tool for resolving organizational conflict. EA does this in a couple of major ways:

  1. Information Transparency: EA makes business and technical information transparent in the organization. And as they say, “information is power”, so by providing information to everyone, EA becomes a ‘great equalizer’—making information equally available to those throughout the organization. Additionally, by people having information, they can better resolve conflict through informed decision-making.
  2. Governance: EA provides for governance. According to Wikipedia, “governance develops and manages consistent, cohesive policies, processes and decision-rights for a given area of responsibility.” As such, governance provides a mechanism to resolve conflicts, in an orderly fashion. For example, an IT Investment Review Board and supporting EA Review Board enables a decision process for authorizing, allocating, and prioritizing new IT investments, an otherwise highly contentious area for many sponsors and stakeholders in the organization.

Conflict is inevitable; however, EA can provide both information and governance to help manage and resolve conflict.


Share/Save/Bookmark

September 29, 2009

Turning the Tables on Terrorists

Rep. Roscoe Bartlett (R-Md) said that an Electromagnetic Pulse (EMP)—“it would bring down the whole [electrical] grid and cost between $1 trillion to $2 trillion” to repair with full recovery taking up to 10 years!

“It sounds like a science-fiction disaster: A nuclear weapon is detonated miles above the Earth’s atmosphere and knocks out power from New York City to Chicago for weeks, maybe months. Experts and lawmakers are increasing warning that terrorists or enemy nation state could wage that exact type of attack, idling electricity grids and disrupting everything from communications networks to military defenses…such an attack would halt banking, transportation, food, water, and emergency services and might result in the defeat of our military forces.” (Federal Times—September 21, 2009)

The Federal Energy Regulatory Commission (FERC) says “the U.S. is ill-prepared to prevent or recover from an EMP”—they are asking Congress for authority to require power companies to take protective steps to build metal shields around sensitive computer equipment.

It is imperative for us to protect our critical infrastructure so that we are not vulnerable to the devastating effects of a potential EMP blast. We must think beyond simple guns and bullets and realize that our technological progress is on one hand a great advantage to our society, but on the other hand, can be a huge liability if our technical nerve centers are “taken out”. Our technology is a great strategic advantage for us, but also it is our soft underbelly, and whether, we are surprised by an EMP or some hard-hitting cyber warfare, we are back to the stone age and it will hurt.

It also occurs to me that the same tools terrorists use against others can also be used against them.


Share/Save/Bookmark

Embracing Instability and Enterprise Architecture

Traditional management espouses that executives are supposed to develop a vision, chart a course for the organization, and guide it to that future destination. Moreover, everyone in the enterprise is supposed to pull together and sing off the same sheet of music, to make the vision succeed and become reality. However, new approaches to organizational management acknowledge that in today’s environment of rapid change and the many unknowns that abound, executives need to be far more flexible and adaptable, open to learning and feedback, and allow for greater individualism and creativity to succeed.

In the book Managing the Unknowable by Ralph Stacey, the author states that “by definition, innovative strategic directions take an organization into uncharted waters. It follows that no one can know the future destination of an innovative organization. Rather, that organization’s managers must create, invent, and discover their destination as they go.”

In an environment of rapid change, the leader’s role is not to rigidly control where the organization is going, but rather to create conditions that foster creativity and learning. In other words, leaders do not firmly set the direction and demand a “cohesive team” to support it, but rather they create conditions that encourage and promote people to “question everything and generate new perspectives through contention and conflict.” The organization is moved from "building on their strengths and merely adapting to existing market conditions, [to insted] they develop new strengths and at least partly create their own environments.”

An organization just sticking to what they do best and incrementally improving on that was long considered a strategy for organizational success; however, it is now understood as a recipe for disaster. “It is becoming clearer why so many organizations die young…they ‘stick to their knitting’ and do better and better what they already do well. When some more imaginative competitors come along and change the rules of the game, such over-adapted companies…cannot respond fast enough. The former source of competitive success becomes the reason for failure and the companies, like animals, become extinct.”

Organizations must be innovative and creative to succeed. “The ‘new science’ for business people is this: Organizations are feedback systems generating such complex behavior that cause-and-effect links are broken. Therefore, no individual can intend the future of that system or control its journey to that future. Instead what happens to an organization is created by and emerges from the self-organizing interactions between its people. Top managers cannot control this, but through their interventions, they powerfully influence this.

With the rapidly changing economic, political, social, and technological conditions in the world, “the future is inherently unpredictable.” To manage effectively then is not to set rigid plans and targets, but rather to more flexibly read, analyze, and adapt to the changes as they occur or as they can be forecast with reasonable certainly. “A ‘shared vision’ of a future state must be impossible to formulate, unless we believe in mystic insight.” “No person, no book, can prescribe systems, rules, policies, or methods that dependably will lead to success in innovative organizations. All managers can do it establish the conditions that enable groups of people to learn in each new situation what approaches are effective in handling it.”

For enterprise architecture, there are interesting implications from this management approach. Enterprise architects are responsible for developing the current and target architecture and transition plan. However, with the rapid pace of change and innovation and the unpredictability of things, we learn that “hard and fast” plans will not succeed, but rather EA plans and targets must remain guidelines only that are modified by learning and feedback and is response to the end-user (i.e User-centric). Secondly, EA should not become a hindrance to organizational innovation, creativity, and new paradigms for organizational success. EA needs to set standards and targets and develop plans and administer governance, but this must be done simultaneously with maintaining flexibility and harnessing innovation into a realtime EA as we go along. It’s not a rigid EA we need, but as one of my EA colleagues calls it, it’s an “agile EA”.


Share/Save/Bookmark

September 27, 2009

Rational Decision Making and Enterprise Architecture

In the book Images of Organization by Gareth Morgan, the Nobel Prize winner Herbert Simon is cited as exploring the parallels between human and organization decision making, as follows:

Organizations can never be completely rational, because their members have limited information processing abilities…people

  • usually have to act on the basis of incomplete information about possible courses of action and their consequences

  • are able to explore only a limited number of alternatives relating to any given decision, and

  • are unable to attach accurate values to outcome


...In contrast to the assumptions made in economics about the optimizing behavior of individuals, he concluded that individuals and organizations settle for a ‘bounded rationality’ of a good enough decision based on simple rules of thumb and limited search and information.”


While EA provides a way ahead for the organization, based on Herbert Simon explanation, we learn that there is really no 100% right answers. Organizations, like individuals, have limited ability to plan for the future, since they cannot adequately analyze potential outcomes of decisions in an uncertain environment with limited information.


Architects and the organizations they serve must recognize that the best laid plans are based on bounded rationality, and there is no "right" or "wrong" answers, just rational planning and due diligence.


Share/Save/Bookmark

September 26, 2009

The Doomsday Machine is Real

There is a fascinating article in Wired (Oct. 2009) on a Doomsday Machine called “the Perimeter System” created by the Soviets. If anyone tries to attack them with a debilitating first strike, the doomsday machine will take over and make sure that the adversary is decimated in return.

“Even if the US crippled the USSR with a surprise attack, the Soviets could still hit back. It wouldn’t matter if the US blew up the Kremlin, took out the defense ministry, severed the communications network, and killed everyone with stars on their shoulders. Ground-based sensors would detect that a devastating blow had been struck and a counterattack would be launched.”

The Doomsday machine has supposedly been online since 1985, shortly after President Reagan proposed the Strategic Defense Initiative (SDI or “Star Wars”) in 1983. SDI was to shield the US from nuclear attack with space lasers (missile defense). “Star Wars would nullify the long-standing doctrine of mutually assured destruction.”

The logic of the Soviet’s Doomsday Machine was “you either launch first or convince the enemy that you can strike back even if you’re dead.”

The Soviet’s system “is designed to lie dormant until switched on by a high official in a crisis. Then it would begin monitoring a network of seismic, radiation, and air pressure sensors for signs of nuclear explosion.”

Perimeter had checks and balances to hopefully prevent a mistaken launch. There were four if/then propositions that had to be meet before a launch.

Is it turned on?

Yes then…

Had a nuclear weapon hit Soviet soil?

Yes, then…

Was there still communications links to the Soviet General Staff?

No, then launch authority is transfered to whoever is left in protected bunkers

Will they press the button?

Yes, then devastating nuclear retaliation!

The Perimeter System is the realization of the long-dreaded reality of machines taking over war.

The US never implemented this type of system for fear of “accidents and the one mistake that could end it all.”

“Instead, airborne American crews with the capacity and authority to launch retaliatory strikes were kept aloft throughout the Cold War.” This system relied more on people than on autonomous decision-making by machines.

To me, the Doomsday Machine brings the question of automation and computerization to the ultimate precipice of how far we are willing to go with technology. How much confidence do we have in computers to do what they are supposed to do, and also how much confidence do we have in people to program the computers correctly and with enough failsafe abilities not to make a mistake?

On one hand, automating decision-making can help prevent errors, such as a mistaken retaliatory missile launch to nothing more than a flock of geese or malfunctioning radar. On the other hand, with the Soviet’s Perimeter System, once activated, it put the entire launch sequence in the hands of a machine, up until the final push a button by a low-level duty station officer, who has a authority transferred to him/her and who is perhaps misinformed and blinded by fear, anger, and the urge to revenge the motherland in a 15 minute decision cycle—do or die.

The question of faith in technology is not going away. It is only going to get increasingly dire as we continue down the road of computerization, automation, robotics, and artificial intelligence. Are we safer with or without the technology?

There seems to be no going back—the technology genie is out of the bottle.

Further, desperate nations will take desperate measures to protect themselves and companies hungry for profits will continue to innovate and drive further technological advancement, including semi-autonomous and perhaps, even fully autonomous decision-making.

As we continue to advance technologically, we must do so with astute planning, sound governance, thorough quality assurance and testing, and always revisiting the technology ethics of what we are embarking on and where we are headed.

It is up to us to make sure that we take the precautions to foolproof these devices or else we will face the final consequences of our technological prowess.


Share/Save/Bookmark

September 25, 2009

The Window and the Mirror and Enterprise Architecture

I came across some interesting leadership lessons that can be helpful to enterprise architect leaders in the book Good to Great by Jim Collins.

At the most basic level, Collins says that a “level 5” executive or great leader is a “paradoxical blend of personal humility and professional will." “Level 5 leaders channel their ego away from themselves and into the larger goal of building a great company…their ambition is first and foremost for the institution, not themselves.”

Furthermore, level 5 great leaders differ from good leaders in terms of “the window and the mirror.”
  • Great leaders—“look out the window to attribute success to factors outside themselves, [and] when things go poorly, they look in the mirror and blame themselves.”
  • Good (non-great) leaders—“look in the mirror to take credit for success, but out the window to assign blame for disappointing results.”

Interestingly enough, many leaders attributed their company’s success to “good luck” and failures to “bad luck”. Collins writes: “Luck. What an odd factor to talk about. Yet, the good-to-great executives talked a lot about luck in our interviews. This doesn’t sound like Harvard or Yale MBAs talking does it?

Collins comments on this bizarre and repeated reference to luck and states: “We were at first puzzled by this emphasis on good luck. After all, we found no evidence that the good-to-great companies were blessed with more good luck than the comparison companies.”

What puzzles me is not only the lack of attribution for company success to global factors, general market conditions, competitive advantage, talented leadership, great architecture, astute planning, sound governance, great products/services, creative marketing, or amazing employees, but also that there is no mention or recognition in the study of good-to-great leaders in the benevolence from the Almighty G-d, and no apparent gratitude shown for their companies’ success. Instead, it's all about their personal brilliance or general good luck.

Where is G-d in the leaders' calculus for business success?

It seems that the same good-to-great leaders that “look out the window to attribute success to factors outside themselves,” also are looking down at superstitious or “Vegas-style” factors of luck, rather than looking out the window and up to the heavens from where, traditionally speaking, divine will emanates.

Perhaps, there should be a level 6 leader (after the level 5 great leader) that is “truly great” and this is the leader that not only has personal humility and professional will, but also belief in a power much higher than themselves that supersedes “good luck.”

Share/Save/Bookmark

Nanotechnology and Enterprise Architecture

“Nanotechnology is the engineering of functional systems at the molecular scale. In its original sense, 'nanotechnology' refers to the ability to construct items from the bottom up.” (Center for Responsible Nanotechnology)

Two examples of nanotechnology include the manufacturing of super strength polymers, and the design of computer chips at the molecular level (quantum computing). This is related to biotechnology, where technology is applied to living systems, such as recombinant DNA, biopharmaceuticals, or gene therapy.


How do we apply nanotechnology concepts to User-centric EA?
  • Integration vs. Decomposition: Traditional EA has looked at things from the top-down, where we decompose business functions into processes, information flows, and systems into services. But nanotechnology, from a process perspective, shows us that there is an alternate approach, where we integrate or build up from the bottom-up. This concept of integration can be used, for example, to connect activities into capabilities, and capabilities into competencies. These competencies are then the basis for building competitive advantage or carrying out mission execution.
  • Big is out, small is in: As we architect business processes, information sharing, and IT systems, we need to think “smaller”. Users are looking to shed the monolithic technology solutions of yesteryear for smaller, agile, and more mobile solutions today. For example, centralized cloud computing services replacing hundreds and thousands of redundant instances of individuals systems and infrastructure silos, smaller sized but larger capacity storage solutions, and ever more sleek personal digital assistants that pack in the functionality of cellphones, email, web browsing, cameras, ipods, and more.
  • Imagination and the Future State: As architects, we are concerned not only with the as-is, but also with the to-be state (many would say this is the primary reason for EA, and I would agree, although you can't establish a very effective transition plan without knowing where your coming from and going to). As we plan for the future state of things, we need to let our imagination soar. Moore’s Law, which is a view into the pace of technological change, is that the number of transistors on an integrated circuit doubles every 24 months. With the rapid pace of technological change, it is difficult for architects to truly imagine what the true possibilities are 3-5 years out--but that can't stop of from trying based on analysis, trends, forecasts, emerging technologies, competitive assessments, and best practice research.

The field of information technology, like that of nanotechnology and biotechnology is not only evolving, but is moving so quickly as to seem almost revolutionary at times. So in enterprise architecture, we need to use lots of imagination in thinking about the future and target state. Additionally, we need to think not only in terms of traditional architecture decomposition (a top-down view), but also integration (a bottom-up view) of the organization, its processes, information shares, and technologies. And finally, we need to constantly remain nimble and agile in the globalized, competitive marketplace where change is a constant.


Share/Save/Bookmark