February 27, 2009

Lessons from Space for CIOs



There are no CIOs in space. At least not yet. Someday, as we colonize space, there will be. And information technology will be more important then ever as communications, information sharing, collaboration, and new ways of doing things enable people to live and work in distances that are now just the realm of science fiction.


As I read about space tourism in MIT Technology Review, January/February 2009, I realized there are already lessons for CIOs from space travel even in its nascent stages.

  • Modernize, as needed—as technologists, some erroneously think that everything has to be swapped out and modernized every few years (for example, many organizations are on na 3 year refresh cycle—whether they need it or not!), but the Russian space program teaches us differently. They modernize, not on a fixed time, but rather as needed. They work by the principle “if it’s not broken don’t fix it.” Here’s an excerpt: “You can look at the original Soyuz, and the same physical design—same molds, even—appear to have been used throughout its history…But anything that has ever gone wrong or failed, they fix. Or if there is some new technology that comes along that would be of significant benefit, they change it also. Isn’t this a novel principle that we can adapt for sound IT investment management?

  • Functional minimalism--for many organizations and individuals, there is a great desire to have the latest and greatest technology gadgets and tools. Some call these folks technology enthusiasts or cutting-edge. And while, IT is incredibly exciting and some missions really need to be cutting-edge to safeguard lives for example. Many others don’t need to have a closet with one of every software package, hardware gadget, or new tool out there. I’ve seen mid-size organizations that literally have thousands of software products—almost as many as people in the entire company! However, on the Russian Soyuz space vehicle, we see a different way. One space tourist noted: “It’s sort of a functional minimalism.” You don’t need tons of gadgets, just what is operationally necessary. CIO’s, as IT strategists and gatekeepers for sound IT investing, should keep this principle in mind and spend corporate investment dollars wisely, strategically, and with careful selection criteria. We don’t need one of everything, especially when half of the investments are sitting in a closet somewhere collecting organizational dust!

  • Technology is 3-D—Our IT environment is still mostly stuck in a two-dimensional paradigm. Our user-interfaces, controls, and displays are still primarily flat. Of course, many have conceived of IT in a more real three-dimensional portrayal for example using 3-D graphics, modeling and simulation, holograms, virtual controls, and even virtual world’s in gaming and online. As CIO’s, we need to encourage the IT industry to continue rapid transformation from a 2-D to 3-D technology paradigm. As a corollary, in space where there is little to no gravity such as on the International Space Station, “It is cluttered, but then after a while you realize, well that’s true if you’re thinking in 2-D, but once your brain shift to 3-D, you realize that it isn’t.”

  • Think strategic and global—The CIO and his/her staff gets lot of calls everyday based on operational issues. From simple password resets to the dreaded “the network is down.” When firefighting, it is easy to fall into a purely operational way of thinking. How am I going to get this or that user back up. But getting all consumed by operational issues is counterproductive to long-term planning, strategy, and monumental shifts and leaps in technology and productivity. One space tourist looking out the window in space summed it up nicely for CIOs (and others) to get perspective: “You’re out there in space looking back at Earth, and in a way, you’re also looking back at your life, yourself, your accomplishments. Thinking about everything you own, love, or care for, and everything else that happens in the world. Thinking bigger picture. Thinking in a more global fashion.” Maybe every CIO need a picture window view from the Internation Space Station to keep perspective?

Share/Save/Bookmark

February 25, 2009

Security Architecture Q&A

Recently, I was interviewed on the subject of Security Architecture and was given permission to share the Q&A:

In general, what kinds of information security issues does an organization face?

The overarching information security issue in any organization is one of communication, collaboration and the need for transparency vs. the need to protect information from being compromised. Information security is about more than just "stopping leaks." It is also about making sure that people don't intercept, interject or otherwise manipulate agency information for their own ends.

A related issue has to do with protecting the agency's critical IT infrastructure from physical or cyber attack. It's the age-old conflict: If you lock it down completely, then you're protecting it, but you also can't use it. And if you open yourself up altogether, then obviously it won't be long before somebody takes aim.

Finally, the largest threat to an organization's information is clearly from insiders, who have the "keys to the kingdom." And so one must pay great attention to not only the qualifications, but also the background, of the employees and contractors entrusted with access to IT systems. Additionally we must institute checks and balances so that each person is accountable and is overseen.

How do leaders demonstrate security leadership?

Leadership in the area of security is demonstrated in a variety of ways. Obviously the primary method for demonstrating the importance of this function is to formalize it and establish a chief information security officer with the resources and tools at his or her disposal to get the job done.

But security leadership also means building an awareness of risk (and countermeasures) into everything we do: education, awareness, planning, designing, developing, testing, scanning and monitoring.

When new applications or services are being planned and rolled out, does security have a seat at the table?

I can't imagine any organization these days that doesn't consider security in planning and rolling out new applications or services. The real question is, does the organization have a formal process in place to provide certification and accreditation for IT systems? By law, federal agencies are required to do this.

Would you say that information security is generally tightly integrated into organizational culture?

I think that a security mindset and culture predominate in professions where security is paramount, such as law enforcement, defense and intelligence, for obvious reasons.

But the larger question is, how would other organizations make the transition to a culture of greater information security? And this is actually a really important question in today's age of transparency, social networking, Web 2.0, etc., where so much information is freely flowing in all directions. One approach that I have adopted as a culture-changing mechanism is to treat key initiatives as products to be marketed to a target audience. The IT security professional needs to be a master communicator as well as a technical expert, so that employees not only grudgingly comply with necessary measures, but are actively engaged with, and support, their implementation.

At the end of the day, the organization's information security is only as strong as its weakest link. So security has to be as deeply ingrained into the culture and day-to-day operations as possible.

Is information security an inhibitor to new initiatives?

Information security is one of many requirements that new initiatives must meet. And of course there will always be people who see compliance as an inhibitor. But the reality is that security compliance is an enabler for initiatives to achieve their goals. So the key for IT security professionals is to keep educating and supporting their stakeholders on what they need to do to achieve success and security at the same time.


Share/Save/Bookmark

February 22, 2009

Disruptive Technologies

When companies get cozy, the marketplace gets innovative and from out of nowhere...a disruptive technology upends things.

We've seen this happen countless of times in big ways.

In the auto industry, 50 years ago neither GM nor Ford would have ever dreamed that they would lose their virtual monopoly on the U.S. auto industry to foreign car companies that would dislodge them with compact vehicles and hybrid engine technologies.

More recently in the music industry, Apple seized the day by combining functionality, stylishness and price on their iPod player with an accessible online iTunes music store.

More generally, the whole world of e-Commerce has stolen much of the show from the brick and mortar retail outlets with internet marketing, online transaction processing, supply chain management and electronic funds transfer.

Now, another disruption is occurring in the computer market. For years, the computer industry has made every effort to provide more raw computing power, memory, and functionality with every release of their computers. And Moore’s law encapsulated this focus with predictions of doubling every two years.

Now, on the scene comes the Netbook—a simpler, less powerful, less capable computing device that is taking off. Yes, this isn’t the first time that we’ve had a drive toward smaller, sleeker devices (phones, computers, and so on), but usually the functionality is still growing or at the very least staying the same. But with Netbooks smaller truly does mean less capable.

Wired magazine, March 2009, states “ The Netbook Effect: Dinky keyboard. Slow chip. Tiny hard drive. And users are going crazy for them.”

How did we get here?

“For years now, without anyone really noticing, the PC industry has functioned like a car company selling SUVs: It pushed absurdly powerful machines because the profit margins were high, which customers lapped up the fantasy that they could go off-roading, even though they never did.”

So what happened?

What netbook makers have done is turn back the clock: Their machine perform the way laptops did four years ago. And it turns out that four years ago (more or less) is plenty.”

“It turns out that about 95%...can be accomplished through a browser…Our most common tasks—email, Web surfing, watching streaming videos—require very little processing power.”

The netbook manufactures have disrupted the computer market by recognizing two important things:

  1. Computer users have adequate computing power for their favorite tasks and what they really want now is more convenience and at a price that says buy me.
  2. Cloud computing is no longer an idea full of hot air, but it is a technology that is here now and can do the job for consumers. We can get our applications over the web and do not have to run them on our client machines. We can afford to have computers that do less, because the cloud can do more!

The result?

Foreign companies are running away with the Netbook market. “By the end of 2008, Asustek had sold 5 million netooks, and other brands together had sold 10 million…In a single year, netbooks had become 7 percent of the world’s entire laptop market. Next year it will be 12%.”

“And when Asustek released the Eee notbook, big firms like Dell, HP, and Apple did nothing for months.” They were taken off guard by miscalculation and complacency.

The future?

Of course, the big boys of computing are hoping that the netbook will be a “secondary buy—the little mobile thing you get after you already own a normal size laptop. But it’s also possible, that the next time your replacing an aging laptop, you’ll walk away into the store and wonder, ‘why exactly am I paying so much for a machine that I use for nothing but email and the Web?’ And Microsoft and Intel and Dell and HO and Lenovo will die a little bit inside that day.”

Implications for CIOs?

  • End complacency and always be on the lookout for disruptive technologies and ways of doing business. There is always a better way!
  • Hardware becomes a commodity over time and supplying the infrastructure for the organization is moving the way that electricity generation did at the turn of the 20th century—to outside vendors that can do it more effectively and efficiently.
  • Cloud computing means that commonly used software applications are available over the internet and can be provide the foundation business functionality for the organization.

The important future value add from the Office of CIO is in IT strategy, planning, governance, and mission-focused solutions. We need CIOs that are true leaders, innovative, and focused on the business and not just on the technology.


Share/Save/Bookmark

February 21, 2009

No Choice But to Change

It’s easy to get into a rut and just follow the status quo that we’re used to.

People do it all the time. It’s doing what we know. It’s comfortable. It’s less challenging. It feels less risky. It doesn’t “cause waves” with various stakeholders.

Don’t we often hear people say, “don’t fix it, if it ain’t broke”?

Here’s another more arrogant and obnoxious version of the anti-change sentiment: “don’t mess with perfection!”

And finally, the old and tried and true from the nay-sayer crowd: “we tried that one before.”

Unfortunately, what many of these die-hard obstructionists fail to acknowledge is that time does not stand still for anyone; “Time marches on.” Change is a fact of life, and you can either embrace it or make a futile attempt to resist.

If you embrace it and moreover become a champion of it, you can influence and shape the future—you are not simply a victim of the tide. However, if you resist change, you are standing in front of a freight train that will knock you out and drag you down. You will lose and lose big: Change will happen without you and you will be run over by it.

In short, it is more risky to avoid change than to embrace it.

Therefore, as a leader in an organization, as The Total CIO, you have an obligation to lead change:

  • to try to foresee events that will impact the organization, its products/services, its processes, its technology, and its people.
  • to identify ways to make the most of changing circumstances—to take advantage of opportunities and to mitigate risks, to fill gaps and to reduce unnecessary redundancies.
  • to develop and articulate a clear vision for the organization (especially in terms of the use of information technology) and to steer the organization (motivate, inspire, and lead) towards that end state.
  • to course correct as events unfold; the CIO is not a fortuneteller with all knowing premonition. Therefore, the CIO must be prepared to adjust course as more information becomes available. Sticking to your guns is not leadership, its arrogance.
  • to integrate people, process, technology, and information; the CIO is not siloed to technology issues. Rather, the CIO must look across the enterprise and develop enterprise solutions that integrate the various lines of business and ensures true information sharing, collaboration, and streamlined integration and efficiency. The CIO is a unifier.
  • to institutionalize structured planning and governance to manage change. It’s not a fly by night or put your finger up to see which way the wind is blowing type of exercise. Change management is an ongoing programmatic function that requires clear process, roles and responsibilities, timelines, and decision framework.
  • to bring in management best practices to frame the change process. Change is not an exact science, but we can sure learn from how others have been and are successful at it and try to emulate best practices, so we are not reinvesting the wheel.

Change is a fact of life, even if it is often painful.

I’d like to say that maybe it doesn’t have to be, but I think that would be lying, because it would be denying our humanity—fear, resistance, apathy, weariness, physical and mental costs, and other elements that make change difficult.

But while the CIO cannot make change pain-free, he can make change more understandable, more managed (and less chaotic), and the results of change more beneficial to the long term future of the organization.


Share/Save/Bookmark

February 16, 2009

It's Not The Systems, Stupid

Being a CIO is not just about information technology—IT is a service. The real job of a CIO is truly understanding the IT needs of their customers (those who actually carry out the mission of the organization) and leading the IT people to fulfill those needs.

In essence, the CIO leads his IT staff to deliver on the mission needs of the organization. So being the CIO is far from being just a technical job; it is very much a people job.

To deliver IT then, the CIO must understand how to effectively lead and motivate his people.

There is a terrific book on this subject called “What People Want” by Terry Bacon that identifies 7 primary needs of people in work relationships and particularly how an effective leader can fulfill those needs and in so doing build a high performing workforce.

Here are the primary people needs in relationships:

TRUST—“the most fundamental relationship need. Without trust, there will not be much of a basis for a relationship at all.”
CHALLENGE/GROWTH—“with rare exception, people are not content in trivial, boring, or stagnant jobs…they need to feel that their work is challenging and that they are developing their skills, capabilities, and possibilities.”
SELF-ESTEEM—"appearance, intelligence, talents, autonomy, integrity, awards, titles, positions, job responsibilities, memberships in special groups, acceptance or recognition.”
COMPETENCE—“people want to be expert at something.”
APPRECIATED—“feel pride in who they are and be genuinely accepted for what they contribute.”
EXCITED—“people want to be energized and enthused…it’s more fun than the alternative.”
RELEVANT—“contributing to something they believe in.”

You’ll notice that monetary compensation and benefits are not mentioned here, because that’s not what this is about. Yes, we all need to be able to pay our bills at the end of the month, but beyond that we have basic human needs (trust, challenge, self-esteem…) that are fundamental to people being effective on the job through their interactions with others.

And indeed, every leader can become a better, more effective leader by understanding these relationship needs and developing their abilities to genuinely help people feel fulfilled on these.

For the CIO, I think it is very easy—too much so—to focus on technology. The field is technically intriguing, quickly changing, futuristic, and fundamental to mission. Intentionally or not, the CIO can easily overlook the people that are behind the technical solutions—those that he/she depends on to really tech-enable the organization (it’s not the systems, stupid).

CIOs, take care of your hard-working and talented people—develop their trust, provide challenging work, grow their self-esteem, help them to mature their competences, appreciate them, inspire and excite, and show them they are contributing to something important. And you and they will be more than the sum of the parts and deliver IT solutions to the organization that will truly amaze!
Share/Save/Bookmark

February 14, 2009

The Stimulus Plan and User-Centric Enterprise Architecture

Just something I am thinking about...

Per Wall Street Journal, 14-15 February 2008, Stimulus Plan = 1,073 pages.

Imagine this...alternative stimulus plan--one sentence: Give everyone a debit card for $2500 that is good for 3 months.

(That's per every man, women, and child in this country!)

Result: spending will be pervasive and immediate, jump-starting the stalled economy.

(This can still be supplemented by long-term infrastructure projects and national investments as appropriate.)


This point is that the enemy of problem-solving is over-complexity.

We start with a problem that is so complex almost no one can understand it. For example, the financial market melt-down was tied in large part to dizzyingly constructed financial instruments that confounded and some say manipulated, even the most sophisticated investors.

And the answer was developed to respond to the problem. Sure, a complex problem may deserve a multi-faceted and even a thousand page answer.

But, perhaps it is time to step out of the trees and look at the forest. Is it time for a little simplicity?

Even if the answer is ok, maybe it needs to be communicated simply and straightforwardly--it's got to be user-centric!

Obviously, the point is not to over-simplify and miss the mark, but do be direct and to draw a clear relationship between problem and solution. Have we done that?


Share/Save/Bookmark

February 10, 2009

Reflections On The Role of a Federal Chief Technology Officer

Had interview today with Federal News Radio on the role of the Chief Technology Officer.

Here are some key points:

1. The CTO is a subject matter expert on technology modernization, transformation and deployment of new technology in the agency.

2. The CTO is responsible to work with the lines of business and IT to ensure that technology is meeting the needs of customers, that enterprise architecture and governance are in place, and that agency is incorporating best practices from all sources into technology operations.

3. The CIO's focus is on the business while that of the CTO is technology. Everything the CTO does is to support the CIO to operationalize his or her decisions and those of the senior leadership team. The CTO also serves as principal advisor to the CIO on IT management best practices, so that these get incorporated into the decision process.

4. A federal CTO would be a positive development because it will give a more prominent voice to the nation’s technology needs.

If there had been more time I would have added that in my view, the most important issue for a federal CTO is to address the need to raise the technology competitiveness of the United States. Technology is our future and we need to be number one.
Share/Save/Bookmark

February 8, 2009

Change Agents--Poisoned or promoted?

Let’s fantasize for a moment about what it must be like to be an enterprise architect/change agent.

Here we go.

Our stereotypical organization, let’s call it ABC Company has a talented group of enterprise architects. They have worked hard, built partnerships, learnt the organization and its needs, and have done a remarkable job working with leadership, subject matter experts, and other stakeholders in identifying an accurate baseline, determining a promising target, and have helped the organization navigate a well thought out transition plan. The organization reaches its target—success—and the process continues.

Hooray for the architects. Praise and promotion be upon ABC company’s enterprise architects.

Wait. Not so fast. Let’s back up. Rewind and see what often really happens when architects or anyone else for that matter tries to change the status quo:

R—E—S—I—S—T—A—N—C—E!!

Research shows that change agents are often scorned by their organizations and their peers. In immature organizations that do not embrace constructive change, change agents like enterprise architects are often not looked upon favorably.

Remember what happened to Socrates more than two millennium ago (and countless others innovators, inventors, and thought leaders since)?

Strategy + Business Magazine, Issue 53, has an article called “Stand by Your Change Agent.”

The article states: “research shows that most transformation leaders go unpromoted, unrecognized, and unrewarded. And their companies suffer in the long run.”

In a study of 84 major change initiatives at Fortune 500 companies between 1995 and 2005, “some 70 percent of executives who led these major transformations went unrewarded or were sidelined, fired, or spurred to leave.”

Why are change agents treated adversely?

The research shows that “deep down, a great many people and organizations fear change. People do not like to move out of their comfort zones. Powerful institutional forces help maintain the status quo. In such companies, change simply has no constituency.”

In these change-averse organizations, change agents often “find their efforts impeded, undermined, or rejected outright. Change agents may also suffer from the delusion that others see the urgent need for action just as they do, and may be frustrated to discover how little key stakeholders care about the initiatives and outcomes they hold dear.”

What is the impact to companies that treat their change agents this way?

Both the companies and people suffer. Change initiatives remain unfinished. Investments do not see their payback. Highly talented change agents are lost. And worse, other potential leaders will think many times over before taking on a change effort that “could derail their careers.”

Well, which companies did best with change?

“Companies that scored highest in leadership development and embracing change were most likely to improve performance.”

The lesson is clear: If companies want to grow, mature, and improve performance, then they need leaders who are visionaries and change agents to step up to the plate.

Those organizations that recognize this truth will embrace their change agents—encourage, recognize, reward, promote, and retain them.

Talented and motivated change agents (like enterprise architects) are an organization’s best hope for innovation, energizing creative potential, and long-term organizational success.


Share/Save/Bookmark

February 7, 2009

The Perilous Pitfalls of Unconscious Decision Making

Every day as leaders, we are called upon to make decisions—some more important than others—but all having impacts on the organization and its stakeholders. Investments get made for better or worse, employees are redirected this way or that, customer requirements get met or are left unsatisfied, suppliers receive orders while others get cancelled, and stakeholders far and wide have their interests fulfilled or imperiled.

Leadership decisions have a domino effect. The decisions we make today will affect the course of events well into the future--especially when we consider a series of decisions over time.

Yet leadership decisions span the continuum from being made in a split second to those that are deliberated long and hard.

In my view, decision makers can be categorized into three types: “impulsive,” “withholding,” and “optimizers.”

  1. Impulsive leaders jump the gun and make a decision without sufficient information—sometimes possibly correctly, but often risking harm to the organization because they don’t think things through.
  2. Withholding leaders delay making decisions, searching for the optimal decision or Holy Grail. While this can be effective to avoid overly risky decisions, the problem is that they end up getting locked into “analysis paralysis”. They never get off the dime; decisions linger and die while the organization is relegated to a status quo—stagnating or even declining in times of changing market conditions.
  3. Optimizers rationally gather information, analyze it, vet it, and drive towards a good enough decision; they attempt to do due diligence and make responsible decisions in reasonable time frames that keep the organization on a forward momentum, meeting strategic goals and staying competitive. But even the most rational individuals can falter in the face of an array of data.

So it is clear that whichever mode decision makers assume, many decisions are still wrong. In my view, this has to do with the dynamics of the decision-making process. Even if they think they are being rational, in reality leaders too often make decisions for emotional or even unconscious reasons. Even optimizers can fall into this trap.

CIOs, who are responsible for substantial IT investment dollars, must understand why this happens and how they can use IT management best practices, structures, and tools to improve the decision-making process.

An insightful article that sheds light on unconscious decision-making, “Why Good Leaders Make Bad Decisions,” was published this month in Harvard Business Review.

The article states: “The reality is that important decisions made by intelligent, responsible people with the best information and intentions are sometimes hopelessly flawed.”

Here are two reasons cited for poor decision making:

  • Pattern Recognition—“faced with a new situation, we make assumptions based on prior experiences and judgments…but pattern recognition can mislead us. When we’re dealing with seemingly familiar situations, our brains can cause us to think we understand then when we don’t.”
  • Emotional Tagging—“emotional information attaches itself to the thoughts and experiences stored in our memories. This emotional information tells us whether to pay attention to something or not, and it tells us what sort of action we should be contemplating.” But what happens when emotion gets in the way and inhibits us from seeing things clearly?

The authors note some red flags in decision making: the presence of inappropriate self-interest, distorting attachments (bonds that can affect judgment—people, places, or things), and misleading memories.

So what can we do to make things better?

According to the authors of the article, we can “inject fresh experience or analysis…introduce further debate and challenge…impose stronger governance.”

In terms of governance, the CIO certainly comes with a formidable arsenal of IT tools to drive sound decision making. In particular, enterprise architecture provides for structured planning and governance; it is the CIO’s disciplined way to identify a coherent and agreed to business and technical roadmap and a process to keep everyone on track. It is an important way to create order of organizational chaos by using information to guide, shape, and influence sound decision making instead of relying on gut, intuition, politics, and subjective management whim—all of which are easily biased and flawed!

In addition to governance, there are technology tools for information sharing and collaboration, knowledge management, business intelligence, and yes, even artificial intelligence. These technologies help to ensure that we have a clear frame of reference for making decisions. We are no longer alone out there making decisions in an empty vacuum, but rather now we can reach out –far and wide to other organizations, leaders, subject matter experts, and stakeholders to get and give information, to analyze, to collaborate and to perhaps take what would otherwise be sporadic and random data points and instead connect the dots leading to a logical decision.

To help safeguard the decision process (and no it will never be failsafe), I would suggest greater organizational investments in enterprise architecture planning and governance and in technology investments that make heavily biased decisions largely a thing of the past.


Share/Save/Bookmark

January 24, 2009

Vision and The Total CIO

Vision is often the telltale demarcation between a leader and a manager. A manager knows how to climb a ladder, but a leader knows where the ladder needs to go—leaders have the vision to point the organization in the right direction!
Harvard Business Review, January 2009, asks “what does it mean to have vision?”
First of all, HBR states that vision is the “central component in charismatic leadership.” They offer three components of vision, and here are my thoughts on these:
  1. Sensing opportunities and threats in the environment”—(recognizing future impacts) this entails “foreseeing events” and technologies that will affect the organization and one’s stakeholders. This means not only constantly scanning the environment for potential impacts, but also making the mental connections between, internal and external factors, the risks and opportunities they pose, and the probabilities that they will occur.
  2. Setting strategic direction”—(determining plans to respond) this means identifying the best strategies to get out ahead of emerging threats and opportunities and determining how to mitigate risks or leverage opportunities (for example, to increase mission effectiveness, revenue, profitability, market share, and customer satisfaction).
  3. Inspiring constituents”—(executing on a way ahead) this involves assessing change readiness, “challenging the status quo” (being a change agent), articulating the need and “new ways of doing things”, and motivating constituent to take necessary actions.
The CIO/CTO is in a unique position to provide the vision and lead in the organization, since they can bring alignment between the business needs and the technologies that can transform it.
The IT leader cannot afford to get bogged down in firefighting the day-to-day operations to the exclusion of planning for the future of the enterprise. Firefighting is mandatory when there is a fire, but he fire must eventually be extinguished and the true IT leader must provide a vision that goes beyond tomorrow’s network availability and application up-time. Sure the computers and phones need to keep working, but the real value of the IT leader is in providing a vision of the future and not just more status quo.
The challenge for the CIO/CTO is to master the business and the technical, the present and the future—to truly understand the mission and the stakeholders as they are today as well as the various technologies and management best practices available and emerging to modernize and reengineer. Armed with business and technical intelligence and a talent to convert the as-is to the to-be, the IT leader can increase organizational efficiency and effectiveness, help the enterprise better compete in the marketplace and more fully satisfy customers now and in the future.

Share/Save/Bookmark

January 18, 2009

Information: Knowledge or B.S.?

With modern technology and the Internet, there is more information out there than ever before in human history. Some argue there is too much information or that it is too disorganized and hence we have “information overload.”

The fact that information itself has become a problem is validated by the fact that Google is world’s #1 brand with a market capitalization of almost $100 billion. As we know the mission statement of Google is to “to organize the world's information and make it universally accessible and useful.”

The key to making information useful is not just organizing it and making it accessible, but also to make sure that it is based on good data—and not the proverbial, “garbage in, garbage out” (GIGO).

There are two types of garbage information:

  1. Incorrect, incomplete, or dated
  2. Misleading /propagandistic or an outright lie

When information is not reliable, it causes confusion, rather than bringing clarity. And then, the information can actually result in worse decision making, then if you didn’t have it in the first place. This is an enterprise architecture that is not only worthless, but is harmful or poison to the enterprise.

Generally, in enterprise architecture, we are optimistic about human nature and focus on #1, i.e., we assume that people mean to provide objective and complete data and try to ensure that they can do that. But unfortunately there is a darker side to human nature that we must grapple with, and that is #2.

Misinformation by accident or by intent is used in organizations all the time to make poor investment decisions. Just think how many non-standardized, non-interoperable, costly tools your organization has bought because someone provided “information” or developed a business case, which “clearly demonstrated” that is was a great investment with a high ROI. Everyone wants their toys!

Wired Magazine, February 2009, talks about disinformation in the information age in “Manufacturing Confusion: How more information leads to less knowledge” (Clive Thompson).

Thompson writes about Robert Proctor, a historian of science from Stanford, who coined the word “Agnotology,” or “the study of culturally constructed ignorance.” Proctor theorizes that “people always assume that if someone doesn’t know something, it’s because they haven’t paid attention or haven’t yet figured it out. But ignorance also comes from people literally suppressing truth—or drowning it out—or trying to make it so confusing that people stop hearing about what’s true and what’s not.” Thompson offers as examples:

  1. “Bogus studies by cigarette companies trying to link lung cancer to baldness, viruses—anything but their product.”
  2. Financial firms creating fancy-dancy financial instruments like “credit-default swaps [which] were designed not merely to dilute risk but to dilute knowledge; after they changed hands and been serially securitized, no one knew what they were worth.”

We have all heard the saying that “numbers are fungible” and we are also all cautious about “spin doctors” who appear in the media telling their side of the story rather than the truth.

So it seems that despite the advances wrought by the information revolution, we have some new challenges on our hands: not just incorrect information but people who literally seek to promote its opposite.

So we need to get the facts straight. And that means not only capturing valuable information, but also eliminating bias so that we are not making investment decisions on the basis of B.S.


Share/Save/Bookmark

January 17, 2009

Decentralization, Technology, and Anti-Terror Planning

Given that 9/11 represented an attack on geographically concentrated seats of U.S. financial and government power, is it a good enterprise architecture decision to centralize many or all government headquarters in one single geographic area?

Read about Decentralization, Technology, and Anti-Terror Planning in The Total CIO.


Share/Save/Bookmark

Decentralization, Technology, and Anti-Terror Planning

Even though there hasn’t been a successful terrorist attack against the United States since 9/11, we are all aware that terrorists continue to seek ways to harm us. Of course, we have assets deployed nationally as well as internationally to protect our interests. However, there is always more that can be done. And one thing that immediately comes to my mind is decentralization.

The concept of decentralization is very simple. Rather than concentrating all your vital assets in one place, you spread them out so that if one is destroyed, the others remain functional. The terrorists already do this by operating in dispersed “cells.” Not only that, but we know that very often one “cell” doesn’t know what the other one is doing or even who they are. All this to keep the core organization intact in case one part of it is compromised.

Both the public and private sectors understand this and often strategically decentralize and have backup and recovery plans. However, we still physically concentrate the seat of our federal government in a geographically close space. Given that 9/11 represented an attack on geographically concentrated seats of U.S. financial and government power, is it a good enterprise architecture decision to centralize many or all government headquarters in one single geographic area?

On the one hand the rationale for co-locating federal agencies is clear: The physical proximity promotes information-sharing, collaboration, productivity, a concentrated talent pool, and so on. Further, it is a signal to the world that we are a free and proud nation and will not cower before those who threaten us.

Yet on the other hand, technology has advanced to a point where physical proximity, while a nice-to-have, is no longer an imperative to efficient government. With modern telecommunications and the Internet, far more is possible today than ever before in this area. Furthermore, while we have field offices dispersed throughout the country, perhaps having some headquarters outside DC would bring us closer to the citizens we serve.

On balance, I believe that both centralization and decentralization have their merits, but that we need to more fully balance these. To do this, we should explore the potential of decentralization before automatically reverting to the former.

It seems to me that decentralization carries some urgency given the recent report “World At Risk,” by The Commission on the Prevention of Weapons of Mass Destruction Proliferation and Terrorism—it states that “terrorists are determined to attack us again—with weapons of mass destruction if they can. Osama bin Laden has said that obtaining these weapons is a ‘religious duty’ and is reported to have sought to perpetuate another ‘Hiroshima.’

Moreover, the report goes on to state that the commission “believes that unless the world community acts decisively and with great urgency, it is more likely than not that a weapon of mass destruction will be used in a terrorist attack somewhere in the world by the end of 2013.”

Ominously the report states “we know the threat we face. We know our margin of safety is shrinking, not growing. And we know what we must do to counter the risk.”

Enterprise architecture teaches us to carefully vet and make sound investment decisions. Where should we be investing our federal assets—centrally or decentralized and how much in each category?

Obviously, changing the status quo is not cheap and would be especially difficult in the current global economic realty. But it is still something we should carefully consider.


Share/Save/Bookmark

January 11, 2009

Choice Architecture and Enterprise Architecture

In a free society like America, we are generally all strong believers in our rights and freedoms—like those often cited from the Bill of Rights-- speech, press, religion, assembly, bearing arms, due process and so forth. More broadly, we cherish our right and freedom to choose.

According to an recent article in Harvard Business Review, December 2008, one way that enterprises can better architect their products and services is by “choice architecture.”

Choice Architecture is “design of environments to order to influence decisions.” By “covertly or overly guiding your choices,” enterprises “benefit both company and consumer by simplifying decision making, enhancing customer satisfaction, reducing risk, and driving profitable purchases.”

For example, companies set “defaults” for products and services that are “the basic form customers receive unless they take action to change it.”

“At a basic level, defaults can serve as manufacturer recommendations, and more often than not we’re happy with what we get by accepting them. [For example,] when we race through those software installation screens and click ‘next’ to accept the defaults, we’re acknowledging that the manufacturer knows what’s best for us.”

Of course, defaults can be nefarious as well. They have caused many of us to purchase unwanted extended warranties or to inadvertently subscribe to mailing lists.”

Given the power of defaults to influence decisions and behaviors both positively and negatively, organizations must consider ethics and strategy in equal measure in designing them.”

Here are some interesting defaults and how they affect decision making:

Mass defaults—“apply to all customers…without taking customers; individual preferences into account.” This architecture can result in suboptimal offerings and therefore some unhappy customers.

Some mass defaults have hidden options—“the default is presented as a customer’s only choice, although hard-to-find alternatives exist.” For example, computer industry vendors, such as Microsoft, often use hidden options to keep the base product simple, while at the same time having robust functionality available for power users.

Personalized defaults—“reflect individual differences and can be tailored to better meet customers’ needs.” For example, information about an individual’s demography or geography may be taken into account for product/service offerings.

One type of personalized default is adaptive defaults—which “are dynamic: they update themselves based on current (often real-time) decisions that a customer has made.” This is often used in online retailing, where customers make a series of choices.

There are other defaults types such as benign, forced, random, persistent, and smart: each limiting or granting greater amounts of choice to decision makers.

When we get defaults right (whether we are designing software, business processes, other end-user products, or supplying services), we can help companies and customers to make better, faster, and cheaper decisions, because there is “intelligent” design to guide the decision process. In essence, we are simplifying the decision making process for people, so they can generally get what they want in a logical, sequenced, well-presented way.

Of course, the flip side is that when choice architecture is done poorly, we unnecessarily limit options, drive people to poor decisions, and people are dissatisfied and will seek alternative suppliers and options in the future.

Certainly, we all love to choose what we want, how we want, when we want and so on. But like all of us have probably experienced at one time or another: when you have too many choices, unconstrained, not guided, not intelligently presented, then consumers/decision makers can be left dazed and confused. That is why we can benefit from choice architecture (when done well) to help make decision making simple, smarter, faster, and generally more user-centric.


Share/Save/Bookmark

January 10, 2009

Why We Make Bad Decisions and Enterprise Architecture

With the largest Ponzi scheme in history ($50 billion!!) still unfolding, and savvy investors caught off guard, everyone is asking how can this happen—how can smart, experienced investors be so gullible and make such big mistakes with their savings?

To me the question is important from an enterprise architecture perspective, because EA is seeks to help organizations and people make better decisions and not get roped into decision-making by gut, intuition, politics, or subjective management whim. Are there lessons to be learned from this huge and embarrassing Ponzi scheme that can shed light on how people get suckered in and make the wrong decision?

The Wall Street Journal, 3-4 January, has a fascinating article called the “Anatomy of Gullibility,” written by one of the Madoff investors who lost 30% of their retirement savings in the fund.

Point #1—Poor decision-making is not limited to investing. “Financial scams are just one of the many forms of human gullibility—along with war (the Trojan Horse), politics (WMD in Iraq), relationships (sexual seduction), pathological science [people are tricked into false results]…and medical fads.”

Point #2—Foolish decisions are made despite information to the contrary (i.e. warning signs). “A foolish (or stupid) act is one in which someone goes ahead with a socially or physically risky behavior in spit of danger signs or unresolved questions.

Point #3—There are at least four contributors to making bad decisions.

  • SITUATION—There has to be a event that requires a choice (i.e. a decision point). “Every gullible act occurs when an individual is presented with a social challenge that he has to solve.” In the enterprise, there are situations (economic, political, social, legal, personal…) that necessitate decision-making every day.
  • COGNITION—Decision-making requires cognition, whether sound or unsound. “Gullibility can be considered a form of stupidity, so it is safe to assume deficiencies in knowledge and/or clear thinking are implicated.” In the organization and personally, we need lots of good useful and usable information to make sound decisions. In the organization, enterprise architecture is a critical framework, process, and repository for the strategic information to aid cognitive decision-making processes.
  • PERSONALITY—People and their decisions are influenced positively or negatively by others (this includes the social affect…are you following the “in-crowd”.) “The key to survival in a world filled with fakers…or unintended misleaders…is to know when to be trusting and when not to be.” In an organization and in our personal lives, we need to surround ourselves with those who can be trusted to be provide sound advice and guidance and genuinely look after our interests.
  • EMOTION—As humans, we are not purely rational beings, we are swayed by feelings (including fear, greed, compassion, love, hate, joy, anger…). “Emotion enters into virtually every gullible act.” While, we can never remove emotion, nor is it even desirable to do this, from the decision-making process, we do need to identify the emotional aspects and put them into perspective. For example, the enterprise may feel threatened and competitive in the marketplace and feel a need to make a big technological investment; however, those feelings should be tempered by an objective business case including cost-benefit analysis, analysis of alternatives, risk determination, and so forth.

Hopefully, by better understanding the components of decision-making and what makes us as humans gullible and prone to mistakes, we can better structure our decision-making processes to enable more objective, better vetted, far-sighted and sound decisions in the future.


Share/Save/Bookmark

January 4, 2009

The Need for Control and Enterprise Architecture

Human beings have many needs and these have been well documented by prominent psychologists like Abraham Maslow.

At the most basic level, people have physiological needs for food, water, shelter, and so on. Then “higher-level” needs come into play including those for safety, socializing, self-esteem, and finally self-actualization.

The second order need for safety incorporates the human desire for feeling a certain degree of control over one’s life and that there is, from the macro perspective, elements of predictability, order, and consistency in the world.

Those of us who believe in G-d generally attribute “real” control over our lives and world events to being in the hands of our creator and sustainer. Nevertheless, we see ourselves having an important role to play in doing our part—it is here that we strive for control over our lives in choosing a path and working hard at it. A lack of any semblance of control over our lives makes us feel like sheer puppets without the ability to affect things positively or negatively. We are lost in inaction and frustration that whatever we do is for naught. So the feeling of being able to influence or impact the course of our lives is critical for us as human beings to feel productive and a meaningful part of the universe that we live in.

How does this impact technology?

Mike Elgan has an interesting article in Computerworld, 2 January 2009, called “Why Products Fail,” in which he postulates that technology “makers don’t understand what users want most: control.”

Of course, technical performance is always important, but users also have a fundamental need to feel in control of the technology they are using. The technology is a tool for humans and should be an extension of our capabilities, rather than something like in the movie Terminator that runs rogue and out of the control of the human beings who made them.

When do users feel that the technology is out of their control?

Well aside from getting the blue screen of death, when they are left waiting for the computer to do something (especially the case when they don’t know how long it will be) and when the user interface is complicated, not intuitive, and they cannot find or easily understand how to do what they want to do.

Elgan says that there are a number of elements that need to be built into technology to help user feel in control.

Consistetency—“predictability…users know what will happen when they do something…it’s a feeling of mastery of control.”

Usability—“give the user control, let them make their own mistakes, then undo the damage if they mess something up” as opposed to the “Microsoft route—burying and hiding controls and features, which protects newbies from their own mistakes, but frustrates the hell out of experienced users.”

Simplicity—“insist on top-to-bottom, inside-and-outside simplicity,” rather than “the company that hides features, buries controls, and groups features into categories to create the appearance of few options, with actually reducing options.”

Performance/Stability—“everyone hates slows PCs. It’s not the waiting. It’s the fact that the PC has wrenched control from the user during the time that the hourglass is displayed.”

Elgan goes on to say that vendors’ product tests “tend to focus on enabling user to ‘accomplish goals…but how the user feels during the process is more important than anything else.”

As a huge proponent of user-centricity, I agree that people have an inherent need to feel they are in some sort of control in their lives, with the technology they use, and over the direction that things are going in (i.e. enterprise architecture).

However, I would disagree that how the user feels is more important than how well we accomplish goals; mission needs and the ability of the user to execute on these must come first and foremost!

In performing our mission, users must be able to do their jobs, using technology, effectively and efficiently. So really, it’s a balance between meeting mission requirements and considering how users feel in the process.

Technology is amazing. It helps us do things better, faster, and cheaper that we could ever do by ourselves. But we must never forget that technology is an extension of ourselves and as such must always be under our control and direction in the service of a larger goal.


Share/Save/Bookmark

January 3, 2009

Embedded Systems and Enterprise Architecture

Information technology is not just about data centers, desktops, and handheld devices anymore. These days, technology is everywhere—embedded in all sorts of devices from cars and toaster ovens to traffic lights and nuclear power plants. Technology is pervasive in every industry from telecommunications to finance and from healthcare to consumer electronics.

Generally, embedded systems are dedicated to specific tasks, while general-purpose computers can be used for a variety of functions. In either case, the systems are vital for our everyday functioning.

Government Computer News, 15 December 2008 reports that “thanks to the plummeting cost of microprocessors, computing…now happens in automobiles, Global Positioning Systems, identification cards and even outer space.

The challenge with embedded systems are that they “must operate on limited resources—small processors, tiny memory and low power.”

Rob Oshana, director of engineering at Freescale Semiconductor says that “With embedded it’s about doing as much as you can with as little as you can.”

What’s new—haven’t we had systems embedded in automobiles for years?

Although originally designed for interacting with the real world, such systems are increasingly feeding information into larger information systems,” according to Wayne Wolf, chair of embedded computing systems at Georgia Institute of Technology.

According to Wolf, “What we are starting to see now is [the emergence] of what the National Science Foundation is called cyber-physical systems.”

In other words, embedded systems are used for command and control or information capture in the physical domain (like in a car or medical imaging machine), but then they can also share information over a network with others (think OnStar or remote medical services).

When the information is shared from the car to the Onstar service center, information about an accident can be turned into dispatch of life-saving responders. Similarly, when scans from a battlefield MRI is shared with medical service providers back in the States, quality medical services can be provided, when necessary, from thousands of miles away.

As we should hopefully have all come to learn after 9-11, information hoarding is faux power. But when information is shared, the power is real because it can be received and used by others and others, so that its influence is exponential.

Think for example, of the Mars Rover, which has embedded systems for capturing environmental samples. Left alone, the information is contained to a physical device millions of miles away, but sharing the information back to remote tracking stations here on Earth, the information can be analyzed, shared, studied, and so forth with almost endless possibilities for ongoing learning and growth.

The world has changed from embedded systems to a universe of connected systems.

Think distributed computing and the internet. With distributed computing, we are silos or separate domains of information, but by connecting the islands of information using the internet for example, we can all harness the vast amounts of information out there and in turn process it within our own lives and contribute back information to others.

The connection and sharing is our strength.

In the intelligence world, information is often referred to as dots, and it is the connection of the dots that make for viable and actionable intelligence.

As people, we are also proverbially just little dots in this big world of ours.

But as we have learnt with social media, we are able to grow as individuals and become more potent and more fulfilled human beings by being connected with others—we’ve gone from doing this in our limited physical geographies to a much larger population in cyberspace.

In the end, information resides in people or can be embedded in machines, but connecting the information to with other humans and machines is the true power of the information technology.


Share/Save/Bookmark

January 2, 2009

It Time to Stop the Negativity and Move towards Constructive Change

Recently, there was an article in Nextgov (http://techinsider.nextgov.com/2008/12/the_industry_advisory_council.php) about the Industry Advisory Council (IAC), a well respected industry-government consortium under the Auspices of the American Council for Technology, that recommended to the incoming Obama Administration the standup of an innovation agency under the auspices of the new Chief Technology Officer.
The Government Innovation Agency “would serve as an incubator for new ideas, serve as a central repository for best practices and incorporate an innovation review in every project. As we envision it, the Government Innovation Agency would house Centers of Excellence that would focus on ways to achieve performance breakthroughs and leverage technology to improve decision making, institute good business practices and improve problem solving by government employees.:”
While I am a big proponent for innovation and leveraging best practices, what was interesting to me was not so much the proposal from IAC (which I am not advocating for by the way), so much as one of the blistering comments posted anonymously from one of the readers, under the pseudonym “concerned retiree,” which I am posting in its entirety as follows:
“Hmmmmm...."innovation"..."central repository of new ideas"......can this be just empty news release jargon? Just more slow-news day, free-range clichés scampering into the daily news hole?.. .or perhaps this item is simply a small sized news item without the required room to wisely explicate on the real life banalities of the government sponsored “innovation” world...such as: 1)patent problems - is the US going to be soaking up, or handing out patent worthy goodies via the "innovation" czar or czarina? Attention patent attorneys, gravy train a comin’ 2)"leverage technology to improve decision making" – wow! a phrase foretelling a boon-doggle bonanza, especially since it’s wonderfully undefined and thereby, prompting generous seed money to explore it’s vast potential (less just fund it at say, $20-30 million?); 3) the "Government Innovation Agency" - -well now, just how can we integrate this new member to the current herd of government “innovation” cows, including: A) a the Dod labs, like say the Naval Research Lab, or the Dept of Commerce lab that produced the Nobel prize winner (oh, I see now, the proposal would be for “computer” type innovation pursuits – oh, how wise, like the health research lobbyists, we’re now about slicing “innovation” and/or research to match our vendor supplier concerns, how scientific!, how MBAishly wise); B) existing labs in private industry (e.g. former Bell Labs. GM-Detroit area "labs"/innovation groups), C) university labs – currently watered by all manner of Uncle Sam dollars via the great roiling ocean of research grants. Finally - given the current Wall Street melt-down and general skepticism for American business nimbleness (this too will pass, of course) -- what's the deal with all the Harvard Grad School-type hyper-ventilation on the bubbling creativity (destructive or otherwise) of American capitalism - -surely the GAO/Commerce/SEC could pop out some stats on the progressive deterioration of expenditures -- capital and otherwise--on "innovation". Or perhaps the sponsors of the "Government Innovation Agency" - will be happy to explain at the authorization hearing - how all the dough to date spent to date on development of the green automobile has yet to put a consumer friendly one on the road from a US corp -- a fact that argues either for a vast expansion of the GIA, or, the merciful euthenasiaing of this dotty idea. See you all at the authorizing hearing?”
What’s so disheartening about this retiree’s comments?
It’s not that there is not some truth intermixed with the blistering comments, but it is the sheer magnitude of the cynicism, bitterness, negativity, resistance to ”new” (or at times reformulated) ideas, and “been-there-done-that” attitude that unfairly provides a bad name to other government workers who are smart, innovative, positive, and hard-charging and want to continuously improve effectiveness and efficiency of government for the benefit of the nation and to serve our citizens.
Sure, we need to listen and learn from those that preceded us--those with age, experience, expertise, and certainly vast amounts of wisdom. And yes, those of us who do not learn from the mistakes of the past are doomed to repeat it. So we must be mindful to be respectful, collaborative, inclusive, and careful to vet new ideas and changes.
However, those that have served before or have been serving a long time now should also give hope, innovation, change (not for change’s sake, but based on genuine learning and growth) and continuous improvement a chance.
It is always easier to be a naysayer, a doomsday prognosticator, and to tear down and destroy. It is much, much harder to be positive, hopeful, and constructive—to seek to build a brighter future rather than rest on the laurels of the past.
Unfortunately, many people have been hurt by past mistakes, false leaders, broken promises, and dashed hopes, so they become resistant to change, in addition to, of course, fearing change.
Those of us in information technology and other fields (like science, engineering, product design and development, and so many others—in fact all of us can make a difference) need to be stay strong amidst the harsh rhetoric of negativity and pessimism, and instead continue to strive for a better tomorrow.

Share/Save/Bookmark

January 1, 2009

Scraping the Landlines and The Total CIO

It’s long overdue. It’s time to get rid of the landline telephones from the office (and also from our homes, if you still have them). Wireless phones are more than capable of doing the job and just think you already probably have at least one for business and one for personal use—so redundancy is built in!
Getting rid of the office phones will save the enterprise money, reduce a maintenance burden (like for office moves) and remove some extra telejunk clutter from your desk. More room for the wireless handheld charger. :-)
USA Today, 20 December 2008 reports that according to Forrester Research “Estimated 25% of businesses are phasing out desk phones in effort to save more money.”
Additionally, “more than 8% of employees nationwide who travel frequently have only cellphones.”
Robert Rosenberg, president of The Insight Research Corp., stated: U.S. businesses are lagging behind Europe and Asia in going wireless, because major cellular carriers…are also earning money by providing landlines to businesses—an $81.4 billion industry in 2008.”
“In Washington, D.C., the City Administrator’s office launched a pilot program in October in which 30 employees with government-issued cellphones gave up their desk phones, said deputy mayor Dan Tangherlini. Because the government has issued more than 11,000 cellphones to employees, the program could multiply into significant savings.”
A study by the National Center for Health Statistics between January and June found that more than 16% of families “have substituted a wireless telephone for a land line.”
So what’s stopping organizations from getting rid of the traditional telephones?
The usual culprits: resistance to change, fear of making a mistake, not wanting to give up something we already have—“old habits die hard” and people don’t like to let go of their little treasures—even a bulky old deskphone (with the annoying cord that keeps getting twisted).
Things are near and dear to people and they clutch on to them with their last breath—in their personal lives (think of all the attics, garages, and basements full of items people can’t let go off—yard sale anyone?) and in the professional lives (things equate to stature, tenure, turf—a bigger rice bowl sound familiar?).
Usually the best way to get rid of something is to replace it with something better, so the Total CIO needs to tie the rollout of new handheld devices with people turning in their old devices--land lines, pagers, and even older cell phones (the added benefit is more room and less weight pulling on your belt).
By the way, we need to do the same thing with new applications systems that we roll out. When the new one is fully operational than the old systems need to be retired. Now how often does that typically happen?
Folks, times are tough, global competition is not going away, and we are wasting too much money and time maintaining legacy stuff we no longer need. We need to let go of the old and progress with the new and improved.

Share/Save/Bookmark