August 30, 2009

Privacy vs. Exhibitionism

We are a nation torn between on one hand wanting our privacy safeguarded and on the other hand wanting to share ourselves openly and often on the Internet—through Social Media, e-Commerce, e-mail, and so forth.

These days, we have more information about ourselves available to others than at any time in history. We are information exhibitionists—essentially an open book—sharing virtually everything about ourselves to everybody.

Online, we have our personal profile, photos, videos, likes and dislikes, birth date, addresses, email and phone contacts, employer, resume, friends and family connections, banking information, real estate transactions, legal proceedings, tax returns, and more. We have become an open book to the world. In a sense we have become an exhibitionistic nation.

While we continue to friend, blog, tweet, and post our thoughts, feelings, and personal information online, we are shocked and dismayed when there is a violation of our privacy.

How did we get to this point—here are some major milestones on privacy (in part from MIT Technology Review--July/August 2009):

1787—“Privacy” does not appear in Constitution, but the concept is embedded in protections such as “restrictions of quartering soldiers in private homes (Third Amendment), prohibition against unreasonable search and seizure (Fourth Amendment), prohibition against forcing a person to be a witness against himself (Fifth Amendment).

1794—Telegraph invented

1876—Telephone invented

1890—Boston Lawyers Samuel Warren and Louis Brandeis wrote in Harvard Law Review of “the right to be let alone” and warned that invasive technologies threatened to take “what was whispered in the closet” and have it “proclaimed from the house-tops.”

1914Federal Trade Commission Act prohibits businesses from engaging in “unfair or deceptive acts or practices”; has been extended to require companies to write privacy policies describing what they do with personal information they collect from customers and to honor these policies.

1934Federal Communications Act limits government wiretapping

1969—ARPANet (precursor to Internet) went live

1970Fair Credit Reporting Act regulates collections, dissemination, and use of consumer information, including credit information

1971—First e-mail sent.

1973—Code of Fair Information Practices limits secret data banks, requires that organizations ensure they are reliable and protected from unauthorized access, provides for individuals to be able to view their records and correct errors.

1974—Privacy Act prohibits disclosure of personally identifiable information from federal agency.

1988—Video Privacy Protection Act protects against disclosure of video rentals and sales.

1996—Health Insurance Portability and Accountability Act (HIPPA) protects against disclosures by health care providers.

1999Scott McNealy, CEO of Sun Microsystems states: “You have zero privacy anyway. Get over it.”

2000—Children’s Online Privacy Protection Act prohibits intentional collections of information from children 12 or younger

2001—USA Patriot Act expands government’s power to investigate suspected terrorism acts

2003—Do Not Call Implementation Act limits telemarketing calls

2006—Google Docs release for creating and editing docs online

2009—Facebook 4th most popular website in the world

As anyone can see, there is quite a lot of history to protecting privacy. Obviously, we want to be protected. We need to feel secure. We fear our information being misused, exploited, or otherwise getting out of our control.

Yet, as technology progresses, the power of information sharing, collaboration, and online access is endlessly enticing as it is useful, convenient, and entertaining. We love to go online and communicate with people near and far, conduct e-commence for any product near seamlessly, and work more and more productively and creatively.

The dichotomy between privacy and exhibitionism is strong and disturbing. How do we ensure privacy when we insist on openness?

First, let me say that I believe the issue here is greater than the somewhat simplistic answers that are currently out there. Obviously, we must rely on common sense + technology.

From a common sense perspective, we need to personally safeguard truly private information—social security numbers and mother’s maiden name are just the obvious. We need not only be concerned about distinct pieces of information, but information in the aggregate. In other words, individual pieces of information may not be easily exploitable, but when aggregated together with other publically available information—you may now be truly exposed.

In terms of technology, we need to invest more time, money, and effort into securing our systems and networks. Unfortunately, businesses are more concerned with quarterly revenue and profit targets than with securing our personal information. We have got to incentivize every business, organization, and government entity to put security and privacy first. Just like we teach our children, “safety first”, we need to change our adult priorities as well or risk serious harm to ourselves and our nation from cyber criminals, terrorisms, and hostile nation states.

But the real issue is, why do we continue to treat technology as if it is more secure and private than it truly is? In a sense, we shut our eyes to the dangers that we know are lurking, and tell ourselves “it only happens to somebody else.” How do we curb our enthusiasm for technological progress with a realism of recognizing the very real dangers that persist?


Share/Save/Bookmark

August 29, 2009

Information Stats to Scare

We all know that we are generating and receiving more information then ever. Good thing? I like to think so, but sometimes, you can have too much of even a good thing.

Certainly, information is a strategic asset—its vital to making sound decisions, essential for effective communications, and critical for expanding our thinking, breaking paradigms, predictive analysis, and helping us to innovate.

But when information is too much, too unorganized, too often, or too disruptive, it’s value is diminished and organizations and individuals suffer negative effects.

Here are some information stats to scare from Harvard Business Review (September 2009):

  • 60%--Those who checked email in the bathroom (and 15% even admitted to checking it while in church)
  • 20—Average hours per week spent by knowledge workers on email
  • 85%--Computer users who would take a laptop on vacation
  • 1/3--Emails considered unnecessary
  • 300—Number of emails executive get a day
  • 24—Minutes for worker to recover from being interrupted by an email notification
  • 40—Number of websites employees visit on an average day
  • 26%--People who want to delete all emails (declare “e-mail bankruptcy”) and start over
  • 3—Number of minutes before knowledge workers switch tasks
  • ~$1 trillion—Cost to economy of information overload
  • 85%--Emails opened within 2 minutes
  • 27%--Amount of workday eaten up by interruptions
  • 2.8 trillion gigabytes—Size of digital information by 2011
  • 31%--Workers whose quality of life is worsened by email

Some interesting antidotes offered by HBR:

  • Balance—weigh cost-benefits before sending another email
  • Reply to all—disable the reply all button
  • Five sentences—keep email to 5 sentences or less
  • Allots—affix virtual currency from a fixed daily amount to email based on its importance
  • IM Savvy—program by IBM that senses when you are busy by detecting your typing patterns and tells would be interrupters that you are busy
  • BlackBerry Orphans—to regain the attention of their parents, children are flushing their parent’s BlackBerries down the toilet

While the issues and proposed assists for information overload are thought provoking (and somewhat humorous), what is fascinating to me is how technology and the speed of its advancement and adoption are positively, but also—less spoken about—negatively affecting people and organizations.

It seems like life keeps accelerating—faster and faster—but the quality is deteriorating in terms of fuzzy boundaries between work-life, weakening of our closest relationships, burn-out of our best and hardest working people, and unrealistic expectations of people to be always on—just like the email account that keeps spitting out new messages.

Somewhere along the line, we need to hit the proverbial “reset button” and recognize that information and communication are truly strategic assets and as such need to be used intelligently and with good measure or else we risk cheapening their use and limiting their effectiveness.


Share/Save/Bookmark

August 28, 2009

Minimalism (short title intentional)

They question of the day—is less really more?

I don’t know a lot about art (except that I appreciate it when it’s good). But I remember often hearing subtle advice about leaving plenty of “white space”—i.e. don’t clutter up the work, because less is more.

Recently, I heard some manager at work say: “I don’t care what it looks like…just give me content, content, content.” Again, to me the theme was the same—as they say, keep it simple stupid (a.k.a. KISS).

It reminded me of what one of my high school teachers used to say about class assignments: Just give me the “meat and potatoes”.

Then, I read an interesting article in Wired (September 2009) about Craig Newmark and his company, Craigslist, which is the epitome of minimalism, when it comes to design, features, and functions.

“Besides offering nearly all of its features for free, it scorns advertising, refuses investments, ignores design, and does not innovate.”

Craigslist looks like no other website that I’ve ever seen on the Internet. It has no graphics. No pictures (unless it’s associated with a listing). Little real text. It’s basically just layers upon layers of links, until you get to a particular listing. The site seems to disregard all the accepted standards of website design, navigation, and functionality.

“Craigslist is one of the strangest monopolies in history, where customers are locked in by fees set at zero and where the ambiance of neglect is not a way to extract more profit but the expression of a world view.”

And what is Craig Newark’s world view?

Minimalism and simplicity.

And in the crazy world we live in today of hyper consumerism, accumulation of wealth, ever-increasing productivity, acceleration of communications, boosting of processing power, aggregation of data, and doing more with less—the simple and minimalistic approach of Craigslist is an oasis in a desert of often meaningless greed and gluttony.

Newmark says: “People are good and trustworthy and generally just concerned with getting through the day.”

Therefore, “All you have to do to serve them well is build a minimal infrastructure allowing them to get together and work things out for themselves. Any additional features are almost superfluous and could even be damaging.”

So how is Craigslist doing with such a simple approach—is it being overrun by the more aggressive web builders and entrepreneurs of our time?

Au contraire. “Craigslist get more traffic then either eBay or Amazon.com. eBay has more than 16,000 employees. Amazon has more than 20,000. Craigslist has 30.”

Moreover, according to their factsheet, Craigslist has more than 20 billion page views per month. And more than 50 million people use it in the U.S. alone.

Estimates are that Craigslist generates more than a $100 million in revenue and is worth billions.

While I can't say that I am a big user myself, these are some pretty amazing stats for a site that is bare bones and maybe more than a little awkward.

The philosophy of Newmark is: why add the “bells and whistles” if the user doesn’t want or need it?

In a sense, Craig Newmark is one of the most user-centric enterprise architects of our time. He genuinely seeks to understand his customer needs and to serve them in a way that meets them in an almost primal fashion.

Newmark has architected Craigslist in a uniquely user-centric way, undeterred that it runs counter to almost all conventional website wisdom.


Share/Save/Bookmark

August 23, 2009

E-memory and Meat Memory

As we move towards a “paperless society” and migrate our data to the computer and the Internet, we can find personal profiles, resumes, photos, videos, emails, documents, presentations, news items, scanned copies of diplomas and awards, contact lists, and even financial, tax, and property records.

People have so much information on the web (and their hard drive) these days that they fear one of two things happening:

  1. Their hard drive will crash and they will lose all their valuable information.
  2. Someone will steal their data and their identity (identity theft)

For each of these, people are taking various precautions to protect themselves such as backing up their data and regularly and carefully checking financial and credit reports.

Despite some risks of putting “too much information” out there, the ease of putting it there, and the convenience of having it there—readily available—is driving us to make the Internet our personal storage device.

One man is taking this to an extreme. According to Wired Magazine (September 2009), Gordon Bell is chronicling his life—warts and all—online. He is documenting his online memory project—MyLifeBits—in a book, called Total Recall.

“Since 2001, Bell has been compulsively scanning, capturing and logging each and every bit of personal data he generates in his daily life. The trove includes Web Sites he’s visited (22,173), photos taken (56,282), docs written and read (18,883), phone conversations had (2,000), photos snapped by SenseCam hanging around his neck (66,000), songs listened to (7,139) and videos taken by (2,164). To collect all this information, he uses a staggering assortment of hardware: desktop scanner, digicam, heart rate monitor, voice recorder, GPS logger, pedometer, Smartphone, e-reader.”

Mr. Bell’s thesis is that “by using e-memory as a surrogate for meat-based memory, we free our minds to engage in more creativity, learning, and innovation.”

Honestly, with all the time that Bell spends capturing and storing his memories, I don’t know how he has any time left over for anything creative or otherwise.

Some may say that Gordon Bell has sort of an obsessive-compulsive disorder (OCD)—you think? Others that he is some sort of genius that is teaching the world to be free and open to remembering—everything!

Personally, I don’t think that I want to remember “everything”. I can dimly remember some embarrassing moments in elementary school and high school that I most sure as heck want to forget. And then there are some nasty people that would be better off buried in the sands of time. Also, some painful times of challenge and loss—that while may be considered growth experiences—are not something that I really want on the tip of my memory or in a file folder on my hard drive or a record in a database.

It’s good to remember. It’s also sometimes good to forget. In my opinion, what we put online should be things that we want or need to remember or access down the road. I for one like to go online every now and then and do some data cleanup (and in fact there are now some programs that will do this automatically). What I thought was worthwhile, meaningful, or important 6 months or a year ago, may not evoke the same feelings today. Sometimes, like with purchases I made way back when, I think to myself, what was I thinking when I did that? And I quickly hit the delete key (wishing I could do the same with those dumb impulse purchases!). Most of the time, I am not sorry that I did delete something old and I am actually happy it is gone. Occasionally, when I delete something by accident, then I start to pull my hair out and run for the backup—hoping that it really worked and the files are still there.

In the end, managing the hard drive takes more work then managing one’s memories, which we have little conscious control over. Between the e-memory and the meat memory, perhaps we can have more of what we need and want to remember and can let go and delete the old and undesired one—and let bygones be bygones.
Share/Save/Bookmark

August 22, 2009

Technology, A Comfort to the Masses

Typically, as technologists, we like to point out the great things that technology is doing for us—making us more productive, facilitating more convenience, allowing us to perform feats that humans alone could not do, and enabling us to connect with others almost without regard to space and time. And truly, we are fortunate to live in a time in history with all these new unbelievable capabilities—our ancestors would be jealous in so many ways.

Yet, there is a flip side to technology—what some refer to as the 24x7 society—“always on”—that we are creating, in which life is a virtual non-stop deluge of emails, voicemails, videoconferencing, messaging, Friending, Linking-in, blogging, tweeting, YouTubing, and more.

We are becoming a society of people living in a Matrix-type virtual world, where we go around addicted to the online cyber world and yet in so many ways are unconscious to the real-world relationships that are suffering in neglect and silence.

A fascinating article in the Wall Street Journal, 22-23 August 2009 entitled, Not So Fast, by John Freeman states that “we need to protect the finite well of our attention if we care about our relationships.”

Certainly, online communications and connections are valuable, and in many ways are meaningful to us. They can create wonderful opportunities to bond with those near and far, including those who would be normally beyond our reach geographically and temporally. For me it’s been great reconnecting with old friends from schools, jobs, and communities. And yes, who would think that Sylvester Stallone and Arnold Schwarzenegger would be but a FaceBook message away for me?

Yet while all the online interaction is fulfilling for us in so many ways—filling voids of all sorts in our lives—in reality the connections we make in the virtual world are but a tiny fraction of the real world human-to-human relationships we have in terms of their significance and impact.

The Journal article puts it this way: “This is not a sustainable way to live. This lifestyle of being constantly on causes emotional and physical burnout, workplace meltdowns, and unhappiness. How many of our most joyful memories have been created in front of a screen?”

One of the biggest fears that people have is not their own mortality, but that of being left alone in the corporeal world—for each of us, while a world unto ourselves, are small in the vastness of all that is around us. Perhaps to feel less alone, people amass and encircle themselves with great amounts of familiar, comforting, and loving people and things. And while people have these, they are connected, grounded, loved, and they are comforted that they are not alone.

But the harsh reality is that no matter how much we have in our lives, people are beings onto themselves, and over time, unfortunately and extremely painfully, all worldly things are ultimately lost.

The Journal states: “We may rely heavily on the Internet , but we cannot touch it, taste it, or experience the indescribable feeling of togetherness that one gleans from face-to-face interaction.”

Connections are great. Virtual relationships can be satisfying and genuine. All the technology communication mechanisms are fast, efficient, and powerful in their ability to reach people anytime and anywhere. Yet, we must balance all these with the people we care about the most. We cannot sacrifice our deepest and most intimate relationships by sitting in front of a computer screen morning, noon, and night and walking around with the BlackBerry taking phone calls and emails at our kids' school play, on their graduation day, and during their wedding recital. We are missing the boat on what is really important. We have forgotten how to balance. We have gone to extremes. We are hurting the ones we truly love the most.

“We need to uncouple our idea of progress from speed, separate the idea of speed from efficiency, pause and step back enough to realize that efficiency may be good for business and governments, but does not always lead to mindfulness and sustainable, rewarding relationships.”

Finally, with all the technology, we are in a sense becoming less human and more mechanical—like the Borg, in Star Trek—with BlackBerrys and Netbooks as our implants. Let’s find some time to pull the plug on these technologies and rediscover the real from the virtual.


Share/Save/Bookmark

August 21, 2009

Taking the Politics out of Enterprise Decision Making

Some people say power is primarily exerted through military might (“hard power”), others says it is through use of diplomacy—communications, economic assistance, and investing in the global good (“soft power”). Then, there is a new concept of employing the optimal mix of military might and diplomacy (“smart power”).

It’s interesting to me how the Department of Defense—military approach—and the Department of State—diplomatic approach—is as much alive and well in our enterprises as it is in the sphere of world politics to get what we want.

At work, for example, people vie—some more diplomatically and some more belligerently—for resources and influence to advance their agendas, programs, projects, and people. This is symptomatic of the organizational and functional silos that continue to predominate in our organizations. And as in the world of politics, there are often winners and losers, rather than winners and winners. Those who are the “experts” in the arts of diplomacy and war (i.e. in getting what they want) get the spoils, but often at the expense of what may be good for the organization as a whole.

Instead of power politics (hard, soft, or smart), organizations need to move to more deliberate, structured, and objective governance mechanisms. Good governance is defined more by quantifiable measures than by qualitative conjecture. Sound governance is driven by return on investment, risk mitigation, strategic business alignment, and technical compliance rather than I need, want, like, feel, and so forth. Facts need to rule over fiction. Governance should not be a game of power politics.

Henry Mintzberg, the well-known management scholar, identified three mechanisms for managers to exert influence in the organization (Wall Street Journal, 17 August 2009):

1. Managing action—“managers manage actions directly. They fight fires. They manage projects. They negotiate contracts.” They get things done.

2. Managing people—“managers deal with people who take the action, so thy motivate them and they build teams and they enhance the culture and train them and do things to get people to take more effective actions.”

3. Managing information—“managers manage information to drive people to tale action—through budgets and objectives and delegating tasks and designing organization structure.”

It is in the third item—managing information—that we have the choice of building sincere business cases and creating a genuine call to action or to devolve into power politics, exerting hard, soft, and smart influence to get what we want, when we want it, and how we want it.

When information is managed through the exertion of power, it can be skewed and distorted. Information can be manipulated, exaggerated, or even buried. Therefore, it is imperative to build governance mechanisms that set a level playing field for capturing, creating, calculating, and complying with a set of objective parameters that can be analyzed and evaluated in more absolute terms.

When we can develop decision support systems and governance mechanisms that take the gut, intuition, politics, and subjective management whim out of the process, we will make better and more productive decisions for the enterprise.


Share/Save/Bookmark

August 20, 2009

Andy Blumenthal Talks about Cloud Computing

Here is the podcast from MeriTalk Silverlining Series (August 2009)


Share/Save/Bookmark

August 18, 2009

DHS OIG Report on My User-centric EA Implementation at the Coast Guard

Just learned of new Department of Homeland Security (DHS) Office of Inspector General (OIG) Report documenting the significant progress of Enterprise Architecture and IT Governance program at the U.S. Coast Guard, which I led up to and during the majority of the audit.

I am pleased at the recognized progress and at the terrific work that my team accomplished there--I am very proud of all of them!

Of course, there is more work to be done, but the right EA infrastructure has been put in place to accomplish the goals and objectives set out.

Here is the link to the report: http://sites.google.com/site/thetotalcio/Home/links/EAOIGReport-July2009.pdf?attredirects=0

"The Coast Guard has made progress in developing its enterprise architecture by defining its enterprise architecture framework [User-centric EA] in alignment with both federal and DHS architectures. In addition, its enterprise architecture is aligned with the Coast Guard's IT strategy. These achievements have been possible because of executive support for the enterprise architecture effort."
Share/Save/Bookmark

August 16, 2009

Vision is not a Business Only Matter

At an enterprise architecture conference a number of weeks ago, the audience was asked how many of you see yourself as technology people—about half raised their hands. And then the audience was asked how many see yourselves more as business people—and about half raised their hands. And of course, there were a handful of people that raised their hands as being “other.”

Then the dialogue with the audience of architects proceeded to regardless of whether you consider yourselves more business-oriented or more technology-oriented, either way, enterprise architects must get the vision from the business people in the organization, so the architects can then help the business people to develop the architecture. It was clear that many people felt that we had to wait for the business to know that their vision was and what they wanted, before we could help them fulfill their requirements. Well, this is not how I see it.

From my experience, many business (and technology) people do not have a “definitive vision” or know concretely what they want, especially when it comes to how technology can shape the business. Yes, of course, they do know they have certain gaps or that they want to improve things. But no, they don’t always know or can envision what the answer looks like. They just know that things either aren’t working “right” or competitor so and so is rolling out something new or upgrading system ABC or “there has just got to be a better way" to something.

If we plan to wait for the business to give us a definitive “this is what I want,” I think in many cases, we’ll be waiting a very long time.

The role of the CIO, CTO, as well as enterprise architects and other IT leaders is to work with the business people, to collaboratively figure out what’s wrong, what can be improved, and then provide solutions on how to get there.

Vision is not a business only matter—it is a broad leadership and planning function. IT leaders should not absolve themselves of visioning, strategy, and planning and rely only on the business for this. To the contrary, IT leaders must be an integral part of forging the business vision and must come up with an enabling “technology vision” for the organization. These days, business is more and more reliant on technology for its success, and a business vision without thought and input from the technology perspective would be superficial at best and dead of center at worst.

Moreover, visioning is not an art or a science, but it is both and not everyone is good at it. That is why open communication and collaboration is critical for developing and shaping the vision for where the organization must go.

Early on in my career, in working with my business counterparts, I asked “What are you looking to do and how can I help you?” And my business partner responded, opening my eyes, and said, “You tell me—what do you think we need to do. You lead us and we will follow.”

Wow! That was powerful.

“You tell me.”

“What do you think we need to do.”

“You lead us and we will follow.”

The lesson is simple. We should not and cannot wait for the business. We, together with our operational counterparts, are “the business”. Technology is not some utility anymore, but rather it is one of the major underpinnings of our information society; it is the driving force behind our innovation, the core of our competitive advantage, and our future.


Share/Save/Bookmark

August 12, 2009

Andy's Cloud Computing Presentation on MeriTalk

Introduction

First let me start out by saying that cloud computing brings us closer than ever to providing IT as a utility such as electricity, where users no longer need to know or care about how IT services are provided, and only need to know that they are reliably there, just like turning on the light. This is the subscription approach to using information technology, where base services are hosted, shared, and you pay only for what you need and use.

In cloud computing, there are a number of basic models. First, in public clouds, we have a multi-tenant, shared services environment with access provided over a secure Internet connection. In contrast in a private cloud, the IT shared services is behind the company’s firewall and is controlled by in-house staff. Then, there is also a community cloud, which is an extension of the private cloud, where IT resources are shared by several organizations that make-up a specific community.

The advantage to cloud computing—whether public or private—is that you have a shared, enterprise-wide solution that offers a number of distinct advantages:

  1. Efficiency–with cloud computing, we build once and reuse multiple times—i.e. we share resources—rather than everyone having their own.
  2. Flexibility–we are more nimble and agile when we can quickly expand or contract capacity on-demand, as needed—what some call rapid elasticity. Moreover, by outsourcing the utility computing elements of our IT infrastructure, we can focus our internal efforts on building our core mission areas.
  3. Economy (or economy of scale)–it’s cheaper and more cost effective when we can tap into larger pools of common resources maintained by companies with subject matter expertise. They then are responsible for ensuring that IT products are patched, upgraded and modernized. Moreover, we pay only for what we actually use.

Issue

So cloud computing sounds pretty good, doesn’t it? What then is the big issue? Plain and simple it comes down to—Is cloud computing effective for the organization? And what I mean by that is a few things:

  • First is customization, personalization and service: when you buy IT computing services in this shared services model, do you really get what you need and want – or are you just getting a canned approach, like the Model T that came in one color, black? For example, when you purchase Software as a Service are you getting the solution you need for your agency or the one built for someone else?
  • Next is security, privacy, and disaster recovery. This is a big deal because in a public cloud, you are capturing, processing, sending, and storing data outside of your proprietary infrastructure. This opens the door for theft, manipulation, or other ways of our data being compromised by criminals, cyber-terrorists, and even hostile nation-states.
  • Third, and maybe most important, is cultural, especially in a very individualistic society, like ours, where people are used to getting what they want, when they want, without having to share. For example, we prefer owning our own vacation home to having a time-share. We love the concept of a personal home theater. Everyone now has a personal cell phone, and the old public telephones that were once on every corner are now practically extinct. And most people prefer driving their own cars to work rather than using mass transit—even though it’s not environmentally friendly. So the idea of giving up our proprietary data centers, application systems, the control of our data, in a cloud computing model, is alien to most and possibly even frightening to many.

The Reality

So how do we harmonize the distinct advantages of cloud computing—efficiency, flexibility, and economy—with the issues of customization, security, and culture?

The reality is that regardless of customization issues, we can simply no longer afford for everyone to have their own IT platforms—it’s wasteful. We are recovering from a deep financial recession, the nation has accumulated unprecedented levels of debt, and we are competing in a vast global economy, where others are constantly raising the bar—working faster, better, and cheaper.

Moreover, from a technology standpoint, we have advanced to where it is now possible to build an efficient cloud computing environment using distributed architecture, virtualization/consolidation, and grid computing.

Thirdly, on a cultural level, as individualistic as we are, it is also true that we now recognize the importance of information sharing and collaboration. We are well aware of the fact that we need to break the stovepiped verticals and build and work horizontally. This is exemplified by things like Google Docs, SharePoint, Wikipedia, and more.

In terms of security, I certainly understand people’s concern and it is real. However, we are all already using the cloud. Are you using online banking? Are you ordering things online through Amazon, Overstock or other e-commerce vendors? Do you use yahoo or Google email? Then you are already using the cloud and for most of us, we don’t even realize it. The bottom line on security is that every agency has to decide for itself in terms of its mission and ability to mitigate any risks.

How to Choose

So there are two questions then. Assuming—and I emphasize assuming—that we can solve the security issues with a “Trusted Cloud” that is certified and accredited, can we get over the anxiety of moving towards cloud computing as the new standard? I believe that since the use case—for flexibility, economy, and efficiency—is so compelling, that the answer is going to be a resounding yes.

The next question is, once we accept the need for a cloud computing environment, how do we filter our choices among the many available?

Of course I’m not going to recommend any particular vendor or solution, but what I will do is advocate for using enterprise architecture and sound IT governance as the framework for the decision process.

For too many years, we based our decisions on gut, intuition, politics, and subjective management whim, which is why statistics show that more than 82% of IT projects are failing or seriously challenged.

While a full discussion of the EA and governance process is outside the scope of this talk, I do want to point out that to appropriately evaluate our cloud computing options, we must use a strong framework of architecture planning and capital planning and investment control to ensure the strategic alignment, technical compliance, return on investment, and risk mitigation—including of course security and privacy—necessary for successful implementation.

How Cloud Computing fits with Enterprise Architecture:

As we move to cloud computing, we need to recognize that this is not something completely new, but rather an extension of Service Oriented Architecture (SOA) where there are service providers and consumers and applications are built by assembling reusable, shared services that are made available to consumers to search, access, and utilize. Only now with public cloud computing, we are sharing services beyond the enterprise and to include applications, data, and infrastructure.

In terms of a transition strategy, cloud computing is a natural evolution in IT service provision.

At first, we did everything in-house, ourselves—with our own employees, equipment, and facilities. This was generally very expensive in terms of finding and maintaining employees with the right skill sets, and developing and maintaining all our own systems and technology infrastructure, securing it, patching it, upgrading it, and so on.

So then came the hiring of contractors to support our in-house staff; this helped alleviate some of the hiring and training issues on the organization. But it wasn’t enough to make us cost-efficient, especially since we were still managing all our own systems and technologies for our organization, as a stovepipe.

Next, we moved to a managed services model, where we out-sourced vast chunks of our IT—from our helpdesk to desktop support, from data centers to applications development, and even to security and more.

Finally, the realization has emerged that we do not need to provide IT services either with our own or contracted staff, but rather we can rely on IT cloud providers who can offer an array of IT services, on demand, and who will manage our information technology and that of tens, hundreds, and thousands of others and provide it seamlessly over the Internet, so that we all benefit from a more scalable and unified service provision model.

Of course, from a target architecture perspective, cloud computing really hits the mark, because it provides for many of the inherent architecture principles that we are looking to implement, such as: services interoperability and component reuse, and technology standardization, simplification, and cost-efficiency. And on top of all that—using services on a subscription or metered basis is convenient for the end-user.

Just one last thing I would like to point out is that sound enterprise architecture and governance must be user-centric. That means that we only build decision products that are valuable and actionable to our users—no more ivory tower efforts or developing shelfware. We need to get the right information to the right decision makers to get the mission accomplished with the best, most agile and economical support framework available.


Share/Save/Bookmark