Showing posts with label integration. Show all posts
Showing posts with label integration. Show all posts

March 18, 2012

Your Leadership Ticket Is Waiting

A lot of colleagues tell me that they hate office politics, and for many it represents their one-way ticket to ongoing bickering, infighting, and a virtual endless cycle of unsatisfied wants and unhappiness.

Office politics is where the interests of multiple parties either converge or collide--where convergence occurs through feelings of interdependence (i.e. enterprise) and acts of teamwork, while collisions predominate by stressing independence (i.e. isolationism) and head-butting.

This is where good and bad leadership can make a huge difference.

- One one hand, a bad leader sees the world of the office as "us versus them" and fights almost indiscriminately for his/her share of scope, resources, influence, and power.

- On the other hand, a good leader looks out for the good of the organization and its mission, and works to ensure the people have what they need to get their jobs done right, regardless of who is doing it or why.

Thus, good leaders inspire trust and confidence, because they, without doubt, put the mission front and center--and egos are left at door.

Harvard Business Review (January-February 2011) in an article called "Are You A Good Boss--Or A Great One?" identifies a couple of key elements that inherently create opposition and competitiveness within the enterprise:

1) Division of Labor--This is the where we define that I do this and you do that. This has the potential to "create disparate groups with disparate and even conflicting goals and priorities." If this differentiation is not well integrated back as interrelated parts of an overall organizational identity and mission, then feelings of "us versus them" and even arguments over whose jobs and functions are more important and should come first in the pecking order will tear away at the organizational fiber and chances of success.

2) Scarce Resources--This is where limited resources to meet requirements and desirements impact the various parts of the organization, because not everyone's wishes can be pursued at the same time or even necessarily, at all.  Priorities need to be set and tradeoffs made in what will get done and what won't. Again, without a clear sense of unity versus disparity, scarcity can quickly unravel the organization based on people's  feelings of unfairness, dissatisfaction, unrest, and potentially even "mob rule" when people feel potentially threatened.


Hence, a bad leader works the system--seeing it as a win-lose scenario--where his/her goals and objectives are necessarily more important than everyone else, and getting the resources (i.e. having a bigger sandbox or "building an empire") is seen as not only desirable but critical to their personal success--here, their identity and loyalty is to their particular niche silo.

However, a good leader cares for the system--looking to create win-win situations--where no one element is better or more important than another, rather where they all must work together synergistically for the greater good of the organization. In this case, resources go not to who fights dirtier, but to who will most benefit the mission with them--in this case, their allegiance and duty is to the greater enterprise and its mission.

HBR states well that "In a real team [with a real leader], members hold themselves and one another jointly accountable. They share a genuine conviction they will succeed or fail together."

Organizations need not be snake pits with cut throat managers wanting to see others fail and waiting to take what they can for themselves, rather there is another way, and that is to lead with a shared sense of purpose, meaning, and teamwork. 

And this is achieved through creating harmony among organizational elements and not class warfare between them.

This type of leader that creates unity--builds enduring strength--and has the ticket we need to organizational success.

(Source Photo: Andy Blumenthal)

Share/Save/Bookmark

March 1, 2012

Dashboarding The Information Waves

I had an opportunity to view a demo of a dashboarding product from Edge called AppBoard, and while this is not a vendor or product endorsement, I think it is a good example to briefly talk about these types of capabilities. 

Dashboard products enable us to pull from multiple data sources, make associations, see trends, identify exceptions, and get alerts when there are problems.

Some of the things that I look for in dashboard tools are the following:

- Ease of use of connecting to data 

- Ability to integrate multiple stovepiped databases

- A variety of graphs, charts, tables, and diagrams to visualize the information

- Use of widgets to automatically manipulate the data and create standardized displays

- Drag and drop ability to organize the dashboard in any way you like to see it

- Drill down to get more information on the fly 

While there are many tools to consider that provide dashboards, information visualization, and business intelligence, I think one of the most important aspects of these is that they be user-centric and easy to implement and customize for the organization and its mission.

When making critical decisions (especially those involving life and death) and when time is of the essence--we need tools that can be can be easily navigated and manipulated to get the right information and make a good decision, quickly. 
 
As a fan of information visualization tools, I appreciate tools like this that can help us get our arms around the "information overload" out there, and I hope you do too.

(All Opinions my own)

Share/Save/Bookmark

January 29, 2012

Platforms - Open or Closed

Ever since the battles of Windows versus Linux, there have been two strong competing philosophies on systems architecture.

Many have touted the benefits of open architecture--where system specifications are open to the public to view and to update.  

Open sourced systems provide for the power of crowdsourcing to innovate, add-on, and make the systems better as well as provides less vendor lock-in and lower costs.  

Open Source -----> Innovation, Choice, and Cost-Savings

While Microsoft--with it's Windows and Office products--was long the poster child for closed or proprietary systems and has a history of success with these, they have also come to be viewed, as TechRepublic (July 2011) points out as having an "evil, monopolistic nature."

However, with Apple's rise to the position of the World's most valuable company, closed solutions have made a strong philosophical comeback.

Apple has a closed architecture, where they develop and strictly control the entire ecosystem of their products. 

Closed systems provides for a planned, predictable, and quality-controlled architecture, where the the whole ecosystem--hardware, software and customer experience can be taken into account and controlled in a structured way.  

Closed Systems -----> Planning, Integration, and Quality Control

However, even though has a closed solutions architecture for it's products, Apple does open up development of the Apps to other developers (for use on the iPhone and iPad). This enables Apple to partner with others and win mind share, but still they can retain control of what ends-up getting approved for sale at the App Store. 
I think what Apple has done particularly well then is to balance the use of open and closed systems--by controlling their products and making them great, but also opening up to others to build Apps--now numbering over 500,000--that can leverage their high-performance products.

Additionally, the variety and number of free and 99 cent apps for example, show that even closed systems, by opening up parts of their vertical model to partners, can achieve cost-savings to their customers. 

In short, Apple has found that "sweet spot"--of a hybrid closed-open architecture--where they can design and build quality and highly desirable products, but at the same time, be partners with the larger development community. 

Apple builds a solid and magnificent foundation with their "iProducts," but then they let customers customize them with everything from the "skins" or cases on the outside to the Apps that run on them on the inside. 

Closed-Open Systems -----> Planned, Integrated, and Quality PLUS Innovation, Choice, and Cost-Savings

Closed-Open Systems represent a powerful third model for companies to choose from in developing products, and which benefits include those from both open and closed systems.

Share/Save/Bookmark

November 1, 2011

Replacing Yourself, One Piece at a Time

Here is a wonderful idea to help people who use prosthetics--a smartphone built right in to the artificial limb.

What was once a challenging task to hold a smartphone and make calls, write emails and texts, or just search the web is now just a push of a button or voice command away.

This is a user-centric and functional integration of technology with medical science to help those who have either lost limbs or been born without them.

While a step forward for the disabled, perhaps this is also a move towards future technological augmentation of regular body parts as well.

What was once a tattoo or body piercing on the periphery may soon become an implanted smartphone in the body part of your choosing.

The concept reminds me of the MTV show "Pimp My Ride" where run-of-the-mill cars are completely made over into new awesome vehicles by stripping them and rebuilding them with better, cooler parts.

Is this where we are going with our human bodies--where one day we are an old beat-up minivan only to have our parts swapped out and replaced with biotechnology to become a new hotrod convertible once again.

Now we are moving from leveraging technology for medical purposes to tinkering with our our physical bodies, using technology, for preference.

Yes, this is already being done with facelifts and other cosmetic surgery, but how about replacing entire body parts not because they are diseased, but because you want or can afford an upgrade?

Lot's of exciting and scary implications to think about with this one--as our body parts become replaceable almost like legos--snap on and off.

In the future, becoming a better, stronger, faster person may not be just a function of what you do, but how much you can afford to replace.

Share/Save/Bookmark

August 21, 2011

Deus Ex-Overtaken By Technology

Deus Ex is an action role-playing game (RPG) and first person shooter game. It sold more than a million copies as of 2009 and was named "Best PC Game of All Time."

A prequel Deus Ex: Human Evolution is due to be released this month (August 2011).

You play a coalition anti-terrorist agent in a world slipping further and further into chaos.

The time is 2052 and you are in a dystopian society where society has progressed faster technologically than it has evolved spiritually--and people are struggling to cope with technological change and are abusing new technology.

The challenges portrayed in the trailer show people using/abusing technological augmentation--the integration of technology with their human bodies--replacing damaged limbs, adding computer chips, and even "upgrading themselves".

There are many issues raised about where we are going as a society with technology:

1) Are we playing G-d--when we change ourselves with technology, not because we have too (i.e. because of sickness), but rather because we want to--at what point are we perhaps overstepping theologically, ethically, or otherwise?

2) Are we playing with fire--when we start to systematically alter our makeup and change ourselves into some sort of half-human and half-machine entities or creatures are we tempting nature, fate, evolution with what the final outcome of who we become is? As the end of the trailer warns: "Be human, remain human"--imagine what type of cyborg creatures we may become if we let things go to extremes.

3) Technology may never be enough--As we integrate technology into our beings, where does it stop? The minute we stop, others continue and we risk being "less intelligent, less strong, and less capable than the rest of the human race." In short, are we facing a technological race toward dehumanization and enhanced machines.

4) Drugs and other vices follow--To prevent technology augmentation from being rejected, mankind relies on ever larger and more potent doses of drugs. We not only risk losing elements of our humanity to technology, but also to drugs and other vices that make us forget the pain of change and rejection (physical and perhaps emotional).

Deus Ex literally is Latin for "G-d out of the machine." Perhaps, future dystopian society starts out by people trying to play G-d, but I think the risk is that it ends with the proverbial devil displacing the best laid intentions.

While technology holds the most amazing of promises from curing disease, solving world hunger, and endless innovations (even including developing the archetype bionic man/women--"We can rebuild him...we have the technology"), without a solid moral compass and frequent check-ins, we run the risk of technology getting away from us and even doing more harm than good.

Share/Save/Bookmark

August 14, 2011

Images, Alive And Profitable

"There are nearly 4 trillion images on the Internet and 200 million new ones being added each day," according to Chief Revenue Officer (CRO) of Luminate.

Luminate (formerly Pixazza) has the vision of making all those images interactive through image recognition algorithms and human-assisted crowdsourcing to identify objects and tag the images with content.
They "transform static images into interactive content," according to the Luminate website.

The way it works:

1) Icon--look for the Luminate icon image in the lower left corner of the image that means the image in interactive.

2) Mouse--mouse over the image to choose from the interactive image apps.

3) Click--click on the images in the photo to shop and buy it ("Get The Look"), share information (e.g. Facebook, Twitter, email), or navigate (click on contextual hyperlinks from Wikipedia and other sources).

According to Forbes (27 July 2011), Luminate already "has more than 4,000 publishers, 150 million unique visitors per month, and more than 20 million products catalogued."

The image-tagging platform provides context and information for consumers and revenue generating opportunities for producers--so it is a win-win for everyone in the marketplace!

By connecting end-user Internet images on the front-end with advertisers and commerce on the back-end, Luminate has found a way to integrate web-surfers and industry--no longer are advertisements on the web disconnected as pop-ups, banners, or lists from the Internet content itself.

Right now, there are apps for annotations, advertisements, commerce, and social media. Luminate plans to open up development to others to create their own for things such as apps for donations for disaster relief images or mapping and travel apps for images of places.

Luminate, as a photo-tagging and application service, is advancing our experience with the Internet by creating a richer experience, where a photo is not just a photo, but rather a potential gateway into everything in the photo itself.

In my view, this is a positive step toward a vision of a fully augmented reality, where we have a truly information-rich "tagged environment", where everything around us--that we see and experience--is identified and analyzed, and sourced, and where the images of the world are alive no matter how or from what angle we look at them.

Lastly, my gut tells me that Google is heavily salivating over where this company is going and future developments in this field.

(Source Photo: here)

Share/Save/Bookmark

August 4, 2011

Google+ And A History of Social Media

Ten_commandments

Bloomberg Business (25-31 July 2011) tells in biblical terms the history of social media leading up to the recent release of Google+:

"In the beginning, there was Friendster; which captivated the web'ites before it was smitten by slow servers and exiled to the Far East. And then a man called Hoffman begat LinkedIn, saying "This name shall comfort professionals who want to post their resumes online," and Wall Street did idolize it. And then Myspace lived for two thousand and five hundred days and worshipped flashy ads and was subsumed by News Corp., which the L-rd hath cursed. And Facebook emerged from the land of Harvard and forsook the flashy ads for smaller ones and welcomes vast multitudes of the peoples of the world. And it was good."

With the "genesis" of Google+, there is now a new contender in virtual land with a way to share posts, pictures, videos, etc. with limited groups--or circles of friends--and an advance in privacy features has been made.

According to the article, even Mark Zuckerberg and some 60 other Facebook employees have signed up for Google+.

With all this confusion brewing in social media land, one wonders exactly why Randi Zuckerberg (Mark's sister) recently headed for the exits--a better offer from Google? :-)

Google+ has many nice features, especially in terms of integration with everything else Google. On one hand, this is a plus in terms of potential simplicity and user-centricity, but on the other hand it can be more than a little obtrusive and scary as it can \link and share everything from from your profile, contacts, pictures (Picasa), videos (YouTube), voice calls (Google Voice), geolocation (Google Maps), Internet searches, and more.

Google owns a lot of Internet properties and this enables them to bundle solutions for the end-user. The question to me is will something as basic as Circles for grouping friends really help keep what's private, private.

It seems like we are putting a lot of information eggs in the Google basket, and while they seem to have been a force for good so far, we need to ensure that remains the case and that our privacy is held sacred.

(Source Photo, With All Due Respect To G-d: here)

Share/Save/Bookmark

May 13, 2011

Who's On First

I have a new article in Public CIO Magazine (April/May 2011) on the topic of Accountability In Project Management:

We've all be to "those" kinds of meeting. You know the ones I'm talking about: The cast of characters has swelled to standing-room only and you're beginning to wonder if maybe there's a breakfast buffet in the back of the room.

It seems to me that not only are there more people than ever at todays meetings, but meetings are also more frequent and taking up significantly more hours of the day.

I'm beginning to wonder whether all these meeting are helping us get more work done, or perhaps helping us avoid confronting the fact that in many ways we're stymied in our efforts.

Read the rest of the article at
Government Technology.


Share/Save/Bookmark

May 1, 2011

Social Networking the Pepsi Way





On April 27, 2011, Pepsi announced the launch of it's state-of-the-art "Social Vending Machine."


It's a touch screen, networked machine that aside from enabling the purchase of soft drinks and the provision of nutrition information online, it also enables users to "gift" a drink to a friend by entering the recipient's name, mobile number, and a personalized text message (and even has an option to personalize it with a short recorded video).

The recipient of the Pepsi gift simply enters the redeem code at a pepsi social vending machine to get their soda. They can also return a thank you gift to the sender or "pay it forward" and give a gift to someone else.

In addition, the machine makes use of advanced telemetry to remotely measure and report on inventory, manage delivery scheduling, and update content on the machines. This machine is alive with changeable content and interactive communication between users.

As the Chief Innovation Officer of PepsiCo Foodservice states: "Social vending extends our consumers' social networks beyond the confines of their own devices and transforms a static, transaction-oriented experience into something fun and exciting they'll want to return to, again and again."

Additionally, Mashable reports that in phase 2, Pepsi is planning to integrate their Social Vending concept with other social media such as Facebook, extending the reach of product placement and gifting even further through cyberspace and social networking.

While many companies continue to struggle to figure out how to integrate social networking into a companies operations and make it profitable, PepsiCo has a simple formula for how it engages it's customers, promotes sales, and makes it all seem completely natural to the whole transaction--like it belonged there all along.

Great job PepsiCo!

Share/Save/Bookmark

January 29, 2011

The iWatch Does It All

Forget James Bond gadgets or Dick Tracy 2-way wrist-watches, the new concept iWatch is the one to drool over.

This is the vision of Italy's ADR Studio, but I believe it is just "around the corner" for all of us.

Fusing the design of an iPod Shuffle/Nano with the functionality of an iTouch/iPhone and voila, the new iWatch.

Clock, calendar, calculator, and weather--that's nice, but frankly it's child's play. Think more in terms of:

- News
- Stock quotes
- Social networking
- Music, videos, and games
- Google
- GPS
- 300,000 App Store downloads (and growing)

Unload some smartphone "baggage" from your belt and bag and integrate on your wrist.

There is a reason this concept keeps coming back in ever cooler ways--it makes sense functionally and feels right ergonomically.
I envision this working one day with virtual display and controls, so that physical "size doesn't matter."

We will walk on the moon again or some other distant planets, but we will always be connected to each other and not just in spirit, but with our iWatches

"Slide to unlock" now, please!

Share/Save/Bookmark

August 15, 2010

Engineering An Integrated IT Solution

Traditionally, the IT market has been deeply fragmented with numerous vendors offering countless of products and IT leaders have been left holding the proverbial bag of varied and mixed technologies to interoperate, integrate, optimize, and solve complex organizational problems with.

While competition is a great thing in driving innovation, service, and cost efficiencies, the results of the current fragmented IT market has been that organizations buy value or best of breed technologies from across the vendor universe, only to find that they cannot make them work with their other IT investments and infrastructure.

The result has been a contribution to IT execution that has become notorious for delivering an 82% project failure rate as reported by the Standish group.

Typically, what follows numerous attempts to resuscitate a code blue IT project is the eventual abandonment of the investment, only to be followed, by the purchase of a new one, with hopes of doing it “right” the next time. However, based on historical trends, there is a 4 out of 5 chance, we run into the same project integration issues again and again.

Oracle and other IT vendors are promoting an integration strategy to address this.

Overall, Oracle’s integration strategy is that organizations are envisioned to “buy the complete IT stack” and standup “engineered systems” more quickly and save money than if they have to purchase individual components and start trying to integrate them themselves. Some examples of this are their Exadata Storage Servers and Fusion Applications.

Oracle is not the first company to try this integration/bundling approach and in fact, many companies have succeeded by simplifying the consumers experience such as Apple bringing together iTunes software with the iPod/iPad/Mac hardware or more generally the creation of the smartphone with the integration of phone, web, email, business productivity apps, GPS, games, and more. Similarly, Google is working on its own integration strategy of business and personal application utilities from Google Docs to Google Me.

Of course, the key is to provide a sophisticated-level of integration, simplifying and enhancing the end-user experience, without becoming more generally anticompetitive.

On the other hand, not all companies with integration strategies and product offerings are successful. Some are more hype than reality and are used to drive sales rather than actually deliver on the integration promise. In other words, just having an integration strategy does not integration make.

For the IT leader, choosing best of breed or best of suite is not an easy choice. We want to increase capabilities to our organizations, and we need a solutions strategy that will deliver for our end users now.

While an integration strategy by individual companies can be attractive to simplify our execution of the projects, in the longer-term, cloud computing offers an alternative model, whereby we attach to infrastructure and services outside of our own domains on a flexible, as needed basis and where in theory at least, we do not need to make traditional IT investment on this scale at all anymore.

In the end, a lot of this discussion comes down to security and trust in the solution/vendor and the ability to meet our mission needs cost-effectively without a lot of tinkering to try to put the disparate pieces together.


Share/Save/Bookmark

August 7, 2010

No Real Solution Without Integration

Emergency Management Magazine (July/August 2010) has an article called “Life Savers” that describes how a convergence of new technologies will help protect and save first responder lives. These new technologies can track first responders’ location (“inside buildings, under rubble, and even below ground”) and monitor their vital signs and send alerts when their health is in danger.

There are numerous technologies involved in protecting our first responders and knowing where they are and that their vitals are holding up:

  • For locating them—“It will likely take some combination of pedometers, altimeters, and Doppler velocimeters…along with the kinds of inertial measurement tools used in the aerospace industry.”
  • For monitoring health—“We’ve got a heart monitor; we can measure respiration, temperature. We can measure how much work is being done, how much movement.”

The key is that none of the individual technologies alone can solve the problem of first responder safety. Instead, “All of those have to be pulled together in some form. It will have to be a cocktail solution,” according to the Department of Homeland Security (DHS), Science and Technology (S&T) Directorate that is leading the effort.

Aside from the number of technologies involved in protecting first responders, there is also the need to integrate the technologies so they work flawlessly together in “extreme real world conditions,” so for example, we are not just monitoring health and location at the scene of an emergency, but also providing vital alerts to those managing the first responders. This involves the need to integrate the ability to collect inputs from multiple sensors, transmit it, interpret it, and make it readily accessible to those monitoring the scene—and this is happening all under crisis situations.

While the first responder technology “for ruggedized vital-sign sensors could begin in two years and location tracking in less than a year,” the following lessons are clear:

  • The most substantial progress to the end-user is not made from lone, isolated developments of technology and science, but rather from a convergence of multiple advances and findings that produce a greater synergistic effect. For example, it clearly takes the maturity of numerous technologies to enable the life saving first responder solution envisioned.
  • Moreover, distinct technical advances from the R&D laboratory must be integrated into a solution set that performs in the real world for the end-user; this is when product commercialization becomes practical. In the case of the first responder, equipment must function in emergency, all hazard conditions.
  • And finally, to bring the multiple technologies together into a coherent end-user solution, someone must lead and many parties must collaborate (often taking the form of a project sponsor and an integrated project team) to advance and harmonize the technologies, so that they can perform as required and work together seamlessly. In the case of the first responder technology, DHS S&T took the lead to come up with the vision and make it viable and that will save lives in the future.


Share/Save/Bookmark

June 25, 2010

TEAM: Together Everyone Achieves More

People are selfish; they think in terms of win-lose, not win-win. The cost of this kind of thinking is increasingly unacceptable in a world where teamwork matters more than ever.

Today, the problems we face are sufficiently complex that it takes a great deal more collaboration than ever to yield results. For example, consider the recent oil spill in the Gulf, not to mention the ongoing crises of our time (deadly diseases, world hunger, sustainable energy, terrorism).

When we don’t work together, the results can be catastrophic. Look at the lead-up to 9-11, the poster child for what can happen if when we fail to connect the dots.

A relay race is a good metaphor for the consequences of poor teamwork. As Fast Company (“Blowing the Baton Pass,” July/August 2010) reports, in the 2008 Beijing Olympics, the USA’s Darvis Patton was on the third leg of the race, running neck and neck with a runner from Trinidad when he and his relay partner, Tyson Gay, blew it:

“Patton rounded the final turn, approaching…Gay, who was picking up speed to match Patton. Patton extended the baton, Gay reached back, and the baton hit his palm. Then, somehow it fell. The team was disqualified.”

Patton and Gay were each world-class runners on their own, but the lack of coordination between them resulted in crushing defeat.

In the business realm, we saw coordination breakdown happen to JetBlue in February 2007, when “snowstorms had paralyzed New York airports, and rather than cancel flights en masse, Jet Blue loaded up its planes…and some passengers were trapped for hours.”

Why do people in organizations bicker instead of team? According to FC, it’s because we “underestimate the amount of effort needed to coordinate.” I believe it’s really more than that – we don’t underestimate it, but rather we are too busy competing with each other (individually, as teams, as departments, etc.) to recognize the overarching importance of collaboration.

This is partly because we see don’t see others as helping us. Instead we (often erroneously) see them as potential threats to be weakened or eliminated. We have blinders on and these blinders are facilitated and encouraged by a reward system in our organizations that promotes individualism rather than teamwork. (In fact, all along the way, we are taught that we must compete for scarce resources – educational slots, marriage partners, jobs, promotions, bonuses and so on.)

So we think we are hiring the best and the brightest. Polished resume, substantial accomplishments, nice interview, solid references, etc. And of course, we all have the highest expectations for them. But then even the best employees are challenged by organizational cultures where functional silos, “turf wars”, and politicking prevail. Given all of the above, why are we surprised by their failure to collaborate?

Accordingly, in an IT context, project failure has unfortunately become the norm rather than an exception. We can have individuals putting out the best widgets, but if the widgets don’t neatly fit together, aren’t synchronized for delivery on schedule and within budget, don’t meet the intent of the overall customer requirements, and don’t integrate with the rest of the enterprise—then voilá, another failure!

So what do we need to become better at teamwork?

  • Realize that to survive we need to rely on each other and work together rather than bickering and infighting amongst ourselves.
  • Develop a strong, shared vision and a strategy/plan to achieve it—so that we all understand the goals and are marching toward it together.
  • Institute a process to ensure that the contributions of each person are coordinated— the outputs need to fit together and the outcomes need to meet the overarching objectives.
  • Reward true teamwork and disincentivize people who act selfishly, i.e. not in the interest of the team and not for the sake of mission.

Teamwork has become very cliché, and we all pay lip service to it in our performance appraisals. But if we don’t put aside our competitiveness and focus on the common good soon, then we will find ourselves sinking because we refused to swim as a team.


Share/Save/Bookmark

June 12, 2010

Managing Change The Easy Way


We all know that change is not easy, even when it's necessary.

As human beings, we question change, fear change, and at times resist change.

Often, change is timely or even overdue, and is needed to remain fresh, competitive, and in sync with changes in the external and internal environment.

At other times, change could be conceived of for selfish, arbitrary, politically motivated, or poorly thought out reasons.

People often react to change negatively, saying things such as:

- “Everything is really fine, why are you rocking the boat?”

- “This will never work” or “We’ve already tried that and it didn’t work.”

- “This is just the pendulum swinging back the other way again.”

- “Thing are now going to be even worse than before.”

- “I’ll never do that!”

The key to dealing with change is not to dismiss people’s feelings, but to take the time to thoroughly understand them, to take input from them for change, and to explain what is changing (precisely), for whom, when, where, and why.

The more precise, timely and thorough the communications with people, the better people will be able to deal with change.

To successfully plan and implement change, we need people to be engaged and on-board rather than to ignore or subvert it.

Below is a nice “change model” From http://www.changecycle.com/changecycle.htm that helps explain the stages of change that people go through including loss, doubt, discomfort, discovery, understanding, and integration.

To me the keys to managing through these six stages of change are solid information, clear communications, and people working together.

The Change Cycle™ Model

(All of the text below is quoted)

Stage 1 – Loss to Safety

In Stage 1 you admit to yourself that regardless of whether or not you perceive the change to be good or 'bad" there will be a sense of loss of what "was."

Stage 2 – Doubt to Reality

In this stage, you doubt the facts, doubt your doubts and struggle to find information about the change that you believe is valid. Resentment, skepticism and blame cloud your thinking.

Stage 3 – Discomfort to Motivation

You will recognize Stage 3 by the discomfort it brings. The change and all it means has now become clear and starts to settle in. Frustration and lethargy rule until possibility takes over.

The Danger Zone

The Danger Zone represents the pivotal place where you make the choice either to move on to Stage 4 and discover the possibilities the change has presented or to choose fear and return to Stage 1.

Stage 4 – Discovery to Perspective

Stage 4 represents the "light at the end of the tunnel." Perspective, anticipation, and a willingness to make decisions give a new sense of control and hope. You are optimistic about a good outcome because you have choices.

Stage 5 - Understanding

In Stage 5, you understand the change and are more confident, think pragmatically, and your behavior is much more productive. Good thing.

Stage 6 - Integration

By this time, you have regained your ability and willingness to be flexible. You have insight into the ramifications, consequences and rewards of the change -- past, present, and future.


Share/Save/Bookmark

May 22, 2010

Staying Open to Open Source

I don’t know about you, but I have always been a pretty big believer that you get what you pay for.

That is until everything Internet came along and upended the payment model with so many freebies including news and information, email and productivity tools, social networking, videos, games, and so much more.

So when it comes to something like open source (“free”) software, is this something to really take seriously for enterprise use?

According to a cover story in ComputerWorld, 10 May 2010, called “Hidden Snags In Open Source” 61% say “open source has become more acceptable in enterprises over the past few years.” And 80% cited cost-savings as the driving factor or “No. 1 benefit of open-source software.”

However, many companies do not want to take the risk of relying on community support and so “opt to purchase a license for the software rather than using the free-of-charge community version…to get access to the vendor’s support team or to extra features and extensions to the core software, such as management tools.”

To some degree then, the license costs negates open source from being a complete freebie to the enterprise (even if it is cheaper than buying commercial software).

The other major benefit called out from open source is its flexibility—you’ve got the source code and can modify as you like—you can “take a standard install and rip out the guts and do all kinds of weird stuff and make it fit the environment.”

The article notes a word of caution on using open source from Gartner analyst Mark Driver: “The key to minimizing the potential downside and minimizing the upside is governance. Without that you’re shooting in the dark.”

I think that really hits the target on this issue, because to take open source code and make that work in a organization, you have got to have mature processes (such as governance and system development life cycle, SDLC) in place for working with that code, modifying it, and ensuring that it meets the enterprise requirements, integrates well, tests out, complies with security, privacy and other policies, and can be adequately supported over its useful life.

If you can’t do all that, then the open source software savings ultimately won’t pan out and you really will have gotten what you paid for.

In short, open source is fine, but make sure you’ve got good governance and strong SDLC processes; otherwise you may find that the cowboys have taken over the Wild West.


Share/Save/Bookmark

November 7, 2009

A Vision of User-centric Communication Design

[Authored by Andy Blumenthal and published in Architecture and Governance Magazine November 2009]

As technology has advanced in leaps and bounds over the last 30 years, so has the number of information devices—from phones to faxes, pagers to PDAs, desktops to Netbooks—and it goes on and on.

Some devices, despite having outlived their useful lives, have been slow to disappear from the scene completely. For example, fax machines are still in our offices and homes, although now often combined with other de- vices such as the “all-in-one” copier, printer, scanner, and fax. However, why with the ability to scan and e-mail with attachments, do we even need to fax at all anymore?

Similarly, at one time, pagers were all the rave to reach someone 911. Then cell phones and PDAs took over the scene. Nevertheless, paging never fully went away; instead, it was replaced by “press 1 to send this per- son a page.” However, why do we need to page them at all anymore, if we can just leave them a voice mail or instant message?

It seems as if legacy technology often just doesn’t want to die, and instead of sun-setting it, we just keep packaging it into the next device, like the phone that comes with e-mail, instant messaging, texting, and more. How many ways do we need to say hello, how are you, and what time will you be home for dinner?

When is technology enough and when is it too much?

Of course, we want and love choice—heck, we’re consumers to the core. Technology choice is like having the perfect outfit for every occasion; we like to have the “right” technology to reach out to others in a myriad of different ways for every occasion.

Should I send you an e-mail on Facebook or should I “poke” you or perhaps we should just chat? Or maybe I should just send you a Tweet or a “direct message” on Twitter? No, better yet, why don’t I send you a message on LinkedIn? Anyway, I could go on for about another three paragraphs at least on how I should/could contact you. Maybe I’ll hit you up on all of them at the same time and drive you a little nuts, or maybe I’ll vary the communications to appear oh so technically versatile and fashionable.

Yes, technology choice is a wonderful thing. But it comes at a price. First, all the communication mediums

start to become costly after a while. I can tell you from my cell phone bill that the cost of all these options— e-mail, texting, Internet, and so on—definitely starts to add up. And don’t forget all the devices that we have to schlep around on our belts (I have one cell phone on each side—it’s so cool, like a gunslinger from the Wild West), pockets, and bags—where did I leave that de- vice? Let’s not forget the energy consumption and eco- unfriendliness of all these gadgets and all the messy wires.

Additionally, from a time-is-precious perspective, consider the time sinkhole we have dug for ourselves by trying to maintain a presence on all of these devices and social networking sites. How many hours have we spent trying to keep up and check them all (I’m not sure I can fully remember all my e-mail accounts anymore)? And if you don’t have single sign-on, then all the more hassle— by the way, where did I hide my list of passwords?

Next out of the gate is unified communications. Let’s interoperate all those voice mail accounts, e-mail ac- counts, IM, presence, and social media communications. Not only will your phone numbers ring to one master, but also your phone will transcribe your voice mails— i.e., you can read your voice mail. Conversely, you can listen to your e-mail with text-to-speech capability. We can run voice-over-IP to cut the traditional phone bill and speed up communications, and we can share nonreal-time communications such as e-mail and voice mail with real-time communication systems like our phone.

So, we continue to integrate different communication mediums, but still are not coalescing around a basic device. I believe the “communicator” on Star-Trek was a single device to get to someone on the Enterprise or on the planet surface with just the tap of a finger. Perhaps, our reality will some day be simpler and more efficient, too. When we tire of playing with our oodles of technology “toys” and signing up for myriad user accounts, we will choose eloquence and simplicity over disjointed—or even unified—communications.

As the founder of User-centric Enterprise Architecture, my vision is to have one communicator (“1C”) device, period. 1C is an intelligent device. “Contact John,” okay—no phone number to dial and no e-mail to address. 1C knows who John is, how to reach him, the best way to contact him, and if he is available (“present”) at the moment or not. 1C can take a message, leave a message, or communicate in any way (voice, text, video, virtual) that an individual prefers and that is appropriate for each portion of a particular communication to ensure that the communication intended is the communication received. 1C is not limited to a one-on-one communications, but is open to conferencing—as needed. Mention the need for Cindy to be in on the communication and instantaneously, Cindy is on and then off again. 1C is ubiquitous in time and space—I can send you a communication to arrive now or next week, when you’re here or there, when you’re in country or out, in a car, on a flight, on a ship, or underwater—it doesn’t matter. Like telepathy, the communication reaches you effortlessly. And, of course, 1C translates languages, dialects, acronyms, or concepts, as needed—truly it’s a “universal communicator.”

The closest we’ve come so far is probably the Apple iPhone, but with some 50,000 apps and counting, it is again too focused on the application or technology to be used, rather than on the user and the need.

In the end, it’s not how many devices or how many accounts or how many mediums we have to communicate with, but it is the communication itself that must be the focus. The 1C of the future is an enabler for the communication—anytime, anywhere, the right information to the right people. The how shouldn’t be a concern for the user, only the what.


Share/Save/Bookmark

September 25, 2009

Nanotechnology and Enterprise Architecture

“Nanotechnology is the engineering of functional systems at the molecular scale. In its original sense, 'nanotechnology' refers to the ability to construct items from the bottom up.” (Center for Responsible Nanotechnology)

Two examples of nanotechnology include the manufacturing of super strength polymers, and the design of computer chips at the molecular level (quantum computing). This is related to biotechnology, where technology is applied to living systems, such as recombinant DNA, biopharmaceuticals, or gene therapy.


How do we apply nanotechnology concepts to User-centric EA?
  • Integration vs. Decomposition: Traditional EA has looked at things from the top-down, where we decompose business functions into processes, information flows, and systems into services. But nanotechnology, from a process perspective, shows us that there is an alternate approach, where we integrate or build up from the bottom-up. This concept of integration can be used, for example, to connect activities into capabilities, and capabilities into competencies. These competencies are then the basis for building competitive advantage or carrying out mission execution.
  • Big is out, small is in: As we architect business processes, information sharing, and IT systems, we need to think “smaller”. Users are looking to shed the monolithic technology solutions of yesteryear for smaller, agile, and more mobile solutions today. For example, centralized cloud computing services replacing hundreds and thousands of redundant instances of individuals systems and infrastructure silos, smaller sized but larger capacity storage solutions, and ever more sleek personal digital assistants that pack in the functionality of cellphones, email, web browsing, cameras, ipods, and more.
  • Imagination and the Future State: As architects, we are concerned not only with the as-is, but also with the to-be state (many would say this is the primary reason for EA, and I would agree, although you can't establish a very effective transition plan without knowing where your coming from and going to). As we plan for the future state of things, we need to let our imagination soar. Moore’s Law, which is a view into the pace of technological change, is that the number of transistors on an integrated circuit doubles every 24 months. With the rapid pace of technological change, it is difficult for architects to truly imagine what the true possibilities are 3-5 years out--but that can't stop of from trying based on analysis, trends, forecasts, emerging technologies, competitive assessments, and best practice research.

The field of information technology, like that of nanotechnology and biotechnology is not only evolving, but is moving so quickly as to seem almost revolutionary at times. So in enterprise architecture, we need to use lots of imagination in thinking about the future and target state. Additionally, we need to think not only in terms of traditional architecture decomposition (a top-down view), but also integration (a bottom-up view) of the organization, its processes, information shares, and technologies. And finally, we need to constantly remain nimble and agile in the globalized, competitive marketplace where change is a constant.


Share/Save/Bookmark

August 31, 2008

“Design Thinking” and Enterprise Architecture

Ranked as one of the most innovative companies in the world, IDEO is an innovation and design firm, founded in 1991. Its client list include heavy hitters such as Microsoft, Intel, Nokia, Nestle, and Proctor and Gamble.

According to their website, they specialize in helping organizations to “Visualize new directions for companies and brands and design the offerings - products, services, spaces, media, and software - that bring innovation strategy to life.”

Harvard Business Review, June 2008, has an article by their CEO and President, Tim Brown.

First, how IDEO defines innovation:

“Innovation is powered by a thorough understanding, through direct observation, of what people want and need in their lives and what they like or dislike about the way particular products are made, packaged, marketed, sold, and supported. “

“Leaders now look to innovation as a principle source of differentiation and competitive advantage; they would do well to incorporate design thinking into all phases and processes.”

“Rather than asking designers to make an already developed idea more attractive to consumers, companies are asking them to create ideas that better meet consumers’ needs and desires.”

The three phases of design:

  • Inspiration—the problem or opportunity that is driving the creative design process.
  • Ideation (brainstorming)—“the process of generating, developing, and testing ideas that may lead to solutions.”
  • Implementation—“executing the vision or how we bring the design concept to market.”

How you can be a design thinker?

A people first approach—based on keen observation and noticing things that others do not, you can use insights to inspire innovative ideas that meet explicit and implicit needs. This is similar to a user-centric enterprise architecture approach, where we drive business process improvement and the introduction of new technologies based on genuine user/business requirements and a strategic understanding of the performance, business, information, systems, technologies, security, and human capital aspects of the organization.

Integration—To develop innovative solutions, you need to integrate “sometimes contradictory-aspects of a confounding problem and create novel solutions that go beyond and dramatically improve on existing alternatives.” Integration is an important aspect of EA, not only in terms of enterprise architecture synthesizing business and technology to enable creative architecture plans that drive the organization into the future, but also in terms of breaking down structural and process silos and building a more holistic, synergistic, interoperable, and capable organization.

Experimentation—there are “endless rounds of trial and error—the ‘99% perspiration’ in [Thomas] Edison’s famous definition of genius.” Most great ideas don’t “pop up fully formed out of brilliant minds”—“they are not a sudden breakthrough nor the lightening strike of genius,” but rather, they are “the result of hard work segmented by creative human-centered discovery process and followed by iterative cycles of prototyping, testing, and refinement.” While enterprise architecture is not generally-speaking a disciple based in experimentation, part of the EA planning should focus on market and competitive research, including best practices identification and sponsorship that will be used to drive modernization and transformation of the enterprise. Additionally, the EA should include research and development efforts in the plans to acknowledge the ongoing innovation required for the organization to grow, mature, and compete.

Collaboration—“the increasing complexity of products, services, and experiences has replaced the myth of the lone creative genius with the reality of enthusiastic interdisciplinary collaborator.” As an enterprise architect, I am an ardent proponent of this principle. In the large and complex modern-day organization of the 21st century, we need both breadth and depth of subject matter experts to build the EA, govern it, mange change, and drive modernization in our enterprises. As any half-decent architect knows, ivory tower planning effort are bound for failure. We must work collaboratively with the business and technology experts and give all our stakeholders a voice at the table—this give change and innovation the best chance of real success.

Tim Brown says that “design thinking can lead to innovation that goes beyond aesthetics…time and again we see successful products that were not necessarily the first to market, but were the first to appeal to us emotionally and functionally…as more of our basic needs are met, we increasingly expect sophisticated experiences that are emotionally satisfying and meaningful.”

I believe this is a lesson for EA as well:

For enterprise architecture to be successful, it is not enough to be functional (i.e. to set a good plan), but rather it has to have design thinking and be useful and usable to our end-users (i.e. User-centric). By incorporating innovative thinking into not only the EA plans, but also into how we reach out and collaborate with our stakeholders to build the plans (i.e by sparking the innovative process and creative juices with constructive challenging of the status quo to a broad array of subject matter experts), and the way we employ design to portray and communicate these plans (i.e with profiles, models, and inventories for example), we will have a architecture that truly represents the organization, is understood by it, and serves its needs and aspirations.


Share/Save/Bookmark

August 3, 2008

Texting Gone Wild and Enterprise Architecture

Information availability and communication mobility is all the craze. We are connected everywhere we go. We have our phones, PDAs, and laptops as part of our everyday gear. We wouldn’t leave the house without one or more of them or a converged device like the iPhone or Sidekick. And people are walking and driving around yapping on the phone or typing out text messages. Evan at work, people are answering the phone and texting in the stall. What is it about being connected with these devices that we literally can’t let go?

The Wall Street Journal, 25 July 2008, reports that “Emailing on the Go Sends Some Users Into Harm’s Way.”

These multi-taskers “ram into walls and doorways or fall down stairs. Out on the streets, they bump into lampposts, parker cars, garbage cans, and other stationary objects.”

Are people getting hurt?

You bet. James Adams of Northwestern Memorial Hospital is Chairman of Emergency Medicine, and he states “he has treated patients involved in texting incidents nearly every day this summer.”

Things have gotten so out of control that one London company began “outfitting lampposts with padded bumpers in the in the East End to cut down on injuries to errant texters.”

The stories go on and on about texters who bump into brides at wedding, fall off of curb and into construction barricades, walk into two-by-fours toted by construction workers, knock into bikers, and fall down staircases.

As a student of organizational behavior and an enterprise architect, I ask myself what is going on that people feel such a compelling need to be in touch literally every second. Are people craving intimacy? Are they insecure? Do they get a high by connecting with others and just can’t stop? Is this good thing for society and our organizations?

Certainly, the ability to communicate anytime, anywhere is a good thing. It makes us more capable. It can make us more productive (if we don’t end up killing ourselves in stupid accidents doing it irresponsibly). But like all good things, we need to learn to control our appetite for them. It’s the difference between eating thoughtfully or eating thoughtless, like glutton. Or between taking medicine when needed to treat a legitimate medical condition or just using recklessly like an addict.

Part of good enterprise architecture is building balance into the organization. Architects introduce new technologies to enable performance, but should also help develop policies and ensure training for responsible usage.

It’s terrific to bring new capabilities to the organization and society, but our role as architects does not end there. The human capital perspective of the enterprise architecture comes into play and demands that we go beyond the pure business requirements and technology solutions, and explore the impact of the technology on the people who will use. The human capital perspective of the architecture provides a lens through which we can manage the integration of people and technology.

I’d believe that we should educate people to use technology more responsibly, rather than outfit every lamppost and tree with bumper pads!


Share/Save/Bookmark

March 14, 2008

Conflict Theory and Enterprise Architecture

“Conflict theory states that the society or organization functions so that each individual participant and its groups struggle to maximize their benefits… The essence of conflict theory is best epitomized by the classic 'pyramid structure' in which an elite dictates terms to the larger masses. All major institutions, laws, and traditions in the society are created to support those who have traditionally been in power, or the groups that are perceived to be superior in the society according to this theory. This can also be expanded to include any society's 'morality' and by extension their definition of deviance. Anything that challenges the control of the elite will likely be considered 'deviant' or 'morally reprehensible.” (Wikipedia)

In the organization that we work in, today—modern times—is everything copascetic or is there inherent conflict, and how does this affect EA? And how is this impacted by EA?

We all hear and read the message from the top—from the executive(s) in charge—messages of unity of command, unity of purpose, and unity of structure. “We’re all in this together!”

However, the reality is that there are power struggles up and down, sideways, and on the diagonals, of the organization—this is conflict theory! Those at the top, wish to stay there. Those at the lower rungs, wish to climb up and check out the view. The organization is a pyramid, with fewer and fewer senior level positions as you go higher and higher up. Everyone in the organization is evaluated by measures of performance and is competing for resources, power, influence, and advancement.

I remember learning at Jewish day school, that people are half animal and half angel. Sort of like the age old conflict of good and evil. Freud, for the individual, put it in terms of the id and superego.

On one hand, conflict theory pits egocentric and selfish behavior against the greater needs of the organization (and the goals of EA) to share, collaborate, integrate, and go forward as the army slogan states, “an army of one!” The individual or group in the enterprise wants to know the proverbial, “what’s in it for me?”

On the other hand, User-centric EA is about collaboration: collaboration between business and IT, collaboration within the business, collaboration within IT, and even collaboration outside the agency (such as through alignment to the department, the federal EA, and so on). The collaboration takes the form of information sharing, structured governance, an agreed on target and plan, and the building of interoperability, standards, efficiencies, enterprise solutions, and overall integration!

It is not easy for EA to be a counterbalance for conflict theory. The organization needs to provide incentives for positive behavior (and disincentives for negative behavior), so that everyone is encouraged to team, collaborate, share, and look at the bigger picture for the success of overall enterprise!

I’ve seen organizations take steps toward building unity through team awards, criteria in everyone’s performance evaluation for teamwork, and actual mandates to share information. These are positive steps, but more needs to be done to make the enterprise flatter, more collaborative, and remind all employees that they work for the end-user.
Share/Save/Bookmark