Showing posts with label Knowledge Management. Show all posts
Showing posts with label Knowledge Management. Show all posts

August 4, 2018

Things Look Different Up Close



So this was interesting. 

I was coming up the highway. 

In the distance, there looked like there was a large tractor-trailer heading towards me.

I had to take a double take, because this truck was on my side of the divider...Oh shit!

It was only as I got closer that I could see that the truck was really being towed in reverse by a tow truck. 

Yes, "seeing is believing!"

This is a lesson in life:

Things may look one way from a distance, and very different up close. 

Sometimes, my wife tells me:

"Andy, don't look too close!" lol

But the truth is that you may not really see what you heading towards until it's right in front of your eyes.

So it's important to look out over the horizon and study what is coming your way. 

But don't take your eye off the ball (or Mack Truck as it may be). 

Things can change your perspective the closer you get to it. ;-)

(Source Photo: Andy Blumenthal)
Share/Save/Bookmark

May 1, 2018

Information Is Power

Just wanted to share something I heard and liked about data and information:
"Everything is a record, record, record
in a table, table, table."
Can everything in life really be reduced to lines of records, with fields of data in tables of information?

This is the information age!

Analytics and Big Data rule!

Knowledge is power!

In any conflict, we seek information dominance and supremacy!

Artificial intelligence is the future!

Records are unique with their own sys.id.

Creativity and innovation are also records in the table--even if they are the one in a million. 

The more records and tables--the more dots and connections between them--the more intelligence we can glean. 

Yes, everything is a record, record, record in a table, table, table. ;-)

(Source Photo: Andy Blumenthal)
Share/Save/Bookmark

January 8, 2016

We Just Keep Giving It All Away

How do these things keep happening to us?

We lost a high-tech Hellfire air-to- ground missile, accidentally sending it to Cuba, likely compromising critical sensor and GPS targeting technology to China, Russia, and/or North Korea. 

But it's not all that different from how many other examples, such as: 

- Chinese cyber espionage snared critical design secrets to the 5th generation F-35 Joint Strike Fighter.

- Iran captured and purportedly decoded an RQ-170 Sentinel high-altitude reconnaissance drone.

- Russian spies stole U.S. nuclear secrets helping them to build their first atomic bomb.

We are the innovator for high-tech bar none, which is beautiful and a huge competitive advantage. 

But what good is it when we can't protect our intellectual property and national security secrets. 

The U.S. feeds the world not only with our agricultural, but with our knowledge.

Knowledge Management should be a mindful exercise that rewards our allies and friends and protects us from our enemies--and not a free-for-all where we we can't responsibly control our information. ;-)

(Source Photo: here with attribution to James Emery)
Share/Save/Bookmark

November 1, 2013

Why Memorize?

G-d, I remember as a kid in school having to memorize everything for every class--that was the humdrum life for a schoolchild.

Vocabulary words, grammar rules, multiplication tables, algebraic and geometric equations, scientific formulas, historical events, famous quotes, states and capitals, presidents, QWERTY keys, and more. 

It was stuff it in, spit it out, and basically forget it.

This seemed the only way to make room for ever more things to memorize and test out. 

In a way, you really had to memorize everything, because going to a reference library and having  to look up on the stacks of endless shelves or microfiche machines was a pain in the you know what. 

Alternatively, the home dictionary, theasarus, and encyclopeda were indispensible, but limited, slow, dated, and annoying. 

But as the universe of knowledge exploded, became ever more specialized, and the Internet was born, looking something up was a cinch and often necessary. 

All of a sudden, memorization was out and critical thinking was in. 

That's a good thing, especially if you don't want people who are simple repositories of stale information, but rather those who can question, analyze, and solve problems. 

Albert Einstein said, "Never memorize something that you can look up."

But an interesting editorial in the Wall Street Journal by an old school teacher questions that logic. 

David Bonagura Jr. proposes that critical thinking and analysis "is impossible without first acquiring rock-solid knowledge of the foundational elements upon which the pyramid of cognition rests."

He says, "Memorization is the most effective means to build that foundation."

As a kid, I hated memorization and thought it was a waste of time, but looking back I find that more things stayed in that little head of mine than I had thought. 

I find myself relying on those foundations everyday...in writing, speaking, calculating, and even remembering a important story, principle, saying or even song lyrics.

These come out in my work--things that I thought were long lost and forgotten, but are part of my thinking, skills, and truly create a foundation for me to analyze situations and solve problems. 

In fact, I wish I knew more and retained it all, but short-term memory be damned. 

We can't depend on the Internet for all the answers--in fact, someday, it may not be there working for us all, when we need it. 

We must have core knowledge that is vital for life and survival and these are slowly being lost and eroded as we depend on the Internet to be our alternate brains. 

No, memorizing for memorization's sake is a waste of time, but building a foundation of critical skills has merits. 

Who decides what is critical and worthwhile is a whole other matter to address.

And are we building human automatons full of worthless information that is no longer relevant to today's lifestyles and problems or are we teaching what's really important and useful to the human psche, soul, and evolution. 

Creativity, critical thinking, and self-expression are vital skills to our ability to solve problems, but these can't exist in a vacuum of valuable brain matter and content.

It's great  to have a readily available reference of world information at the tips of our fingertips online, but unless you want to sound (and act) like an idiot, you better actually know something too. ;-)

(Source Photo: here with attribution to Chapendra)
Share/Save/Bookmark

February 29, 2012

Progressing From Data to Wisdom

I liked this explanation (not verbatim) by Dr. Jim Chen of data, information, knowledge, and wisdom.

- Data: This is an alphanumeric entity and/or symbol (ABC, 123, !@#...)

- Information: This is when entities are related/associated to each other and thereby derive meaning. (Information = Data + Meaning)

- Knowledge: This is information applied to context. (Knowledge = Data + Meaning + Context)

- Wisdom: This is knowledge applied to multiple contexts. (Wisdom = Data + Meaning + (Context x N cases)).

I'd like to end this blog with a short quote that I thought sort of sums it up:

"A man may be born to wealth, but wisdom comes only with length of days." - Anonymous
(Source Photo: here)

Share/Save/Bookmark

January 2, 2012

The Internet Lives


While the Internet, with all its information, is constantly changing with updates and new information, what is great to know is that it is being preserved and archived, so present and future generations can "travel back" and see what it looked liked at earlier points in time and have access to the wealth of information contained in it.

This is what the Internet Archive does--this non-profit organization functions as the Library of the Internet. It is building a "permanent access for researchers, historians, scholars, people with disabilities, and the general public to historical collections that exist in digital format."

In the Internet Archive you will find "texts, audio, moving images, and software as well as archived web pages" going back to 1996 until today.
I tested the Archive's Wayback Machine with my site The Total CIO and was able to see how it looked like back on October 24, 2010.

It is wonderful to see our digital records being preserved by the Internet Archive, just like our paper records are preserved in archives such as The Library of Congress, which is considered "the world's most comprehensive record of human creativity and knowledge"), The National Archives, which preserves government and historical records, and The National Security Archive, a research institute and library at The George Washington University that "collects and publishes declassified documents through the Freedom of Information Act...[on] topics pertaining to national security, foreign, intelligence, and economic policies of the United States."

The Internet Archive is located in San Francisco (and my understanding is that there is a backup site in Egypt).

The Internet Archive is created using spider programs that crawl the publicly available pages of the Internet and then copy and store data, which is indexed 3 dimensionally to allow browsing over multiple periods of times.

The Archive now contains roughly 2 petabytes of information, and is growing by 20 terabytes per month. According to The Archive, the data is stored on hundreds (by my count it should be about 2,000) of slightly modified x86 machines running on Linux O/S with each storing approximately a terabyte of data.

According to the FAQs, it does take some time for web pages to show up--somewhere between 6 months and 2 years, because of the process to index and transfer to long-term storage, and hopefully the process will get faster, but in my opinion, having an organized collection and archiving of the Internet is well worth the wait.

Ultimately, the Internet Archive may someday be (or be part of) the Time Capsule of human knowledge and experience that helps us survive human or natural disaster by providing the means to reconstitute the human race itself.

(Source Photo: here)

Share/Save/Bookmark

April 29, 2011

A Place for Answers


First there was Wikipedia and now there is Quora.

On January 15, 2011, Wikipedia celebrated it's 10 year birthday, and according to Bloomberg Businessweek, it now has more than 17 million entries (compared with only 120,000 for Encyclopedia Britannica) in 250 languages and is one of the most visited sites on the Internet. Moreover, the accuracy of the crowd-sourced Wikipedia has generally been found as good as traditional encyclopedias.

But despite the phenomenal growth of Wikipedia, a new site, Quora is finding a place for itself in online knowledge management, as one of the key question and answer (Q&A) destinations of the web (others being Answers.com, Yahoo Answers, and more--which were apparently found lacking by the founders of Quora).

According to Wired (May 2011), Quora is only 2 years old and already has about 200,000 people visiting the site each month. The approach of Quora is to create a searchable knowledge market based on merging verifiable facts with people's personal experiences and observations or what Wired calls "the large expanse between...the purely objective [e.g. Wikipedia] and the purely subjective [e.g. Facebook, Twitter, etc.]."

Quora is looking to capture what it believes is the "Ninety percent of information people have [that] is still in their heads and not on the web."

The site is also creating a community of people who participate in asking and answering questions, and can select to follow topics and people of interest, and vote on whether answers are helpful ("voted up") or not to push answers up or down the page.

Similar to Wikipedia, answers can be "trimmed, corrected, or otherwise massaged by one of the rigorous volunteers" (of which their are now more than 100--Quora only has 18 employees). Answers are "written for the world, and for anyone who has that same question for the rest of time." And even questions can get "extensively reworded."

Wired asks is this just another popularity contest on the web or self-promotion for the self-proclaimed experts? One of the volunteers responds that "This isn't about job searching. It's not about raising money. Most of us who are heavy users can already do that without help. It's a sense of sharing what we now, and it's being part of a community."

Of course, while critics may call them pedantic or petty, the Quora participants are on a mission to build a vital and timeless knowledge repository--"the modern-day equivalent to the Library of Alexandria", so perhaps the people chic has to be balanced with information usability.

On January 21, 2011, Tech Crunch awarded Quora "best new startup for 2010."

It will be interesting to see where this goes...the funny thing for me was that I ended looking up Quora up in Wikipedia. :-)

Share/Save/Bookmark

April 23, 2011

Information-Free Is Invaluable

At first I admit it, I didn't really get Google; I mean what is this G-o-o-g-l-e and the shtick about "doing search"?

But the writing was on the wall all along with their incredible mission statement of: "to organize the world's information and make it universally accessible and useful."

So search is the just the beginning of a long list of now amazingly valuable Google properties and services (now valued with a market capitalization of almost $169 Billion):

- Search (Google Search, Google Search Appliance, Google Desktop)
- Cloud Computing (Google Apps Engine, Google Storage for Developers, Chrome Notebooks)
- Advertising Technology (Adwords, AdSense, DoubleClick)
- Website Analytics (Google Analytics)
- Operating Systems (Chrome OS, Android, Honeycomb)
- Web Browser (Google Chrome)
- Productivity Software (Gmail, Google Calendar, Google Apps Suite)
- Social Computing (Google Wave, Google Talk, Orkut, Buzz)
- News Aggregator (Google News, Google Reader)
- Translation (Google Translate)
- Telecommunication (Google Voice)
- Clean Energy (Google Energy)
- Geospatial (Google Maps, Google Earth)
- Video (YouTube)
- Photos (Picassa)
- Electronic Books (Google Books)
- Blogs (Blogger)

What Google seems to intuitively get is that their free powerful web services creates invaluable consumer market share and mind share--like a honey pot. Once the consumer comes on board--like good little bees, they are ripe for companies to reach out to via advertising for all and every sort of product and service under the sun. And according to 1998 revenue breakdown, as much as 99% of Google's revenue is associated with advertising!
Google is brilliant and successful for a number of reasons:

1) Google is consumer-oriented and knows how to attract the crowd with free services, and they let others (the advertisers) concern themselves with monetizing them.

2) Google is incredibly innovative and provides the breath and depth of technology services (from cloud to productivity to search to video) that consumers need and that are easy for them to use.

3) Google is information rich, but they share this broadly and freely with everyone. While some have complained about the privacy implications of this information bounty; so far, Google seems to have managed to maintain a healthy balance of information privacy and publicity.

4) Google values their people, as their "owners manual" reads: "our employees...are everything. We will reward them and treat them well." And to help retain their talent, Google just gave their employees a 10% raise in January.

5) Google wants to be a force for good--their creed is "Don't be evil." They state in their manual: "We believe strongly that in the long term, we will be better served- as shareholders and in all other ways--by a company that does good things for the world, even if we forgo some short-term gains."

Do not underestimate Google--as the Wall Street Journal, 23-24 April, 2011 summarizes today, they are not a conventional company.

At the end of the day, if Google is successful in their business of making information universally accessible and useful, then we are talking about making an invaluable difference in the lives of humanity--where information builds on itself, and knowledge--like the Tree of Knowledge in the Book of Genesis--is alive and constantly growing for all to benefit from in our Garden of Eden, we call Earth.

(Source Picture: Honeybird)

Share/Save/Bookmark

April 10, 2011

The Twitter Miracle

Twitter is a crazy thing--little blue birdie...tweet, tweet, tweet.
Why do we even do it (tweet)?
Here are the "4 Stages of Getting Twitter" (Credit: Andfaraway):
  • Stage 1--It starts with utmost skepticism and even denigrating the tool (e.g. it's stupid, dumb, a time-waster...)
  • Stage 2--Then it moves to well why don't I just try it and see what all the commotion is all about--maybe I'll like it?
  • Stage 3--As the interaction with others (RT's, @'s and messages) start to flow, you have the ah ha moment--I can communicate with just about anyone, globally!
  • Stage 4--I like this (can anyone say addiction!). I can share, collaborate, influence--way beyond my traditional boundaries. This is amazing--this is almost miraculous.
Here are some other things I like about Twitter:
1) Like a journal, it's a way to capture your thoughts, experiences, feelings, likes/dislikes. (One thing I don't like about Twitter is there is no good way that I know of to archive or print them--I hope they fix this, please).
2) Another thing about Twitter (and Blogger and Wikipedia for that matter)...I imagine sometimes that this is an incredible social time capsule (i.e. knowledge repository) that we are putting together (almost unknowingly) that will carry humankind forward past any future natural or man-made disasters. Years ago, people would bury a few mementos in a treasure chest or something, as a time capsule, and what a find this would be for people years later when they would open it up and learn firsthand what life was like "those days." Now, imagine the treasure trove of the exabytes of information contributed to by hundreds of millions people from around the world. What is also fascinating to me is that people contribute enormous amounts of their time and energy and all for free--hey, this is even less than what Amazon's Mechanical Turks could do this for! :-)
Clearly, people want to express themselves and connect with others--and social media gives ever new meaning to this beyond physical space and time.

Share/Save/Bookmark

March 14, 2011

Watson Can Swim

With IBM's Watson beating the pants off Jennings and Rutter in Jeopardy, a lot of people want to know can computers can really think?

Both sides of this debate have shown up in the last few weeks in some fascinating editorials in the Wall Street Journal.

On one hand, on 23 February 2011, John Searle of the University of California, Berkeley wrote that "IBM invented an ingenious program--not a computer that can think." According to Searle, Watson (or any computer for that matter) is not thinking but is simulating thinking.

In his most passionate expression, Searle exclaims: "Watson did not understand the questions, nor its answers, not that some of its answers were right and some wrong, not that it was playing a game, nor that it won--because it doesn't understand anything."

Today, on 14 March 2011 on the other hand, Stephen Baker, author of "Final Jeopardy--Man vs. Machine and the Quest to Know Everything" took the opposing view and stated: "Watson is an early sighting of a highly disruptive force...one that can handle [information] jobs held by people."

To the question of whether machine thinking is "real" thinking? Baker quotes David Ferrucci, IBM's chief scientist who when asked if Watson can think, responded "Can a submarine swim?"

The analogy is a very good one.

Just because a submarine doesn't swim like a fish or a person, doesn't mean it can't swim. In fact and in a sense, for the very reason that it doesn't swim exactly like a fish or person, it actually can swim better.

So too with computers, just because they don't "think" like humans doesn't mean they don't think. They just think differently and again in sense, maybe for the very same reason, in certain ways they can think better.

How can a computer sometimes think better than a person? Well here are just some possible examples (non-exhaustive):

- Computers can evaluate options purely based on facts (and not get "bogged down" in emotions, conflict, ego, and so forth like human beings).

- Computers can add processing power and storage at the push of button, like in cloud computing (people have the gray matter between their ears that G-d gave them, period).

- Computers do not tire by a problem--they will literally mechanically keep attacking it until solved (like cracking a password).

- Computers can be upgraded over time with new hardware, software, and operating systems (unlike people who age and pass).
At the same time, it is important to note that people still trump computers in a number of facets:

- We can evaluate things based on our conscience and think in terms of good and evil, and faith in a higher power (a topic of a prior blog).

- We can care for one another--especially children and the needy--in a altruistic way that is not based on information or facts, but on love.

- We can work together like ants in a colony or bees in a hive or crowdsourcing on- or off-line to get large jobs done with diversity and empowerment.

- We are motivated to better ourselves and our world--to advance ourselves, families, and society through continuous improvement.
Perhaps, like the submarine and the fish, both of which can "swim" in their own ways, so too both computers and people can "think"--each in their own capacity. Together, computers and people can augment the other--being stronger and more effective in carrying out the great tasks and challenges that confront us and await.

Share/Save/Bookmark

March 12, 2011

Civic Commons-A Lesson In Sharing

Love this concept on Civic Commons that was presented at the Gov 2.0 Summit (2010) and is now becoming a reality.

Presented by Bryan Sivak the CTO for DC--Civic Commons is about governments collaborating and building technology once and reusing multiple times, rather than reinventing the wheel over and over again--a critical enterprise architecture principle.

Governments have similar needs and can share common solutions--products and projects--for these.

A number of successful examples:

1) DC and San Francisco building Open 311 (which I wrote about in a prior blog).
2) Minnesota building a $50 Million Unemployment Insurance System and then sharing it with Iowa who implemented it at less than 1/2 that.

Some initial products that have been committed:

1) White House IT Federal Dashboard
2) Track DC (Operational Dashboard)
3) San Francisco Master Address Database Geocoder
4) New York Senate's Open Legislation Application

And more will be coming...all of which can be used and improved upon.

It is great to see so many state governments collaborating--across the Nation--from Seattle to LA, Boston, San Francisco, NY, and Chicago. Moreover, they are coordinating with the Federal Government, as well as with supporting organizations, such as OpenPlans, Code For America, O'Reilly Media, and more that are helping with coordination, facilitation, and support.

This is another great step in breaking down the silos that separate us and becoming more efficient in working together and learning to share what can benefit many.

Share/Save/Bookmark

February 3, 2011

Leading With Business Intelligence

Check out this great video on Mobile Business Intelligence (BI) put out by MicroStrategy (Note: this is not an endorsement of any particular vendor or product).

Watch the user fly through touchscreen tables, charts, graphs, maps, and more on an iPhone and iPad-- Can it really be this easy?

This fits in with my firm belief that we've got to use business analytics, dashboarding, and everything "information visualization" (when done in a user-centric way) to drive better decision-making.

This is also ultimately a big part of what knowledge management is all about--we turn data into actionable insight!

What is so cool about this Mobile BI is that you can now access scorecards, data mining, slicing and dicing (Online Analytical Processing--OLAP), alerting, and reporting all from a smartphone or tablet.

This integrates with Google maps, and is being used by major organizations such as U.S. Postal Service and eBay.

Running a business, I would want this type of capability...wouldn't you?

As Federal Judge John E. Jones said: "What gets measured get's done, what gets measured and fed back gets done well, and what gets rewarded, gets repeated."

Share/Save/Bookmark

January 2, 2011

The Robots Are Coming

Forget waiters and waitresses, the new Japanese Hajime Robot restaurant in Bangkok, Thailand invested almost $1 million on 4 robotic waitstaff.

You order your food by touch screen computer, and there is a countdown on the screen for when the food is ready and the robot brings it out to you.

While the samurai clad robots are not the best looking—their huge eyes are a little cartoonish—they are certainly quite dexterous and able as they nimbly serve the food in this restaurant and dance for the customers in between courses without missing a beat.

Initially automation affected the jobs of blue-collar workers in manufacturing and mechanical work as robots displaced people on the “assembly line.” Now we see the trend continuing and expanding with automation entering the service industry and jobs involving customer interaction, entertainment, and retail being affected. This is happening not only in restaurants, but also elder care (like robot uBot5 being developed out of University of Massachusetts), and in major retail establishments such as in warehouse automation with Kiva Systems robots being employed by major companies like Gap, Staples, and Zappos.

Further, the expansion of robots into traditional human work is also happening in our military—think Unmanned Aerial Vehicles (UAVs or Drones) like the Predators and Reapers, the robotics pack animals that can carry hundreds of pounds of gear (like Big Dog) and various bomb disposal robots. This is just the beginning.

We are witnessing the transformation of our workforce from traditional blue- and now even white-collar jobs to those with an emphasis on knowledge management (think engineers and technology professionals working at companies like iRobot, Intel, and Apple). This has obvious implications for selection of education pursuits and availability of professional opportunities in the future for our children and grandchildren.

The robots are coming. The robots ARE coming!


Share/Save/Bookmark

October 31, 2009

Complexity, plain and simple

There is the old saying that rings true to basic leadership: “Keep it Simple Stupid,” (or KISS) yet for various reasons people and organizations opt or are compelled toward complexity.

And when things are complex, the organization is more prone to mistakes, people to misunderstandings, and leadership to mismanagement--all are points of failure in the dynamics of running an organization.

Mistakes can be costly from both a strategic and operational standpoint; misunderstandings between people are a cause of doubts, confusion, and hostility; and mismanagement leads to the breakdown of solid business process and eventually everything goes to pot.

An interesting article in the Wall Street Journal, 26 October 2009, defines four types of complexity:

Dysfunctional—This is the de facto complexity. It “makes work harder and doesn’t create value…research suggests that functional complexity creeps into a company over years through the perpetuation of practices that are no longer relevant, the duplication of activities due to mergers or reorganizations, and ambiguous or conflicting roles.”

Designed—This is an odd one…why would you design in complexity? “Executives may deliberately increase the complexity of certain activities or they may broaden the scope of their product offering, because they expect the benefits of those changes to outweigh the costs.” Example cited: “Dell believes that configuring each product to individual specs, rather than creating them all the same, makes customers more likely to buy from the company.”

Inherent—I guess this is the nothing I can do about it category, it just is hard! “The difficulty of getting the work done.” Plain and simple, some jobs are highly complex Mr. Rocket Scientist.

Imposed—This is the why are they doing this to us category—external factors. This “is largely out of the control of the company. It is shaped by such entities as industry regulators, non-governmental organizations and trade unions.” I would assume competitors’ misdeeds would fall into this one as well.

Whatever the reason for the complexity, we know implicitly that simplification, within the realm of what’s possible, is the desired state. Even when the complexity is so to say “designed in” because of certain benefits like with the Dell example, we still desire to minimize that complexity, to the extent that we can still achieve the organization’s goals.

I remember years ago reading about the complexity of some companies’ financial reports (income statements, balance sheets, statements of cash flows…) and news commentators questioning the authenticity of their reporting. In other words, if you can’t understand it—how do we know if it is really truthful, accurate, or the full story? Well-publicized accounting scandals like Enron, HealthSouth, and many others since around the mid-1990’s come to mind.

Generally, we know that when something is veiled in a shroud of complexity, there is often mismanagement or misconduct at play.

That is not to say that everything in life is simple—it isn’t. Certainly advances in the sciences, technology, and so on are not simple. Knowledge is incremental and there is certainly lot’s of it out there to keep us all occupied in the pursuit of life-long learning. But regardless of how complex things get out there—whether dysfunctional, designed, inherent, or imposed—we should strive to make things easier, more straightforward, and as effortless and trouble-free, as possible.

Will simplification get more difficult as a goal as our society continues to advance beyond the common man’s ability to understand it?

Yes, this is going to be a challenge. It used to be that graduating from high school was the farthest most people went with their education. Then college became the goal and norm for many. And now graduate and post-graduate studies are highly desirable and expected for many professional careers. It is getting difficult for people to keep us with the pace of change, breadth and depth of knowledge, and the advancement in technical fields.

One of the antidotes to the inherent complexity seems to be greater specialization such as in medicine, technology, engineering and so forth. As knowledge advances, we need to break it up into smaller chunks that people can actually digest and handle. The risk is that the pieces become so small eventually that we can lose sight of the bigger picture.

Complexity is here to stay in various forms, but we can and must tackle at the very least the dysfunctional complexity in our organizations. Some ways we can do this include breaking down the silos that impede our collaboration and information sharing; architecting in simplification into our strategic, operational, and tactical plans; building once and reusing multiple times (i.e. through enterprise and common solutions); filling gaps, reducing redundancies, and eliminating inefficiencies; reengineering our business processes as a regular part of “what we do”, constantly innovating better, faster, and cheaper ways of doing things; thinking and acting user-centric, improving the way we treat our people; and of course, being honest, transparent, and upright in our dealings and communications.


Share/Save/Bookmark

February 7, 2009

The Perilous Pitfalls of Unconscious Decision Making

Every day as leaders, we are called upon to make decisions—some more important than others—but all having impacts on the organization and its stakeholders. Investments get made for better or worse, employees are redirected this way or that, customer requirements get met or are left unsatisfied, suppliers receive orders while others get cancelled, and stakeholders far and wide have their interests fulfilled or imperiled.

Leadership decisions have a domino effect. The decisions we make today will affect the course of events well into the future--especially when we consider a series of decisions over time.

Yet leadership decisions span the continuum from being made in a split second to those that are deliberated long and hard.

In my view, decision makers can be categorized into three types: “impulsive,” “withholding,” and “optimizers.”

  1. Impulsive leaders jump the gun and make a decision without sufficient information—sometimes possibly correctly, but often risking harm to the organization because they don’t think things through.
  2. Withholding leaders delay making decisions, searching for the optimal decision or Holy Grail. While this can be effective to avoid overly risky decisions, the problem is that they end up getting locked into “analysis paralysis”. They never get off the dime; decisions linger and die while the organization is relegated to a status quo—stagnating or even declining in times of changing market conditions.
  3. Optimizers rationally gather information, analyze it, vet it, and drive towards a good enough decision; they attempt to do due diligence and make responsible decisions in reasonable time frames that keep the organization on a forward momentum, meeting strategic goals and staying competitive. But even the most rational individuals can falter in the face of an array of data.

So it is clear that whichever mode decision makers assume, many decisions are still wrong. In my view, this has to do with the dynamics of the decision-making process. Even if they think they are being rational, in reality leaders too often make decisions for emotional or even unconscious reasons. Even optimizers can fall into this trap.

CIOs, who are responsible for substantial IT investment dollars, must understand why this happens and how they can use IT management best practices, structures, and tools to improve the decision-making process.

An insightful article that sheds light on unconscious decision-making, “Why Good Leaders Make Bad Decisions,” was published this month in Harvard Business Review.

The article states: “The reality is that important decisions made by intelligent, responsible people with the best information and intentions are sometimes hopelessly flawed.”

Here are two reasons cited for poor decision making:

  • Pattern Recognition—“faced with a new situation, we make assumptions based on prior experiences and judgments…but pattern recognition can mislead us. When we’re dealing with seemingly familiar situations, our brains can cause us to think we understand then when we don’t.”
  • Emotional Tagging—“emotional information attaches itself to the thoughts and experiences stored in our memories. This emotional information tells us whether to pay attention to something or not, and it tells us what sort of action we should be contemplating.” But what happens when emotion gets in the way and inhibits us from seeing things clearly?

The authors note some red flags in decision making: the presence of inappropriate self-interest, distorting attachments (bonds that can affect judgment—people, places, or things), and misleading memories.

So what can we do to make things better?

According to the authors of the article, we can “inject fresh experience or analysis…introduce further debate and challenge…impose stronger governance.”

In terms of governance, the CIO certainly comes with a formidable arsenal of IT tools to drive sound decision making. In particular, enterprise architecture provides for structured planning and governance; it is the CIO’s disciplined way to identify a coherent and agreed to business and technical roadmap and a process to keep everyone on track. It is an important way to create order of organizational chaos by using information to guide, shape, and influence sound decision making instead of relying on gut, intuition, politics, and subjective management whim—all of which are easily biased and flawed!

In addition to governance, there are technology tools for information sharing and collaboration, knowledge management, business intelligence, and yes, even artificial intelligence. These technologies help to ensure that we have a clear frame of reference for making decisions. We are no longer alone out there making decisions in an empty vacuum, but rather now we can reach out –far and wide to other organizations, leaders, subject matter experts, and stakeholders to get and give information, to analyze, to collaborate and to perhaps take what would otherwise be sporadic and random data points and instead connect the dots leading to a logical decision.

To help safeguard the decision process (and no it will never be failsafe), I would suggest greater organizational investments in enterprise architecture planning and governance and in technology investments that make heavily biased decisions largely a thing of the past.


Share/Save/Bookmark

May 28, 2008

Blogs and Enterprise Architecture

Well this is interesting to write: a blog about blogging ;-)

Blogs are becoming a great new tool for enterprise communications and an alternate to clogging up already full email boxes.

CIO Magazine, 15 January 2008, states that “enterprise users can get lost in storms of ‘reply-all’ e-mails while trying to manage projects. Blogs offer a better way.”

The group president of systems and technology for Bell Canada says that “email, used by itself just doesn’t cut it anymore for project management and interoffice communication.”

What’s the interest level and use of blogs?

Forester Research reports that “54% of IT decision makers expressed interest in blogs. Of companies that had piloted or implemented blogs, nearly two-thirds (63%) said they used them for internal communications. Fifty percent said they used blogs for internal knowledge management—and these companies are leading the way of the future.”

A software social consultant says that “traditional enterprise solutions were designed to keep IT happy. They’ve not usually designed with any thought to the user, like a blog is.” What a nice user-centric EA concept, design technical solutions that meet user requirements; let business drive technology, rather than doing technology for technology’s sake.

Why do people resist blogs?

“People are hung up on this concept of the blog as a diary and as an external marketing medium,” rather than understanding its criticality as a tool for communications and knowledge management.

How can you advance the use of blogs in your organization?

  1. Calming the troops─if people are nervous about blogs, consider avoiding the term blog and call it an ideaboard or some other non-technical and non-threatening name.
  2. Security and compliance—build the blog behind the corporate firewall and “establish rules of engagement,” so that proper social and legal etiquette is not violated and passive-aggressive behavior or “web rage” is mitigated.
  3. Start small—“blogs catch on virally, when you need to introduce the idea to the right test group, which will evangelize the idea to the rest of the enterprise.”
  4. Tagging—have people “tag their posts with keywords that will help later with search and discovery needs.”
From an EA perspective, blogs are not a substitute for email; we need email (some of us desperately, like a morning cup of joe), but blogs are a great complementary tool for participatory communications that involve discussion type interaction by more than two users or for capturing enterprise knowledge and making it available for discovery. Also, blogs are a tool that gives a voice to people, who may otherwise remain part of the silent masses; people feel freer to express themselves in blogs, and through freedom of expression comes advancement of ideas, greater buy-in, and better enterprise decision-making.


Share/Save/Bookmark

April 11, 2008

Google and Enterprise Architecture

User-centric Enterprise architecture is about capturing, processing, organizing, and effectively presenting business and technology information to make it valuable and actionable by the organization for planning and governance.

Google is a company that epitomizes this mission.

After reading a recent article in Harvard Business Review, April 2008, I came to really appreciate their amazing business practices and found many connections with User-centric EA.

  1. Organizing information--Google’s mission [is] ‘to organize the world’s information and make it universally accessible and useful.’” Similarly in User-centric EA, we seek to organize the enterprise’s information and make it useful, usable, easy to understand, and readily accessible to aid decision making.
  2. Business and technology go hand-in-hand—“Technology and strategy, at Google, are inseparable and mutually permeable—making it hard to say whether technology is the DNA of its strategy or the other way around.” Similarly, EA is the synthesis of business and technology in the organization, where business drives technology, rather than doing technology for technology’s sake.
  3. Long-term approach—“CEO Eric Schmidt has estimated that it will take 300 years to achieve the mission of organizing the world’s information…it illustrates Google’s long-term approach to building value and capability.” Similarly, EA is a planning and governance function. EA plans span many years, usually at least 5 years, but depending on the mission, as long as 20 years for business/IT projects with long research and development cycles like in military and space domains.
  4. Architectural control—“Architectural control resides in Google’s ability to track the significance of any new service, its ability to choose to provide or not provide the service, and its role as a key contributor to the service’s functional value.” This is achieved by network infrastructure consisting of approximately one million computers and a target audience of 132 million customers globally on which they can test and launch applications. In EA, control is exercised through a sound governance process that ensures sound IT investments are selected or not.
  5. Useful and usable—“The emphasis in this process is not on identifying the perfect offering, but rather on creating multiple potential useful offerings and letting the market decide which is best…among the company’s design principles are…usefulness first, usability later.” In User-centric EA, we also focus on the useful and usable products (although not in sequence). The point being that the EA must have clear value to the organization and its decision makers; we shun developing organizational shelfware or conducting ivory tower efforts.
  6. Data underscores decision making—“A key ingredient of innovation at the company is the extensive, aggressive use of data and testing to support ideas.” EA also relies on data (business and technical) for planning and governance. This is the nature of developing, maintaining, and leveraging use of EA through information products that establish the baseline, target, and transition plan of the organization. A viable plan is not one that is pulled from a hat, but one that is data-driven and vetted with executives, subject matter experts, and other stakeholders. Further, EA provides business intelligence for governance and decision making.
  7. Human capital—“If a company actually embraced—rather than merely paid lip service to—the idea that its people are its most important asset, it would treat employees much the way Google does.” This concept is embedded User-centric EA, where the architecture is driven by the needs and requirements of the users. Further, Human Capital is a distinct perspective in User-centric EA, where people are viewed as the hub for all business and IT success.

In short, Google is a highly User-centric EA-driven organization and is a model for many of its core tenets.


Share/Save/Bookmark

April 6, 2008

Total Recall and Enterprise Architecture

Enterprise architecture plays an important role in corporate knowledge management. EA captures, analyzes, catalogues, and serves up information to end-users. In many cases, where more general KM endeavors fail, User-centric EA succeeds because it is a focused effort, with a clear value proposition for making information useful and usable.

Now, KM is being taken to whole new level. And rather than capturing information with clearly defined users and uses, the aim is total recall.

ComputerWorld, 6 April 2008, reports on an initiative for “storing every life memory in a surrogate [computer] brain.”

“Gordon Bello, a longtime veteran of the IT industry and now principle researcher at Microsoft’s Corp.’s research arm, is developing a way for everyone to remember those special moments. Actually, Bell himself wants to remember—well, everything...he wants the ability to pull up any picture, phone call, e-mail, or conversation any time he wants”

“The nine-year project, called MyLifeBits, has Bell supplementing his own memory by collecting as much information as he can about his life. He’s trying to store a lifetime on his laptop.”

“The effort is about not forgetting, not deleting, and holding onto all the bit of your life. In essence, it’s about immortality.”

What about privacy of your personal information?

It “isn’t about plastering a Myspace or Facebook page with information…[It’s] immensely personal...you will leave a personal legacy—a record of your life [on a personal computer].

And Bell is not discerning, he stores painful memories as he does happy ones; this “would actually let people see who he was as a person.”

Certainly people have strived for eternal life from the time of the first man and woman—Adam and Eve eating of the forbidden apple in their quest for immortality—and since with the search for the “fountain of youth” and other elixirs to prolong life. Similarly, people have sought to live eternal by leaving a legacy—whether great men or nefarious one—from rulers and inventors to conquerors and hate mongers. The desire to influence and be remembered everlasting is as potent as the most parch thirst of man.

Bell has gone to extremes collecting and storing his memories—good and bad—from “every webpage he has ever visited and television shows he has watched…video’s of lectures he’d given, CDs, correspondence and an avalanche of photos…he has also recorded phone conversations, images and audio from conference sessions, and with his e-mail and instant messages.”

In fact, Bell wears a SenseCam around his neck, a digital camera that automatically takes a photo every 30 seconds or whenever someone approaches.

“Bell figures that he could store everything about his life, from start to finish, using a terabyte of storage.”

“In 20 years, digitizing our memories will be standard procedure according to Bell. ‘Its my supplemental memory and brain’. It’s one of my most valuable possessions. It look at this thing and think, ‘My whole life is there.’”

So is that what a human life comes down to—a terabyte of stored information?

While maybe a noble effort at capturing memory, this seems to miss the mark at what a human being is really about. A person is much more than that which can be captured by a photo or sound bite of the external circumstances and events that take place around us. The essence of a person is about the deep challenges that go on inside us. The daily struggles and choices we make through our inner conscience—to chose right from wrong and to sacrifice for our creator, our loved ones, our nation, and our beliefs. Yes, you can see the resulting actions, but you don’t see the internal struggles of heart, mind, and soul.

Also, while capturing every 30 seconds of a person’s life may be sacred to the person whose life is being stored, who else really cares? The high-lights of a person’s life are a lesson for others, the minutia of their day are personal for their growth and reckoning.

From a User-centric EA perspective, I believe we should focus KM initiatives for both organizations and individuals from being a wholesale data dump to being truly meaningful endeavors that have a clarity or purpose and a dignity of the human beings being recorded.


Share/Save/Bookmark

March 12, 2008

Knowledge Management and Enterprise Architecture

Enterprise architecture is a major contributor to knowledge management.

  • EA documents and communicates the baseline, target, and transition plan for the organization.
  • User-centric EA categorizes, analyzes, and visualizes the information to make it useful and usable.
  • Further, User-centric EA develops information products to enable better decision making, and it makes these readily accessible to end-users.

The Wall Street Journal, 10 March 2008, reports that “knowledge management [KM] can make a difference—but it needs to be more pragmatic.”

What is KM?

A concerted effort to improve how knowledge is created, delivered, and used.”

“Over the past 15 years or so, many large organizations have embraced the idea that they could become more productive and competitive by better managing knowledge—the ideas and expertise that originate in the human mind.”

But many KM programs have failed miserably or just gone nowhere—why?

“Some firms stumbled by focusing their knowledge management efforts on technology at the expense of everything else, while others failed to tie knowledge programs to overall business goals or the organization’s other activities.”

Here’s how to do KM right:

  1. Creation—Organizations “define in advance the type of information they need and why they need it—say to improve customer service or to develop easier to use products. They solicit ideas, insights, and innovations from rank-and-file workers, customers, and business partners, rather than relying solely on R&D staff to come up with the ideas.” Web 2.0 technologies like blogs, wikis, and collaborative websites encourage broader participation.
  2. Dissemination—“the focus is on putting in one place all the content a specific group of workers need, regardless of its source. To that end, many organizations are using Web portals, or intranet sites as one stop information shops.”
  3. Application—“obtaining and sharing knowledge is beneficial only if employees use it to get better at what they do—that is, they learn from it.” Creating communities of interest (COIs) helps foster social learning that occurs when people with a common interest in some subject or problem are brought together to collaborate over an extended period to share ideas, solutions, and innovations.”

EA contributes to all three areas:

  1. EA identifies information needs by the business and IT areas and captures, processes, and serves up this information to stakeholders.
  2. EA disseminates information products through the EA website, handbook, EA repository, and other media to make it accessible to end-users.
  3. EA that is User-centric focuses on providing information that is actionable—useful and usable by the business and IT executives and staffs. Only products with clear uses and users are developed, maintained, and shared. EA is focused on delivering value (shelfware is a dirty word in User-centric EA).

EA can be a shining example of KM, when it is User-centric!


Share/Save/Bookmark