Showing posts with label Risk Management. Show all posts
Showing posts with label Risk Management. Show all posts

July 15, 2012

Resilient To The Core

I circled back to an article that I saved away for the last 10 years (5 years before I started blogging and practically before it really even existed)!

It is from Harvard Business Review and it is called How Resilience Works (May 2002). 

It is an incredible article about what differentiates the person that falls apart and seemingly gives up under immense stress and those that use it as a stepping stone to future success and greatness. 

Resilience is "the skill and capacity to be robust under conditions of enormous stress and change."

Literally, resilience means "bouncing back," perhaps versus jumping throw a plate glass wall from the 50th story. 

Everyone has their tests in life--whether loss, illness, accident, abuse, incarceration, poverty, divorce, loneliness, and more. 

But resilience is how we meet head-on these challenges, and it "can be learned."

The article looks at individual and organizational "survivors" of horrible things like the Holocaust, being a prisoner of war (POW), and terrorist attacks such as 9/11, and basically attributes resilience to three main things:

1) Acceptance--rather than slip into denial, dispair, or wishful-thinking, resilience means we see the situation exactly for what it is and make the most of it--or as they say, "make lemonade out of lemons."

2) Meaning--utilizing a strong system of values, we find meaning and purpose even in the darkest of situations--even if it is simply to learn and grow from it!

3) Ingenuity--this is capacity to invent, improvise, imagine possibilities, make do with what you have, and generally solve-problems at hand. 

Those who accept, find meaning, and improvise can succeed, where others fail. 

Now come forward a decade in time, and another article at CNN (9 July 2012) called Is Optimism Really Good For You? comes to similar conclusions.

The article describes how optimism works for an individual or an organization only when it is based on "action, common sense, resourcefulness, and considered risk-taking."

"It's the opposite of defeatism"--we recognize that there are things not in our control and that don't always turn out well, but we use that as an opportunity to come back and find a "different approach" and solve the problem. 

The article calls this "action-oriented optimists" and I like this concept--it is not blind hope nor is it giving-up, but rather it is a solid recognition that we can do and must do our part in this world. 

Fortune Magazine summed this up well in an article a few months back as follows--There are three kinds of people: "those who make it happen, those who watch it happen, and those wonder how the heck it happened."

When things happen in your life--to you--which of these types of people will you be? 

(Source Photo: Andy Blumenthal)

Share/Save/Bookmark

July 12, 2012

100% Burglar Proof--Tell Me Another One

So I saw this advertisement for a "100% burglar proof" system and I was just bewildered.

Does anyone really think we can be 100% sure of anything--let alone security?

Everyday thieves rob the safest banks, cyber criminals hack the most secure systems, and crooks break into the most secure sites.

Everything we do comes down to risk management--assessing and classifying risk, selecting controls to mitigate risk, and monitoring those for effectiveness and necessary modifications. 

For children, maybe things are basic black and white--it's simpler that way "good guys" and "bad guys" and so on, but for adults we know there are at least "50 shades of grey" and that means that there are no certainties in life--whether security, sure financial bets, or perfect opportunities--everything is a gamble in some respects. 

I remember someone once joked about even marriage being somewhat chancy, since "you never really know the person until you wake up with them in the morning every day."

With 20-20 hindsight, all the pundits seem brilliant, but only the prophets can predict the future with accuracy. 

As to any product or vendor that markets itself as having a 100% success rate, you better get yourself a money back guarantee on it, because you will definitely need it! ;-)

(Source Photo: Andy Blumenthal)

Share/Save/Bookmark

June 16, 2012

Securing Transport To The Cloud

A new article by Andy Blumenthal on cyber security and cloud computing in Public CIO Magazine (June 2012) called Securing Cloud Data Means Recognizing Vulnerabilities.

"It’s the principle of inertia: An object in motion stays in motion unless disturbed. Just like a car on a highway, everything zips along just fine until there’s a crash. This is similar with information on the superhighway."

Let's all do our part to secure cyberspace.

Hope you enjoy!

(Source Photo: here with attribution to Kenny Holston 21)

Share/Save/Bookmark

June 3, 2012

Raising The Bar On Cybersecurity



Good video by the The Washington Post (2 June 2012) on the importance and challenges of cybersecurity. 

There are 12 billion devices on the Internet today and this is projected to soar to 50 billion in the next decade.

Cybersecurity is paramount to protecting the vast amounts of critical infrastructure connected to the Internet.

There is a lot riding over the Internet--power, transportation, finance, commerce, defense, and more--and the vulnerabilities inherent in this is huge!

Some notable quotes from the video:

- "Spying, intrusions, and attacks on government and corporate networks occur every hour of every day."

- "Some sort of cyberwar is generally considered an inevitability."

- "Cyberwar although a scary terms--I think it is as scary as it sounds."

- "Right now the bar is so low, it doesn't take a government, it doesn't take organized crime to exploit this stuff--that's what's dangerous!"

We all have to do our part to raise the bar on cybersecurity--and let's do it--now, now, now.

Share/Save/Bookmark

May 29, 2012

A Cyber Security House Of Cards

Yesterday there were reports of a new "massive cyber attack" called the Flame.

A U.N. Spokesperson called it "the most powerful [cyber] espionage tool ever."

The Flame ups the cyber warfare ante and is "one of the most complex threats ever discovered"--20 times larger than Stuxnet--and essentially an "industrial vacuum cleaner for sensitive information."

Unlike prior cyber attacks that targeted computers to delete data ("Wiper"), steal data ("Duqu"), or to disrupt infrastructure ("Stuxnet"), this malware collects sensitive information. 

The malware can record audio, take screenshots of items of interest, log keyboard strokes, sniff the network, and even add-on additional malware modules as needed. 

Kaspersky Labs discovered the Flame visus, and there have been greater than 600 targets infected in more than 7 countries over the last 2 years with the greatest concentration in Iran. 

This is reminiscent of the Operation Shady Rat that was a 5-year cyber espionage attack discovered by McAfee in 2011--involving malware that affected more than 72 institutions in 14 countries. 

Separately, an attack on the U.S. Federal government's retirement investments--the Thrift Saving Plan --impacted the privacy and account information of 123,000 participants and "unathroized access"--and was reported just last week after being discovered as far back as July 2011.

Regardless of where the particular cyber attacks are initiating from, given the scale and potential impact of these, it is time to take cyber security seriously and adopt a more proactive rather than a reactive mode to it.

One can only wonder how many other cyber attacks are occuring that we don't yet know about, and perhaps never will.

We can't afford to fumble the countermeasures to the extraordinary risk we face in the playing fields of cyber warfare. 


We have to significantly strengthen our cyber defenses (and offenses) -- or else risk this "cyber house of cards" come crashing down. 

It's time for a massive infusion of funds, talent, tools, and leadership to turn this around and secure our nation's cyber infrastructure.   

(Source Photo: here with attribution to Dave Rogers)

Share/Save/Bookmark

May 19, 2012

Those In The Know, Sending Some Pretty Clear Warnings

There have been a number of leaders who have stepped up to tell people the real risks we are facing as a nation. 

They are not playing politics--they have left the arena. 

And as we know, it is much easier to be rosy and optimistic--let's face it, this is what people want to hear. 

But these leaders--national heros--sacrifice themselves to provide us an unpopular message, at their own reputational risk. 

That message is that poor leadership and decision-making in the past is threatening our present and future. 

Earlier this week (15 May 2011), I blogged about a documentary called I.O.U.S.A. with David Walker, the former Comptroller General of the United States for 10 years!

Walker was the head of the Government Accountability Office (GAO)--the investigative arm of Congress itself, and has testified before them and toured the country warning of the dire fiscal situation confronting us from our proclivity to spend future generation's money today--the spiraling national deficit.

Today, I read again in Fortune (21 May 2012) an interview with another national hero, former Admiral Mike Mullen, who was chairmen of the Joint Chiefs (2007-2011).

Mullen warns bluntly of a number of "existential threats" to the United States--nukes (which he feels is more or less "under control"), cyber security, and the state of our national debt. 

Similarly, General Keith Alexander, the Director of the National Security Agency (NSA) and the head of the Pentagon's Cyber Command has warned that DoD networks are not currently defensible and that attackers could disable our networks and critical infrastructure underpinning our national security and economic stability.

To me, these are well-respected individuals who are sending some pretty clear warning signals about cyber security and our national deficit, not to cause panic, but to inspire substantial change in our national character and strategic priorities.

In I.O.U.S.A., after one talk by Walker on his national tour, the video shows that the media does not even cover the event.

We are comfortable for now and the messages coming down risk shaking us from that comfort zone--are we ready to hear what they are saying?

(Source Photo: here with attribution to Vagawi)


Share/Save/Bookmark

May 5, 2012

Understanding Risk Management

Information Security, like all security, needs to be managed on a risk management basis.  

This is a fundamental principle that was prior advocated for the Department of Homeland Security, by the former Secretary Michael Chertoff.  

The basic premise is that we have limited resources to cover ever changing and expanding risks, and that therefore, we must put our security resources to the greatest risks first.

Daniel Ryan and Julie Ryan (1995) came up with a simple formula for determining risks, as follows:

Risk = [(Threats x Vulnerabilities) / Countermeasures)]  x  Impact

Where:

- Threats = those who wish do you harm.

- Vulnerabilities = inherent weaknesses or design flaws.

- Countermeasures = the things you do to protect against the dangers imposed.

[Together, threats and vulnerabilities, offset by any countermeasures, is the probability or likelihood of a potential (negative) event occurring.]

- Impacts = the damage or potential loss that would be done.

Of course, in a perfect world, we would like to reduce risk to zero and be completely secure, but in the real world, the cost of achieving total risk avoidance is cost prohibitive. 

For example, with information systems, the only way to hypothetically eliminate all risk is by disconnecting (and turning off) all your computing resources, thereby isolating yourself from any and all threats. But as we know, this is counterproductive, since there is a positive correlation between connectivity and productivity. When connectivity goes down, so does productivity.

Thus, in the absence of being able to completely eliminate risk, we are left with managing risk and particularly with securing critical infrastructure protection (CIP) through the prioritization of the highest security risks and securing these, going down that list until we exhaust our available resources to issue countermeasures with.

In a sense, being unable to "get rid of risk" or fully secure ourselves from anything bad happening to us is a philosophically imperfect answer and leaves me feeling unsatisfied--in other words, what good is security if we can't ever really have it anyway?

I guess the ultimate risk we all face is the risk of our own mortality. In response all we can do is accept our limitations and take action on the rest.

(Source Photo: here with attribution to martinluff)

Share/Save/Bookmark

May 4, 2012

Leadership Cloud or Flood Coming?

I came across two very interesting and concerning studies on cloud computing--one from last year and the other from last month.

Here is a white paper by London-based Context Information Security (March 2011)

Context rented space from various cloud providers and tested their security. 

Overall, it found that the cloud providers failed in 41% of the tests and that tests were prohibited in another 34% of the cases --leaving a pass rate of just 25%!

The major security issue was a failure to securely separate client nodes, resulting in the ability to "view data held on other service users' disk and to extract data including usernames and passwords, client data, and database contents."

The study found that "at least some of the unease felt about securing the Cloud is justified."

Context recommends that clients moving to the cloud should:

1) Encrypt--"Use encryption on hard disks and network traffic between nodes."

2) Firewall--"All networks that a node has access to...should be treated as hostile and should be protected by host-based firewalls."

2) Harden--"Default nodes provisioned by the Cloud providers should not be trusted as being secure; clients should security harden these nodes themselves."

I found another interesting post on "dirty disks" by Context (24 April 2012), which describes another cloud vulnerability that results in remnant client data being left behind, which then become vulnerable to others harvesting and exploiting this information.

In response to ongoing fears about the cloud, some are choosing to have separate air-gaped machines, even caged off, at their cloud providers facilities in order to physically separate their infrastructure and data--but if this is their way to currently secure the data, then is this really even cloud or maybe we should more accurately call it a faux cloud? 

While Cloud Computing may hold tremendous cost-saving potential and efficiencies, we need to tread carefully, as the skies are not yet all clear from a security perspective with the cloud. 

Clouds can lead the way--like for the Israelites traveling with G-d through the desert for 40 years or they can bring terrible destruction like when it rained for 40 days and nights in the Great Flood in the time of Noah. 

The question for us is are we traveling on the cloud computing road to the promised land or is there a great destruction that awaits in a still immature and insecure cloud computing playing field? 

(Source Photo: here with attribution to freefotouk)


Share/Save/Bookmark

April 21, 2012

Don't Throw Out The Pre-Crime With the Bathwater

The Atlantic (17 April 2012) has an article this week called " Homeland Security's 'Pre-Crime' Screening Will Never Work." 

The Atlantic mocks the Department of Homeland Security's (DHS) Future Attribute Screening Technology (FAST) for attempting to screen terrorists based on physiological and behavioral cues to analyze and detect people demonstrating abnormal or dangerous indicators.

The article calls this "pre-crime detection" similar to that in Tom Cruise's movie Minority Report, and labels it a  "super creepy invasion of privacy" and of "little to no marginal security" benefit.

They base this on a 70% success rate in "first round of field tests" and the "false-positive paradox," whereby there would be a large number of innocent false positives and that distinguishing these would be a "non-trivial and invasive task." 

However, I do not agree that they are correct for a number of reasons: 

1) Accuracy Rates Will Improve--the current accuracy rate is no predictor of future accuracy rates. With additional research and development and testing, there is no reason to believe that over time we cannot significantly improve the accuracy rates to screen for such common things as "elevated heart rate, eye movement, body temperature, facial patterns, and body language" to help us weed out friend from foe. 

2) False-Positives Can Be Managed--Just as in disease detection and medical diagnosis, there can be false-positives, and we manage these by validating the results through repeating the tests or performing additional corroborating tests; so too with pre-crime screening, false-positives can be managed with validation testing, such as through interviews, matching against terrorist watch lists, biometric screening tools, scans and searches, and more. In other words, pre-crime detection through observable cues are only a single layer of a comprehensive, multilayer screening strategy.

Contrary to what The Atlantic states that pre-crime screening is "doomed from the word go by a preponderance of false-positives," terrorist screening is actually is vital and necessary part of a defense-in-depth strategy and is based on risk management principles. To secure the homeland with finite resources, we must continuously narrow in on the terrorist target by screening and refining results through validation testing, so that we can safeguard the nation as well as protect privacy and civil liberties of those who are not a threat to others. 

Additionally, The Atlantic questions whether subjects used in experimental screening will be able to accurately mimic the cues that real terrorist would have in the field. However, with the wealth of surveillance that we have gathered of terrorists planning or conducting attacks, especially in the last decade in the wars in Iraq and Afghanistan, as well as with reams of scientific study of the mind and body, we should be able to distinguish the difference between someone about to commit mass murder from someone simply visiting their grandmother in Miami. 

The Atlantic's position is that  terrorist screening's "(possible) gain is not worth the cost"; However, this is ridiculous since the only alternative to pre-crime detection is post-crime analysis--where rather than try and prevent terrorist attacks, we let the terrorists commit their deadly deeds--and clean up the mess afterwards. 

In an age, when terrorists will stop at nothing to hit their target and hit it hard and shoe and underwear bombs are serious issues and not late night comedy, we must invest in the technology tools like pre-crime screening to help us identify those who would do us harm, and continuously work to filter them out before they attack. 

(Source Photo: here with attribution to Dan and Eric Sweeney)

Share/Save/Bookmark

December 2, 2011

Who Will Protect Those Who Protect Us?

This is a video that the Federal Law Enforcement Officers Association (FLEOA) sent to Congress to appeal to them not to cut funding to all the activities that our law enforcement officers do for us.

While the functions of government can always be more efficient--and we should constantly work to achieve these--federal law enforcement is incredibly important.

From the FBI to the Secret Service and from Border Patrol to DEA, we need to support all our federal law enforcement efforts.

These agents and officers risk their lives every day for all of us, and it's time that we stand by them to protect their mission and jobs.

Share/Save/Bookmark

November 3, 2011

Cloud, Not A Slam Dunk


Interesting article in Nextgov about the deep skepticism of cloud computing by the Corporate IT Pros.

The vast majority of IT practitioners questioned did not "believe so-called infrastructure-as-a-service providers protect e-mail, documents and other business data.”

So while many business people think that Cloud Computing is more or less safe, the IT community is not so sure.

Of 1,018 professional surveyed (of which about 60% were from IT)--only 1/3 of the IT professionals thought the cloud was secure versus 50% of the business compliance supervisors.

Cloud is not a slam dunk and we need to evaluate every implementation very carefully.

(Source Photo: here)

Share/Save/Bookmark

September 25, 2011

They're Not Playing Ketchup

I wouldn't necessarily think of Heinz as a poster child for a company that is strategic and growing, and was therefore, somewhat surprised to read an impressive article in Harvard Business Review (October 2011) called "The CEO of Heinz on Powering Growth in Emerging Markets."

Heinz, headquartered out of Pittsburgh PA, is ranked 232 in the Fortune 500 with $10.7B in sales, $864M in profits, and 35,000 employees. They have increased their revenue from emerging markets from 5% a few years ago to more than 20% today.

Bill Johnson, the CEO of Heinz, explains his 4 As for success--which I really like:


1) Applicability--Your products need to suit local culture. For example, while Ketchup sells in China, soy sauce is the primary condiment there, so in 2010, Heinz acquired Foodstar in China, a leading brand in soy sauce.

2) Availability--You need to sell in channels that are relevant to the local populace. For example, while in the U.S., we food shop predominantly in grocery stores, in other places like Indonesia, China, India, and Russia, much food shopping is done in open-air markets or corner groceries.

3) Affordability--You have to price yourself in the market. For example, in Indonesia, Heinz sells more affordable small packets of soy sauce for 3 cents a piece rather than large bottles, which would be mostly unaffordable and where people don't necessarily have refrigerators to hold them.

4) Affinity--You want local customers and employees to feel close with your brand. For example, Heinz relies mainly on local managers and mores for doing business, rather than trying to impose a western way on them.

Heinz has a solid strategy for doing business overseas, which includes "buy and build"--so that they acquire "solid brands with good local management that will get us into the right channels...then we can start selling other brands."

Heinz manages by being risk aware and not risk averse, diversifying across multiple markets, focusing on the long-term, and working hard to build relationships with the local officials and managers where they want to build businesses.

"Heinz is a 142-year old company that's had only five chairmen"--that's less than the number of CEO's that H-P has had in the last 6 years alone.

I can't help but wonder on the impact of Heinz's stability and laser-focus to their being able to develop a solid strategy, something that a mega-technology company like H-P has been struggling with for some time now.

If H-P were to adopt a type of Heinz strategy, then perhaps, they would come off a little more strategic and less flighty in their decisions to acquire and spin off business after business (i.e. PCs, TouchPads, WebOS, etc.), and change leadership as often as they do with seemingly little due diligence.

What is fascinating about H-P today is how far they have strayed front their roots of their founders Bill and Dave who had built an incredibly strong organizational culture that bred success for many years.

So at least in this case, is it consumer products or technology playing catch-up (Ketchup) now?

P.S. I sure hope H-P can get their tomatoes together. ;-)

(Source Photos: Heinz here and H-P here)

Share/Save/Bookmark

September 9, 2011

Visualizing IT Security


I thought this infographic on the "8 Levels of IT Security" was worth sharing.

While I don't see each of these as completely distinct, I believe they are all important aspects of enterprise security, as follows:

1) Risk Management - With limited resources, we've got to identify and manage the high probability, high impact risks first and foremost.

2) Security Policy - The security policy sets forth the guidelines for what IT security is and what is considered acceptable and unacceptable user behavior.

3) Logging, Monitoring, and Reporting - This is the eyes, ears, and mouth of the organization in terms of watching over it's security posture.

4) Virtual Perimeter - This provides for the remote authentication of users into the organization's IT domain.

5) Environment and Physical - This addresses the physical protection of IT assets.

6) Platform Security - This provides for the hardening of specific IT systems around aspects of its hardware, software, and connectivity.

7) Information Assurance - This ensures adequate countermeasures are in place to protect the confidentiality, integrity, availability, and privacy of the information.

8) Identification and Access Management - This prevents unauthorized users from getting to information they are not supposed to.

Overall, this IT security infographic is interesting to me, because it's an attempt to capture the various dimensions of the important topic of cyber security in a straightforward, visual presentation.

However, I think an even better presentation of IT security would be using the "defense-in-depth" visualization with concentric circles or something similar showing how IT security products, tools, policies, and procedures are used to secure the enterprise at every level of its vulnerability.

IT security is not just a checklist of do's and don't, but rather it is based on a truly well-designed and comprehensive security architecture and its meticulous implementation for protecting our information assets.

Does anyone else have any other really good visualizations on cyber security?

(Source Photo: here)

Share/Save/Bookmark

September 4, 2011

9/11 - A Lesson In Risky Business

Corresponding to the 10th anniversary of 9/11, Bloomberg BusinessWeek (5-11 Sept 2011) has a great article on risk management called The G-d Clause.

When insurers take out insurance--this is called reinsurance, and reinsurers are "on the hook for everything, for all the risks that stretch the limits of the imagination"--that's referred to as The G-d Clause--whatever the almighty can come up with, the "reinsurers are ultimately responsible for" paying for it.

And obviously, when insurers and reinsurers don't well imagine, forecast, and price for risky events--they end up losing money and potentially going out of business!

Well when it came to 9/11, insurers lost fairly big financially--to the tune of $23 billion (it is in fact, the 4th costliest disaster since 1970 after Japan's tsunami, earthquake and Fukushima nuclear disaster ($235B), and hurricanes Katrina ($72B) and Andrew ($25B) in the U.S.)

Even Lloyd's "that invented the modern profession of insurance [and] publishes a yearly list of what it calls 'Realistic Disaster Scenarios,'" and while they had imagined 2 airlines colliding over a city, even they failed to anticipate the events of September 11, 2001.

According to the article, even insurers that make their living forecasting risks, "can get complacent."

And the psychology of the here and now, where "people measure against the perceived reality around them and not against the possible futures" is the danger we face in terms of being unprepared for the catastrophic events that await, but are not foretold.

In a sense, this is like enterprise architecture on steroids, where we know our "as-is" situation today and we try to project our "to-be" scenario of the future; if our projection is to far off the mark, then we risk either failing at our mission and/or losing money, market share, or competitive advantage.
The ability to envision future scenarios, balancing reality and imagination, is critical to predict, preempt, prepare, and manage the risks we face.

Post 9/11, despite the stand-up of a sizable and impressive Department of Homeland Security, I believe that our achilles heel is that we continue to not be imaginative enough--and that is our greatest risk.

For example, while on one hand, we know of the dangers of weapons of mass destruction--including nuclear, chemical, biological, and radiological devices--as well as new cyber weapons that can threaten us; on the other hand, we have trouble imagining and therefore genuinely preparing for their actual use.

Perhaps, it is too frightening emotionally or we have trouble coping practically--but in either case, the real question is are we continuing to proceed without adequate risk-loss mitigation strategies for the future scenarios we are up against?

Frankly, living in the suburbs of our nations capital, I am fearful at what may await us, when something as basic as our power regularly goes out, when we get just a moderate rain storm in this area. How would we do in a real catastrophe?

In my mind, I continue to wonder what will happen to us, if we proceed without taking to heart the serious threats against us--then the tragic events of 9/11 will have unfortunately been lost on another generation.

Like with the reinsurers, if we do not open our minds to perceive the catastrophic possibilities and probabilities, then the risky business that we are in, may continue to surprise and cost us.

(All opinions my own)

(Source Photo: here)

Share/Save/Bookmark

August 28, 2011

Can't Live With Them, Can't Live Without Them

I remember years ago, my father used to joke about my mother (who occasionally got on his nerves :-): "you can't live with them, and you can't live without them."

Following the frequently dismal state of IT project performance generally, I'm beginning to think that way about technology projects.

On one hand, technology represents innovation, automation, and the latest advances in engineering and science--and we cannot live without it--it is our future!

On the other hand, the continuing poor track record of IT project delivery is such that we cannot live with it--they are often highly risky and costly:
  • In 2009, the Standish Group reported that 68% of IT projects were failing or seriously challenged--over schedule, behind budget, and not meeting customer requirements.

  • Most recently, according to Harvard Business Review (September 2011), IT projects are again highlighted as "riskier than you think." Despite efforts to rein in IT projects, "New research shows surprisingly high numbers of out-of-control tech projects--ones that can sink entire companies and careers."

  • Numerous high profile companies with such deeply problematic IT projects are mentioned, including: Levi Strauss, Hershey's, Kmart, Airbus, and more.

  • The study found that "Fully one in six of the projects we studied [1,471 were examined] was a black swan, with a cost overrun of 200% on average, and a schedule overrun of almost 70%."

  • In other words there is a "fat tail" to IT project failure. "It's not that they're particularly prone to high cost overruns on average...[rather] an unusually large proportion of them incur massive overages--that is, there are a disproportionate number of black swans."

  • Unfortunately, as the authors state: "these numbers seems comfortably improbable, but...they apply with uncomfortable frequency."
In recent years, the discipline of project management and the technique of earned value management have been in vogue to better manage and control runaway IT projects.
At the federal government level, implementation of such tools as the Federal IT Dashboard for transparency and TechStats for ensuring accountability have course-corrected or terminated more than $3 billion in underperforming IT projects.
Technology projects, as R&D endeavors, come with inherent risk. Yet even if the technical aspect is successful, the human factors are likely to get in the way. In fact, they may be the ultimate IT "project killers"--organizational politics, technology adoption, change management, knowledge management, etc.
Going forward, I see the solution as two-pronged:
  • On the one hand we must focus on enhancing pure project management, performance measurement, architecture and governance, and so on.

  • At the same time, we also need to add more emphasis on people (our human capital)--ensuring that everyone is fully trained, motivated, empowered and has ownership. This is challenging considering that our people are very much at a breaking point with all the work-related stress they are facing.
These days organizations face numerous challenges that can be daunting. These range from the rapid pace of change, the cutthroat global competition at our doorsteps, a failing education system, spiraling high unemployment, and mounting deficits. All can be helped through technology, but for this to happen we must have the project management infrastructure and the human factors in place to make it work.

If our technology is to bring us the next great breakthrough, we must help our people to deliver it collaboratively.

The pressure is on--we can't live with it and we cannot live without it. IT project failures are a people problem as much as a technology problem. However, once we confront it as such, I believe that we can expect the metrics on failed IT projects to change significantly to success.

(Source Photo: here)


Share/Save/Bookmark

May 6, 2011

Avoiding The Ultimate In Surprise


Everyone remember the I Love Lucy show? Well, that show really epidemized what it meant to surprise and be surprised by all the antics that the main character, Lucy, got into--show after show.

One thing that's very clear is that no one really likes surprises (except maybe for some comic relief and that's one reason I believe the show was the most popular season after season).

So what's the problem with surprises? They are not inherently bad--there can be good surprise and bad ones.

The issue is really that people want to be prepared for whatever is coming there way.

Even surprise parties or gifts somehow seem sweeter when the recipient isn't completely "taken by surprise."

One of my bosses used to often repeat to the team, "I don't like surprises!"

Hence, the importance of what we all got in the habit of saying--communicate, communicate, communicate--early and often.

With the tragic tornados that struck last week across the south killing some 329 people, we are reminded how important early warning to surprises in life can be.

The Wall Street Journal reports today that new technologies are being developed for early warning of these tornados such as:

- Visual cues--Antennas that can track cloud-to-loud lighting, which is often invisible from the ground, but it "drops sharply in a storm just before a tornado develops" and can therefore provide early detection for those that can see it.

- Sound waves--Using "infrasonic microphones" we can pick up storm sounds from as far as 500 miles away at frequencies too low to be detected by the human ear and can filter out the noise to track the storm's severity and speed, and therefore hear in advance if it is turning dangerous.

Early warning saves lives...even a few extra minutes can provide the much needed time for a person to get to a shelter.

After the 2004 Indian Ocean earthquake and tsunami which killed more than 230,000 people, an early warning system was put in place there and again with the the recent Japanese earthquake and tsunami of 2011, we see the ongoing need for these efforts to advance globally.

These efforts for early detection and alerts have always been around.

Already thousands of years ago, settlers built lookout towers and fire signals to get and give early notice of an advancing army, marauders, dangerous beasts, or other pending dangers.

Nowadays, we have satellites and drones providing "eyes in the sky" and other technologies (like the proverbial trip wires and so on) are being developed, refined, and deployed to protect us.

Advance warning and preparation is important for risk management and life preservation and leveraging technology to the max for these purposes is an investment that is timeless and priceless.

The challenge is in identifying the greatest risks (i.e. those with the most probability of happening and the biggest impact if they do) so that we can make our investments in the technologies to deal with them wisely.

Share/Save/Bookmark

April 2, 2011

The Cost of Underestimating Technology


While research is important and I respect the people who devote themselves to doing this, sometimes they risk being disconnected from reality and the consequences associated with it.

From the Wall Street Journal, 2 April 2011--two economists calculated that "$1,700 is the benefit the average American derives from personal computers each year."

They call this the "benefit we get from computers above and beyond what we pay for them."

To me, this figure seems inconsistent with common sense and the realities on the ground.

In an information age, where we are connected virtually 24 x 7 and can download hundreds of thousands of apps for free, endlessly surf the internet, shop and bank online, get much of our entertainment, news, and gaming on the the web, and communicate around the globe by voice, video, and text for the cost on a monthly high speed connection, I say hogwash.

Moreover, we need to factor in that most of us are now information workers (about 20%) or depend on technology in performing our jobs everyday and earning our living.

Just yesterday in fact, the Wall Street Journal reported that more people work for the government (22.5 million--forget the private sector information workers for the moment) than in construction, farming, fishing, forestry, manufacturing, mining, and utilities combined!

Additionally, at work, we are using computers more and more not only for transaction processing, but also for content management, business intelligence, collaboration, mobility (and robotics and artificial intelligence is coming up fast).

Finally, technology enables breakthroughs--in medicine, energy, environment, education, materials sciences, and more--the impact of technology to us is not just now, but in the potential it brings us for further innovations down the road.

So is the benefit that you get from computers less than $5 day?

I know for me that's the understatement of a lifetime.

Apparently by some, technology continues to be misunderstood, be undervalued and therefore potentially risks being underinvested in, which harms our nations competitiveness and our collective future.

As much respect as I have for economics, it doesn't take an economist to think with common business sense.

Share/Save/Bookmark

March 20, 2011

Fixing The Information Flow

So check this out--H2Glow has an LED faucet light that it temperature sensitive and turns blue for cold water and red for hot.

When I saw this, I thought this would be a great metaphor for managing the information flow from our organizations--where we could quickly and simply see whether the information flowing was sharable and for public consumption ("blue") or whether something was private and proprietary ("red").

The Economist, 24 February 2011, in an article called "The Leaky Corporation" writes: "Digital information is easy not only to store, but also to leak. Companies must decide what they really need to keep secret, and how best to do so."

Like a faucet that gushes water, our organizations are releasing information--some with intent (where we are in control) and much without (due is spillage and pilferage).
In the age of WikiLeaks, computer hackers, criminals, terrorists, and hostile nation states, as well as the insider threat, information is leaking out uncontrollably from our organizations and this puts our vital competitive information, national secrets, and personal privacy information at risk (i.e. health, financial, identity, and so on).

Of course, we want the proverbial blue light to go on and information to be shared appropriately for collaboration and transparency, but at the same time, we need to know that the light will turn red and the information will stop, when information is justifiably private and needs to be kept that way.

Being an open and progressive society, doesn't mean that that there is only cold water and one color--blue. But rather, that we can discern the difference between cold and hot, blue and red, and turn the faucet on and off, accordingly.

Information is proliferating rapidly, and according to IDC, a market research firm, the "digital universe" is expected to "increase to 35 zettabytes by 2020."--a zettabyte is 1 trillion gigabytes or the equivalent of 250 billion DVDs.

Therefore, the necessity of filtering all this digitally available information for inside use and outside consumption is going to become more and more critical.

According to The Economist article, we will need to employ the latest techniques and automation tools in:

- Enterprise Content Management--to "keep tabs on digital content, classify it, and define who has access to it."

- Data Loss Prevention--using "software that sits at the edge of a firm's network and inspects the outgoing data traffic."

- Network Forensics--"keep an eye on everything in the a corporate network and thus...detect a leaker."

Of course, as the Ciso chief security officer says: "technology can't solve the problem, just lower the probability of accidents."
In the end, we need to make sure people understand the vulnerability and the dangers of sharing the "red" information.
We can focus our employees on protecting the most critical information elements of the organization by a using a risk management approach, so that information with the high probability of a leak and with the greatest possible negative impact to the organization is filtered and protected the most.

The leaky faucet is a broken faucet and in this case we are all the plumbers.

Share/Save/Bookmark

August 29, 2010

Why EA and CPIC?

Note: This is not an endorsement of any vendor or product, but I thought this short video on enterprise architecture planning and capital planning and investment control/portfolio management was pretty good.


Share/Save/Bookmark

May 15, 2010

What’s Lurking In The Update?

In defense, it is a well-known principle that you determine your critical infrastructure, and then harden those defenses—to protect it.

This is also called risk-based management, because you determine your high impact assets and the probability that they will be “hit” and deem those the high risks ones that need to be most protected.

In buttressing the defenses of our critical infrastructure, we make sure to only let in trusted agents. That’s what firewalls, anti-virus, spyware, and intrusion prevention systems are all about.

In so-called “social engineering” scams, we have become familiar with phony e-mails that contain links to devastating computer viruses. And we are on the lookout for whether these e-mails are coming from trusted agents or people we don’t know and are just trying to scam us.

What happens though when like the Trojan Horse in Greek times, the malware comes in from one of the very trusted agents that you know and rely on, for example, like from a software vendor sending you updates for your regular operating system or antivirus software?

ComputerWorld, 10 May 2010, reports that a “faulty update, released on April 21, [by McAfee] had corporate IT administrators scrambling when the new signatures [from a faulty antivirus update] quarantined a critical Windows systems file, causing some computers running Windows XP Service Pack 3 to crash and reboot repeatedly.”

While this particular flawed security file wasn’t the result of an action by a cyber-criminal, terrorist or hostile nation state, but rather a “failure of their quality control process,” it begs the question what if it was malicious rather than accidental?

The ultimate Trojan Horse for our corporate and personal computer systems are the regular updates we get from the vendors to “patch” or upgrade or systems. The doors of our systems are flung open to these updates. And the strategic placement of a virus into these updates that have open rein to our core systems could cause unbelievable havoc.

Statistics show that the greatest vulnerability to systems is by the “insider threat”—a disgruntled employee, a disturbed worker, or perhaps someone unscrupulous that has somehow circumvented or deceived their way past the security clearance process (or not) on employees and contractors and now has access from the inside.

Any well-placed “insider” in any of our major software providers could potentially place that Trojan Horse in the very updates that we embrace to keep our organizations secure.

Amrit Williams, the CTO of BIGFIX Inc. stated with regards to the faulty McAfee update last month, “You’re not talking about some obscure file from a random third party; you’re talking about a critical Windows file. The fact that it wasn’t found is extremely troubling.”

I too find this scenario unnerving and believe that our trusted software vendors must increase their quality assurance and security controls to ensure that we are not laid bare like the ancient city of Troy.

Additionally, we assume that the profit motive of our software vendors themselves will keep them as organizations “honest” and collaborative, but what if the “payoff” from crippling our systems is somehow greater than our annual license fees to them (e.g., terrorism)?

For those familiar with the science fiction television series BattleStar Galactica, what if there is a “Baltar” out there ready and willing to bring down our defenses to some lurking computer virus—whether for some distorted ideological reason, a fanatical drive to revenge, or a belief in some magnanimous payoff.

“Trust but verify” seems the operative principle for us all when it comes to the safety and security of our people, country and way of life—and this applies even to our software vendors who send us the updates we rely on.

Ideally, we need to get to the point where we have the time and resources to test the updates that we get prior to deploying them throughout our organizations.


Share/Save/Bookmark