Showing posts with label Errors. Show all posts
Showing posts with label Errors. Show all posts

April 19, 2021

How Many Is Too Many?

Interesting sign:

Take Risks Make Mistakes.

And the "i" is missing in Mistakes!

I didn't even take the risks, but I still made the mistakes.  ;-)

(Credit Photo: Andy Blumenthal)


Share/Save/Bookmark

April 29, 2016

Losing Our Tech-osterone

So a vendor comes in and does a pitch and demo for a product we were interested in. 

But this technology vendor, a Fortune 100 company, couldn't figure out how to plug in their laptop for the demonstration. 

The presenter is holding his plug from the computer and comparing it to the ports on the monitor and going, "Is it a male or is it a female?"

It's almost like he's going innie or outtie...

And he's repeating this over and over again as he keeps trying to plug in his cord to the various openings. 

Everyone is sitting sort of uncomfortably at this point, and so I try to break the tension and say, "I didn't know we were going to be getting an anatomy lesson today."

Well, we got the guy some technical help--the government to the rescue--and before long, he figured out the males and the females and the presentation was on the screen. 

The only problem, the title slide for his presentation had a misspelling for the product they were selling. 

At this point, all I can say is, this is why American business is getting soft!  ;-)

(Source Photo: Andy Blumenthal)
Share/Save/Bookmark

June 4, 2015

Losing Deadly Control

So today we hear that there was a horrible mistake in which at least 52 sites (in 18 states here and 3 other countries) were inadvertently sent LIVE anthrax!!!

This after a prior incident in December where ebola had been mishandled and a technician potentially exposed. 

Again last August, they announced that a lab had accidentally cross-contaminated benign bird flu virus with a deadly strain of it. 

And there are at least five other major mishaps just since 2009 including more with anthrax and bird flu as well as with Brucella and botulism--these involved everything from using improper sterilization and handling techniques to inadvertent shipments of deadly live germs. 

Also in July, the CDC discovered six vials of LIVE smallpox in an unused storage room at the NIH.

This is reminiscent of similar gaffes by the military with an inadvertent shipment in 2007 by the Air Force of six nuclear warheads while the crew was unaware that they were even carrying it.

And here we go again (a doozy this time), information was disclosed in 2013 that we nearly nuked ourselves (specifically North Carolina) with 2 hydrogen bombs (260 times more powerful than that exploded on Hiroshima) in 1961. 

Yes, mistakes happen, but for weapons of mass destructions that we are talking about here, there are layers of safeguards that are supposed to be strictly in place. 

After each incident, it seems that some official acknowledges the mistakes made, says sorry, and claims things are going to be cleaned up now. 

But if the same or similar mistakes are made over and over again, then what are we really to believe, especially when millions of lives are at stake?

We have too much faith in the large bureaucratic system called government that despite how well it could be run, very often it isn't and is prone to large and dangerous errors and miscalculations.

With all due respect for our experts in these areas, we need to spend a lot more time and effort to ensure the safety of our most dangerous stockpiles--be it of nuclear, chemical, biological, or radiological origin. 

We can't afford any more mistakes--or the next one could be more than just a simple (not) embarrassment.

What good is all the preparation to win against our enemies, if we are our own worst enemy or we have meet the enemy and it is us! ;-)

(Source Photo: Andy Blumenthal)
Share/Save/Bookmark

October 12, 2013

Parole By Analytics

Interesting article in the Wall Street Journal about parole boards using software to predict repeat offenders before letting someone go free. 

What used to be a decision based on good behavior during time served, showing remorse to the parole board, and intuition is being augmented with "automated assessments" that include inmate interviews, age of first arrest, type of crime, and so forth.

At least 15 states have adopted "modern risk assessment methods" to determine the potential for recidivism. 

Individuals are marked as higher risk if they are:

- Young--age 18-23 (and impulsive)
- Offense was drug-related
- Suspended or expelled from school
- Quit a job prior to having another one 
- Single or separated
- Diagnosed with a mental disorder
- Believes that it's not possible to overcome their past. 

Surprisingly, violent criminals (rapists and murders) are actually considered lower risk those guilty of nonviolent property crimes--the thinking being the someone convicted of robbery is more likely to repeat the criminal behavior because the crime is one that "reflects planning and intent."

Honestly, I think it is more than ridiculous that we should rank violent criminals less risky than thieves and release them because they had what is considered an "emotional outburst."

Would you rather have some thieves back on the street or murders and rapists--rhetorical question!

But it just shows that even the best of systems that are supposed to help make better decisions--can instead be misused or abused.

This happens when there is either bad data (such as from data-entry mistakes, deceptive responses, and missing relevant information) or from poorly designed decision rules/algorithms are applied.

The Compas system is one of the main correctional software suites being used, and the company Northpointe (a unit of Volaris) themselves advise that officials should "override the system's decisions at rates of 8% to 15%."

While even a 1/7 error rate may be an improvement over intuition, we need to still do better, especially if that 1 person commits a violent hideous crime that hurts someone else in society, and this could've been prevented. 

It's certainly not easy to expect a parole board to make a decision of whether to let someone out/free in 20 minutes, but think about the impact to someone hurt or killed or to their family, if the wrong decision is made. 

This is a critical governance process that needs:

- Sufficient time to make important decisions
- More investment in tools to aid the decision process
- Refinement of the rules that support release or imprisonment
- Collection of a broad base of interviews, history, and relevant data points tied to repeat behavior
- Validation of information to limit deception or error.

Aside from predicting whether someone is likely to be repeat offenders, parole boards also need to consider whether the person has been both punished in accordance with the severity of the crime and rehabilitated to lead a productive life going forward. 

We need to decide people's fates fairly for them, justly for the victims, and safely for society--systems can help, but it's not enough to just "have faith in the computer." ;-)

(Source Photo: Andy Blumenthal)
Share/Save/Bookmark

May 25, 2011

Apples or Oranges

There are lots of biases that can get in the way of sound decision-making.

An very good article in Harvard Business Review (June 2011) called "Before You Make That Big Decision" identifies a dozen of these biases that can throw leaders off course.

What I liked about this article is how it organized the subject into a schema for interrogating an issue to get to better decision-making.

Here are some of the major biases that leaders need to be aware of and inquire about when they are presented with an investment proposal:


1) Motivation Errors--do the people presenting a proposal have a self-interest in the outcome?

2) Groupthink--are dissenting opinions being actively solicited and fairly evaluated?

3) Salient Analogies--are analogies and examples being used really comparable?

4) Confirmation Bias--has other viable alternatives been duly considered?

5) Availability Bias--has all relevant information been considered?

6) Anchoring Bias--can the numbers be substantiated (i.e. where did they come from)?

7) Halo Effect--is success from one area automatically being translated to another?

8) Planning Fallacy--is the business case overly optimistic?

9) Disaster Neglect--is the worst-case scenario imagined really the worst?

10) Loss Aversion--is the team being overly cautious, conservative, and unimaginative?

11) Affect Heuristic--are we exaggerating or emphasizing the benefits and minimizing the risks?

12) Sunk-Cost Fallacy--are we basing future decision-making on past costs that have already been incurred and cannot be recovered?

To counter these biases, here are my top 10 questions for getting past the b.s. (applying enterprise architecture and governance):

1) What is the business requirement--justification--and use cases for the proposal being presented?

2) How does the proposal align to the strategic plan and enterprise architecture?

3) What is return on investment and what is the basis for the projections?

4) What alternatives were considered and what are the pros and cons of each?

5) What are the best practices and fundamental research in this area?

6) What are the critical success factors?

7) What are the primary risks and planned mitigations for each?

8) What assumptions have been made?

9) What dissenting opinions were there?

10) Who else has been successful implementing this type of investment and what were the lessons learned?

While no one can remove every personal or organizational bias that exists from the decision-making equation, it is critical for leaders to do get beyond the superficial to the "meat and potatoes" of the issues.

This can be accomplished by leaders interrogating the issues themselves and as well as by establishing appropriate functional governance boards with diverse personnel to fully vet the issues, solve problems, and move the organizations toward a decision and execution.
Whether the decision is apples or oranges, the wise leader gets beyond the peel.

Share/Save/Bookmark