Showing posts with label Analysis. Show all posts
Showing posts with label Analysis. Show all posts

January 30, 2018

Take Off The Halo and Horn

Thought this was a learning moment. 

The halo and horn effects. 

This has to do with generalizing about people, things, places, or events. 

With the halo effect, if we like (are positive) about one or a few things about it, we may put a proverbial halo on it and treat or rate everything about it as great.

Similarly, with the horn effect, if we dislike (are negative) about one or a few things about it, we may put a proverbial horn on it and treat or rate everything about it as horrible. 

This means we're not really being objective or balanced in our assessment. 

Usually, it's not all just good or bad, black or white--but good AND bad, black AND white.  

And obviously, this can cause us to make bad decisions based on poor analysis and judgment. 

Therefore, the importance of taking a step back, looking holistically at all the facts, and evaluating things for what they really are, rather than making snap calls to judgment--and poor ones at that! ;-)

(Source Photo: here with attribution to darksouls1)
Share/Save/Bookmark

July 27, 2017

When You Need To BLUF

Most professional (and even personal) communications should start with...
________________________

BLUF (Bottom Line Up Front).

This means that you start with the ending--in mind, on paper, verbally, and in digital format. 

You provide the conclusion and/or recommendations right up front.

Rather than first wadding through all the details--context, analysis, considerations, assumptions, risks, etc. 

Let the reader know right away what it is you want. 

Generally, this is different than an abstract or summary that provides a synopsis and leading evidence for the argument put forward. 

Tell me what I need to know and get right to the point! ;-)

(Source Photo: Andy Blumenthal)
Share/Save/Bookmark

May 28, 2017

Arguing The Negative

I thought this was an interesting sign this gentlemen had.

It says:


"Those who reject Jesus do so because of sin, not science or evidence."

Overall, religion is a matter of personal faith not to be argued, but rather when based to good, to be wholly respected. 

This argument though was basically saying, not to reject this particular tenet of faith of a major religion because there is "not science or evidence" from which to reject.

But usually, don't we look for science or evidence to accept or do something. 

In other words, the default usually is that if you want me to believe in something or somebody, prove to me why I should

It's a bad argument when you ask me to prove to you why you shouldn't believe in something. 

Very often this is the same argument people use in relationships and in organizations.

We do the same thing everyday or over and over again, and we often don't ask ourselves why we do it this way or believe this is a good way of doing something...we just do it. 

And in fact, when someone new comes in with "fresh eyes" and questions why we do it a certain way or have we considered another approach, we ask them to prove to us with "science or evidence" why their way is better, rather than reexamine our own ways and means.

I'm not in any way questioning here G-d or religion, but rather simply our approach to self-examination, introspection, and betterment.

Don't ask me to prove to you why you should reject something, but rather be prepared to defend your hypothesis. ;-)

(Source Photo: Andy Blumenthal)
Share/Save/Bookmark

January 23, 2014

From Memorization To Thinking

Our education system continues to suffer as we rank somewhere between 17th and 20th globally. 

This means that our economy will assuredly suffer in the future from the global competition that strangles us.

Some prominent experts in the field, like Walter Isaacson, say that innovation occurs at the intersection of arts and humanities meeting science and math--and I really like that. 

Personally, this inspires me to think about whether education reform is perhaps focused too much on the teachers, tests, and core curriculum, and less on changing the way we are approaching education in the first place. 

For as long as I can remember (i.e. even when I was in school way back when), we based our education on lots of memorization--multiplication tables, periodic tables, vocabulary, history, and much more. 

For those with great short term memory, you could do very well to memorize, spit it out, and forget it, so you can start all over again with the next great wave of facts and figures. 

The emphasis on memorization of basics, is important in getting a foundation of knowledge, but seems to me to come at the expense of critical thinking and problem solving skills. 

From my own experience and watching my kids in school, I often see boredom at raw facts, and excitement and self-satisfaction at figuring something out. 

Yet, too often students are asked to do rote memorization and test accordingly, rather than really think. 

You can't memorize innovation, but rather you need to be able to apply learning. 

In this day and age, where facts are but a Google search away, memorization is less important and real analytical, reasoning, problem solving, and communication skills (all anchored in solid core values) are more relevant to our national and personal success. 

Yet, have our school caught up with this?

Unfortunately, it seems most have not, and perhaps that is one reason that many of our preeminent innovators are dropouts--from Steve Jobs to Bill Gates, Mark Zuckerberg, Larry Ellison, Michael Dell, Henry Ford, Walt Disney, Richard Branson, Ted Turner, etc. 

Will we ever get away completely from memorizing the basics? Certainly not. Do we need to spend so much of K-12 education and even college years playing instant recall? What a waste!

The best experience that I remember from my younger daughter in school was her activities in the Ethics Bowl, where schools competed in analyzing ethically challenging situations and arguing the merits of the various sides. They learned to think and articulate their reasoning and conclusions and that is the best education that I can imagine. 

Until we stop using education techniques from the dinosaur age--memorizing species and trying to recall where the eggs are buried, I fear we are doomed to subpar educational performance--in a boring, memorizing, and non-thinking way. 

No wonder the kids want to develop the next great iPhone app and use their textbooks as a handy-dandy booster seat. ;-)

(Source Photo: here with attribution to Lansing Public Library)

Share/Save/Bookmark

November 1, 2013

Why Memorize?

G-d, I remember as a kid in school having to memorize everything for every class--that was the humdrum life for a schoolchild.

Vocabulary words, grammar rules, multiplication tables, algebraic and geometric equations, scientific formulas, historical events, famous quotes, states and capitals, presidents, QWERTY keys, and more. 

It was stuff it in, spit it out, and basically forget it.

This seemed the only way to make room for ever more things to memorize and test out. 

In a way, you really had to memorize everything, because going to a reference library and having  to look up on the stacks of endless shelves or microfiche machines was a pain in the you know what. 

Alternatively, the home dictionary, theasarus, and encyclopeda were indispensible, but limited, slow, dated, and annoying. 

But as the universe of knowledge exploded, became ever more specialized, and the Internet was born, looking something up was a cinch and often necessary. 

All of a sudden, memorization was out and critical thinking was in. 

That's a good thing, especially if you don't want people who are simple repositories of stale information, but rather those who can question, analyze, and solve problems. 

Albert Einstein said, "Never memorize something that you can look up."

But an interesting editorial in the Wall Street Journal by an old school teacher questions that logic. 

David Bonagura Jr. proposes that critical thinking and analysis "is impossible without first acquiring rock-solid knowledge of the foundational elements upon which the pyramid of cognition rests."

He says, "Memorization is the most effective means to build that foundation."

As a kid, I hated memorization and thought it was a waste of time, but looking back I find that more things stayed in that little head of mine than I had thought. 

I find myself relying on those foundations everyday...in writing, speaking, calculating, and even remembering a important story, principle, saying or even song lyrics.

These come out in my work--things that I thought were long lost and forgotten, but are part of my thinking, skills, and truly create a foundation for me to analyze situations and solve problems. 

In fact, I wish I knew more and retained it all, but short-term memory be damned. 

We can't depend on the Internet for all the answers--in fact, someday, it may not be there working for us all, when we need it. 

We must have core knowledge that is vital for life and survival and these are slowly being lost and eroded as we depend on the Internet to be our alternate brains. 

No, memorizing for memorization's sake is a waste of time, but building a foundation of critical skills has merits. 

Who decides what is critical and worthwhile is a whole other matter to address.

And are we building human automatons full of worthless information that is no longer relevant to today's lifestyles and problems or are we teaching what's really important and useful to the human psche, soul, and evolution. 

Creativity, critical thinking, and self-expression are vital skills to our ability to solve problems, but these can't exist in a vacuum of valuable brain matter and content.

It's great  to have a readily available reference of world information at the tips of our fingertips online, but unless you want to sound (and act) like an idiot, you better actually know something too. ;-)

(Source Photo: here with attribution to Chapendra)
Share/Save/Bookmark

February 23, 2013

Analyzing The Law


So I am back in school AGAIN (I'm a life-long learner), augmenting my not so slow-paced job.

Let's just say that at this point, I recognize that the more I know, the more I don't know anything. 

The class that I am taking now is Cyberlaw, and while I did take law in business school--many moons ago--that was more focused on contracts and business organizations. 

This class looks interesting from the perspective of the legal and regulatory structure to deal with and fight cybercrime, -terrorism, and -war.

One interesting thing that I already learned was a technique for evaluating legal cases called IRAC, which stands for:

- Issues--the underlying legal matters that the case is addressing.

- Rules--what legal precedents can be applied.

- Analysis--whether those rules apply or not, in this case.

- Conclusion--rendering an opinion on the case.

This is a structured way to analyze any legal case. 

Of course, before you do these, you have to look at the facts--so that is the very first section. 

The problem with that is then you have F-IRAC and that can definitely be taken the wrong way. ;-)

(Source Photo: Andy Blumenthal)

Share/Save/Bookmark

February 15, 2013

The Counterterrorism Calendar


The National Counterterrorism Center (NCTC) "leads our nations efforts to combat terrorism at home and abroad by analyzing the threat, sharing that information with our partners, and integrating all instruments of national power to ensure unity of effort."  The NCTC is part of the Office of the Director of National Intelligence (ODNI). 

Not since the playing cards used in the 2003 Iraqi invasion with the most-wanted identified on the cards have I seen the employ of such a common tool for sharing such important information--until now with the development by the NCTC of a Counterterrorism Calendar

Typically, pin-up calendars have been devoted to beautiful models, Dilbert cartoons, and areas of personal interests and hobbies--such as cars, sports, aircraft, boats, or whatever.

I was impressed to see this concept used for sharing counterterrorism information; really, this is something that we should be mindful of every day--it's about our safety and national security.

The counterterrorism calendar has both a website and a PDF download

The website has an interactive timeline, map, and terrorist profiles--so you can learn about terrorism by time and space and those who commit the atrocities. 

Timeline--you can view by month and day the major terrorist acts that have occurred--and many days have more than one terrorist act associated with it--and only seven days out of the whole calendar year have no terrorist acts listed--so for those who are focused on just 9/11, there is a whole calendar waiting for you to view. 

Map--the map allows you to see the home base and geographical sphere of influence of many terrorist organizations--17 of them--along with a profile of each of those terrorist groups. There is also a button on the bottom of the page to see all the countries impacted with victims from 9/11--there are 91 countries shown with victims from this single catastrophic event alone.

Terrorists--the site has a list of terrorists with their profiles, identifying information, what they are wanted for, and amount of reward offered, or whether they have already been captured or killed. There is also a list of the 10 most wanted off to the right side of the page--with a rewards of $25 million listed for the #1 spot for Ayman al-Zawahiri.

The downloadable calendar has this information in a 160 page color-calendar--with a wealth of information for a calendar format like this--it is so large, I don't think you could actually hang this calendar because no regular push pins could actually hold it.

So if you can pull yourself away from the stereotypical Sports Illustrated Swimsuit Calendar, then you may actually be able to learn a lot about what our counterterrorism efforts are all about. ;-)

Share/Save/Bookmark

December 1, 2012

The Future In Good Hands


Ethics_bowl

I had the distinct honor to attend the first Washington D.C. High School Ethics Bowl at American University.

There were eight teams competing from local schools in the D.C., Maryland, and Virginia areas.

My daughter's team won 2nd place!

(Note: the trophys were identical except for the engraving of first, second, and third places.)

I was so proud to see that the schools are educating our students in ethics--both the theory and the practice.

The student teams prepared and competed using 10 case study scenarios that covered everything from oil drilling in Alaska to the death penalty. 

In lieu of the education of yesteryear that relied all too heavily on rote memorization, it was awesome instead to see the students analyzing real life scenarios, using critical thinking, debating ethical and philosophical considerations, and making policy recommendations. 

The students were sensitive to and discussed the impact of things like income inequality on college admission testing, the environmental effects of offshore drilling versus the importance of energy independence, the influence of race of criminal sentencing, and the moral implications of the Red Cross teaching first aid to named terrorist groups like the Taliban. 

I was truly impressed at how these high school students worked together as a team, developed their positions, and presented them to the moderator, judges and audience--and they did it in a way that could inspire how we all discuss, vet, and decide on issues in our organizations today.

- They didn't yell (except a few that were truly passionate about their positions and raised their voices in the moment), instead they maturely and professionally discussed the issues.

- They didn't get personal with each other--no insults, put-downs, digs, or other swipes (with the exception of when one team member called his opponents in a good natured gest, "the rivals"), instead they leveraged the diversity of their members to strengthen their evaluation of the issues.

- They didn't push an agenda in a winner takes all approach--instead they evaluated the positions of the competing teams, acknowledged good points, and refined their own positions accordingly to come up with even better proposals. 

- They didn't walk away from the debate bitter--but instead not only shook hands with their opponents, but I actually heard them exchange appreciation of how good each other did and what they respected about each other.

I'll tell you, these kids--young adults--taught me something about ethics, teamwork, critical thinking, presentation, and debate, and I truly valued it and actually am enthusiastic about this next generation coming up behind us to take the reins. 

With the many challenges facing us, we need these smart and committed kids to carry the flag forward--from what I saw today, there is indeed hope with our children. ;-)

(Source Photo: Andy Blumenthal)

Share/Save/Bookmark

September 27, 2008

Intel is King of Change and Enterprise Architecture

Intel is one of the most amazing companies. They are the world’s largest semiconductor company, and the inventor of the popular x86 microprocessor series found in most PCs. Intel has around $40 billion in annual revenue, and ranked 62 in the Fortune 500 last year.

The Wall Street Journal 27-28 September 2008 has an interview with CEO of Intel, Paul Ostellini, that offers some useful lessons for enterprise architects:

  • Plan for change—“A CEO’s main job, because you have access to all of the information, is to see the need to change before anyone else does.” It’s great when the CEO has access to the information for seeing ahead and around the curves, but many do not. Information is critical and leaders need plenty of it to keep from steering the enterprise off a cliff. An important role of enterprise architects is provide business and technical information to the CEO and other executives to give them clear vision to the changes needed to grow and safeguard the business. (Perhaps better information would have prevented or reduced the damage to so many companies in dot-com bubble a few years ago and the financial crisis afflicting Wall Street today!)
  • Question repeatedly—a prior CEO of Intel, Andrew Grove, taught him “Ask why, and ask it again five more times, until all of the artifice is stripped away and you end up with the intellectually honest answer.” It easy to accept things on face value or to make snap judgments, but to really understand an issue, you need to get below the surface, and the way you do this is to question and dig deeper. I think this is critical for enterprise architects who are evaluating business and technology and providing recommendations to the business that can potentially make or break change efficacy. Architects should not just capture information to plunk into the architecture repository, but should question what they are seeing and hearing about the business, validate it, categorize it, and analyze it, to add value to it before serving that information up to decision makers.
  • Measure Performance—“we systematically measured the performance of every part of the company to determine what was world class and what wasn’t. Then as analytically as possible, --we made the cuts…and saved $3 billion in overall spending.” Measuring performance is the only way to effectively manage performance. If decisions are to be anything more than gut and intuition, they need to be based on quantifiable measures and not just subjective management whim. Enterprise architects need to be proponents for enterprise-wide performance measurement. And not just at the top level either. Performance measures need to be implemented throughout the enterprise (vertically and horizontally) and dashboard views need to be provided to executives to make the measures visible and actionable.
  • Communicate, communicate—“I made it my job to communicate, communicate, communicate the positive message. I did open forums, I did Webcasts, I told the employees to send me questions via email and I’d answer them...you have to convince them through reasoning and logic, the accuracy of your claims.” Good communication is one of those areas that are often overlooked and underappreciated. Leadership often just assumes that people will follow because they are “the leaders”. NOPE! People are not sheep. They will not follow just because. People are intelligent and want to be respected and explained to why….communication early and often is the key. The approach to architecture that I espouse, User-centric EA, focuses on the users and effectively communicating with them—each the way they need to absorb the information and at the level that is actionable to them. Making architecture information easy to understand and readily available is essential to help make it valuable and actionable to the users. User-centric EA uses principles of communication and design to do this.
Intel, in its 40 year history, has repeatedly planned for change, measured it, and managed it successfully. Intel’s CEO, Gordon Moore, is the epitome of driving change. Moore, the founder of Moore’s Law, captured the exponential change/improvement in silicon chip performance—identifying that the number of transistors packed on silicon chip would double every two years. Intel’s subsequent obsession with Moore’s Law has kept them as the dominant player in computer processors and may lead them to dominance in cell phones and other mobile devices as well.
Share/Save/Bookmark

August 23, 2008

Building Enterprise Architecture Momentum

Burton Group released a report entitled “Establishing and Maintaining Enterprise Architecture Momentum” on 8 August 2008.

Some key points and my thoughts on these:

  • How can we drive EA?

Value proposition—“Strong executive leadership helps establish the enterprise architecture, but…momentum is maintained as EA contributes value to ongoing activities.”

Completely agree: EA should not be a paper or documentation exercise, but must have a true value proposition where EA information products and governance services enable better decision making in the organization.

  • Where did the need for EA come from?

Standardization—“Back in the early days of centralized IT, when the mainframe was the primary platform, architecture planning was minimized and engineering ruled. All the IT resources were consolidated in a single mainframe computer…the architecture was largely standardized by the vendor…However distributed and decentralized implementation became the norm with the advent of personal computers and local area networks…[this] created architectural problems…integration issues…[and drove] the need to do architecture—to consider other perspectives, to collaboratively plan, and to optimize across process, information sources, and organizations.”

Agree. The distributed nature of modern computing has resulted in issues ranging from unnecessary redundancy, to a lack of interoperability, component re-use, standards, information sharing, and data quality. Our computing environments have become overly complex and require a wide range of skill sets to build and maintain, and this has an inherently high and spiraling cost associated with it. Hence, the enterprise architecture imperative to break down the silos, more effectively plan and govern IT with an enterprise perspective, and link resources to results!

  • What are some obstacles to EA implementation?

Money rules—“Bag-O-Money Syndrome Still Prevails…a major factor inhibiting the adoption of collaborative decision-making is the funding model in which part of the organization that bring the budget makes the rules.”

Agree. As long as IT funding is not centralized with the CIO, project managers with pockets of money will be able to go out and buy what they want, when they want, without following the enterprise architecture plans and governance processes. To enforce the EA and governance, we must centralize IT funding under the CIO and work with our procurement officials to ensure that IT procurements that do not have approval of the EA Board, IT Investment Review Board, and CIO are turned back and not allowed to proceed.

  • What should we focus on?

Focus on the target architecture—“Avoid ‘The Perfect Path’…[which] suggest capturing a current state, which is perceived as ‘analyze the world then figure out what to do with it.’ By the time the current state is collected, the ‘as-is’ has become the ‘as-was’ and a critical blow has been dealt to momentum…no matter what your starting point…when the program seems to be focused on studies and analysis…people outside of EA will not readily perceive its value.”

Disgree with this one. Collecting a solid baseline architecture is absolutely critical to forming a target architecture and transition plan. Remember the saying, “if you don’t know where you are going, then any road will get you there.” Similarly, if you don’t know where you are coming from you can’t lay in a course to get there. For example, try getting directions on Google Maps with only a to and no from location. You can’t do it. Similarly you can’t develop a real target and transition plan without identifying and understanding you current state and capabilities to determine gaps, redundancies, inefficiencies, and opportunities. Yes, the ‘as-is’ state is always changing. The organization is not static. But that does not mean we cannot capture a snapshot in time and build off of this. Just like configuration management, you need to know what you have in order to manage change to it. And the time spent on analysis (unless we’re talking analysis paralysis), is not wasted. It is precisely the analysis and recommendations to improve the business processes and enabling technologies that yield the true benefits of the enterprise architecture.

  • How can we show value?

Business-driven —“An enterprise architect’s ability to improve the organization’s use of technology comes through a deep understanding of the business side of the enterprise and from looking for those opportunities that provide the most business value. However, it is also about recognizing where change is possible and focusing on the areas where you have the best opportunity to influence the outcome.”

Agree. Business drives technology, rather than doing technology for technology’s sake. In the enterprise architecture, we must understand the performance results we are striving to achieve, the business functions, processes, activities, and tasks to produce to results, and the information required to perform those functions before we can develop technology solutions. Further, the readiness state for change and maturity level of the organization often necessitates that we identify opportunities where change is possible, through genuine business interest, need, and desire to partner to solve business problems.


Share/Save/Bookmark

May 24, 2008

The Business Analyst and Enterprise Architecture

A business analyst or "BA" is responsible for analyzing the business needs of their clients and stakeholders to help identify business problems and propose solutions. Within the systems development life cycle domain, the business analyst typically performs a liaison function between the business side of an enterprise and the providers of [IT] services to the enterprise. (Wikipedia)

Business analysis is critical to enterprise architecture, because it derives the business functions, processes, activities, and tasks. Coupled with some basic data and systems analysis, BA determines the information requirements of the business and the systems (manual or automated) that serve those up. Through business analysis, we identify gaps, redundancies, roadblocks, and opportunities which are used by enterprise architecture to drive business process improvement, reengineering, and the introduction of new technologies.

Where does the business analyst reside in the organization—in the business or in IT?

The answer is yes to both. The business analyst resides in the business and works on segment architecture for their lines of business and on defining functional requirements. Some business analysts also reside in IT as a relationship manager to translate business-speak to the techies and vice versa. Also, the LOBs may not have business analysts on staff and may request this service be performed by the IT shop. For example, this may be done from the enterprise architecture function to support segment architecture development or alignment to the enterprise architecture. Or it may be done by the IT centers of excellence that develop the systems solutions. If they can’t get the functional requirements from the LOBs, they may send in their own BAs to work with the programs to help capture this information.

ComputerWorld Magazine, 12 May 2008, asks “Is there a place for business analysts in IT today?” And answers, “Not if their primary function is just to analyze business needs…business people want more than analysis; they want workable solutions.”

So aside from business analysis what do you need to come up with a technical solution?

  • Resources—$$$$, smart people, the right infrastructure! (this one’s mine, the other two below are from ComputerWorld)
  • Creativity—“come up with ideas…to create systems that can meet performance requirements.”
  • Synthesis—“best ideas are evaluated and modified until good solutions are found.”

According to the ComputerWorld article, a single person who does the analysis, the creativity, and the synthesis is called a systems designer, but I disagree with this. The analysis and development of the requirements is “owned” by the business (even if IT is called upon to help with this function). While the creativity and synthesis, which is the technical solution, is “owned” by IT. Further, it is typically not a “single person” that develops the requirements and comes up with the solution. The solutions provider (IT) is generally distinct from the business that has the needs, even if sometimes it is difficult for them to articulate these into functional requirements.

ComputerWorld specifies four techniques for identifying requirements and developing a solution:

  1. Group facilitation—“getting input from everyone who might have relevant information and insights on a business process.”
  2. Process mapping—“create diagrams that capture task sequences for existing and new workflows.” (I believe we in EA all know this as Business Modeling).
  3. Data modeling—“diagram the structure of the data those workflows operate in.”
  4. User interface prototyping—“use prototypes of user interface screens to illustrate how people can interact with the system to do their jobs.” (Frankly, I don’t believe this one fits with the other three, since prototyping comes somewhat down the road in the SDLC after conceptual planning, analysis, and design. I would replace prototyping with some core system modeling to fill out the business, data, and system model set, so that we can see what systems are currently in use and where the gaps and redundancies are, and where there is potential for component or system re-use and building interoperability.)

I would suggest that 1 and 2 (the facilitation and business modeling) are the functions are the business analyst, but that 3 and 4 (data and systems modeling) are the responsibility of the IT function. Again, it is the business that “brings” the requirements and the IT department that comes up with the technical solution to meet those requirements.

Another thought: Perhaps the organization is struggling with defining the business analyst and those that develop the technical solution because it is really the synthesis of the two that is needed. It is similar to enterprise architecture itself, which is the synthesis of business and technology to enable better decision making. I can envision the further development of segment and solutions architecture to become just such a function that merges the requirements (business) and solutions (IT).


Share/Save/Bookmark