Showing posts with label Interoperability. Show all posts
Showing posts with label Interoperability. Show all posts

March 21, 2018

Measurement And Standards Are Our Friends

So I learned that Metrology is the science of measurement. 

And measurement is the foundation of scientific research and creating standards. 

Scientific research and measurement are about exploration, discovery, and innovation.

Further, it is about finding the facts; it is objective; it is truth; it is essential to maintaining integrity. 

Standards also help to ensure dependability, because there is a common reference and you know what you are getting. 

A great true story that demonstrates the importance of measurements and standards is the Great Baltimore Fire of 1904.

This was the third worst urban inferno in American history. 

It destroyed over 1,500 building across 140 acres. 

Fire engines responded from as far as New York and Virginia. 

But the problem was that they invariably could not help. 

Why?  

Because their fire hose couplings could not fit on the Baltimore fire hydrants--they were not standardized.

Without standards, we don't have interoperability. 

We don't have a reference that everyone can go by. 

It's as if we're all working on our own desert islands. 

This defeats the power in numbers that make us together greater than the sum of our individual parts. 

Science and technology help us advance beyond just ourselves and today. 

Measurement and standardization help us to build a better and stronger society. ;-)

(Source Photo: Andy Blumenthal)
Share/Save/Bookmark

June 10, 2014

I Like That Technology

Christopher Mims in the Wall Street Journal makes the case for letting employees go rogue with IT purchases.

It's cheaper, it's faster, "every employee is a technologist," and those organizations "concerned about the security issues of shadow IT are missing the point; the bigger risk is not embracing it in the first place."


How very bold or stupid? 


Let everyone buy whatever they want when they want--behavior akin to little children running wild in a candy store. 


So I guess that means...


  • Enterprise architecture planning...not important.
  • Sound IT governance...hogwash.
  • A good business case...na, money's no object.
  • Enterprise solutions...what for? 
  • Technical standards...a joke.
  • Interoperability...who cares? 
  • Security...ah, it just happens!

Well, Mims just got rids of decades of IT best practices, because he puts all his faith in the cloud.

It's not that there isn't a special place for cloud computing, BYOD, and end-user innovation, it's just that creating enterprise IT chaos and security cockiness will most-assuredly backfire. 


From my experience, a hybrid governance model works best--where the CIO provides for the IT infrastructure, enterprise solutions, and architecture and governance, while the business units identify their specific requirements on the front line and ensure these are met timely and flexibly.


The CIO can ensure a balance between disciplined IT decision-making with agility on day-to-day needs. 


Yes, the heavens will not fall down when the business units and IT work together collaboratively. 


While it may be chic to do what you want when you want with IT, there will come a time, when people like Mims will be crying for the CIO to come save them from their freewheeling, silly little indiscretions. 


(Source Photo: Andy Blumenthal)

Share/Save/Bookmark

July 4, 2012

Electronic Health Records, Slow But Steady

The best article I have seen on the subject of Electronic Health Records (EHR) was in Bloomberg BusinessWeek (21 June 2012) called "This machine saves lives so why don't more hospitals use it."

What I liked about this article was how straightforward it explained the marketplace, the benefits, the resistance, and the trends.  

Some basic statistics on the subject of EHR:

The healthcare industry is $2.7 trillion annually or ~18% of GDP.

Yet we continue to be quite inefficient with only about half of hospitals and doctors projected to be using EHR by end of 2012.

Annual spending on EHR is expected to reach $3.8 billion by 2015.

Basically, EHR is the digitization of our medical records and automation of medical services so that we can:
 
- Schedule medical appointments online

- Check medical records including lab and test results
- Communicate with our doctors by secure messaging/email
- Send prescriptions into the pharmacy electronically
- Automatically keep track of dosage and refills
- Get alerts as to side effects or interactions of medication
- Analyze symptoms and suggest diagnosis
- Receive prompts as to the latest medical treatments
- Recognize trends like flu outbreaks or epidemics
- File and speed claim processing

So why do many doctor's seem to resist moving to EHR?
 
- Cost of conversion in terms of both money and time

- Concern that it can be used against them in medical malpractice suits
- Potential lose of patient privacy
- Lack of interoperability between existing systems (currently, "there are 551 certified medical information software companies in the U.S. selling 1,137 software programs"--the largest of which are from GE and Epic.)

The government is incentivizing the health care industry to make the conversion:

- Hitech Act (2009) "provides $27 billion in financial incentives" including $44K from Medicare and $63K from Medicaid over 5 years for outpatient physicians that can demonstrate "that they are using the technology to improve care."
- Patient Protection and Affordable Care Act (2010)--a.k.a. Obamacare--calls for "accountable care organizations" to receive extra money from Medicare and Medicaid for keeping patients healthy, rather than by procedure--"they are expected to do so using computers."

The big loophole in EHR right now seems to be:

- The lack of standards for EHR systems from different vendors to be compatible, so they can "talk" to each other.
- Without interoperability, we risk having silos of physicians, hospitals, labs, and so on that cannot share patient and disease information.

So, we need to get standards or regulations in place in order to ensure that EHR is effective on a national, and then even a global level. 

A number of months ago, I went to a specialist for something and saw him a few times; what he didn't tell me when I started seeing him what that he was retiring within only a few months.
Aside from being annoyed at having to find another doctor and change over, I felt that the doctor was not too ethical in not disclosing his near-term intentions to close up shop and giving me the choice of whether I wanted to still see him. 


But what made matters worse is that I got a letter in mail with the notification--not even in person--along with a form to fill out to request a copy of my medical records at a cost per page, so that I could transfer them--hardcopy--elsewhere. 

Of course, this was also the doctor who hand wrote prescriptions still and wasn't able to get test results online. 

To me, seeing someone with a great amount of experience was really important, but the flip side was that in terms of organization, he was still in the "dark ages" when it came to technology. 

I look forward to the day when we can have both--senior medical professionals who also have the latest technology tools at their disposal for serving the patients. 

In the meantime, the medical profession still seems to have some serious catching up to do with the times technologically. 

Let's hope we get there soon so that we not only have the conveniences of modern technology, but also the diagnostic benefits and safeguards. 

(Source Photo: Andy Blumenthal)


Share/Save/Bookmark

January 22, 2012

Work Off Of Standards, But Stay Flexible to Change

Interesting book review in the Wall Street Journal (18 January 2012) on Standards: Recipes for Reality by Lawrence Busch.
Standards are a fundamental principle of enterprise architecture, and they can mean many things to different people--they can imply what is normal or expected and even what is considered ethical.
Reading and thinking about this book review helped me to summarize in my own mind, the numerous benefits of standards:
- Predictability--You get whatever the standard says you get.
- Quality--By removing the deviation and defects, you produce a consistently higher quality.
- Speed--Taking the decision-making out of the routine production of standardized parts (i.e. we don't have to "reinvent the wheel each time"), helps us to move the production process along that much faster.
- Economy--Standardizing facilitates mass production and economies of scale lowering the cost of goods produced and sold.
- Interoperability--Creating standards enables parts from different suppliers to inter-operate and work seamlessly and this has allowed for greater trade and globalization.
- Differentiation--Through the standardization of the routine elements, we are able to focus on differentiating other value-add areas for the consumer to appeal to various tastes, styles, and genuine improvements.
While the benefits of standards are many, there are some concerns or risks:
- Boring--This is the fear of the Ford Model-T that came in only one color, black--if we standardize too much, then we understate the importance of differentiation and as they say "variety is the spice of life."
- Stagnation--If we over-standardize, then we run the risk of stifling innovation and creativity, because everything has to be just "one way."
- Rigidity--By standardizing and requiring things like 3rd-party certification, we risk becoming so rigid in what we do and produce that we may become inflexible in addressing specific needs or meeting new requirements.
The key then when applying standards is to maximize the benefits and minimize the risks.
This requires maintaining a state of vigilance as to what consumers are looking for and the corollary of what is not important to them or what they are not keen on changing. Moreover, it necessitates using consumer feedback to continuously research and develop improvements to products and services. Finally, it is important to always be open to introducing changes when you are reasonably confident that the benefits will outweigh the costs of moving away from the accepted standard(s).
While it's important to work off of a standard, it is critical not to become inflexible to change.
(Source Photo: here )

Share/Save/Bookmark

September 20, 2009

Is Free Worth the Price?

In the computer world, free is often the architecture and economic model of choice or is it?

We have various operating systems like Linux, Chrome, Android and more now costing nothing. Information is free on the Internet. Online news at no cost to the reader is causing shock waves in the print news world. There are thousands of free downloads available online for applications, games, music, and more.

What type of business model is free—where is the revenue generation and profit margin?

Yes, we know you can use giveaways to cross sell other things which is what Google does so well making a boat load of money (billions) from its free search engine by selling ads. Others are trying to copy this model but less successfully.

Also, sometimes, companies give product away (or undercharge) in order to undermine their competitive challengers, steal market share, and perhaps even put their rivals out of business.

For example, some have accused Google of providing Google Apps suite for free as a competitive challenge to Microsoft dominant and highly profitable Office Suite in order to shake one of Microsoft’s key product lines and get them off-balance to deflect the other market fighting going on in Search between Google and Microsoft’s new Bing “decision engine.”

So companies have reasons for providing something for free and usually it is not pure altruism, per se.

But from the consumers perspective, free is not always really free and is not worth the trouble.

Fast Company has an interesting article (October 2009) called “The High Cost of Free.”

“The strategy of giving everything away often creates as many hassles as it solves.”

Linux is a free operating system, yet “netbooks running Windows outsell their Linux counterparts by a margin of nine to one.”

“Why? Because free costs too much weighted down with hassles that you’ll happily pay a little to do without.”

For example, when you need technical support, what are the chances you’ll get the answers and help you need on a no-cost product?

That why “customers willingly pay for nominally free products, because they understand that only when money changes hands does the seller become reliably responsive to the buyer.”

And honestly, think about how often--even when you do pay--that trying to get good customer service is more an anomaly than the rule. So what can you really reasonably expect for nothing?

“Some companies have been at the vanguard of making a paying business of “free.” IBM, HP and other tech giants generate significant revenue selling consulting services and support for Linux and other free software to business.”

Also, when you decide to go with free products, you may not be getting everything you bargained for either in the base product or in terms of all the “bells and whistles” compared with what a paid-for-product offers. It’s reminiscent of the popular adages that “you get what you pay for” and “there’s no such thing as a free lunch.”

Sure, occasionally there is a great deal out there—like when we find a treasure at a garage or estate sale or even something that someone else discarded perhaps because they don’t recognize it’s true value—and we need to be on the lookout for those rare finds. But I think we’d all be hard pressed to say that this is the rule rather than the exception. If it were the rule, it would probably throw a huge wrench in the notion of market equilibrium.

And just like everyone savors a bargain, people are of course seriously enticed by the notion of anything that is free. But do you think a healthy dose of skepticism is appropriate at something that is free? Again, another old saying comes to mine, “if it’s too good to be true, it probably is.”

Remember, whoever is providing the “free” product or service, still needs to pay their mortgage and feed their family too, so you may want to ask yourself, how you or someone else is paying the price of “free,” and see if it is really worth it before proceeding.

From the organization’s perspective, we need to look beyond the immediate price tag (free or otherwise discounted) and determine the medium- to long-term costs that include operations and maintenance, upgrades, service support, interoperability with other products and platforms, and even long-term market competition for the products we buy.

So let’s keep our eyes open for a great deal or paradigm shift, but let’s also make sure we are protecting the vital concerns of our users for functionality, reliability, interoperability, and support.


Share/Save/Bookmark

October 19, 2008

Net-centricity and Enterprise Architecture

See video on Department of Defense (DoD) vision for Net-Centricity:



Source: Department of Defense
Share/Save/Bookmark

August 24, 2008

FDCC and Enterprise Architecture

Setting standards help us to reduce complexity, contain costs, build interoperability, and secure the enterprise.

The Air Force is leading the way in setting standard configurations for the Federal government for computers, servers, printers, and cell phones.

Government Computer News, 4 August 2008, reports that “The Air Force started taking delivery in July on the first of 150,000 new PCs…the first to come equipped with their Windows Vista operating systems, including Internet Explorer 7, preset to meet Federal Desktop Core Configuration (FDCC) 2.1 standards.”

The FDCC is an outgrowth of the Air Force’s IT Commodity Council (ITCC) “efforts with Microsoft in 2006 to test and develop a standard software configuration.” This was coordinated with NIST, NSA, and DISA, and other agencies. Further, OMB “required agencies to implement FDCC’s Windows XP and Vista standards by Feb, 1, 2008.”

Now ITCC is working with DISA, NSA, Army, Navy, Marine, and Coast Guard to build Server configurations. Microsoft is taking these base configurations and “will develop configurations for ‘roles placed on top,’ says Michael Harper, Microsoft Service Director.

“Those will include the file and print servers, the domain controller, Exchange, SQL server, SharePoint, Web, and Windows deployment services.”

FDCC is “forcing the software industry to pay greater attention to the default settings of its products”. This is helping to reduce security vulnerabilities, and reducing costs.

Some examples of reducing costs and achieving other benefits from FDCC include:

  • “Preinstalling software at the factory rather than retrofitting a machine.”
  • Reducing energy costs by “preconfiguring Vista’s energy management settings.”
  • Steamlining the number of…device categories.”
  • “Standardizing…software…makes it easier to manage network and document security.”

FDCC has been so successful that ITCC is now moving forward with doing the same standardization for mobile devices.

FDCC is a step forward in terms of inter-agency collaboration, working with the vendor community, and creating an enterprise architecture that hits the mark for improved IT planning and governance.


Share/Save/Bookmark

August 23, 2008

Building Enterprise Architecture Momentum

Burton Group released a report entitled “Establishing and Maintaining Enterprise Architecture Momentum” on 8 August 2008.

Some key points and my thoughts on these:

  • How can we drive EA?

Value proposition—“Strong executive leadership helps establish the enterprise architecture, but…momentum is maintained as EA contributes value to ongoing activities.”

Completely agree: EA should not be a paper or documentation exercise, but must have a true value proposition where EA information products and governance services enable better decision making in the organization.

  • Where did the need for EA come from?

Standardization—“Back in the early days of centralized IT, when the mainframe was the primary platform, architecture planning was minimized and engineering ruled. All the IT resources were consolidated in a single mainframe computer…the architecture was largely standardized by the vendor…However distributed and decentralized implementation became the norm with the advent of personal computers and local area networks…[this] created architectural problems…integration issues…[and drove] the need to do architecture—to consider other perspectives, to collaboratively plan, and to optimize across process, information sources, and organizations.”

Agree. The distributed nature of modern computing has resulted in issues ranging from unnecessary redundancy, to a lack of interoperability, component re-use, standards, information sharing, and data quality. Our computing environments have become overly complex and require a wide range of skill sets to build and maintain, and this has an inherently high and spiraling cost associated with it. Hence, the enterprise architecture imperative to break down the silos, more effectively plan and govern IT with an enterprise perspective, and link resources to results!

  • What are some obstacles to EA implementation?

Money rules—“Bag-O-Money Syndrome Still Prevails…a major factor inhibiting the adoption of collaborative decision-making is the funding model in which part of the organization that bring the budget makes the rules.”

Agree. As long as IT funding is not centralized with the CIO, project managers with pockets of money will be able to go out and buy what they want, when they want, without following the enterprise architecture plans and governance processes. To enforce the EA and governance, we must centralize IT funding under the CIO and work with our procurement officials to ensure that IT procurements that do not have approval of the EA Board, IT Investment Review Board, and CIO are turned back and not allowed to proceed.

  • What should we focus on?

Focus on the target architecture—“Avoid ‘The Perfect Path’…[which] suggest capturing a current state, which is perceived as ‘analyze the world then figure out what to do with it.’ By the time the current state is collected, the ‘as-is’ has become the ‘as-was’ and a critical blow has been dealt to momentum…no matter what your starting point…when the program seems to be focused on studies and analysis…people outside of EA will not readily perceive its value.”

Disgree with this one. Collecting a solid baseline architecture is absolutely critical to forming a target architecture and transition plan. Remember the saying, “if you don’t know where you are going, then any road will get you there.” Similarly, if you don’t know where you are coming from you can’t lay in a course to get there. For example, try getting directions on Google Maps with only a to and no from location. You can’t do it. Similarly you can’t develop a real target and transition plan without identifying and understanding you current state and capabilities to determine gaps, redundancies, inefficiencies, and opportunities. Yes, the ‘as-is’ state is always changing. The organization is not static. But that does not mean we cannot capture a snapshot in time and build off of this. Just like configuration management, you need to know what you have in order to manage change to it. And the time spent on analysis (unless we’re talking analysis paralysis), is not wasted. It is precisely the analysis and recommendations to improve the business processes and enabling technologies that yield the true benefits of the enterprise architecture.

  • How can we show value?

Business-driven —“An enterprise architect’s ability to improve the organization’s use of technology comes through a deep understanding of the business side of the enterprise and from looking for those opportunities that provide the most business value. However, it is also about recognizing where change is possible and focusing on the areas where you have the best opportunity to influence the outcome.”

Agree. Business drives technology, rather than doing technology for technology’s sake. In the enterprise architecture, we must understand the performance results we are striving to achieve, the business functions, processes, activities, and tasks to produce to results, and the information required to perform those functions before we can develop technology solutions. Further, the readiness state for change and maturity level of the organization often necessitates that we identify opportunities where change is possible, through genuine business interest, need, and desire to partner to solve business problems.


Share/Save/Bookmark

July 20, 2008

A Net-centric Military and Enterprise Architecture

Information is central to the Department of Defense’s arsenal for fighting and defeating our enemies and the ability to share information across interoperable systems in the way ahead.

National Defense, March 2008 reports that while a net-centric military is our goal, the transformation is a work in progress.

Brig. Gen. David Warner, director of command and control at DISA stated: “in this war, information is truly our primary weapon. You can’t move, you can’t shoot, if you can’t communicate.”

Yet, “the Defense Department continues to acquire stovepiped systems…the requirements change, the system grows, and then there are cost overruns. One of the first items to cut from the budget is interoperability.”

Air Force Gen. Lance L. Smith says, “the dream of a truly net-centric U.S. military will not happen overnight. But progress could be achieved within the next five to 10 years, It will be a matter of waiting for the stovepiped legacy systems to come to the end of their lifespan. If the services get onboard and stop building non-interoperable technologies now, then the new generation of net-centric communications can take over and become the norm.”

This sounds to me like the problem isn’t limited to legacy systems, but that there are still cultural, project management, and change management issues that are obstacles to achieving the net-centric goal.

The challenges are even greater and more complex when it comes to sharing information with “federal civilian agencies and foreign allies…NATO, for example, has no mechanism to ensure its members are interoperable with each other.”

Today the normal way to do business is to ‘exchange hostages’ which means sending personnel from one service, agency, or coalition partner to each other’s command centers so they can verbally relay information.” This typically takes the form of interagency operation command center, and is not very net-centric.

So we continue to have stovepipes for “communications or data sharing systems built by different agencies, armed services, or coalition partners that cannot link to each other…[yet] the U.S. military is trying to make itself more lethal, faster, and more survivable. [And] the key to doing that is the ability to share information.”

Net-centricity, interoperability, and information sharing are true cornerstones to what enterprise architecture is about, and it is where we as architects are needed to take center stage now and in the years ahead in the war on terrorism and the other challenges we will face.

From an EA perspective, we need to ensure that all of our agencies’ targets, transition plans, and IT governance structures not only include, but emphasize net-centricity and enforce it through the EA review processes and the investment review board. There is no excuse for these stovepipes to persist.
Share/Save/Bookmark

May 31, 2008

Occam’s Razor and Enterprise Architecture

“Occam's razor (sometimes spelled Ockham's razor) is a principle attributed to the 14th-century English logician and Franciscan friar William of Ockham…The principle is often expressed in Latin as the lex parsimoniae (‘law of parsimony’ or ‘law of succinctness’)..This is often paraphrased as ‘All other things being equal, the simplest solution is the best.’… it is more often taken today as a heuristic maxim (rule of thumb) that advises economy, parsimony, or simplicity.’” (Wikipedia)

In Occam’s razor, “razor refers to the act of shaving away unnecessary assumptions to get to the simplest explanation.”

Thomas Aquinas made a similar argument in the 13th century: "If a thing can be done adequately by means of one, it is superfluous to do it by means of several; for we observe that nature does not employ two instruments where one suffices." (Pegis, A. C., translator (1945). Basic Writings of St. Thomas Aquinas. New York: Random House, 129.)

The principle of Occam’s razor is very applicable to enterprise architecture—how?

Occams razor is a call for simplicity, and this principle is a foundation for enterprise architecture in terms of consolidation, integration, and cost efficiency and takes specific form in terms of:

  • Systems interoperability and component re-use
  • Technology standardization and simplification

Paul O’Neill, the former Secretary of the Treasury was a true advocate of Occams razor and frequently asked “if not one, why not one?”

“The philosopher of science Elliott Sober once argued along the same lines as Popper, tying simplicity with ‘informativeness’: The simplest theory is the more informative one, in the sense that less information is required in order to answer one's questions.” (Wikipedia)

In this sense, Occam’s razor is a validation for User-centric Enterprise Architecture, which seeks to make information simpler, easier to understand, and generally more valuable and actionable to the end-user to enhance decision making. Moreover, Occam’s razor is also evident in User-centric EA application of principles of communication and design like simplifying complex information and maximizing use of information visualization in order to more effectively communication EA information.

Occams razor makes clear the need to transform from traditional EA’s development of “artifacts” that are often difficult for the user to understand and apply and instead develop more useful and usable information products and governance services.


Share/Save/Bookmark

February 29, 2008

A Pocket Printer and Enterprise Architecture

Ever wonder what happened to the old Polaroid cameras—you know point, click, shoot, and out pops your photo? Very cool technology for a society that expects, no demands, instant gratification.

Polaroid photos were great while they lasted, but their pictures have become obsolete with new digital photography.

However, Polaroid has a new architecture to transform itself. They have developed a pocket printer to enable the printing of digital photos from cell phones and cameras.

MIT Technology Review, 7 January 2008, reports that Polaroid’s “new handheld printers produce color photos using novel thermal-printing technology developed at Polaroid spinoff Zink Imaging…[and] will be priced at less than $150.”

How does the pocket printer work?

The printer is about the size of a deck of cards. A user who takes a picture on a cell phone or camera can wirelessly send the file to the printer using Bluetooth, a common short-range wireless technology used in cell phones, or PictBridge, a wireless technology found in a number of cameras. The result is a two-inch-by-three-inch photo printed on paper engineered by Zink.”

Where does the printer cartridge go in the small pocket printer?

The printing technology is similar to that of a common thermal printer…since Zink's technology eliminates the need for printer cartridges...it has led to the smallest printers on the market, and it could eventually be integrated into cell phones and cameras. It would also dispense with the inconvenience of ink cartridges that unexpectedly begin to run out of ink, and which have to be replaced. "When you go to replace an ink-jet cartridge today, it's in the $40 range," Herchen says. With Zink, a person pays only by the print. Polaroid expects to sell the photo paper for $0.30 a page.”

What challenges does the pocket printer face?

“People are accustomed to e-mailing pictures to each other or sending them to each other's phones, and they probably won't want to carry around another gadget just to print pictures on the spot.” But this concern can be obviated if the printer can be integrated into the cell phone or camera, in essence creating a modern digital Polaroid camera equivalent.

From a User-centric EA perspective, you’ve got to hand it to Polaroid to extend their expertise in instant photography to the digital photo age. They have come up with a novel idea and have executed on it, so that it is standards-based (Bluetooth and PictBridge), interoperable with other technologies (cell phones and cameras), small and affordable—thus, appealing to end-users. It would be nice to see the pocket printer work with MS Office applications, so I can print my blog and other work on the go.


Share/Save/Bookmark

February 26, 2008

Microsoft Reveals Secrets and Enterprise Architecture

This week Microsoft said they had a big announcement, and that it wasn’t about Yahoo! It turns out that Microsoft decided to reveal some of their technical documents for Microsoft Vista, Office, and other applications.

Why would a company like Microsoft reveal their technical secrets to partners and rivals alike? How is this decision a good architecture move, especially by the master architect himself, Bill Gates?

We all know that companies strive to achieve strategic competitive advantage and that one major way to do this is by product differentiation. The goal is to develop a unique product offering that customers want and need and then build market share. In some case, this results in a situation like Microsoft’s virtual monopoly status in desktop operating systems and productivity suites.

So why give up the keys to the Microsoft kingdom?

Well they are not giving up the keys, maybe just giving a peek inside. And an article in The Wall Street Journal, 22 February 2008 tells us why Microsoft is doing this:

  1. Internet Revolution—“For 30 years, Microsoft has…tightly held onto the technical details of how its software works… [and] it become one of the most lucrative franchises in business history. But Microsoft traditional products aren’t designed to evolve via add-ons or tweaks of thousands of non-Microsoft programmers. Nor can they be easily mixed or matched with other software and services not controlled by Microsoft or its partners. Now the Internet is making that kind of evolution possible, and transforming the way software is made and distributed.” As Ray Ozzie, chief software architect of Microsoft states: “The world really has changed.”
  2. Do or die—Microsoft’s prior business model was leading it down a path of eventual extinction. “The more people use these applications [free technologies and shareware], the less they need they have for Microsoft’s applications.” Microsoft is hoping to maintain their relevance.
  3. Antitrust ruling—“Last September, an appeals court in Luxembourg ruled against Microsoft in a long-running European case that forced Microsoft to announce a month later that it would drop its appeals and take steps to license information to competitors.”
  4. Interoperability—“Microsoft announced in July 2006 [its “Windows Principles”]…such as a commitment to providing rival developers with access to interfaces that let their products talk with Windows.” The key here is customer requirements for systems interoperability and Microsoft is begrudgingly going along.

Is this fifth such announcement on sharing by Microsoft the charm? I suppose it all hinges on how much marketplace and legal pressure Microsoft is feeling to divulge its secrets.

So it this the right User-centric EA decision?

If Microsoft is listening to their users, then they will comply and share technical details of their products, so that new technology products in the market can develop that add on to Microsoft’s and are fully interoperable. The longer Microsoft fights the customer, the more harm they are doing to their brand.

At the same time, no one can expect Microsoft to do anything that will hurt their own pocketbook, so as long as they can successfully maintain their monopoly, they will. Not that Microsoft is going away, but they are holding onto a fleeting business model. In the information age, Microsoft will have to play ball and show some goodwill to their users.


Share/Save/Bookmark

February 2, 2008

Simplification and Enterprise Architecture

Enterprise architecture seeks to simplify organizational complexity through both business processes reengineering and technology enablement. Technology itself is simplified through standardization, consolidation, interoperability, integration, modernization, and component reuse.

Harvard Business Review, December 2007, reports on simplifying the enterprise.

Large organizations are by nature complex, but over the years circumstances have conspired to add layer upon layer of complexity to how businesses are structured and managed. Well-intentioned responses to new business challenges—globalization, emerging technologies, and regulations…--have left us with companies that are increasingly ungovernable, unwieldy, and underperforming. In many more energy is devoted to navigating the labyrinth than to achieving results.”

Having worked for a few large organizations myself, I can “feel the pain.” Getting up to 8 levels of signature approval on routine management matters is just one such pain point.

What causes complexity?

Complexity is the cumulative byproduct or organizational changes, big and small that over the years weave complications (often invisibly) into the ways that work is done.

What is sort of comical here is that the many change management and quality processes that are put in place or attempted may actually do more harm than good, by making changes at the fringes—rather than true simplification and process reengineering at the core of the enterprise.

Here is a checklist for cutting complexity out of your organization:

  • “Make simplification a goal, not a virtue—include simplicity…[in] the organization’s strategy; set targets for reducing complexity; create performance incentives that reward simplicity.
  • Simplify organizational structure—reduce levels and layers…consolidate similar functions.
  • Prune and simplify products and services—employ product portfolio strategy; eliminate, phase out, or sell low-value products; counter feature creep.
  • Discipline business and governance processes—create well-defined decision structures (councils and committees); streamline operating processes (planning, budgeting, and so on).
  • Simplify personal patterns—counter communication overload; manage meeting time; facilitate collaboration across organizational boundaries.”

Leading enterprise architecture and IT governance for a number of enterprises has shown me that these initiatives must be focused on the end-user and on simplifying process and improving results, rather than creating more unnecessary complexity. The chief architect needs to carefully balance the need for meaningful planning, helpful reviews, and solid documentation and an information repository with simplifying, streamlining, consolidating, reengineering, and facilitating an agile, nimble, and innovative culture.


Share/Save/Bookmark

January 3, 2008

Customization and Enterprise Architecture

While User-centric EA seeks to provide useful and useable products and services to the end-user, the heavy customization of major application systems to meet user needs is a huge mistake. Major customization of IT systems is a killer: it is a killer of the application and a killer of the underlying business processes.

What do I mean? When you heavily customize an application system (and I am not talking about changing settings), you do the following:
  • You greatly increase the implementation cost of the system, since you have now added all sorts of modifications to the system.
  • You greatly increase you maintenance burden, because new versions of the software often will need to be recoded.
  • You hamper the ability of the system to interoperate with other systems that it was designed to work with (even when it is built with open standards), since you have tinkered and tweaked away at it.
  • You missed one of the biggest opportunities to improve and reengineer your business processes; instead of aligning your business processes with those identified (by usually hundreds, if not thousands of other organizations) as best practices and written into the software, you have made your enterprise the odd man out and overwrote the best practices in the application system with your specific way of doing things. That’s a big no-no.

Let’s face it, most (and there are exceptions to every rule) organizations at their fundamental “business” (not mission) practices are often close to identical. Areas like finance, human capital, and even IT and considered utilities to the organization. These areas are often run in ways to exploit enterprise solutions for large organizations (for example, one timekeeping system, one payroll system, or one general ledger system ) and these functions are the first to be looked at for integration and downsizing on the corporate side during mergers and acquisitions.

Instead of insisting that your processes are so different, see why others are doing it another way and whether there is merit in it, before you go and customize and chip-away at the system—you may be doing yourself more harm than good. Generally (and there are exceptions to every rule), you’re better off changing business processes to meet widely used and verified software.


Share/Save/Bookmark

November 19, 2007

iPod Versus Zune and Enterprise Architecture

Zune has been playing catch up with iPod in the music player business, but from an User-centric enterprise architecture standpoint, they’ve got it all wrong!

The Wall Street Journal (WSJ), 14 November 2007 reports that Microsoft has retooled the Zune so that it “marks a vast improvement; however, it’s still no iPod.”

Where is Microsoft going wrong?
  1. An inferior product—“last year when Microsoft Corp. introduced its Zune music player to take on Apple’s iPod juggernaut, the software giant struck out. While the Zune had a good user interface, it was bigger and boxier, with clumsier controls, weaker battery life, and more complex software. Its companion online music store has a much smaller catalogue, a more complicated purchase process, and no videos for sale. And the Zune’s most innovative feature, built-in Wi-Fi networking, was nearly useless.” So much for competing on product quality!
  2. Underestimating the competition—Microsoft is “back with a second improved round of Zune’s…Apple hasn’t been standing still…the 80-gigabyte Classic, which costs the same as 80-gigabyte Zune, is slimmer than the Zune and has a flashy new interface, if a smaller screen. And the eight-gigabyte nano, which costs the same as the eight-gigabyte Zunem now plays videos and is much smaller—yet it has a larger screen. In addition, Apple has spiffed up its iTunes software…and Apple still trounces Microsoft in the selection of media it sells…more than six million songs, about double what the Zune marketplace offers, and dwarfs Microsoft’s selection of Podcasts and music videos as well.”

The WSJ concludes, “Microsoft has greatly improved the Zune hardware and software this time. But it seems to be competing with Apple’s last efforts, not its newest ones.”

In spite of these explanations, I think we’re missing something else here. If you compare the Microsoft desktop software to Apple’s, Microsoft also has a worse product, yet is the hands-down market leader. So why is Microsoft struggling with Zune?

Maybe functionality is part of the equation, but not the whole thing. It’s interesting to me that neither the article nor advertisements I see for Zune address anything about the interoperability of the product with Apple’s iTunes. Interoperability is not only a major enterprise architecture issue, but from a consumer standpoint, do you really expect people to dump their investment in their iTunes music library when they buy a Zune?

Looking at Yahoo Answers online, I see consumers share this concern:

“Can you use the iTunes’ software with the Microsoft Zune? I am torn between which to buy, if you can use itunes with the Zune then that’s the one I’ll get, but if you can’t then I’m getting an iPod, help me decide please.”

“Best Answer - Chosen by Voters

No you cannot. iTunes only works with the iPod, Zune is a completly different player made by Microsoft, it has its own music program and marketplace called the Zune Marketplace. The Zune Software can automatically copy songs that have not been purchased from iTunes (because ones that are have copy protection on them) and put them in the Zune Program.”

Until Microsoft acts as the architects par excellence that they are, and work out the all-important EA interoperability issues of its product, and communicates this with its customers, the Zune will continue to be second-rate, functionality notwithstanding.
Share/Save/Bookmark

September 20, 2007

Microsoft and Enterprise Architecture

Microsoft—with 79,000 employees in 102 countries and global annual revenue of $51.12 billion as of 2007—is the company every consumer loves and hates.

On one hand, Microsoft’s products have transformed the way we use the computer (yeah, I know Apple got there first, but it’s the Microsoft products that we actually use day-to-day). Everything from the Microsoft operating system, office suite, web browser, media player, and so on has made computers understandable, and useable by millions, if not billions of people around the world. The positive impact (excluding the security flaws and pricing) has drastically made our lives better!

On the other hand, Microsoft is a “near-monopoly” with estimated 90%+ market share for Office and Windows O/S. Near-monopolies, like Microsoft are feared to stifle competition, reduce innovation and consumer choice, and drive up prices.

Microsoft has been convicted by the European Commission of having “improperly bundled a media player with its Windows operating system and denied competitors information needed to make their computers work with Microsoft software…fines and penalties could reach…$2.77 billion. “EU officials praised the decision…for protecting consumers.” While “Microsoft backers said, the ruling will stifle innovation by making it tougher to design products with new features.” Additionally, “the EU is reviewing complaints about Microsoft’s Office Software and concerns over how Microsoft bundled encryption and other features in its new Vista operating system.” (Wall Street Journal, 18 September 2007)

From a User-centric EA perspective, there is a similar love-hate relationship with Microsoft. As
architects, we believe and preach standardization, consolidation, interoperability, and integration—all things that Microsoft’s array of products help us achieve ‘relatively’ easily (imagine, if instead of an integrated Office Suite, as well as mail, calendar, task list, and underlying operating system, we had to use an array of non-integrated products-yikes!). However, also as architects, we look to be able acquire innovative technology solutions for our organizations that will help us achieve superior mission performance, and to acquire products at prices that produce the best value for the enterprise—to achieve that we need a marketplace based on healthy competition that drives innovation and price competition.

So we love and need Microsoft, but we fear and loathe the ramifications of such market dominance.

Share/Save/Bookmark

August 28, 2007

Data Architecture Done Right

Data architecture done right provides for the discovery and exchange of data assets between producers and consumers of data.

Data discovery is enabled by data that is understandable, trusted, and visible.

Data exchange is facilitated by data that is accessible and interoperable.

Together, data discovery and exchange are the necessary ingredients for information sharing.

Why is it so hard?

Primarily it’s a coordination issue. We need to coordinate not only internally in our own organization (often already large and complex), but also externally, between organizations — horizontally and vertically. It’s quite a challenge to get everyone describing data (metadata) and cataloging data in the same way. Each of us, each office, each division, and so forth has its own standards and way of communicating. What is the saying, “you say poTAYtos, and I say poTAHtos”.

Can we ever get everyone talking the same language? And even if we could, do we really want to limit the diversity and creativity by which we express ourselves? One way to state a social security number is helpful for interoperability, but is there really only one "right" way to say it? How do we create data interoperability without creating only one right way and many wrong ways to express ourselves?

Perhaps, the future will bring artificial intelligence closer to being able to interpret many different ways of communicating and making them interoperable. Sort of like the universal translator on Star Trek.

Share/Save/Bookmark