July 19, 2009

Battle of the Tech Titans

Google and Microsoft are going head-to-head, and they are going for the jugular.

ComputerWorld stated in the July 6/July 13, 2009: “Google Set to Wage OS War with Microsoft.” Wired wrote in August 2009 issue according to CEO Eric Schmidt, Google is the “anti-Microsoft”.

According to Wired, the two companies are fighting for the title: King of Technology.

Here’s a quick breakdown:

Google

Microsoft

Web Browser

Chrome (& FireFox distribution)

Explorer

Operating System

Android, Chrome OS

Windows, XP, Vista, Mobile

Business Productivity Suite

Apps Suite

Office

Search

Google

Bing

Online Advertising

Adwords, Adsense, Doubleclick

aQuantive

On one hand, Google is the undisputed master of the Internet delivering 78.5% of search results in the U.S. (versus 8.2% for Microsoft ) and pulling in $22 billion in revenue in 2008 for text ads. On the other hand, Microsoft owns the personal computer environment with 90% of the operating systems for all laptops and desktops yielding $16 billion in 2008 sales and $14.3 billion in 9 months for it’s productivity applications (versus Google which mostly gives away is email and other online applications); further Microsoft has 70% of the browser market to Google 2% for Chrome. (Wired July 13, 2009)

So is there really a full tech war going on or are Microsoft and Google just chipping away on the edges of each others territory, using so-called guerrilla warfare tactics?

It’s a little of each. Both companies are technology behemoths trying to be the king of the tech jungle. But they have very different approaches. Microsoft believes that computer software is the key to tech kingdom, while Google believes that the Internet is the path to people’s technology hearts.

Google is willing to give away software to challenge Microsoft on its home turf, and Microsoft is investing in its new search engine to erode the core strength of its competitor. It’s a jab for jab face-off where I would imagine we would continue to see the corporate fists flying for as long the two are standing.

From a strategic point of view, Microsoft has such a dominant position on our computers both in our homes and businesses, it is hard to imagine them being easily dethroned. Microsoft also has a war chest and the ability to replenish it to fight a darn good fight. But many companies have been smug and have lost to a determined challenger.

Google is coming out strong for its innovativeness and can’t turn down offer of free products. If the television business is any predictor of a winner-take-all, television’s advertising revenue built an incredible entertainment industry that we all enjoy and which still largely dominates today.

And now I think I will go watch 60 minutes on my big flat screen TV.


Share/Save/Bookmark

July 18, 2009

IT as a Surrogate Weapon

There is a fascinating controversy going on now over the CIA plans to kill known al Qaeda terrorists. Should we “stoop to their level” and take them out or is this “assassination” style technique out of bounds for a free and democratic society?

Wow. I don’t think too many Americans the day after 9/11 would be asking that question.

We are quickly swayed by the events of the times and our emotions at play.

When 3,000 people—mostly civilians—were killed in a vicious surprise attack on our financial and military hubs in this country; when the Twin Towers were still burning and crashing down; when smoke was rising out of the Pentagon; and when a plane crashed in Pennsylvania—I think most of us would say, these terrorists need to be dealt a severe and deadly blow.

Who would’ve though that just a mere 8 years later, questions would abound on the righteousness of killing the terrorists who planned, executed, and supported these murderous attacks and still seek every day to do us incredible harm—quite likely with chemical, nuclear, biological, or radiological (CNBR) weapons—it they could pull it off in the future.

We are a society with a short-term memory. We are a reactive society. As some have rightly said, we plan to fight the wars of the past, rather than the wars of the future.

We are also a doubting society. We question ourselves, our beliefs, and our actions. And to some extent this is a good thing. It elevates our humanity, our desire to do what is right, and to improve ourselves. But it can also be destructive, because we lose heart, we lose commitment, we change our minds, we are swayed by political currents, and to some extent we swing back and forth like a pendulum—not knowing where the equilibrium really is.

What makes the current argument really fascinating to me from an IT perspective is that we are okay with drones targeting missiles at terrorist targets (and even with a certain degree of civilian “collateral damage”) from these attacks from miles in the sky, but we are critical and repugnant to the idea the CIA wanted to hunt down and put bullets in the heads of the terrorists who committed the atrocities and are unwavering in their desire to attack again and again.

Is there an overreliance on technology to do our dirty work and an abrogation of hands-on business process to do it with our own “boots on the ground” hands?

Why is it okay to pull the trigger on a missile coming from a drone, but it is immoral to do it with a gun?

Why is it unethical to fight a war that we did not choose and do not want, but are victims of?

Why are we afraid to carry out the mission to its rightful conclusion?

The CIA, interrogators, military personnel and so forth are demonized for fighting our fight. When they fight too cautiously—they have lost their will and edge in the fight, we suffer consequences to our nation’s safety, and we call them incompetent. When they fight too vigorously, they are immoral, legal violators, and should be prosecuted. We are putting “war” under a huge microscope—can anyone come out looking sharp?

The CIA is now warning that if these reputational attacks continue, morale will suffer, employees will become risk-averse, people will quit, and the nation will be at risk.

Do we want our last lines of defense to be gun-shy when the terrorists come hunting?

According to the Wall Street Journal, “one former CIA director, once told me that the ‘CIA should do intelligence collection and analysis, not covert actions. Covert actions almost never work and usually get the Agency in trouble.’”

The Journal asks “perhaps covert action should be done by someone else.” Who is this someone else?

Perhaps we need more technology, more drones to carry out the actions that we cannot bear to face?

I believe that we should not distinguish between pulling the trigger on a drone missile and doing the same on a sniper rifle. Moreover, a few hundred years ago the rifle was the new technology of the time, which made killing less brutal and dehumanized. Now we have substituted sophisticated drones with the latest communication, navigation and weapons technologies. Let’s be honest about what we are doing – and what we believe needs to be done.

(As always, my views are my own and do not represent those of any other entity.)


Share/Save/Bookmark

July 14, 2009

A Call to IT Arms

Recently, I heard a colleague say that we should view IT not as a cost center, but as a resource center—and I really liked that.

In fact, IT is a cost center and a resource center, but these days there is an overemphasis on it being a cost center.

On the negative side, people seem to like to criticize IT and point out the spectacular failures there have been, and in fact, according to Public CIO “a recent study by the Standish Group showed that 82% of all IT project were either failures or were considered challenged.”

This is the dark side of IT that many would like to dwell on.

However, I would argue that while we must constantly improve on IT project delivery, IT failures can be just a point in time on the way to tremendous success and there are many of these IT successes that we benefit from in big and small ways every day.

Moreover, it may take 1000 failures to achieve that one great breakthrough success. That is the nature of innovation and experimentation.

Of course, that does not mean we should do stupid or negligent things that results in failed IT projects—we must do our best to be responsible and professional stewards. But, we should not be afraid to experiment and fail as a healthy part of the creative process.

Thomas Edison said: “I have not failed. I’ve just found 10,000 ways that won’t work.”

So why are we obsessed with IT failures these days?

Before the dot com bust, when technology was all the rave, and we enjoyed the bounty of new technologies like the computer, cell phones, handhelds, electronics galore, the Internet and all the email, productivity software and e-commerce and business applications you could ask for, the mindset was “technology is the engine that drives business.” And in fact, many companies were even changing their names to have “.com” in them to reflect this. The thinking was that if you didn’t realize the power and game-changing nature of technology, you could just as well plan to be out of business in the near future. The technologies that came out of those years were amazing and you and I rely on these every day.

Then after the dot-com burst, the pendulum swung the other way—big time! IT became an over zealous function, that was viewed as unstructured and rampant, with runaway costs that had to be contained. People were disappointed with the perceived broken promises and failed projects that IT caused, and IT people were pejoratively labeled geeks or techies and viewed as being outside the norm—sort of the societal flunkies who started businesses out of home garages. People found IT projects failures were everywhere. The corporate mindset changed to “business drives technology.”

Now, I agree that business drives technology in terms of requirements coming from the business and technology providing solutions to it and enabling it. But technology is also an engine for growth, a value creator, and a competitive advantage!

Further, while some would argue these days that IT is “just a tool”, I would counter that IT is a true strategic asset to those who understand its role in the enterprise. I love IT and I believe we all do and this is supported by the fact that we have become basically insatiable for IT. Forrester predicts U.S. IT budgets in 2009 will be in the vicinity of $750 billion. (http://it.tmcnet.com/topics/it/articles/59200-it-market-us-decline-51-percent-2009-researchers.htm) Think about what you want for the holidays—does it have IT in it?

A recent article in the Wall Street Journal was about how the homeless are so tied to technology that many have a computer with Internet access, even when they don’t have three square meals a day or a proper home to live in.

Another sign of how critical IT has become is that we recently stood up a new Cyber Command to protect our defense IT establishment. We are reliant indeed on our information technology and we had better be prepared to protect and defend it.

The recent White House 2009 Cyberspace Policy Review states: “The globally-interconnected digital information and communications infrastructure known as “cyberspace” underpins almost every facet of modern society and provides critical support for the U.S. economy, civil infrastructure, public safety, and national security.”

It's time for the pendulum to swing back in the other direction and to view IT as the true strategic asset that it is.


Share/Save/Bookmark

July 12, 2009

Information Management Framework

The Information Management Framework (IMF) provides a holistic view of the categories and components of effective information architecture.

These categories include the following:

Information-sharing--Enable information sharing by ensuring that information is visible, accessible, understandable, and interoperable throughout the enterprise and with external partners.

Efficiency--Improve mission efficiency by ensuring that information is requirements-based, non-duplicative, timely, and trusted.

Quality--Promote information quality, making certain that information provided to users is valid, consistent, and comprehensive.

Compliance--Achieve compliance with legislation and policy providing for privacy, freedom of information, and records management.

Security-- Protect information assets and ensure their confidentiality, integrity, and availability.

All areas of the framework must be managed as part of effective information architecture.

Share/Save/Bookmark

July 11, 2009

Adaptive Leaders Rule The Day

One of the key leadership traits is of course, agility. No single course of action—no matter how intelligent or elegant—will be successful in every situation. That’s why effective leaders need to be able to quickly adapt and to apply situation-appropriate behaviors (situational leadership) to the circumstances as they arise.

Leaders need a proverbial "toolkit" of successful behaviors to succeed and even more so be able to adapt and create innovative new tools to meet new unchartered situations.

Harvard Business Review, July/August 2009, has a interesting article called “Leadership in a (Permanent) Crisis” that offers up some useful insights on adaptive leadership.

But first, what is clear is that uncertainty abounds and leadership must adapt and meet the challenges head on:

“Uncertainty will continue as the norm even after the recession ends. Economics cannot erect a firewall against intensifying global competition, energy constraints, climate change, and political instability.”

But some things that effective leaders can do in challenging and uncertain times are as follows:

Foster adaptation”—leaders need to be able to function in two realities—today and tomorrow. They “must execute in order to meet today’s challenges and they must adapt what and how things get done in order to thrive in tomorrow’s world.” Or to put it another way: leaders “must develop ‘next practices’ while excelling at today’s best practices.”

Stabilize, then solve—in uncertain times, when an emergency situation arises, first stabilize the situation and then adapt by tackling the underlying causes and building capacity to thrive in a new reality.

Experiment—don’t be afraid to experiment and try out new ways of doing things, innovate products and services, or field new technologies. “The way forward will be characterized by constant midcourse corrections.” But that is how learning occurs and that’s how success is bred—one experience and experiment at a time.

“Embrace disequilibrium”—Often people and organizations won’t or can’t change until the pain of not adapting is greater than the pain of staying the course. Too little pain and people stay in their comfort zone. Too much change, and people “fight, flee, or freeze.” So we have to be ready to change at the tipping point when the discomfort opens the way for change to drive forward.

Make people safe to question—unfortunately, too often [poor] leadership is afraid or threatened by those who question or seek alternative solutions. But effective leaders are open to new ideas, constructive criticism and innovation. Leaders need be confident and “create a culture of courageous conversations”—where those who can provide critical insights “are protected from the organizational pressure to remain silent.”

Leverage diversity—the broader the counsel you have, the better the decision you are likely to make. “If you do not engage in the widest possible range of life experiences and views—including those of younger employees—you risk operating without a nuanced picture of the shifting realities facing the business internally and externally.

To me, while leaders may intuitively fall back on tried and true techniques that have worked for them in the past, adaptive leaders need to overcome that tendency and think creatively and in situation-appropriate ways to be most effective. The adaptive leader doesn’t just do what is comfortable or known, but rather he/she synthesizes speed, agility, and courage in confronting new and evolving challenges. No two days or situations are the same and leadership must stand ready to meet the future by charting and creative new ways ahead.


Share/Save/Bookmark

July 10, 2009

The Microgrid Versus The Cloud

It’s strange how the older you get, the more you come to realize that life is not black and white. However, when it comes to technology, I once held out hope that the way to the future was clear.

Then things started to get all gray again.

First, I read a few a few weeks ago about the trends with wired and wireless technologies. On one hand, phones have been going from wired to wireless (many are even giving up their landlines all together). Yet on the other hand, television has been going the other way—from wireless (antennas) to wired (cable).

Okay, I thought this was an aberration; generally speaking technology advances—maybe with some thrashing about—but altogether in a specific direction that we can get clearly define and get our arms around.

Well, then I read another article—this one in Fast Company, July/August 2009, about the micogrid. Here’s what this is all about:

“The microgrid is simple. Imagine you could go to Home Depot and pick out a wind or solar appliance that’s as easy to install as a washer/dryer. It makes all the electricity your home needs and pays for itself in just a few years. Your home still connects to the existing wires and power plants, but is a two-way connection. You’re just as likely to be uploading power to the grid as downloading from it. You power supply communicates with the rest of the system via a two-way digital smart meter, and you can view your energy use and generation in real time.”

Is this fantasy or reality for our energy markets?

Reality. “From the perspective of both our venture capital group and some senior people within GE Energy, distributed generation is going to happen in a big way.” IBM researchers agree—“IBM’s vision is achieving true distributed energy on a massive scale.”

And indeed we see this beginning to happen in the energy industry with our own eyes as “going green” environmentalism, and alternate energy has become important to all of us.

The result is that in the energy markets, let’s summarize, we are going from centralized power generation to a distributed model. Yet—there is another trend in the works on the information technology side of the house and that is—in cloud computing, where we are moving from distributed applications, platforms, storage, and so forth (in each organization) to a more centralized model where these are provisioned by service providers such as Amazon, Google, Microsoft, and IBM—to name a just a few. So in the energy markets, we will often be pushing energy back to the grid, while in information technology, we will be receiving metered services from the cloud.

The takeaway for me is that progress can be defined in many technological ways at one time. It’s not black or white. It’s not wired or wireless. It’s not distributed or centralized services. Rather, it’s whatever meets the needs of the particular problem at hand. Each must be analyzed on its own merits and solved accordingly.


Share/Save/Bookmark

July 4, 2009

CIO Support Services Framework

The CIO Support Service Framework (CSSF) has 5 major components:
  1. Enterprise Architecture--for strategic, tactical, and operational planning
  2. Capital Planning & Investment Control (or IT governance)--for managing the IT investment decision process (i.e. "putting those plans to work")
  3. Project Management (or a project management office)--to effectively execute on the programs and projects in the transition strategy
  4. Customer Relationship Management (or IT service management)--for managing service and support to our customer (i.e. with a single--belly button; one call does it all)
  5. Business Performance Management--how we measure & drive performance (like with an IT executive dashboard--so we know whether we are hitting the target or not!)
Together these five areas make up a holistic and synergistic set of CIO support functions.

So that we move the mindset of the CIO from fighting day to day operational problems to instead strategically managing IT service provision through:
  • Planning
  • Investing
  • Executing
  • Servicing
  • Measuring
This is how we are going to achieve genuine success for the CIO in the 21st century and beyond.


Share/Save/Bookmark

July 3, 2009

Industry Architecture—What’s in a Name?

ComputerWorld, 22 June 2009 has an opinion piece, called “The Benefits of Working Together,” about developing an “Industry Architecture (IA)”—in this particular case for the hotel industry.

It takes the concept of a company or organizational architecture and applies it across an entire industry.

“In difficult economic times, every company seeks cost reductions and process improvements. But now an entire industry has banded together to help its constituents maximize their IT-based assets.”

I can see how from a private sector approach, IA is a way for companies to work together and benefit their overall industry through:

  • Improved IT products—“a clear architectural roadmap allows suppliers to focus efforts on the capabilities most important to customers.”
  • Lower IT product costs—standardized products from suppliers are generally less costly to produce than customized one (but they are also less differentiated and may be less exciting and inviting to customers). The IA also facilitates component reuse, standardized interfaces, and so forth.
  • Lower training costs—IA could reduce training costs, since there are standard processes and products spanning the entire industry meaning that employees can move more seamlessly between companies and not have to learn a whole new way of doing things.
  • Improved agility—industry standards allow for faster deployments and configurations of IT.
  • Increased buyer confidence—industry architectures could provide for a “product certification program”, so buyers can have confidence that IT products meet guidelines and are interoperable with other IA certified products.
  • Improved security—IA can incorporate IT security standards, resulting in companies being more secure than if they had “conflicting security approaches.”

From a public sector perspective, the Federal Enterprise Architecture (FEA) is similar to Industry Architecture in the private sector. Ideally, the FEA looks across all the federal departments (like an IA looks across the various companies in an industry) and creates a roadmap, standards, certification programs, interopability, component reuse, umbrella security, and more resulting in lower IT costs, more agility, and improved service to the citizen.

In terms of naming conventions, we can come up with all types of architectures from company architectures to industry architectures, from solution architectures (for meeting specific requirements) to segment architectures (for specific lines of business). We can develop horizontal architectures (across entities in the same stage of production or service provision) or vertical architectures (in entities that span different stages of production or service provision). We can create national architectures (like it looks like we may end up doing the financial services sector now) or perhaps even global architectures (such as through environmental, economic, or military agreements and treaties).

Whatever we call the various levels of architecture, they are all enterprise architectures (just with the “enterprise” representing different types or levels of entities). In other words, an enterprise can be a company or industry, an agency or a department in the federal government. Some enterprise architectures are bigger than others. Some are more complex. But what all these enterprise architectures have in common is that they seek to provide improved IT planning and governance resulting in cost savings, cost avoidance, and performance improvement for the enterprise in question.

So, we must at all levels continue to plan, develop and implement our enterprise architectures so that we realize the benefits – from the micro to the macro environment – of both private and public sector best practices. 


Share/Save/Bookmark

June 27, 2009

Now We All Have Skin In The Game

It used to be that cybersecurity was something we talked about, but took for granted. Now, we’re seeing so many articles and warnings these days about cybersecurity. I think this is more than just hype. We are at a precipice, where cyberspace is essential to each and every one of us.

Here are some recent examples of major reviews in this area:

  • The White House released its 60-days Cyberspace Policy Review on May 29, conducted under the auspices of Melissa Hathaway, the Cybersecurity Chief at the National Security Council; and the reports states: “Cybersecurity risks pose some of the most serious economic and national security challenges of the 21st century…the nation’s approach to cybersecurity over the past 15 years has failed to keep pace with the threat."
  • The Center for Strategic and International Studies’ Commission on Cybersecurity for the 44th President wrote in a December 2008 report: “America’s failure to protect cyberspace is one of the most urgent national security problems facing the new administration…It is a battle we are losing.”

Cyberspace is becoming a more dangerous place as the attacks against it are growing. Federal Computer Week, June 2009, summarized the threat this way:

“Nation states are stealing terabytes of sensitive military data, including some of the most advanced technology. Cybercrime groups are taking hundreds of millions of dollars from bank accounts and using some of that money to buy weapons that target U.S. soldiers. The attacks are gaining in sophistication and the U.S. defenses are not keeping up.

Reviewing the possibilities as to why this is happening: Have we dropped our guard or diverted resources or knowhow away from cybersecurity in a tight budgetary environment and now have to course correct? Or, have our adversaries become more threatening and more dangerous to us?

I believe that the answer is neither. While our enemies continue to gain in sophistication, they have always been tenacious against us and our determination has never wavered to overcome those who would threaten our freedoms and nation. So what has happened?

In my view the shift has to do with our realization that technology and cyberspace have become more and more vital to us and underpins everything we do--so that we would be devastated by any serious disruption. As the Cyberspace Policy Review states definitively: “The globally-interconnected digital information and communications infrastructure known as “cyberspace” underpins almost every facet of modern society and provides critical support for the U.S economy, civil infrastructure, public safety, and national security.”

We rely on cyberspace in every facet of our lives, and quite honestly, most would be lost without the connectivity, communications, commerce, productivity, and pleasure we derive from it each and every day.

The result is that we now have some serious “skin in the game”. We have something to lose--things that we deeply care about. Thus, we fear for our safety and survival should something bad happen. We think consciously or subconsciously how would we survive without the technology, Internet, and global communications that we have come to depend upon.

Let’s think for a second:

What if cyberspace was taken down or otherwise manipulated or controlled by hostile nation states, terrorists, or criminals?

Would there be a breakdown in our ability to communicate, share information, and learn? Would there be interruptions to daily life activities, disruptions to commerce, finance, medicine and so forth, concerns about physical safety or “accidents”, risks to critical infrastructure, and jeopardy to our ability to effectively protect ourselves and country?

The point here is not to scare, but to awaken to the new realities of cyberspace and technology dependence.

Safeguarding cyberspace isn’t a virtual reality game. Cyberspace has physical reality and implications for all of us if we don’t protect it. Cyberspace if a critical national asset, and we had better start treating it as such if we don’t want our fear to materialize.


Share/Save/Bookmark

June 26, 2009

The Cloud is a Natural Evolution of IT


Cloud computing is bringing us closer than ever to providing IT as utility, where users no longer need to know or care about how the IT services are provided, and only want to know that they are reliably there—just like turning on the light.
This rent-an-IT model of cloud computing can apply to any portion of an organization’s IT architecture, as follows:
  • Service architecture—for application systems, there is “software as a service” (SaaS) such as Google Apps suite for office-productivity or Salesforce.com for customer relationship management. And for developing those systems, there is “platform as a service” (PaaS) such as Google Apps Engine (GAE) or the Defense Information Systems Agency (DISA) Rapid Access Computing Environment (RACE).
  • Information architecture—for storing the data used in systems, there is “storage as a service” such as Amazon’s Simple Storage Service (S3).
  • Technology architecture—for hosting systems, there is “infrastructure as a service” such as Amazon’s Elastic Compute Cloud (EC2)
The big advantage to using hosted IT or cloud computing is that it provides on-demand information technology—again like your electricity usage; the juice is there when you need it. Additionally, by outsourcing to specialist IT providers, you can generally get more efficiency, economy, and agility in providing IT your organization.
Of course, there are challenges that include ownership, security, privacy, and a cultural shift from a vertical (stovepiped) to horizontal (enterprise and common services) mindset.
From my perspective, cloud computing is a natural evolution in our IT service provision:
  1. At first, we did everything in-house, ourselves—with our own employees, equipment, and facilities. This was generally very expensive in terms of finding and maintaining employees with the right skill sets, and developing and maintaining all our own systems and technology infrastructure, securing it, patching it, upgrading it, and so on.
  2. So then came, the hiring of contractors to support our in-house staff; this helped alleviate some of the hiring and training issues on the organization. But it wasn’t enough to make us cost-efficient, especially since we were still managing all our own systems and technologies for our organization as a stovepipe.
  3. Next, we moved to a managed services model, where we out-sourced vast chunks of our IT—from our helpdesk to desktop support, from data centers to applications development, and even to security and more. But apparently that didn’t go far enough, because we were still buying, building, and maintaining our own IT instances for our organization, but now employing call centers and data centers in far-flung places.
  4. And finally, the realization has emerged that we do not need to provide IT services either with our own or contracted staff, but rather we can rely on IT cloud providers who will manage our information technology and that of tens, hundreds, and thousands of others and provide it seamlessly over the Internet, so that we all benefit from a more scalable and unified service provision model.
The cloud computing model takes the CIO/CTO and their staffs out of the fire-fighting mode of IT management and into the drivers seat for managing IT strategically, innovatively, and with a focus on the specific mission needs of their organization.

Share/Save/Bookmark

June 21, 2009

Making More Out of Less

One thing we all really like to hear about is how we can do more with less. This is especially the case when we have valuable assets that are underutilized or potentially even idle. This is “low hanging fruit” for executives to repurpose and achieve efficiencies for the organization.

In this regard, there was a nifty little article in Federal Computer Week, 15 Jun 2009, called “Double-duty COOP” about how we can take continuity of operations (COOP) failover facilities and use them for much more than just backup and business recovery purposes in the case of emergencies. 

“The time-tested approach is to support an active production facility with a back-up failover site dedicated to COOP and activated only during an emergency. Now organizations can vary that theme”—here are some examples:

Load balancing—“distribute everyday workloads between the two sites.”

Reduced downtime—“avoid scheduled outages” for maintenance, upgrades, patches and so forth.

Cost effective systems development—“one facility runs the main production environment while the other acts as the primary development and testing resource.”

Reduced risk data migration—when moving facilities, rather than physically transporting data and risk some sort of data loss, you can instead mirror the data to the COOP facility and upload the data from there once “the new site is 100 percent operational.”

It’s not that any of these ideas are so innovatively earth shattering, but rather it is their sheer simplicity and intuitiveness that I really like.

COOP is almost the perfect example of resources that can be dual purposed, since they are there “just in case.” While the COOP site must ready for the looming contingency, it can also be used prudently for assisting day-to-day operational needs.

As IT leaders, we must always look for improvements in the effectiveness and efficiency of what we do. There is no resting on our laurels. Whether we can do more with less, or more with more, either way we are going to advance the organization and keep driving it to the next level of optimization. 


Share/Save/Bookmark

June 20, 2009

Who Says Car Companies Can't See?


Check out the concept for the new "Local Motors" car company:

  • "Vote for the designs you want. If you are a designer, you can upload your own. Either way, you help choose which designs are developed and built by the Local Motors community. Vote for competition designs, Checkup critiques, or portfolio designs.
  • Open Development, sort of like open source. Once there is enough support for any single design, Local Motors will develop it openly. That means that you not only choose which designs you want to drive, you get to help develop them - every step of the way.
  • Choose the Locale During the development process, help choose where the design should be made available. Local Motors is not a big car company, we are Local. The community chooses car designs with local regions in mind; where will this design fit best? You tell us. We make it happen.
  • Build your Local Motors vehicle Then, once the design and engineering is fully developed you can go to the Local Motors Micro-Factory and build your own - with our help, of course. See the "Buy" page for purchase and Build Experience details.
  • Drive your Local Motors car, the one you helped design and build, home."

I like this user-centric approach to car design and development. This is how we really put the user in the driver's seat.

The is the type of opportunity where we go from Henry Ford's one car for the masses approach to a more localized implementation.

While I don't know the specific economics of this approach for a car company, it seems like it has bottom-line potential since they will only proceed with car development once they have enough demand identified.

Why build cars that no one wants or likes and why pay for internal design and market research studies, when people will willingly participate for free in order to get what they really want?

Finally, this is a terrific example of open source development and crowdsourcing--getting the masses to contribute and making something better and better over time. More minds to the task, more productivity and quality as a result.


Share/Save/Bookmark