Showing posts with label graphical user interface. Show all posts
Showing posts with label graphical user interface. Show all posts

June 11, 2013

Apple Designers Lost In The Imagination Orchid


Apple which is under competitive pressure to come up with something new—since Steve Jobs, their chief and master innovator passed away—seems like a deer in the headlights, where they can’t sprint forward to the next innovation and instead, they just sit paralyzed in fear and stair dumbly into the oncoming Mac truck called Google and Samsung.

Apple, the pioneer of the mobile icons on your smartphone and tablet that look like what they are, has lost their way—big time.

Their new iOS 7 abandons this intuitive, user-centric architecture approach of skeuomorphism for instead a more amorphous look and feel—where the user has to guess what an icon is supposed to be (check out the unintelligible icons for Newstand or Passbook mobile wallet).

In other cases, there is virtually no significant perceptible change at all (see Messages and iTunes that are just a little bigger) or other changes that are actually detracting from what was in iOS 6 (see Reminders without the check marks, Notes without a notepad look, Settings without the gears, and the addition of clouds to the Weather icon).

I love Apple products—but just like they are flailing with a new backwards-leaning graphical user interface and Siri, the useless automated personal assistant, they are behind in the wearable technology arena, where Google Glass in almost off and running.

There is a reason Apple stock has tanked from over $700 to hovering in the low to mid $400 range,--without the brilliance of Job’s imagination, a laser-focus on perfecting their products, future-thinking functionality, and sleek elegant design--Apple is in trouble.

Will an Apple watch or television be unveiled soon and save the day?

It will extend Apple’s successful running streak, but their distinctive culture of creativity and excellence had better emerge in more ways than an iWatch or iTV for Apple to hold their crown of technology glory. ;-)

(Source Photo: Facebook Fan's of Apple)

Share/Save/Bookmark

June 9, 2013

Turnkey Cyberwar

Interesting article by Noah Shachtman in Wired about how the Pentagon is gearing up for cyberwar.

It's called Plan X and it's being pursued by the Defense Advanced Research Projects Agency (DARPA).

The idea is for cyber warfare to be conducted like traditional kinetic warfare--where "munitions made of 1s and 0s [are] to be as a simple to launch as ones made of metal and explosives."

Cyberspace is considered a domain of warfare similar to land, sea, air, and space, and it is necessary to be able to craft offensive capabilities where "a military operator can design and deploy a cyber effect, know what it's going to accomplish...and take the appropriate level of action."

We can't fly by the seat of our pants in cyberspace any longer; we've got to have turnkey solutions ready to launch in order to defend our people and interests. 

To accomplish this, we need:

1) Surveillance: A good map of cyberspace detailing enemy cyber outposts and threats akin to the geographical maps we have identifying physical targets and dangerous movements.

2) Weapons: Reliable cyber weapons ready to take on and take out enemy networks similar to kinetic weapons ready to destroy their military hardware and infrastructure.

3) Launch protocols: The rules of engagement for attack and counterattack and the ability to intuitively and securely unleash those even faster then the turnkey capabilities with which we can respond with traditional military might. 

Whether, the cyber weapon looks like Angry Birds or some other point (at the target) and swipe (to launch at them) interface is almost beside the point--what is key is that we are ready to fight like hell in cyberspace, win uncontested, and keep the peace again. ;-)

(Source Photo: here with attribution to Great Beyond)
Share/Save/Bookmark

December 6, 2007

An Online Only World and Enterprise Architecture

How long will it be before the internet becomes our primary means of storing personal data and running software applications (web-based)?

MIT Technology Review, 3 December 2007, reports that one core vision for the evolution of technology (that of Google) is that we are moving from a computer-based technical environment to an online-only world, where “digital life, for the most part, exists on the Internet”—this is called cloud computing.

Already, users can perform many applications and storage functions online. For example:

  • “Google Calendar organizes events,
  • Picasa stores pictures,
  • YouTube holds videos,
  • Gmail stores email, and
  • Google Docs houses documents, spreadsheets, and presentations.”

Moreover, MIT Technology Review reports that it is rumored that Google is working on an umbrella application that will pull these disparate offerings together for a holistic cloud computing solution.

What’s the advantage of cloud computing?

A computer hard drive is no longer important. Accessibility to one’s information is limited only by one’s access to the internet, which is becoming virtually ubiquitous, and information can be shared with others easily. “The digital stuff that’s valuable… [is] equally accessible from his home computer, a public internet café, or a web-enabled phone.”

What are some of the issues with cloud computing?


  • Privacy—“user privacy …becomes especially important if Google serves ads that correspond to all personal information, as it does in Gmail.”
  • Encryption—“Google’s encryption mechanisms aren’t flawless. There have been tales of people logging into Gmail and pulling up someone else’s account.”
  • Copyright—“one of the advantages of storing data in the cloud is that it can easily be shared with other people, but sharing files such as copyrighted music and movies is generally illegal.”
  • Connectivity—“a repository to online data isn’t useful if there’s no Internet connection to be had, or if the signal is spotty.”
Still Google’s vision is for “moving applications and data to the internet, Google is helping make the computer disappear.” Human-computer interaction has evolved from using command lines to graphical user interface to a web browser environment. “It’s about letting the computer get out of our way so we can work with other people and share our information.”

Of course, Google’s vision of an online-only world isn’t without challenge: Microsoft counters that “it’s always going to be a combination of [online and offline], and the solution that wins is going to be the one that does the best job with both.” So Microsoft is building capability for users “to keep some files on hard drives, and maintain that privacy, while still letting them access those files remotely.”

I will not predict a winner-take-all in this architecture battle of online and offline data and applications. However, I will say that we can definitely anticipate that information sharing, accessibility, privacy, and security will be centerpieces of what consumers care about and demand in a digital world. Online or offline these expectations will drive future technology evolution and implementation.
Share/Save/Bookmark