Showing posts with label Natural Language Processing. Show all posts
Showing posts with label Natural Language Processing. Show all posts

December 30, 2015

Simplify Me

So here's the monitor in the "modern" and beautiful Fort Lauderdale International airport. 

Can you see the number of electrical plugs, wires, connections, input/output ports, etc. on this device?

Obviously, it is comical and a farce as we near the end of 2015. 

Think about the complexity in building this monitor...in connecting it...in keeping it operational.

Yes, we are moving more and more to cellular and wireless communications, to miniaturization, to simple and intuitive user interfaces, to paperless processing, to voice recognition, to natural language processing, and to artificial intelligence.

But we are not there yet.

And we need to continue to make major strides to simplify the complexity of today's technology. 

- Every technology device should be fully useful and usable by every user on first contact. 

- Every device should learn upon interacting with us and get better and better with time. 

- Every device should have basic diagnostic and self-healing capability. 

Any instructions that are necessary should be provided by the device itself--such as the device telling you step by step what to do to accomplish the task at hand--no manual, no Google instructions, no Siri questions...just you and the device interacting as one. 

User friendly isn't enough anymore...it should be completely user-centric, period. 

Someday...in 2016 or beyond, we will get there, please G-d. ;-)

(Source Photo: Andy Blumenthal)
Share/Save/Bookmark

June 15, 2015

Ex Machina Will Even Turn The Terminator

So this was a really cool display at the Movie theater yesterday...

They had this head of the Terminator in a enclosed case and roped off. 

Shiny metal alloy skull, buldging bright evil red eyes, and really grotesque yellowed teeth. 

This certainly gets the attention of passerbys for the upcoming new movie, Terminator Genisys (coming out July 1). 

Anyway, Terminator is the ugly dude especially when compared with the robot/artificial intelligence of Ava in Ex Machina that we saw yesterday. 

The Turing test is nothing for Ava!

She can not only fool them as to her humanity, but also outmanuever them with her wit, sexuality, and a good dose of deceit and manipulation. 

Frankly, I think AI Ava could even turn the terible Terminator to her side of things--my bet is that movie to come in 2017. 

(Source Photo: Andy Blumenthal)
Share/Save/Bookmark

September 29, 2014

Talk To The Hand

So you know the saying "Talk to the hand, because the face ain't home..."?

Well IPSoft has an artificial intelligence agent called Amelia that handles service requests. 

Instead of talking to a human customer service rep, you get to talk to a computer. 

The question is whether Amelia is like talking to a hand or is someone really home when using IA to adroitly address your service issues?

Now apparently, according to the Wall Street Journal, this computer is pretty smart and can ingest every single manual and prior service request and learn how to answer a myriad of questions from people. 

On one hand, maybe you'll get better technical knowledge and more consistent responses by talking to a computerized service representative.

But on the other hand, if the interactive voice response systems with the dead end menus of call options, endless maze of "If you want to reach X, press Y now" along with all the disconnects after being on for 10 minutes already are any indication of what this, I am leery to say the least. 

The Telegraph does says that Amelia can service customers in 20 languages and after 2 months, can resolve 64% of "the most common queries" independently, so this is hopeful and maybe even inspiring of what is to come. 

These days, based on how much time we spend online in the virtual world, I think most people would actually prefer to talk to a knowledgeable computer than a smart alec human who doesn't want to be handling annoying customer calls all day, anyway. 

The key to whether Amelia and her computerized brothers and sisters of the future will be successful is not only how quickly they can find the correct answer to a problem, but also how well they can understand and address new issues that haven't necessarily come up the same way before, and how they handle the emotions of the customer on the line who wishes they didn't have the problem needing this call to begin with. ;-)

(Source Photo: here with attribution to Vernon Chen)
Share/Save/Bookmark

December 8, 2013

Amazon Delivery - By Crunk-Car, If You Like

Jeff Bezos of Amazon is one very smart guy and when he announces that he is interested in drones delivering your next online order that makes for a lot of grandstanding. 

But really how is a dumb drone delivering an order of diapers or a book so exciting. 

Aside from putting a lot of delivery people at USPS, UPS, and FedEx out of work, what does the consumer get out of it? 

Honestly, I don't care if if the delivery comes by Zike-Bike, Crunk-Car, Zumble-Zay, Bumble-Boat, or a Gazoom, as Dr. Seuss would say--I just care that it gets here fast, safely, and cheaply. 

Will a drone be able to accomplish those things, likely--so great, send the drone over with my next order, but this doesn't represent the next big technological leap. 

It doesn't give us what the real world of robotics in the future is offering: artificial intelligence, natural language processing, augmentation of humans, or substitution by robots altogether, to do things stronger, faster, and more precisely, and even perhaps companionship to people. 

Turning surveillance and attack drones into delivery agents is perhaps a nice gesture to make a weapon into an everyday service provider. 

And maybe the Octocopters even help get products to customers within that holy grail, one day timeframe, that all the retailers are scampering for.

It's certainly a great marketing tool--because it's got our attention and we're talking about it.

But I'll take a humanoid robot sporting a metallic smile that can actually interact with people, solve problems, and perform a multitude of useful everyday functions--whether a caregiver, a bodyguard, or even a virtual friend (e.g. Data from Star Trek)--over a moving thingamajig that Dr. Seuss foresaw for Marvin K. Mooney. ;-)
Share/Save/Bookmark

November 16, 2013

Web 1-2-3

The real cloud computing is not where we are today.

Utilizing infrastructure and apps on demand is only the beginning. 

What IBM has emerging that is above the other cloud providers is the real deal, Watson, cognitive computing system.

In 2011, Watson beat the human champions of Jeopardy, today according to the CNBC, it is being put online with twice the power. 

Using computational linguistics and machine learning, Watson is becoming a virtual encyclopedia of human knowledge and that knowledge-base is growing by the day.

But moreover, that knowledge can be leveraged by cloud systems such as Watson to link troves of information together, process it to find hidden meanings and insights, make diagnoses, provide recommendations, and generally interact with humans.

Watson can read all medical research, up-to-date breakthroughs in science, or all financial reports and so on and process this to come up with information intelligence. 

In terms of computational computing, think of Apple's Siri, but with Watson, it doesn't just tell you where the local pizza parlors are, it can tell you how to make a better pizza. 

In short, we are entering the 3rd generation of the Internet:

Web 1.0 was as a read-only, Web-based Information Source. This includes all sorts of online information available anytime and anywhere. Typically, organizational Webmasters publishing online content to the masses. 

Web 2.0 is the read-write, Participatory Web. This is all forms of social computing and very basic information analytics. Examples include: email, messaging, texting, blogs, twitter, wikis, crowdsourcing, online reviews, memes, and infographics.

Web 3.0 will be think-talk, Cognitive Computing. This incorporates artificial intelligence and natural language processing and interaction. Examples: Watson, or a good-natured HAL 9000.

In short, it's one thing to move data and processing to the cloud, but when we get to genuine artificial intelligence and natural interaction, we are at all whole new computing level. 

Soon we can usher in Kurzweil's Singularity with Watson leading the technology parade. ;-)

(Source Photo: Andy Blumenthal)
Share/Save/Bookmark

October 19, 2013

What If They Can Read Our Redactions?

The New Yorker has a fascinating article about technology advances being made to un-redact classified text from government documents. 

Typically, classified material is redacted from disclosed documents with black bars that are technologically "burnt" into the document.

With the black bars, you are not supposed to be able to see/read what is behind it because of the sensitivity of it. 

But what if our adversaries have the technology to un-redact or un-burn and autocomplete the words behind those black lines and see what it actually says underneath?

Our secrets would be exposed!  Our sensitive assets put at jeopardy!

Already a Columbia University professor is working on a Declassification Engine that uses machine learning and natural language processing to determine semantic patterns that could give the ability "to predict content of redacted text" based on the words and context around them. 

In the case, declassified information in the document is used in aggregate to "piece together" or uncover the material that is blacked out. 

In another case prior, a doctoral candidate at Dublin City University in 2004, used "document-analysis technologies" to decrypt critical information related to 9/11. 

This was done by also using syntax or structure and estimating the size of the word blacked out and then using automation to run through dictionary words to see if it would fit along with another "dictionary-reading program" to filter the result set to the likely missing word(s). 

The point here is that with the right technology redacted text can be un-redacted. 

Will our adversaries (or even allies) soon be able to do this, or perhaps, someone out there has already cracked this nut and our secrets are revealed?

(Source Photo: here with attribution to Newspaper Club)
Share/Save/Bookmark

February 5, 2013

From Holocaust To Holograms


My father told me last week how my mom had awoken in the middle of night full of fearful, vivid memories of the Holocaust. 

In particular, she remembers when she was just a six year-old little girl, walking down the street in Germany, and suddenly the Nazi S.S. came up behind them and dragged her father off to the concentration camp, Buchenwald--leaving her alone, afraid, and crying on the street. And so started their personal tale of oppression, survival, and escape. 

Unfortunately, with an aging generation of Holocaust survivors--soon there won't be anyone to tell the stories of persecution and genocide for others to learn from.

In light of this, as you can imagine, I was very pleased to see the University of Southern California (USC) Institute for Creative Technologies (ICT) and the USC Shoah Foundation collaborating on a project called "New Dimensions In Testimony" to use technology to maintain the enduring lessons of the Holocaust into the future.

The project involves developing holograms of Holocaust survivors giving testimony about what happened to them and their families during this awful period of discrimination, oppression, torture, and mass murder.

ICT is using a technology called Light Stage that uses multiple high-fidelity cameras and lighting from more than 150 directions to capture 3-D holograms. 

There are some interesting videos about Light Stage (which has been used for many familiar movies from Superman to Spiderman, Avatar, and The Curious Case of Benjamin Button) at their Stage 5 and Stage 6 facilities. 

To make the holograms into a full exhibit, the survivors are interviewed and their testimony is combined with natural language processing, so people can come and learn in a conversational manner with the Holocaust survivor holograms. 

Mashable reports that these holograms may be used at the U.S. Holocaust Museum in Washington, D.C. where visitors will talk "face-to-face" with the survivors about their personal experiences--and we will be fortunate to hear it directly from them. ;-)

(Photo from USC ICT New Dimensions In Technology)

Share/Save/Bookmark

January 15, 2013

Challenging The Dunbar 150


Today, Facebook announced it's new search tool called Graph Search for locating information on people, places, interests, photos, music, restaurants, and more. 

Graph Search is still in beta, so you have to sign up in Facebook to get on the waiting list to use it. 

But Facebook is throwing down the gauntlet to Google by using natural language queries to search by just asking the question in plain language like: "my friends that like Rocky" and up comes those smart ladies and gents. 

But Graph Search is not just a challenge to Google, but to other social media tools and recommendation engines like Yelp and Foursquare, and even LinkedIn, which is now widely used for corporate recruiting. 

Graph Search uses the Bing search engine and it's secret sauce according to CNN is that is culls information from over 1 billion Facebook accounts, 24 billion photos, and 1 trillion connections--so there is an enormous and growing database to pull from. 

So while the average Facebook user has about 190 connections, some people have as many as 5,000 and like the now antiquated business card file or Rolodex, all the people in your social network can provide important opportunities to learn and share. And while in the aggregate six degrees of separation, none of us are too far removed from everyone else anyway, we can still only Graph Search people and content in our network.

Interestingly enough, while Facebook rolls out Graph Search to try to capitalize on its treasure trove personal data and seemingly infinite connections, Bloomberg BusinessWeek (10 January 2013) ran an article called "The Dunbar Number" about how the human brain can only handle up to "150 meaningful relationships."

Whether hunter-gather clans, military units, corporate divisions, or an individual's network of family, friends, and colleagues--our brain "has limits" and 150 is it when it comes to substantial real world or virtual relationships--our brains have to process all the facets involved in social interactions from working together against outside "predators" to guarding against "bullies and cheats" from within the network. 

According to Dunbar, digital technologies like the Internet and social media, while enabling people to grow their virtual Rolodex, does not really increase our social relationships in the real meaning of the word. 

So with Graph Search, while you can mine your network for great talent, interesting places to visit, or restaurants to eat at, you are still fundamentally interacting with your core 150 when it comes to sharing the joys and challenges of everyday life. ;-)

(Source Photo: Andy Blumenthal)

Share/Save/Bookmark

April 3, 2012

Robot Firefighters To The Rescue


Meet Octavia, a new firefighting robot from the Navy's Laboratory for Autonomous Systems Research (LASR) in Washington, D.C.

Octavia and her brother Lucas are the the latest in firefighting technology. 

These robots can hear commands, see through infrared cameras, identify patterns, and algorithmically make decisions on diverse information sets.

While the current prototypes move around like a Segway, future versions will be able to climb ladders and get around naval vessels.
It is pretty cool seeing this robot spray flame retardant to douse the fire, and you can imagine similar type robots shooting guns on the front line at our enemies.

Robots are going to play an increasingly important role in all sorts of jobs, and not only the repetitive ones where we put automatons, but also the dangerous situations (like the bomb disposal robots), where robots can get out in front and safeguard human lives.

While the technology is still not there yet--and the robot seems to need quite a bit of instruction and hand waving--you can still get a decent glimpse of what is to come.

Robots with artificial intelligence and natural language processing will be putting out those fires all by themselves...and then some. 

Imagine a robot revolution is coming, and what we now call mobile computing is going to take on a whole new meaning with robots on the go--autonomously capturing data, processing it, and acting on it.

I never did see an iPhone or iPad put out a fire, but Octavia and brother Lucas will--and in the not too distant future!

Share/Save/Bookmark

March 10, 2012

Robots, Coming to An Agency Near You Soon

There is an article today in the Wall Street Journal (10-11 March 2012) about how an Anybot Robot attended a wedding party in Paris dressed up as the man's 82-year old mother who logged on from her home in Las Vegas and by proxy of the robot moved and even danced around the party floor and conversed with guests--she was the hit of the party. 

While sort of humorous, this is also amazingly incredible--through robotics, IT and telecommunications, we are able to close the gap in time and space and "be there," even from a half a world away.

The QB Anybot robot is life size, rolls around on 2 wheels like a Segway, and has glowing blue eyes and a telescreen for a forehead on a long skinny cylindrical body that can be controlled remotely and costs only $9,700.

While this is the story of a robot "becoming the life of the party," I believe that we are at the cusp of when robots will be reporting for duty at our agencies and organizations. 
 
The function of robots in workplace has been tested with them performing everything from menial office tasks (like bringing the coffee and donuts) to actually representing people at meetings and around the office floor--not only keeping an electric eye on things so to say, but actually skyping back and forth with the boss, for example. 

As robots become more dexterous, autonomous, and with better artificial intelligence, and abilities to communicate with natural language processing, we are going to see an explosion of these in the workplace--whether or not they end up looking like a Swiffer mop or something a little more iRobot-like. 

So while we are caught up in deficit-busting times and the calls for everything from "Cloud First" to "Share First" in order to consolidate, save, and shrink, maybe what we also need is a more balanced approach that takes into account not only efficiencies, but effectiveness through innovation in our workplaces--welcome to the party, Robots!

(Source Photo: Andy Blumenthal)

Share/Save/Bookmark