Showing posts with label Big Brother. Show all posts
Showing posts with label Big Brother. Show all posts

August 5, 2011

Facial Recognition Goes Mainstream

Bar

Facial recognition applications are no longer just for the military and law enforcement to identify hostiles or criminals, but rather is going mainstream.

The Wall Street Journal (5 August 2011) reports from the bar scene to the television and from vampire gaming to celebrity match-ups, facial recognition software is now part of our everyday technology mix.

Facial recognition is "at a tipping point where some of these face-recognition technologies are not just gimicks, but are becoming useful." Moreover, the technology has become quite good with "frontal face images, the error rate of rejecting a legitimate claim--when the face image and name match-decreased to 0.29% in 2010 from a rate of 79% in 1993."

So here are some examples of how facial recognition is being used:

- SceneTap: Free app for iPhone and Droid "displays real time stats on the local bar scene...shows the number of people at the bar, the male-to-female ratio, and the average age of the patron"--all from facial recognition--this is not bad except for the bartender on a slow night.

- TVs with Viewdle: TV set-top boxes with facial recongition can "identify who is sitting in front of the TV then customize programming accordingly...displaying most recently watched or recorded shows"--can anyone say America's Got Talent!

- Third Eye: Facebook game that based on facial recognition identifies people as either vampires or slayers. Even without the app, I'd bet I'm one of the slayers :-)

- FaceR Celebrity: This iPhone app uses a picture and facial recognition software to determine which celebrities you most closely resemble. For me, it's Sylvester Stallone, all the way--I'm sure of it.

A lot of people are concerned about the privacy implications of facial recognition--collecting and storing images of faces and using it for surveillance and tracking and getting into your business...like knowing what bars or whereever else you are going to.

But apps like SceneTap say they don't collect personal information, nobody sees the video feed, and they don't match the images to photos on the web or Facebook to identify exactly who is entering the bar. This is sounding a little like TSA and the body imaging scanners they use--i.e. don't worry nobody sees your privates! :-)

But perhaps, whether or not they do or don't isn't the point, they could and that is a privacy concern.

Facial recognition technology, even though it is used in gaming, it is not kid's play, and it should be regulated to avoid a society where Internet "big brother" has virtually unlimited capability to track and match each and every facial you!

(Source Photo: here)

Share/Save/Bookmark

February 6, 2011

Apple: #1 Super Bowl Commercial Of All Time


Rated the #1 Super Bowl Commercial of all time, this advertisement was used by Apple to introduce its Macintosh computer in 1984 during Super Bowl XVIII.
Apple showed the world their understanding that:

- The "drone" nature of how we did business--"just follow the leader"--was not going to make us great.

- The other "blah"--not user-centric--technology offered by the "Big Brother(s)" of the time was seducing the masses into a blind morass--a kind of an enslavement of our productive energies.
Apple was not, and is not afraid, to come out and break the paradigm and that what makes them a great company.
Innovate, innovate, innovate for a better future for mankind.
In life, there is always choice between what is and what could be and that 's what drives our competitive juices.

Share/Save/Bookmark

March 31, 2010

Balancing Freedom and Security

There is a new vision for security technology that blends high-tech with behavioral psychology, so that we can seemingly read people’s minds as to their intentions to do harm or not.

There was a fascinating article (8 January 2010) by AP via Fox News called “Mind-Reading Systems Could Change Air Security.”

One Israeli-based company, WeCU (Read as we see you) Technologies “projects images onto airport screen, such as symbols associated with a certain terrorist group or some other image only a would be terrorist would recognize.”

Then hidden cameras and sensors monitoring the airport pickup on human reactions such as “darting eyes, increased heartbeats, nervous twitches, faster breathing,” or rising body temperature.

According to the article, a more subtle version of this technology called Future Attribute Screening Technology (FAST) is being tested by The Department of Homeland Security—either travelers can be passively scanned as they walk through security or when they are pulled aside for additional screening are subjected to “a battery of tests, including scans of facial movements and pupil dilation, for signs of deception. Small platforms similar to balancing boards…would help detect fidgeting.”

The new security technology combined with behavioral psychology aims to detect those who harbor ill will through the “display of involuntary physiological reactions that others—such as those stressed out for ordinary reasons, such as being late for a plane—don’t.”

While the technology married to psychology is potentially a potent mix for detecting terrorists or criminals, there are various concerns about the trend with this, such as:

1) Becoming Big Brother—As we tighten up the monitoring of people, are we becoming an Orwellian society, where surveillance is ubiquitious?

2) Targeting “Precrimes”—Are we moving toward a future like the movie Minority Report, where people are under fire just thinking about breaking the law?

3) Profiling—How do we protect against discriminatory profiling, but ensure reasonable scanning?

4) Hardships—Will additional security scanning, searches, and interrogations cause delays and inconvenience to travelers?

5) Privacy—At what point are we infringing on people’s privacy and being overly intrusive?

As a society, we are learning to balance the need for security with safeguarding our freedoms and fundamental rights. Certainly, we don’t want to trade our democratic ideals and the value we place on our core humanity for a totalitarianism state with rigid social controls. Yet, at the same time, we want to live in peace and security, and must commit to stopping those with bad intentions from doing us harm.

The duality of security and freedom that we value and desire for ourselves and our children will no doubt arouse continued angst as we must balance the two. However, with high-technology solutions supported by sound behavioral psychology and maybe most importantly, good common sense, we can continue to advance our ability to live in a free and secure world—where “we have our cake and eat it too.”


Share/Save/Bookmark

August 2, 2008

Big Brother and Enterprise Architecture

When people work from home, should their employers simply set performance goals for them and then evaluate them based on whether or not they met these or should employers monitor employees work at home to ensure that employers are where they say they are and doing what they say there are doing?

The Wall Street Journal, 30 July 2008, reports that “companies are stepping up electronic monitoring and oversight of tens of thousands of home-based independent contractors.”

Home-based workers have been increasing steadily over the years, with over 16 million home-based workers now in the U.S. That is huge!

But work is not care-free for these home workers. They can’t be sitting around working in their underwear, watching YouTube, or playing Sudoku. Employers are more often monitoring their employees by “taking photos of workers’ computer screens at random, counting keystrokes and mouse clicks, and snapping photos of them at their computers.”

That’s the visual inspection going on; then there is the audio piece. Companies are “plying sophisticated technology to instantaneously detect anger, raised voices, or children crying in the background on workers’ home-office calls. Others are using Darwinian routing systems to keep calls coming so fast workers have no time to go the bathroom.”

Is this big brother watching mentality too invasive or is it appropriate when we’re on the clock?

Well even well intentioned monitoring of home employees can certainly be taken to an extreme. One company, Arise-com “keeps its 8,000 at home agents so tightly tethered to their phones that they have to go schedule unpaid time off to go to the bathroom.”

From an enterprise architecture perspective, I believe it’s important to consider not only the performance aspect to the organization in terms of productivity and cost-effectiveness of these workers, but also to look at from a human-capital perspective with respect to treating the employees with trust, respect, and integrity.

I believe that people should be given the benefit of the doubt and treated kindly and humanly and not subjected to undue and invasive monitoring like photographing them on a webcam. Instead, let’s set ambitious, but realistic performance goals for our employees. Most the time, work at home employees end up exceeding performance expectations. For those that don’t meet their goals, then additional monitoring is appropriate to further assess their performance and to decide whether the privilege of working at home should be continued or not.

Trust but verify. Let’s start off with a core dose of trust, but have the verify ready to go for those that abuse it.


Share/Save/Bookmark

January 25, 2008

Big Brother is Watching and Enterprise Architecture

The enterprise architecture for law enforcement and security in the next decade will be focused on using technology to identify the bad guys and stop them in their tracks.
ComputerWorld, 14 January 2008, reports that “Homeland Security is bankrolling futuristic technology to nab terrorists before they strike.”
Here’s the target architecture:
“The year is 2012 [probably a bit optimistic on the date]. As soon as you walk into the airport, the machines are watching. Are you a tourist—or a terrorist posing as one? As you answer a few questions at the security checkpoint, the systems begin sizing you up. An array of sensors—video, audio, laser, infrared—feeds a stream of real-time data about you to a computer that uses specially developed algorithms to spot suspicious people. The system interprets your gestures and facial expressions, analyzes your voice and virtually probes your body to determine your temperature, heart rate, respiration rate, and other physiological characteristics—all in an effort to determine whether you are trying to deceive. Fail the test, and you’ll be pulled aside for a more aggressive interrogation and searches.”
Last July, The Department of Homeland Security, “human factors division asked researchers to develop technologies to support Project Hostile Intent, an initiative to build systems that automatically identify and analyze behaviors and physiological cues associated with deception.”

The intent is to use these screening technologies at airports, border crossings, as well as possibly in the private sector for building access control and candidate screening.
Sharla Rausch, director of DHS’s human factors division says that “in controlled lab setting, accuracy rates are in the range of 78% to 81%.”
Where is the current research focused?
  1. Recognition of gestures and microfacial expressions
  2. Analysis of variations in speech (i.e. pitch, loudness)
  3. Measurement of physiological characteristics
The hope is that by combining all three modalities, “the overall predictive accuracy rate” will improve.
What are some of the challenges with these technologies?
  1. Currently, too many false positives
  2. Existing technologies, like the polygraph have “long been questioned by scientists…and remain inadmissible in court.”
  3. Ability of algorithms to “correctly interrupt” suspicious behavior or cues
  4. Profiling is continuously objected too based on discriminatory grounds
  5. Privacy concerns about the personal data collected
  6. Testing is limited by security concerns in the field
  7. Deployment will be limited due to cost, leaving soft targets potentially at risk
Will this Big Brother screening technology come to fruition?
Absolutely. The challenges with the technologies will be resolved, and putting aside the profiling and privacy issues, these screening technologies will become essential to our protecting ourselves.

Share/Save/Bookmark