Showing posts with label Profiling. Show all posts
Showing posts with label Profiling. Show all posts

August 27, 2019

Selling By Customer Stereotypes

Saw this displayed on the wall inside a Free People clothing store...

It categorizes their female shoppers into 4 types:

1. Candy (hearts): Sweet, girly, flirty, whimsy, and femme  

2. Ginger (cherries): Sexy, confident, edgy, attitude, and mysterious

3. Lou (baseball): Cool, tomboy, laid back, tough, minimal

4. Meadow (sunshine): Flowy, bohemian, embellished, pattern, worldly

So this is how they stereotype their customers "to be helpful"?

Interesting also that they don't see that people can be complex with: multiple traits that cross categories or even in no category at all.

Moreover, people can have different sides to themselves and reflect these in different situations. 

Perhaps in an effort to market and sell more, what they've done is reduce people to these lowest common denominator of idiot categories.

And what makes this worse yet is that it seems to be based just on snap judgment of how someone looks coming into the store and all the biases that entails. 

How about we look at people a little more sophisticated than this and treat them as individuals, with real personalities, and not just as another empty label?  ;-)

(Credit Photo: Andy Blumenthal)
Share/Save/Bookmark

August 15, 2018

Floppy Disk Earrings

So this was an interesting technology fashion statement.

This lady in Washington, D.C. has earrings that are floppy disks. 

One full diskette on each ear!

I guess not only can she wear them, but she can plug them into her computer at work and save or transfer files (that is if you can still find a computer that actually uses these). 

It makes you think though, from a cybersecurity perspective, what other devices can people "wear" to work and use for good or malicious purposes. 

Another scary thought came to mind, how suicide/homicide bombers strap vests with explosives to their bodies too--do terrorists also adhere to a certain "style" even for murdering people? 

Anyway, fashion can be almost anything apparently...if you can find a way to put it on you body. ;-)

(Source Photo: Dannielle Blumenthal)
Share/Save/Bookmark

December 24, 2013

To Archive Or Not

Farhad Manjoo had a good piece in the Wall Street Journal on the Forever Internet vs. the Erasable Internet.

The question he raises is whether items on the Internet should be archived indefinitely or whether we should be able to delete postings. 

Manjoo uses the example of Snapshot where messages and photos disappear a few seconds after the recipient opens them--a self-destruct feature.

It reminded me of Mission Impossible, where each episode started with the tape recording of the next mission's instructions that would then self-destruct in five seconds...whoosh, gone. 

I remember seeing a demo years ago of an enterprise product that did this for email messages--where you could lock down or limit the capability to print, share, screenshot, or otherwise retain messages that you sent to others. 

It seemed like a pretty cool feature in that you could communicate what you really thought about something--instead of an antiseptic version--without being in constant fear that it would be used against you by some unknown individual at some future date. 

I thought, wow, if we had this in our organizations, perhaps we could get more honest ideas, discussion, vetting, and better decision making if we just let people genuinely speak their minds. 

Isn't that what the First Amendment is really all about--"speaking truth to power"(of course, with appropriate limits--you can't just provoke violence, incite illegal actions, damage or defame others, etc.)?

Perhaps, not everything we say or do needs to be kept for eternity--even though both public and private sector organizations benefit from using these for "big data" analytics for everything from marketing to national security. 

Like Manjoo points out, when we keep each and every utterance, photo, video, and audio, you create a situation where you have to "constantly police yourself, to create a single, stultifying profile that restricts spontaneous self-expression."

While one one hand, it is good to think twice before you speak or post--so that you act with decency and civility--on the other hand, it is also good to be free to be yourself and not a virtual fake online and in the office. 

Some things are worth keeping--official records of people, places, things, and events--especially those of operational, legal or historical significance and even those of sentimental value--and these should be archived and preserved in a time appropriate way so that we can reference, study, and learn from them for their useful lives. 

But not everything is records-worthy, and we should be able to decide--within common sense guidelines for records management, privacy, and security--what we save and what we keep online and off. 

Some people are hoarders and others are neat freaks, but the point is that we have a choice--we have freedom to decide whether to put that old pair of sneakers in a cardboard box in the garage, trash it, or donate it. 

Overall, I would summarize using the photo in this post of the vault boxes, there is no need to store your umbrella there--it isn't raining indoors. ;-)

(Source Photo: here with attribution to Spinster Cardigan)
Share/Save/Bookmark

December 10, 2012

I'm Looking At You Looking At Me Looking At You

Almax, the Italian maker of mannequins has a new high-tech version that does more than stand around and look pretty.

The EyeSee Mannequin has a camera built into its eye that watches you while you shop. 

According to Bloomberg BusinessWeek (6 December 2012), the EyeSee Mannequin sells for about $5,130 and it conducts consumer profiling--using technology to identify criminals--it determines your age, gender, and race and tracks your shopping patterns. 

Newer versions of EyeSee will likely have a sensor for hearing you as well, so it can "eavesdrop on what shoppers say about the mannequin's attire."

Next to these mannequins, you have to consider who are the real dummies, when everything you do and say can be monitored. 

Next time, you're peering at that mannequin, be careful, it may be peering right back at you--and when it says something be ready to jump. ;-)

(Source Photo: Andy Blumenthal)

Share/Save/Bookmark

July 6, 2012

The Information Is On You

There was a fascinating article in the New York Times (17 June 2012) called: "A data giant is Mapping and Sharing the Consumer Genome."

It is about a company called Acxion--with revenues of $1.13 billion--that develops marketing solutions for other companies based on their enormous data collection of everything about you!
 
Acxion has more than 23,000 servers "collecting, collating, and analyzing consumer data...[and] they have amassed the world's largest commercial database on consumers."

Their "surveillance engine" and database on you is so large that they:

- "Process more than 50 trillion data 'transactions' a year."
- "Database contains information about 500 million active consumers."
- "About 1,500 data points per person."
- Have been collecting data for 40 years!

Acxion is the slayer of the consumer big data dragon--doing large-scale data mining and analytics using publicly available information and consumer surveys.

They collect data on demographics, socio-economics, lifestyle, and buying habits and they integrate all this data.

Acxion generates direct marketing solutions and predictive consumer behavior information.

They work with 47 of the Fortune 100 as well as the government after 9/11.

There are many concerns raised by both the size and scope of this activity.
 
Firstly, as to the information itself relative to its:

- Privacy
- Security

Secondly, regarding the consumer in terms of potential: 

- Profiling
- Espionage
- Stalking
- Manipulation 

Therefore, the challenge of big data is a double-edged sword: 

- On one hand we have the desire for data intelligence to make sense of all the data out there and use it to maximum affect.
- On the other hand, we have serious concerns about privacy, security, and the potential abuse of power that the information enables. 

How we harness the power of information to help society, but not hurt people is one of the biggest challenges of our time. 

This will be an ongoing tug of war between the opposing camps until hopefully, the pendulum settles in the healthy middle, that is our collective information sweet spot. 

(Source Photo: Andy Blumenthal)


Share/Save/Bookmark

March 31, 2010

Balancing Freedom and Security

There is a new vision for security technology that blends high-tech with behavioral psychology, so that we can seemingly read people’s minds as to their intentions to do harm or not.

There was a fascinating article (8 January 2010) by AP via Fox News called “Mind-Reading Systems Could Change Air Security.”

One Israeli-based company, WeCU (Read as we see you) Technologies “projects images onto airport screen, such as symbols associated with a certain terrorist group or some other image only a would be terrorist would recognize.”

Then hidden cameras and sensors monitoring the airport pickup on human reactions such as “darting eyes, increased heartbeats, nervous twitches, faster breathing,” or rising body temperature.

According to the article, a more subtle version of this technology called Future Attribute Screening Technology (FAST) is being tested by The Department of Homeland Security—either travelers can be passively scanned as they walk through security or when they are pulled aside for additional screening are subjected to “a battery of tests, including scans of facial movements and pupil dilation, for signs of deception. Small platforms similar to balancing boards…would help detect fidgeting.”

The new security technology combined with behavioral psychology aims to detect those who harbor ill will through the “display of involuntary physiological reactions that others—such as those stressed out for ordinary reasons, such as being late for a plane—don’t.”

While the technology married to psychology is potentially a potent mix for detecting terrorists or criminals, there are various concerns about the trend with this, such as:

1) Becoming Big Brother—As we tighten up the monitoring of people, are we becoming an Orwellian society, where surveillance is ubiquitious?

2) Targeting “Precrimes”—Are we moving toward a future like the movie Minority Report, where people are under fire just thinking about breaking the law?

3) Profiling—How do we protect against discriminatory profiling, but ensure reasonable scanning?

4) Hardships—Will additional security scanning, searches, and interrogations cause delays and inconvenience to travelers?

5) Privacy—At what point are we infringing on people’s privacy and being overly intrusive?

As a society, we are learning to balance the need for security with safeguarding our freedoms and fundamental rights. Certainly, we don’t want to trade our democratic ideals and the value we place on our core humanity for a totalitarianism state with rigid social controls. Yet, at the same time, we want to live in peace and security, and must commit to stopping those with bad intentions from doing us harm.

The duality of security and freedom that we value and desire for ourselves and our children will no doubt arouse continued angst as we must balance the two. However, with high-technology solutions supported by sound behavioral psychology and maybe most importantly, good common sense, we can continue to advance our ability to live in a free and secure world—where “we have our cake and eat it too.”


Share/Save/Bookmark

January 25, 2008

Big Brother is Watching and Enterprise Architecture

The enterprise architecture for law enforcement and security in the next decade will be focused on using technology to identify the bad guys and stop them in their tracks.
ComputerWorld, 14 January 2008, reports that “Homeland Security is bankrolling futuristic technology to nab terrorists before they strike.”
Here’s the target architecture:
“The year is 2012 [probably a bit optimistic on the date]. As soon as you walk into the airport, the machines are watching. Are you a tourist—or a terrorist posing as one? As you answer a few questions at the security checkpoint, the systems begin sizing you up. An array of sensors—video, audio, laser, infrared—feeds a stream of real-time data about you to a computer that uses specially developed algorithms to spot suspicious people. The system interprets your gestures and facial expressions, analyzes your voice and virtually probes your body to determine your temperature, heart rate, respiration rate, and other physiological characteristics—all in an effort to determine whether you are trying to deceive. Fail the test, and you’ll be pulled aside for a more aggressive interrogation and searches.”
Last July, The Department of Homeland Security, “human factors division asked researchers to develop technologies to support Project Hostile Intent, an initiative to build systems that automatically identify and analyze behaviors and physiological cues associated with deception.”

The intent is to use these screening technologies at airports, border crossings, as well as possibly in the private sector for building access control and candidate screening.
Sharla Rausch, director of DHS’s human factors division says that “in controlled lab setting, accuracy rates are in the range of 78% to 81%.”
Where is the current research focused?
  1. Recognition of gestures and microfacial expressions
  2. Analysis of variations in speech (i.e. pitch, loudness)
  3. Measurement of physiological characteristics
The hope is that by combining all three modalities, “the overall predictive accuracy rate” will improve.
What are some of the challenges with these technologies?
  1. Currently, too many false positives
  2. Existing technologies, like the polygraph have “long been questioned by scientists…and remain inadmissible in court.”
  3. Ability of algorithms to “correctly interrupt” suspicious behavior or cues
  4. Profiling is continuously objected too based on discriminatory grounds
  5. Privacy concerns about the personal data collected
  6. Testing is limited by security concerns in the field
  7. Deployment will be limited due to cost, leaving soft targets potentially at risk
Will this Big Brother screening technology come to fruition?
Absolutely. The challenges with the technologies will be resolved, and putting aside the profiling and privacy issues, these screening technologies will become essential to our protecting ourselves.

Share/Save/Bookmark