Showing posts with label False Positives. Show all posts
Showing posts with label False Positives. Show all posts

June 6, 2018

Radio-Activity

So earlier in the week, I had a great opportunity to visit the NIST Center for Neutron Research (NCNR). 

It was fascinating to see the reactor, control room, and all the cool experiments--not things you see every day, right? 

For safety, we had to wear devices that measured radioactivity and also go through machines that checked us afterward. 

When one person in our group went through the scanner, it went off with a red alert, and the poor individual obviously got really scared--like OMG is there some contamination on me or something.

But they went through again and it turned out it was just a false positive, thank G-d. 

I guess these really can be dangerous substances to work around, but still so marvelous how the scientists harness these neutron beams and direct them to all sort of fascinating scientific experiments. 

Being around all this science makes me think whether if I could do it all again--wondering aloud--whether I would pursue an education in one of these amazing scientific disciplines and work in the lab like a "mad scientist"--exploring and discovering new things and figuring out the mysteries of the universe and how the world really works. 

What a fun, fun field to work in!  ;-)

(Source Photo: Andy Blumenthal and Art by 4th grader, Phillip Kenney)
Share/Save/Bookmark

January 31, 2015

You Can't Hide Your Feelings


You can try to hide your feeling, but it won't work...

Your emotions are now an open book to anyone with facial-recognition software, such as from Emotient, Affectiva, and Eyeris.

This video from Emotient shows examples of Dr. Marion Bartlett demonstrating very well how the system is able to pick up on her expressions of joy, sadness, surprise, anger, fear, disgust, and contempt. 

From broad displays of emotion to subtle spontaneous, natural displays, to micro, fast and involuntary expressions, the system detects and clearly displays it. 

Described in the Wall Street Journal, the software, in real time, successfully uses "algorithms to analyze people's faces" and is based on the work of Dr. Paul Ekman, who pioneered the study of facial expressions creating a catalog in the 1970s with "more than 5,000 muscle movements" linked to how they reveal your emotions. 

A single frame of a person's face can be used to extract 90,000 data points from "abstract patterns of light to tiny muscle movements, which get sorted by emotional categories."

With databases of billions of expressions from millions of faces in scores of countries around the world, the software works across ethnically diverse groups. 

Emotion-detection has a myriad of applications from national security surveillance and interrogation to in-store product marketing and generally gauging advertising effectiveness, to helping professionals from teachers to motivational speakers, executives, and even politicians hold people's attention and improve their messaging.

Then imagine very personal uses such as the software being used to evaluate job applicants or to tell if a spouse is lying about an affair...where does it end?

Of course, there are serious privacy issues in reading people's faces unbeknownst to or unwanted by them as well as possibilities for false positives, so that people's feelings are wrongly pegged or interpreted. 

In the end, unless you wear a physical mask or can spiritually transcend yourself above it all, we can see you and soon we will know not just what you are feeling, but also what you are thinking as well...it's coming. ;-)
Share/Save/Bookmark

April 21, 2012

Don't Throw Out The Pre-Crime With the Bathwater

The Atlantic (17 April 2012) has an article this week called " Homeland Security's 'Pre-Crime' Screening Will Never Work." 

The Atlantic mocks the Department of Homeland Security's (DHS) Future Attribute Screening Technology (FAST) for attempting to screen terrorists based on physiological and behavioral cues to analyze and detect people demonstrating abnormal or dangerous indicators.

The article calls this "pre-crime detection" similar to that in Tom Cruise's movie Minority Report, and labels it a  "super creepy invasion of privacy" and of "little to no marginal security" benefit.

They base this on a 70% success rate in "first round of field tests" and the "false-positive paradox," whereby there would be a large number of innocent false positives and that distinguishing these would be a "non-trivial and invasive task." 

However, I do not agree that they are correct for a number of reasons: 

1) Accuracy Rates Will Improve--the current accuracy rate is no predictor of future accuracy rates. With additional research and development and testing, there is no reason to believe that over time we cannot significantly improve the accuracy rates to screen for such common things as "elevated heart rate, eye movement, body temperature, facial patterns, and body language" to help us weed out friend from foe. 

2) False-Positives Can Be Managed--Just as in disease detection and medical diagnosis, there can be false-positives, and we manage these by validating the results through repeating the tests or performing additional corroborating tests; so too with pre-crime screening, false-positives can be managed with validation testing, such as through interviews, matching against terrorist watch lists, biometric screening tools, scans and searches, and more. In other words, pre-crime detection through observable cues are only a single layer of a comprehensive, multilayer screening strategy.

Contrary to what The Atlantic states that pre-crime screening is "doomed from the word go by a preponderance of false-positives," terrorist screening is actually is vital and necessary part of a defense-in-depth strategy and is based on risk management principles. To secure the homeland with finite resources, we must continuously narrow in on the terrorist target by screening and refining results through validation testing, so that we can safeguard the nation as well as protect privacy and civil liberties of those who are not a threat to others. 

Additionally, The Atlantic questions whether subjects used in experimental screening will be able to accurately mimic the cues that real terrorist would have in the field. However, with the wealth of surveillance that we have gathered of terrorists planning or conducting attacks, especially in the last decade in the wars in Iraq and Afghanistan, as well as with reams of scientific study of the mind and body, we should be able to distinguish the difference between someone about to commit mass murder from someone simply visiting their grandmother in Miami. 

The Atlantic's position is that  terrorist screening's "(possible) gain is not worth the cost"; However, this is ridiculous since the only alternative to pre-crime detection is post-crime analysis--where rather than try and prevent terrorist attacks, we let the terrorists commit their deadly deeds--and clean up the mess afterwards. 

In an age, when terrorists will stop at nothing to hit their target and hit it hard and shoe and underwear bombs are serious issues and not late night comedy, we must invest in the technology tools like pre-crime screening to help us identify those who would do us harm, and continuously work to filter them out before they attack. 

(Source Photo: here with attribution to Dan and Eric Sweeney)

Share/Save/Bookmark