Sept 7, 2012
Anonymous recently put out a video below advising people how to avoid being scanned by facial recognition technology; the advice boils down to ‘tilt your head at an angle’ or ‘wear a specially made cap’. They did not go so far as to tell people how to actually make such a cap, nor did they point out that the technology has been developed to capture faces even when they are at an angle.
The powers that be are highly aware that not all subjects are compliant, or even aware, that their facial features are being ‘captured’, so they have developed the capacity to identify people even when their faces are not in the ‘perfect’ position.
A recent report notes that there have been rapid advances in the accuracy of face trackers, such that they have become highly tolerant to different angles and poses, to the time that has elapsed between photos (e.g. ageing differences), and to lower resolution photos. In other words, the technology has vastly improved, and it just keeps getting better because there are far fewer false alarms. People can even be identified from a 4 x 4 pixel array of their face, although the analysis of just one iris in the eye has been shown to be even more effective than face recognition.
Apparently, the iris has more unique information than any other single organ in the body, providing robust identification potential, second only to DNA. Iris recognition technology can be used to determine ethnicity, is considered to be ideal for situations involving uncooperative individuals, and has been successfully deployed at London’s Gatwick Airport.
BIMA (Biometrics Identity Management Agency) is funding research at Carnegie-Mellon University to develop a method of capturing iris biometrics from uncooperative subjects; they can now identify people up to 12 metres away. No special equipment is needed either; the researchers used commercial, off-the-shelf, photographic equipment to capture iris biometrics, which was successful with both stationary and moving subjects. Cameras can then be mounted on roving vehicles, scanning and recording irises for what the military call ‘tactical non-cooperative biometrics‘.
In addition to this huge increase in capabilities, those in the Identity Management (IdM) industry are advising the implementation of multi-modal biometrics; in other words, to make up for the fact that no method is always 100% accurate, the plan is to capture several types of biometric data for citizen IdM. The type required will depend on the nature of the transaction, and the context in which it is taking place – virtual authentication may require stronger credentials than when a person is physically present, for instance.
Yet while identity management calls for the use of biometrics to protect against identity theft, the trouble is that many biometrical applications can be ‘spoofed’, meaning that ID theft becomes a whole new problem, left unsolved by the current range of biometrical applications. So fingerprints and voice prints, etc. can all be ‘stolen’ and used by an impersonator. Just think how you leave your fingerprints all over the place, and the photos of yourself on the Internet. Research shows how easy it is for almost anyone to be identified from a photo, using simple Internet data mining techniques, and easily available facial recognition software;
In other words, our faces have become our identities, and there little hope of remaining anonymous in a world where billions of photographs are taken and posted online every month.
Proving you’re not a dog is fast becoming simply not enough. You’re going to have to prove you’re alive and present at the time of authentication. So you have to prove you’re a human being, and that you are who you say you are, and that you are indeed alive and well. There are a number ofapplications which are able to do this, and embody the most recent, and still developing, field of cognitive biometrics. This refers to testing and measuring your affective state; i.e. emotions and behaviour, psychological profile, and ‘vital signs’. The way you respond to certain stimuli says a lot about who you are. It can be read as a unique biological signature, which is measurable and universal in application, unlike other biometrics which do not account for human differences: not everyone can walk, not everyone has fingers, etc. The most advances in the field of cognitive biometrics have been made in the pursuit of covert biometrics, which are being developed to identify ‘uncooperative’ individuals. The same information was acquired by DARPA in the Future Attribute Screening Technology (FAST) program which was said to be looking for cues of ‘mal-intent’, pre-crime detection styley.
These techniques are more insidious when they can be done at a distance, without the need for contact with the individual, so the biometric data can be taken without the subject’s knowledge. This also makes them all the more terrifying for the largely compliant and innocent, would-be private, global population.
It also turns out we are scattering our biological signatures all over the place, from finger prints and palm prints, voice recordings, and digital photos, videos showing our ears, irises, tattoos, mannerisms and the way we walk and move about.
Even the way we smell can be used to identify us. And of course, there’s our DNA.
I don’t want someone to have a record of me. (I know they cannot steal my soul but they are trying to steal my sovereignty!) So the ones that bother me most, the ones that Anonymous strangely fail to mention, are the ones that can be done at a distance. Whilst identification from DNA involves contact, and takes around 90 minutes to get results, the following features can be tracked from a distance, covertly:
A European Union Working Party on data protection recently released an official Opinion on developments in biometric technologies, in which it was stated,
Biometric technologies that once needed significant financial or computational resources have become dramatically cheaper and faster. The use of fingerprint readers is now commonplace. For example, some laptops include a fingerprint reader for biometric access control. Advances in DNA analysis mean that results are now available within a few minutes. Some of the newly developed technologies such as vein pattern recognition or facial recognition are already developed to maturity. Their use in various places of our everyday life is just around the corner…. Every individualis likely to be enrolled in one or several biometric systems. (my italics)
It turns out that people are even supplying personal information on the way they move via theaccelerometer which is now a common feature of many smartphones and games consoles; a profile of your gait can then be used to identify you in video sequences. A recent study monitored subjects with an Android smartphone in their pocket, and found,
With a reasonably sized dataset (36 subjects), we show preliminary results indicating that not only can smartphones be used to identify a person based on their normal gait but also that there is potential to match gait patterns across different speeds.
Freaky huh? Well, that’s not all … your smartphone and games equipment and stuff you put on the Internet can all be used to build profiles which log the metrics for your unique physiological and emotional characteristics, which can be used to build your identity profile, and serve as ‘scientifically acceptable’ means of biometric authentication. Just like Pavlov’s dogs, we are to be assessed according to the behaviourist’s favourite ‘stimulus-response paradigm’, and virtual reality is the vehicle. Data that was once collected with wired devices in psychologists’ offices can now be supplied with games equipment which employ ‘haptic’ technology; this adds to the sense of reality in the game, as it allows the user to experience the sense of touch, with feedback in the game, such as ‘feeling’ the recoil of a gun as it is fired.
Biometrics experts are looking for ways to make this data a valid form of identity authentication. A recent study claims,
Haptics can be seen as a mechanism to extract behavioral features that characterize a biometric profile for an identity authentication process. Generally, the haptic data captured during an individual interaction are very large (measured every few milliseconds) and with a high number of attributes (position, velocity, force, angular orientation of the end-effector and torque data, among others). Therefore, the behavioral haptic data that describe users are defined in terms of a large number of features, which adds complexity to the analysis.
‘Affective’ haptic technology can also be used to feedback information regarding the player’semotional reactions. This enriches the identity profile considerably, as well as providing statistical data on the user’s unique characteristics which can also be used for authentication in IdM. As more and more games incorporate ‘readings’ from the player, the more private data (about our physical and mental health) we give away, thus contributing to a future where IdM could become a whole new ballgame:
… the wealth of stimuli suitable for cognitive biometrics provides a wealth of authentication schemes – game playing, listening to music, short video clips, as well as more traditional behavioural biometric approaches provide virtually an infinite amount of input stimuli for use as an authentication scheme. This holds for both static and continuous authentication modes – though the later provides many more opportunities to validate the user under a wide variety of stimulus challenges. This is one of the major advantages of the cognitive approach compared to anatomical biometrics such as finger prints and retinal scans. Further, this approach may suit more closely future person–computer interaction schema that may attempt to minimise traditional input devices such as keyboards and mice. As Julia Thorpe and colleagues proposed in 2005 – authenticating with our minds might be a reality in the near future – and certainly emotion based interactive gaming is already here (Thorpe et al., 2005).
In light of these capabilities, the readings which could be acquired from Guardian Angels take on a whole new meaning. In fact, they would embody the ideal version of what is being dubbed ‘continuous and unobtrusive authentication’ – by remotely monitoring brain and heart signatures from electroencephalogram (EEG) and electrocardiogram (ECG) readings, taken, for instance, from sensors in a cap, and a shirt. The European Council have helped fund research in this area, with the result that a company called Starlab has developed a product called ENOBIO which takes recordings ofEEG and ECG from an individual wearing one sensor on the wrist, and one on the earlobe. This means the individual is constantly sending out a signal of who they are, something said to be important in high security areas, which are “safety critical”, such as transportation, laboratories, airports, etc.
DARPA has begun a four year research program called ‘Active Authentication’, which will focus on methods to validate identity online using cognitive biometrics, such as computers which ‘recognise’ the operator by assessing their habitual movements, and comparing it to their stored profile to validate identity in real time.
Late last year, DARPA announced another research project called ‘Human Identification at a Distance’, which includes developing techniques which can covertly collect reliable heart biometrics to verify identify without any contact with the person being tracked. Wired.com announced,
“In 2006, Darpa developed Radar Scope,which used radar waves to sense through walls and detect the movements associated with respiration. A year later, the Army invested in LifeReader, a system using Doppler radar to find heartbeats. More recently, the military’s been using devices like the AN/PPS-26 STTW (“Sense Through the Wall“) and TiaLinx’s Eagle scanner, which can sense the presence of humans and animals through walls.
Handy though these gadgets may be, Darpa wants to one-up them with some new and better capabilities.
First off, Darpa wants its biometric device to be able to work from farther away. Right now, it says the accuracy of most systems taps out at around eight meters. And while some see-through devices can see through up to eight inches of concrete, they don’t do as well in locations with more or thicker walls. So Darpa’s looking for the next system to push that range past 10 meters, particularly in cluttered urban areas.”
However, whilst the technology exists to detect the heartbeat, the ability to take reliable biometric readings without contact is still being worked on. Although readings taken from sensors placed on the body are successful at proving identity, no research has fully overcome the problems brought about by distance. What’s really getting in the way is environmental ‘noise’ creating interference with the signal being received, meaning that detection of heartbeat biometrics is highly context-dependent.
The best results are achieved by using sensors in contact with the body, with willing subjects. These include ECG, pulseoximetry, and blood pressure. The pursuit of non-contact biometric acquisition continues, however, since it represents the ultimate aim of criminal detection. Heart sounds can be acquired acoustically with success, but methods utilising radar and laser dopler vibrometry are less efficient if the subject is moving. Motion imagery, looking at skin colour fluctuations, is an experimental technology. A recent study revealed just how long-range the technology is for reading vital signs:
Researchers at Georgia Technology Research Institute (GTRI) used an active radar to note the changes (in)heart volume over time (Greneker, 1997; Geisheimer and Greneker III, 1999). The physical deformation provides extensive information about the individual and the relative health of the heart itself along with respiration and other body movements and muscle flexor noises. This work for human identification is impressive because the potential standoff ranges are in excess of 1 km. The GTRI work formed the basis for Mazlouman et al. (2009) to characterise cardiac performance using microwave Doppler radar. Instead of attempting to provide surrogate ECG information, these researchers looked in the infrasonic range, i.e. < 20 Hz through ultra-wide band radar > 2 MHz. The researchers continually were able to collect reliable data between 2 and 10 metres …… Another measure of standoff cardiac measure was provided by (Parra and Da Costa, 2001). Interferometric data were collected from the pulsing of the carotid artery over time. The measurements were collected with an eye-safe laser…
With AISight’s ability to monitor your gait and habitual movements, all of which can be used to identify you, combined with an abundance of other data collected without your consent, the gaze of big brother, imbued with ‘intelligence’, seems inescapable. The very idea that this could be happening is often enough to cause changes in people’s behaviour; when they know they’re being watched, people adapt to fit in and have an easy life. This is known as ‘anticipatory conformity’, and is effectively self-surveillance leading to self-censorship . It means learning to think before you speak, trying to control the look on your face, being careful to always do ‘the right thing’.
The telescreen received and transmitted simultaneously. Any sound that Winston made, above the level of a very low whisper, would be picked up by it; moreover, so long as he remained within the field of vision which the metal plate commanded, he could be seen as well as heard. There was of course no way of knowing whether you were being watched at any given moment. How often, or on what system, the Thought Police plugged in on any individual wire was guesswork. It was even conceivable that they watched everybody all the time. but at any rate they could plug in your wire whenever they wanted to. You have to live – did live, from habit that became instinct – in the assumption that every sound you made was overheard, and, except in darkness, every movement scrutinized.
Guardian Angels hover over us ominously – threatening to tangle us tightly in the hugely complex global web. A sea of contextual data in a world full of sensors, each of us mirrored in the virtual world as the controllers feast on their wireless feedback, the increasingly intimate personal data they are stealing from us. Every piece of data they have from us becomes a valuable asset. Even without an implant, we can be identified and tracked from RFID tags in the things we buy, together with the biometric scanners and device detectors.
The confluence of IdM, surveillance, and complexity modelling are what Big Brother is made of. You will only be understood and acknowledged by the system based on your recorded metrics.
The technology has been developed to achieve all of these things, but as yet they are patchwork projects just getting the feel of things. What we must fear is ‘global interoperability’ – all systems interacting together: Watson with AISight and smart dust omniscience, able to learn and remember, analyse, predict and decide, the Grand Master of Complexity.
So, yes, watch out for Anonymous, indeed. Just as their iconic facemasks symbolise anonymity, the very thing business and governments are seeking to end, in this era of accountability, so too are their numerous claims to successful hacks of top-level domains. Anonymous are helping to bring in the new era of identity control; their appearance heralds the end of privacy and anonymity. Nowhere left to hide; the last of the days of the drifter.
This article was posted: Friday, September 7, 2012 at 1:31 am