Pavan completed his engineering from electronics stream. He is a gadget freak, passionate android developer. He also loves to experiment and learn new things in his life and would like to share it with everyone.
Google has said it will bar facial recognition features from its head-mounted computer, Google Glass, for the time being. The move comes in response to “concern” from the public, according to the company.
“Many have expressed both interest and concern around the possibilities of facial recognition in Glass,” Google said in a statement. “As Google has said for several years, we won’t add facial recognition features to our products without having strong privacy protections in place. With that in mind, we won’t be approving any facial recognition Glassware at this time.”
Google has been pilot-testing Glass with developers and some members of the public since its announcement at the Google I/O conference last year, under what it calls the Explorer Programme.
While the company has yet to officially indicate when Glass might be made generally available, the device has already raised questions from several quarters, including amongst privacy advocates concerned that it could be used for covert filming.
In May, the US Congress’ Bi-Partisan Privacy Caucus addressed a letter to Google chief executive Larry Page addressing a number of privacy questions, including whether Glass permits facial recognition and the cross-referencing of images with other data sources. A response from Google is expected by 14 June, but Google’s message on facial recognition appears to have been prompted by the Congressional letter.
The company updated its developer policy to reflect its facial recognition stance. A section of the policy addressing what is and isn’t permitted in Glass software now indicates that developers may not “use the camera or microphone to cross-reference and immediately present personal information identifying anyone other than the user, including use cases such as facial recognition and voice print”, adding that applications which do so will not be approved.
Another section of the policy prohibits software that allows Glass’ camera to operate while the device’s display is switched off. The clause is meant to make covert filming more difficult, since when the display is in use it is visible to those nearby.
Facial recognition start-up Lambda Labs last month adapted its open source facial recognition API for use with Glass. A beta-test version of the API was released last year for other platforms and is in use by about 1,000 developers, according to Lambda.
Last month, Google announced four new apps for Glass, including those from social networks Facebook and Twitter.
A new ethical dilemma: is it wrong to use people’s DNA ‘leftovers’ to create works of art (or for any other purpose)? – Eleonore Pauwels
Your DNA is as personal as you can get. It has information about you, your family and your future. Now, imagine it is used – without your consent – to create a mask of your face. Working with the DNA bits left behind by strangers, a Brooklyn artist makes us think about issues of privacy and genetic surveillance.
Heather Dewey-Hagborg, a 30-year-old PhD student studying electronic arts at Rensselaer Polytechnic Institute has the weird habit of gathering the DNA people leave behind, from cigarette butts and fingernails to used coffee cups and chewing gum. She goes to Genspace (New York City’s Community Biolab) to extract DNA from the detritus she collects and sequence specific genomic regions from her samples. The data are then fed into a computer program, which churns out a facial model of the person who left the hair, fingernail, cigarette or gum behind. Using a 3D printer, she creates life-sized masks – some of which are coming to a gallery wall near you.
Such a process might seem artistically cutting-edge to some. But, for most of us, the “yuck!” factor quickly kicks in. Among one of my horrifying nightmares is the fear to be accused of a crime I did not commit. Picture the scene: you were at the wrong place, at the wrong time, and circumstantial evidence builds against you. Like in any dream, you are trying to shout out loud that you are innocent, but no sound comes out. In my nightmare, the last chance to be saved always comes from DNA testing. After comparing my DNA to that found on the crime scene, I am finally freed. In many ways, DNA has been seen in a very positive light, but that is starting to change as more ethical questions arise.
For my generation, the one born with DNA-profiling that began in 1987 and raised on films like Gattaca, developments in human genetics have directly influenced self-perceptions and experiences. One positive example of this influence is the do-it-yourself biology movement. Genspace allows lab members to design workshops, train students and innovate with new technologies.
Whether you find what Heather Dewey-Hagborg does cool or creepy, DNA-profiling experiment raises a number of legal and ethical questions that no one knows how to handle. To what degree does the DNA we leave behind in public spaces belong to us? Does a facial mask without a name raise the same issues as a photo? In either case, what exactly is our expectation of privacy?
Just because an individual sheds DNA in a public space does not mean that the individual does not care about preserving the privacy of the data in the DNA. There was no informed consent given to access that data. On the other hand, some might say the major problem is not unauthorized access to data but misuse of data. It is easy to imagine a scenario where someone could inadvertently have their genome sequenced from a cigarette butt they left behind. If the person who tested the cigarette found a risk gene for a mental disorder and posted the results on Facebook, the information could affect the smoker’s social and professional life.
Of course, Dewey-Hagbord is not looking for degenerative diseases or mental disorders in the bits of DNA she picks up off New York’s sidewalks. But still, when the sequences come back from the lab, she compares them to those found in human genome databases. Based on this comparison, she determines the person’s ancestry, gender, eye color, propensity to be overweight and other traits related to facial morphology.
Beyond privacy, this search raises questions of the ability to identify someone from their DNA traces. To what extent do genetic traits (such as ancestry) tell you about how a person looks? Based on the analysis of these genetic traits, how accurate is the 3D facial model produced by the computer? At the request of a Delaware forensic practice, Dewey-Hagborg has been working on a sculpture from a DNA sample to identify the remains of an unidentified woman. This opens another black box at the connection between law enforcement and what we might call “DIY forensic science”: here, what is the role of the state versus that of the individual?
In the UK, the Human Tissue Act 2004 prohibits private individuals from covertly collecting biological samples (hair, fingernails, etcetera) for DNA analysis, but provides exclusion for medical and criminal investigations. The situation is more of a patchwork in the US. According to a 2012 report from the Presidential Commission for the Study of Bioethical Issues, only about half of the states have laws that prevent testing someone’s DNA without their knowledge. It is encouraging, at least, to see that many lawmakers at the state-level have begun to discuss the question of privacy and genome sequencing.
In the near future, the Wilson Center in Washington, DC, will be providing a forum for further policy questions on this issue. On 3 June 2013, Dewey-Hagborg has been invited to discuss her research and motivations in a talk about privacy and genetic surveillance. Another discussion will follow on 13 June 2013 at Genspace in Brooklyn. Perhaps with the help of these and other academics, artists and policymakers, we can begin reaching a consensus about what boundaries we want to set for ourselves, before we accidentally end up in a Gattaca of our own creation.