October 14, 2011
A few months back, I wrote about Facebook’s new facial recognition technology that analyzes uploaded photos and automatically tags Friends that the program recognizes.
By January 2012, the FBI plans to launch its own version of a photo recognition program.
That system, though, does quite a bit more than look at photos.
The program, the Next Generation Identification system (NGI) will analyze several different types of biometric data compiled from sources across the country at all levels of government.
What kinds of biometric data?
In addition to facial data, NGI will collect data on individual irises, palm prints, and voices (through the collection of audio samples).
NGI also collects and analyzes photos of potentially identifying marks, such as tattoos and scars.
And while the FBI already collects fingerprint data, NGI will link that data with the newly-collected data listed above, and make the database accessible to local law enforcement across the country.
Despite the fact that the FBI avers it will not retain non-criminal justice data, the implications here reach beyond basic law enforcement.
NGI presents some very real privacy issues for, as the FBI calls, “innocent citizens.”
For one, as discussed in the Michigan Telecommunications and Technology Law Review back in 2008 when the plan was first announced, there could be serious legal repercussions for employers.
Employers hiring for particular positions are required by certain states to collect fingerprints of job applicants and run a criminal background check through a nationwide database (and some employers just do it even when not required by law).
Through NGI’s new “rap-back” component, the FBI will offer to keep the fingerprint records submitted by job applicants.
Then, the FBI can inform an employer of any later criminal activity of an employee, including arrests and criminal charges, regardless of whether a conviction results.
Even though “rap-back” is voluntary on the part of the employer, this, MTTLR, argues, could still create an affirmative duty for the employer to engage in the program, since under many circumstances, an employer can be liable for an “unfit” employee’s actions.
If the employer had no notice that its employee was unfit, then it probably won’t be liable.
But NGI offers an easy way for employers to keep tabs on their employees’ criminal activities, so a court could very easily construe an employer’s refusal to engage in the program as willful ignorance, which is no defense to liability.
It’s hard to say whether this affirmative duty for employers will actually materialize, but that issue is one of a slew presented, especially considering the sources of data the FBI will have access to through NGI.
On top of the government sources listed above (law enforcement, local DMVs), the agency can also use images from “seized systems” and “open sources.”
Open sources is probably the most expansive by far, since it includes any images publicly available on the Internet, as well as unmanned/unmonitored surveillance cameras (i.e. traffic cameras).
If you weren’t at least a little unnerved before, you should be after that last bit.
The FBI will soon have the ability to identify and track an individual’s every movement, and the only thing that’s really stopping it from doing so is its own discretion and an outdated law (1974’s Privacy Act), which really doesn’t provide that much protection.
While the practical implementation of the system probably won’t be as intrusive as it has the potential for, that’s not the point.
A single entity will have the ability to institute a national surveillance system with face, iris, and voice recognition capability, and our current privacy laws aren’t strong enough to stop it.