June 16, 2016 – You’ve probably never heard of the Next Generation Identification-Interstate Photo System (NGI-IPS); a facial recognition databases used by the FBI. The database currently contains more than 30 million photos and through agreements with a number of states, has the ability to access more than 400 million photos in total. That number is likely to grow as the feds continue to negotiate agreements with individual states. By the time it is all said and done, there is a very good chance that nearly every American will find their picture in the database. And now, according to a critical report from the Government Accountability Office, there is a fairly decent chance that the database is inaccurate.
|
|
|
|
|
|
NGI-IPS started out using mug shots of criminals. But the FBI has negotiated agreements with 16 states so far to use driver’s license pictures and, in some cases, mug shots from state and local law enforcement agencies. And right now, they are in the process of trying to negotiate agreements with 16 other states.
In addition to the state agreements, the database is constantly being fed new photographs from more obscure sources, such as surveillance cameras. By the time the FBI is done, it is likely to have pictures of every person living in the United States.
The very existence of a database like NGI-IPS and the fact that it is operated by the federal government is more than a little disconcerting. Bu the fact that the FBI is using the database to fight crime might give you some solace. That is, it might give you some solace if you have confidence that when queried, it can return accurate results. Unfortunately, according to the GAO report, the FBI has never tested it for accuracy. The implications of that are astounding.
NGI-IPS doesn’t work the same way as the facial recognition databases you see on crime dramas. In those, law enforcement feeds in a picture of a suspect and the database quickly returns an accurate match. But with NGI-IPS, a picture is fed in and the database returns 50 possible matches. Those matches are then reviewed by people to see if they agree with the computer on any of the 50 pictures returned.
This methodology actually has the potential to hinder investigations in several ways. First, since the agency has never tested its facial recognition software for accuracy, if the police try to narrow the search – meaning they only want the database to return four or five possible matches – there is no way to know if those four or five pictures are any more accurate than the other 45 pictures that were not displayed.
Second, when the database does return 50 photos, there is no way to know how many more potential matches the database contains. There could hundreds of additional potential matches in the database that could be just as accurate as the first 50 pictures selected.
And finally, if person reviewing the photos makes a mistake and identifies the wrong person as a match, it could lead an investigation in the wrong direction. And for the person identified, it could place them in both financial and legal jeopardy.
The GAO made six recommendations to the Department of Justice concerning the database. It is troubling that the DOJ only agreed to two of them. And the DOJ rejected a recommendation that the system be tested for accuracy and to determine the rate of “false positive” identifications. You can find the entire GAO report along with the DOJ’s responses to it here.
byJim Malmberg
Note: When posting a comment, please sign-in first if you want a response. If you are not registered, click here. Registration is easy and free.
Follow me on Twitter:
|