April 24, 2019 - Artificial intelligence, more commonly referred to as AI, is already in use in everyday life. For many of us, it makes life easier. Companies use it to make suggestions for purchases, based on your past behavior, suggest books, movies and music that we might like based on other entertainment choices that we've made, etc… But that's really just the tip of the iceberg for AI uses. While most of the AI applications that you are aware of in everyday life seem pretty benign, governments around the world are also beginning to use it. Some of those uses are downright frightening. even here in the United States, some of the AI uses being discussed bring up real privacy and security concerns and current laws are nowhere close to being ready to address some of the issues coming up.
If you want to look for Orwellian uses of AI, just take a peak at China. The Peoples Republic has implemented something they're calling a "social score." Every Chinese citizen has one of these.
Social scores are assigned in much the same way as credit scores are here in the United States. But the score is based on a lot more than just making sure you are paying your bills (although debt payment is very much a part of what the Chinese look at). If you make anti-government comments, talk about things like the uprising at Tiananmen Square, or just use offensive language on social media, your social score drops. Stop paying your bills, it drops even more. In general, if you're a thorn in the side to either other citizens or the government, you'll wind up with a low social score. For those who find themselves in that position, that's a very bad thing.
A low social score can get you fired from your job. You may also find yourself in a position where the government won't let you board a train or a flight; international OR domestic. In short, a low social score can pretty well ruin your life, and you can qualify for a low score just for being loud mouthed on things which are considered politically incorrect. That would be a real problem for someone like me.
Enter AI. If you are a Chinese citizen who actually lives in China, you're being watched. That means your social media is being reviewed along with all of your other computer habits. You're being photographed and videoed when you are out in public. And it’s a pretty safe bet that anything you say on the phone is subject to review too.
All of that information can't possibly be looked at by human eyes. But for computers, its another story and the evidence suggests that's exactly what's happening. Like I said, it's Orwellian.
If you think that can't possibly happen here in the US, you might be wrong. The Department of Homeland Security is rolling out facial recognition to the nation's airports as we speak. Their plan is to have it in place in pretty much all commercial airports within the next four years. According to DHS, they've already identified 7,000 people with questionable legal status who were humming through US airports. What we don't know is how accurate those identifications were. There are a variety of known issues with facial recognition, which is a form of AI.
But facial recognition by law enforcement is a passive use of AI. By that, I mean you use facial recognition to pick people out of a crowd and try to identify anyone who has committed a crime. What if you could take it to the next level? What if you could look at a crowd of people and predict who was going to commit a crime? If that sounds far-fetched, that's exactly what certain companies are doing. And those companies want to sell their AI to the US Government. What could possibly go wrong?
A recent article in DefenseOne - a site that deals with security and military issues - goes into detail about some of the projects in development. Some companies are developing software to look for people carrying guns. One company mentioned is pairing thermal imaging with their cameras to be able to see concealed weapons. Others may be looking for a bulge in clothing.
But there are other companies that going even further. They are looking at your behavior when you're in public and based on their AI, they're making some pretty bold predictions about the people they observe and their potential for criminal behavior. They don't necessarily need to see that a person has a gun. Instead, they are looking for behavior that doesn't quite fit with what other people are doing. If you are walking slowly and everyone else is going fast, that could be enough to get you singled out for further review. If you are walking against the flow of traffic. Or perhaps you simply loiter in a particular area too long for comfort.
It might surprise you, but there are absolutely no laws on the books that deal with this type of data collection and its use in the United States. There is a pretty good chance that if a local, state or the federal government gathered this type of data without a warrant, it would be inadmissible in court. But if private companies collect this type of data, there are no rules about what they can do with it. They can sell it to the government. Or they can publish the information. The uses for this type of data are already vast, and they are only going to grow in the future.
Just imagine what could happen to your reputation if the wrong assumptions were made about you and the data became public. Just think about how an unscrupulous politician could use this type of data against an opponent. Or how it could be used to defame a business competitor. The imagination runs wild.
While the United States is unlikely to every have a social score similar to that in China, we could easily become just a vulnerable to outside pressures. Better watch what you say in public because there's a camera hooked up to an AI platform that's reading your lips. Better not make a scene at a restaurant because there's a camera feeding your image into a facial recognition database that's also matching your face with other information like your name. The next time you call for a reservation, you won't be able to get one… in any restaurant… because the company managing the camera is now selling the information they collect to any restaurant that want's it. And you're on the naughty list.
Again, it's Orwellian. But in this case, it's here in the United States. Unless the law starts to catch up quickly, there's a pretty good chance that we're all going to be enslaved by our own inventions, and that should frighten everyone.
Note: When posting a comment, please sign-in first if you want a response. If you are not registered, click here. Registration is easy and free.
|