History has a habit of repeating itself. Or, perhaps more accurately, revolving in cycles. Which is writer and philosopher George Santayana was motivated to say the saying “Those who cannot remember the past are condemned to repeat it”.
The pages of history are dotted with the regret of great scientists whose ideas were taken and weaponised. Oppenheimer and Einstein both came to regret their role in the development of nuclear weapons. While nuclear technology holds the potential for great good – if we every work out the disposal problem and safety issues perhaps nuclear power could become widely viable – the capacity for misuse and the creation of weapons of mass destruction took the world from the devastation of World War 2 to the brink of global devastation.
Looking at the proliferation of technology over the last decade, it’s easy to draw parallels between the rapid development of nuclear capability in the 1940s to the use of facial recognition and artificial intelligence today.
Proliferation of facial recognition is unfettered
Facial recognition and the use of biometric devices to track people isn’t new. There have been varying forms of it around for some time. But it became mainstream in both consumer electronics, through smartphones that use facial recognition or fingerprints as a security device, and commercial and government applications. For example, my local state government has instituted a poorly tested facial recognition system for identifying citizens accessing government services.
And while projects like India’s AADHAAR seeks to use biometric identification as a way of helping citizens, China’s social credit system is being used to socially engineer specific behaviours.
One could look at it this way; India’s AI-powered system is like clean nuclear power and China is developing an AI-powered bomb if we’re to believe the commentary coming from the West.
The challenge the world faces with the use of AI and facial recognition systems s that their use is almost completely unregulated. There is nothing stopping a consumer from buying a camera with facial recognition features. For example, the Nest Cam IQ I reviewed a while ago, offers facial recognition so it can tell you who has walked through its field of vision so you know of there’s a stranger on your property.
We have governments large and small, consumers and private companies who can now capture and store almost unlimited amounts of data using cloud services and use libraries of stored images along with easy-to-access algorithms to track anything from who walks through a passport control gate at an airport, to neighbours walking past the front gate. But the real issue is that governments, un =der the guise of civil protection, are creating an apparatus so vast and powerful that individuals will not be able to move without being virtually followed.
The risk equation
The traditional view of risk management plots two key variables; likelihood and impact.
In the nuclear world, the likelihood of a Chernobyl or Fukushima incident is usually categorised being very low but having a very high impact. That’s something we saw with those two incidents with the cleanup effort and isolation still ongoing.
The databases being created through the mass collection of facial recognition data, along with other data governments are collecting has created a massive honeypot.
Governments are quick to say that their systems are secure and there is no risk of a breach but anyone who has been involved ed, even tangentially, in cybersecurity knows there is no such thing as an unbreathable systems – something the the USA intelligence community learned through the actions of Chelsea manning and Wikileaks, and Edward Snowden.
So, what’s the likelihood of that data being compromised? It’s probably very low in some countries. But the impact of some sort of compromise is extremely high.
If we take China’s social script system, there’s no need to steal data. All someone has to do is manipulate some data and suddenly an upstanding citizen could find themselves banned from public transport.
At the moment, there are some governments resisting the urge to use facial recognition. San Francisco has banned the use of facial recognition technology by law enforcement – an interesting response from a hub of the technology’s development. Others, like Victoria are being a little more ham-fisted about the use of the technology.
Many western governments, emboldened by the rise of terrorism in some parts f the world, say it’s an essential weapon for the protection of citizens. But scope creep is common with law enforcement and governments so we need to be wary of such claims and call our government to account over such matters.
In Hong Kong, where police are using facial recognition to track protestors, lasers are being used to thwart the systems.
Fuelled by war, nuclear weapons became the ultimate destructive force and following the second world war, we were plunged into the Cold War with the threat of nuclear annihilation a real concern. Today, with the war on terror still ongoing, we see governments turn to a new technology that has great capacity for misuse and introduce significant risk to society.