And finally: sophistication: AI-assisted hacks open the door to complex strategies beyond those that can be devised by the unassisted human mind. AIs’ advanced statistical analysis can uncover relationships between variables, and thus possible exploits, that the best strategists and experts might never have recognized. That sophistication allows AIs to deploy strategies that undermine multiple levels of the target system. For example, an AI designed to maximize a political party’s vote share can determine a precise combination of economic variables, campaign messages, and procedural voting adjustments that can make the difference between election victory and defeat, bringing the revolution that mapping software brought to all aspects of the democration. Not to mention the hard-to-discover tricks an AI could propose to manipulate the stock market, legislative systems or public opinion.
With computer speed, scale, scope and sophistication, hacking will become a problem that we as a society can no longer handle.
I am reminded of a scene in the movie terminator, in which Kyle Reese describes to Sarah Connor, the cyborg who hunts her: “It is non-negotiable. It cannot be reasoned with. It feels no pity, regret or fear. And it’s definitely not going to stop, never…” We’re not dealing with it literal cyborg assassins, but with AI becoming our adversary in the world of social hacking, we may find it just as hard to keep up with its inhumane ability to prey on our vulnerabilities.
Some AI researchers are concerned about the extent to which powerful AIs can overcome their human-imposed limitations and – potentially – come to dominate society. While this may seem like wild speculation, it’s a scenario that’s at least worth considering and avoiding.
Today and for the foreseeable future, however, the hacking described in this book will be perpetrated by the powerful against the rest of us. All the AIs out there, whether on your laptop, online, or embodied in a robot, have been programmed by other people, usually for their sake and not yours. While an internet-connected device like Alexa can mimic your trusted friend, you should never forget that it’s designed to sell Amazon’s products. And just as Amazon’s website urges you to buy its house brands instead of competitors’ higher-quality products, it won’t always act in your best interest. It will hack your trust in Amazon for the purposes of its shareholders.
In the absence of any meaningful regulation, there’s really nothing we can do to prevent AI hacking from unfolding. We must accept that it is inevitable and build robust governance structures that can respond quickly and effectively by normalizing useful hacks in the system and neutralizing the malicious or unintentionally harmful hacks.
This challenge raises deeper, more difficult questions than how AI will evolve or how institutions can respond to it: Which hacks count as useful? Which are harmful? And who decides? If you think government should be small enough to drown in a bathtub, then you probably think that hacks that reduce the government’s ability to control its citizens are usually good. But you still may not want to replace political overlords with technological overlords. If you believe in the precautionary principle, you want as many experts as possible to test and review hacks before they are included in our social systems. And you might want to apply that principle further upstream, to the settings and structures that enable those hacks.
The questions continue. Should AI-created hacks be managed locally or globally? By administrators or by referendum? Or is there a way we can let the market or social groups decide? (Current efforts to apply governance models to algorithms are an early indication of how this will go.) The governance structures we design will empower some people and organizations to determine the hacks that will shape the future. We must ensure that that power is used wisely.
Extract from A Hacker’s Mind: How to bend society’s powerful rules and how to bend them back by Bruce Schneier. Copyright © 2023 by Bruce Schneier. Used with permission of the publisher, WW Norton & Company, Inc. All rights reserved.