AI and crypto become enemies of regulation
In 1865, Britain passed its infamous “Red Flag” law – copied in many other places – to regulate self-propelled vehicles. It required a crew of three for each vehicle, one member of which had to walk 60 meters ahead with a red flag to warn horses and riders of the vehicle’s approach. It also imposed a speed limit of four miles per hour, or two miles per hour in populated areas.
Why is this relevant now? Because attempts, some 158 years later, to regulate cryptocurrencies and artificial intelligence will seem just as ridiculous to future generations. Technology is transforming society based on its capabilities and what people want to do with it, not based on conservative regulations passed by clueless civil servants.
The collapse of crypto exchange FTX in November 2022, capping a “horribilis annus” for major regulated digital currencies, combined with the release of ChatGPT’s demo that same month, sent venture capital money fleeing crypto to AI. A longer-term, harder-to-measure trend among academics and top developers appears to favor the steady, quiet progress of AI work over the scandals, booms, and busts of crypto. These trends are more consequential for the future than anything done in Washington, the ups and downs of Bitcoin, or how non-risk capital is allocated. The automobile—along with radio, the Internet, and genetic engineering—transformed society in fundamental ways, unrelated to the wishes of regulators, stock prices, or whatever else the media was covering at the time.
The competition between crypto and AI to win the hearts and minds of tech innovators and the wallets of venture capitalists reflects a broader dichotomy. AI is traditionally centralized, with routines gobbling up all the data everywhere and making decisions for a small group of human designers – or in dystopian sci-fi versions – the natural limit of no humans. Crypto is radically decentralized. All usable information is held by individuals dispersed in private keys. No one controls the system.
It’s no coincidence that crypto entered mainstream consciousness with the massive, interconnected failure of the 2008 financial crisis, while AI took off after the global pandemic of 2020, reminding people that we are all connected, whether we like it or not. Crypto scares people because it threatens the ability of centralized human institutions to collect taxes and regulate behaviors such as drug use, sex, gambling, pornography, sedition, and more. AI scares people because it threatens individual human action and privacy, regulating all human behavior in a nightmarish totalitarian regime, or perhaps even replacing humans altogether. Another problem with traditional AI is that when you retrieve all the information, you retrieve bias, intolerance, and errors as well as good information.
But a closer look at recent events reveals a more complex situation. Hot areas of AI are using cryptographic technology to create decentralized controls. First-generation AI approaches, which leverage all information, fail because the entities that control the information today are unwilling to cede it to a faceless algorithm beyond their control. Homomorphic encryption allows information holders to benefit from AI analysis without the AI routine itself or its creators accessing the underlying information. Federated learning allows independent and decentralized actors to create and use a common and robust AI tool, without sharing data. Many of the most exciting AI projects are intended to be delivered and controlled by individuals to gather information and make decisions, without exposing anything about the individual to the Internet at large.
At the same time, many crypto projects leverage AI to create complex structures from decentralized parts. A fundamental goal of most cryptocurrencies is composability: once an application is created, it should be easily integrated as a modular component of any larger application. First-generation crypto projects were built with human developers in mind, but AI offers heady possibilities for building much bigger and better structures with decentralized composable applications on the fly. What we call a “smart” contract in crypto today is actually stupid in that it consists of stupid rules chosen by human counterparties. If you string together enough stupid rules, the contract may seem smart, but it’s an illusion; complexity is not intelligence. Additionally, humans are not good at predicting all possible future scenarios. AI can create truly smart contracts, which could transform many areas of human interaction.
Most utopian science fiction imagines computers with access to all information, which slavishly follow human instructions. Some classic science fiction devices – notably Isaac Asimov’s Three Laws of Robotics – deal with the contradiction implicit in this view. But now that we’re building the algorithms that already run much of the world and may soon run all of it, choosing the right mix of centralization and decentralization could be the most important social question of our time – more important than policy or innovations in the field. non-IT fields. And whatever choice we make, it will have to be designed very carefully so that it can be broken neither by decentralized human activities leading to anarchy, nor by centralized AI algorithms leading humans to become slaves.
© 2023 Bloomberg LP
(This story has not been edited by NDTV staff and is auto-generated from a syndicated feed.)
Tech