But when I first started hearing about HFT and the market distortions it seems to produce, what came to mind for me wasn’t the implication for high finance, but the implications for the military–and, ultimately, to all of our security.
HFT systems were designed to operate in an environment where even the lowliest day-traders had access to powerful computers and high-bandwidth connections. That is, the competition for HFT isn’t just the human making buy and sell decisions, but the computerized system augmenting that trader: popping up alerts, executing buy and sell orders based on pre-arranged triggers, and gradually reducing the number of decisions the human operating needs to make over the course of a trading session. It’s an arms race, of sorts, and one that wasn’t anywhere near ending.
Take this one example of HFT in action:
Soon, thousands of orders began flooding the markets as high-frequency software went into high gear. Automatic programs began issuing and canceling tiny orders within milliseconds to determine how much the slower traders were willing to pay. The high-frequency computers quickly determined that some investors’ upper limit was $26.40. The price shot to $26.39, and high-frequency programs began offering to sell hundreds of thousands of shares.
How long, do you imagine, it would be before traders would be using systems that could identify that HFT price sniffing was underway, and adjust limits accordingly? Of course, that doesn’t solve the problem; it just takes human decision-making more and more out of the loop.
But that situation–humans on one side, humans + computer/robot systems on the other–won’t last. And when both sides of a conflict have digitally-augmented combat systems, the side that keeps humans too much in the loop is at a distinct tactical disadvantage. We could easily find ourselves giving our military robots the power to make the kill decision not because we think it’s wise, but because that may be the only guarantee that they can act in time.
This isn’t a “computers are taking over” fear, or an “unfriendly AI” fear–these systems could barely be called artificial intelligence. And that’s precisely the problem. We’re increasingly giving autonomy to computerized systems that lack anything other than simplistic algorithms for decision-making. A functionally autonomous high-frequency trading system executes what it’s been programmed to execute, without any awareness of the larger economy or even what’s happening off the trading floor that may be driving prices; an autonomous military system would fire a shot based on the rules its been given, but without any sense of context or tactics, let alone ethics.
As we change financial rules to reduce the effect of HFT on prices and trading, we might take a moment to think about the bigger picture. What other systems are we going to give autonomy without intelligence?
Read more of Jamais Cascio’s Open the Future blog.