Algorithmic trading
Algorithmic trading is a method of executing orders using automated pre-programmed trading instructions accounting for variables such as time, price, and volume. This type of trading attempts to leverage the speed and computational resources of computers relative to human traders. In the twenty-first century, algorithmic trading has been gaining traction with both retail and institutional traders. A study in 2019 showed that around 92% of trading in the Forex market was performed by trading algorithms rather than humans.
It is widely used by investment banks, pension funds, mutual funds, and hedge funds that may need to spread out the execution of a larger order or perform trades too fast for human traders to react to. However, it is also available to private traders using simple retail tools. Algorithmic trading is widely used in equities, futures, crypto and foreign exchange markets.
The term algorithmic trading is often used synonymously with automated trading system. These encompass a variety of trading strategies, some of which are based on formulas and results from mathematical finance, and often rely on specialized software.
Examples of strategies used in algorithmic trading include systematic trading, market making, inter-market spreading, arbitrage, or pure speculation, such as trend following. Many fall into the category of high-frequency trading, which is characterized by high turnover and high order-to-trade ratios. HFT strategies utilize computers that make elaborate decisions to initiate orders based on information that is received electronically, before human traders are capable of processing the information they observe. As a result, in February 2013, the Commodity Futures Trading Commission formed a special working group that included academics and industry experts to advise the CFTC on how best to define HFT. Algorithmic trading and HFT have resulted in a dramatic change of the market microstructure and in the complexity and uncertainty of the market macrodynamic, particularly in the way liquidity is provided.
Machine Learning Integration
Before machine learning, the early stage of algorithmic trading consisted of pre-programmed rules designed to respond to that market's specific condition. Traders and developers coded instructions based on technical indicators - such as relative strength index, moving averages - to automate long or short orders. A significant pivotal shift in algorithmic trading as machine learning was adopted. Specifically deep reinforcement learning which allows systems to dynamically adapt to its current market conditions. Unlike previous models, DRL uses simulations to train algorithms, enabling them to learn and optimize its algorithm iteratively. A 2022 study by Ansari et al., showed that DRL framework "learns adaptive policies by balancing risks and reward, excelling in volatile conditions where static systems falter". This self-adapting capability allows algorithms to market shifts, offering a significant edge over traditional algorithmic trading.Complementing DRL, directional change algorithms represent another advancement on core market events rather than fixed time intervals. A 2023 study by Adegboye, Kampouridis, and Otero explains that "DC algorithms detect subtle trend transitions, improving trade timing and profitability in turbulent markets". DC algorithms detect subtle trend transitions such as uptrend, reversals, improving trade timing and profitability in volatile markets. This approach specifically captures the natural flow of market movement from higher high to lows.
In practice, the DC algorithm works by defining two trends: upwards or downwards, which are triggered when a price moves beyond a certain threshold followed by a confirmation period. This algorithm structure allows traders to pinpoint the stabilization of trends with higher accuracy. DC aligns trades with volatile, unstable market rhythms. By aligning trades with basic market rhythms, DC enhances precision, especially in volatile markets where traditional algorithms tend to misjudge their momentum due to fixed-interval data.
Ethical Implications and Fairness
The technical advancement of algorithmic trading comes with profound ethical challenges concerning fairness and market equity. The key concern is the unequal access to this technology. High-frequency trading, one of the leading forms of algorithmic trading, reliant on ultra-fast networks, co-located servers and live data feeds which is only available to large institutions such as hedge funds, investment banks and other financial institutions. This access creates a gap amongst the participants in the market, where retail traders are unable to match the speed and the precision of these systems.Aside from the inequality this system brings, another issue revolves around the potential of market manipulation. These algorithms can execute trades such as placing and cancelling orders rapidly to mislead other participants. An event to demonstrate such effects is the 2010 flash crash. This crash had occurred due to algorithmic activity before partially recovering. Executing at such high speeds beyond human oversight and thinking, these systems blur the lines of accountability. When these crashes occur, it is unclear who bears the responsibility: the developers, institutes using them or the regulators.
With these systems in place, it can increase market volatility, often leaving retail traders vulnerable to sudden price swings where they lack the certain tools to navigate. Some argue this concentrates wealth among a handful of powerful firms, potentially widening the economic gaps. An example would be individuals or firms with the necessary resources gain profits by executing rapid trades sidelining smaller traders. On the contrary, it has its own benefits as well which are claimed to boost market liquidity and cut transaction costs. This creates an ethical tug of war: does the pursuit of an efficient market outweigh the risk of entrenching inequality?
European Union efforts to address these concerns lead to regulatory action. These rules mandate rigorous testing of algorithmic trading and require firms to report significant disruptions..This approach aims to minimize the manipulation and enhance oversight, but enforcement is a challenge. As time goes on, algorithmic trading evolves, whereas the ethical stakes grow higher.
History
Early developments
Computerization of the order flow in financial markets began in the early 1970s, when the New York Stock Exchange introduced the "designated order turnaround" system. SuperDOT was introduced in 1984 as an upgraded version of DOT. Both systems allowed for the routing of orders electronically to the proper trading post. The "opening automated reporting system" aided the specialist in determining the market clearing opening price.With the rise of fully electronic markets came the introduction of program trading, which is defined by the New York Stock Exchange as an order to buy or sell 15 or more stocks valued at over US$1 million total. In practice, program trades were pre-programmed to automatically enter or exit trades based on various factors. In the 1980s, program trading became widely used in trading between the S&P 500 equity and futures markets in a strategy known as index arbitrage.
At about the same time, portfolio insurance was designed to create a synthetic put option on a stock portfolio by dynamically trading stock index futures according to a computer model based on the Black–Scholes option pricing model.
Both strategies, often simply lumped together as "program trading", were blamed by many people for exacerbating or even starting the 1987 stock market crash. Yet the impact of computer driven trading on stock market crashes is unclear and widely discussed in the academic community.
Refinement and growth
The financial landscape was changed again with the emergence of electronic communication networks in the 1990s, which allowed for trading of stock and currencies outside of traditional exchanges. In the U.S., decimalization changed the minimum tick size from 1/16 of a dollar to US$0.01 per share in 2001, and may have encouraged algorithmic trading as it changed the market microstructure by permitting smaller differences between the bid and offer prices, decreasing the market-makers' trading advantage, thus increasing market liquidity.This increased market liquidity led to institutional traders splitting up orders according to computer algorithms so they could execute orders at a better average price. These average price benchmarks are measured and calculated by computers by applying the time-weighted average price or more usually by the volume-weighted average price.
A further encouragement for the adoption of algorithmic trading in the financial markets came in 2001 when a team of IBM researchers published a paper at the International Joint Conference on Artificial Intelligence where they showed that in experimental laboratory versions of the electronic auctions used in the financial markets, two algorithmic strategies could consistently out-perform human traders. MGD was a modified version of the "GD" algorithm invented by Steven Gjerstad & John Dickhaut in 1996/7; the ZIP algorithm had been invented at HP by Dave Cliff in 1996. In their paper, the IBM team wrote that the financial impact of their results showing MGD and ZIP outperforming human traders "...might be measured in billions of dollars annually"; the IBM paper generated international media coverage.
In 2005, the Regulation National Market System was put in place by the SEC to strengthen the equity market. This changed the way firms traded with rules such as the Trade Through Rule, which mandates that market orders must be posted and executed electronically at the best available price, thus preventing brokerages from profiting from the price differences when matching buy and sell orders.
As more electronic markets opened, other algorithmic trading strategies were introduced. These strategies are more easily implemented by computers, as they can react rapidly to price changes and observe several markets simultaneously.
Many broker-dealers offered algorithmic trading strategies to their clients – differentiating them by behavior, options and branding. Examples include Chameleon, Stealth, Sniper and Guerilla. These implementations adopted practices from the investing approaches of arbitrage, statistical arbitrage, trend following, and mean reversion.
In modern global financial markets, algorithmic trading plays a crucial role in achieving financial objectives. For nearly 30 years, traders, investment banks, investment funds, and other financial entities have utilized algorithms to refine and implement trading strategies. The use of algorithms in financial markets has grown substantially since the mid-1990s, although the exact contribution to daily trading volumes remains imprecise.
Technological advancements and algorithmic trading have facilitated increased transaction volumes, reduced costs, improved portfolio performance, and enhanced transparency in financial markets. According to the Foreign Exchange Activity in April 2019 report, foreign exchange markets had a daily turnover of US$6.6 trillion, a significant increase from US$5.1 trillion in 2016.