One of the most cited reasons algorithms perform better than human traders is that they don’t have emotions. A trading program doesn’t get depressed if it loses. It doesn’t have a mortgage to think about or kids that might go hungry if daddy blows out in a convergence trade.
On the surface, this argument seems eminently reasonable. Upon further inspection, this viewpoint fails to account for the humanity we impart on our strategies as their lords and creators. We create algorithms in our own image, attempting to encapsulate our ideal strategy. But by doing so, we create strategies capable of startlingly human reactions.
Consider the popular GARCH family used in volatility modeling. GARCH provides a framework for capturing the stylized fact that volatility tends to cluster. Volatility begets volatility. There is evidence that GARCH produces biased results. Moreover, like any model with a rolling memory, it will fall victim to a very human bias: the availability heuristic. By its very nature, GARCH can lull your strategy into a false confidence. By modeling volatility endogenously, the model can miss the external signs of an imminent volatility shock. Thus a mathematical model can exhibit classic human errors such overconfidence and projection bias.
Risk management logic can go one step further. A seemingly reasonable reaction to a series of bad trades might be to automatically size down. Now your strategy has the capacity to get depressed. Each additional level of logical complexity creates the capacity for actions that very much resemble human emotions.
There is a further vector by which we transmit our emotions into our machines; many strategies have hyper-parameters which augment their overall behavior. Think of these as the strategy’s knobs or levers which are controlled by human traders watching the market. E.g. Market makers fearful of low-liquidity after a surprise announcement will adjust their strategies to widen spreads. More generally, maximum levels of risk are expanded or contracted based on a trader’s future outlook. The human may not be in the loop but he is usually on top of the loop to a large degree (skynet levels of self-awareness nonwithstanding).
This raises the discomfiting thought that what we call human emotions might be the result of a series of overlapping heuristics. If simple rules can bake emotions into our trading strategies, is the spectrum of human emotions a result of the complex interplay between a collection of ever-changing rules for living / procreating?
If so, perhaps emotions evolved as a low-latency approximation of rational thought. Emotions such as fear can be extremely beneficial in a life-threatening situation. However, heuristics that evolved for the savanna might not be valuable in a post-industrial modern society. This leads to the concept of a beneficial emotion, or an emotion that provides positive value over time. Degenerate emotions are those which inspire reactions that are sub-optimal for the current environment.
Returning to the example of the depressed trading strategy, it’s melancholy could be beneficial if your threshold for downsizing does a good job identifying bad runs for the strategy. If sizing down lets you side-step days which are ill-suited for your strategy, depression can be a money spinner. If you picked the threshold in order to avoid talking with your risk manager, however, you might just as well be click trading. Your algorithm can easily inherit your fear.