Back many years ago, when stock brokers received orders from their customers to buy or sell stocks there was trust that the broker would work in the best interests of his client. He would find the best price in the market and sell it there. Some brokers abused this trust and used markets that had better kickbacks for them rather than what was best for their customer. Not good. So congress stepped in to intervene and passed a law requiring each broker to find the best price for their client when selling stock. Seems fine until some people realized that meant they could set up small orders on some exchanges as a way to get a tip that a big transaction was coming and react by beating the broker at other exchanges (by having a faster network connection or being more collocated) and as a result drive the price up (or down) so that they could skim some money off the transaction. This doesn’t make the market better, but brokers hands were tied to prevent this as well.
Michael Lewis outlines this in his new book “Flash Boys”. As he unfolds the mystery behind how we got into this situation, he reveals that most of this was a result of congress continuing to enact laws to protect the consumer in one way, without realizing how that might open up a new form of abuse somewhere else. Intervening in a complex system often introduces new wrinkles that end up causing more harm than what we are trying to avoid.
Bruce Schneier, a security expert, outlined this process for evaluating a solution to some perceived security risk:
1) What problem does it solve?
2) How well does it solve the problem?
3) What new problems does it add?
4) What are the economic and social costs?
5) Given the above, is it worth the costs?
The third one is the one we often miss. When we change a system, what new problems is it likely to create that didn’t exist before?
Even though most of us are not implementing legislation or solving security problems, this process is effective anytime we take action due to a fear or worry we might have. We shouldn’t only think about what we are trying to avoid, we should also think about how our action to avoid the situation may create new risk. For example, after 9/11 many people drove their cars instead of flying because of their fear. As a result, driving fatalities went up because driving was and is still more risky than flying. By avoiding the risk that seemed the most scary, they failed to check the cost of changing to a different solution.
Something to think about when we’re thinking about the next big problem we have the perfect solution to fix. Intervening and doing something different isn’t without consequence.
Photo Credit: Now and Here cc