Technological fix


A technological fix, technical fix, technological shortcut or solutionism is an attempt to use engineering or technology to solve a problem. The term “technological fix” was coined in the mid-1960s and popularized by the American physicist Alvin Weinberg, who defined it as "a means for resolving a societal problem by adroit use of technology and with little or no alteration of social behavior.”
Some references consider that technological fixes are inevitable in modern technology, since it has been observed that many technologies, although invented and developed to solve certain perceived problems, often create other problems in the process, known as externalities. In this regard, technological fixes are viewed as an "attempt to repair the harm of a technology by modification of the system", that might involve modification of the machine and/or modification of the procedures for operating and maintaining it. In other words, there would be modification of the basic hardware, modification of techniques and procedures, or both.
The technological fix is the idea that all problems can find solutions in better and new technologies. It now is used as a dismissive phrase to describe cheap, quick fixes by using inappropriate technologies; these fixes often create more problems than they solve or give people a sense that they have solved the problem.

Contemporary context

In the contemporary context, technological fix is sometimes used to refer to the idea of using data and intelligent algorithms to supplement and improve human decision making in hope that this would result in ameliorating the bigger problem. One critic, Evgeny Morozov defines this as "Recasting all complex social situations either as neat problems with definite, computable solutions or as transparent and self-evident processes that can be easily optimized – if only the right algorithms are in place." Morozov has defined this perspective as an ideology that is especially prevalent in Silicon Valley, and defined it as "solutionism". While some criticizes this approach to the issues of today as detrimental to efforts to truly solve these problems, opponents find merits in such approach to technological improvement of our society as complements to existing activists and policy efforts.
An example of the criticism is how policy makers may be tempted to think that installing smart energy monitors would help people conserve energy better, thus improving global warming, rather than focusing on the arduous process of passing laws to tax carbon, etc. Another example is the use of technological tools alone to solve complex sociopolitical crises such as pandemics, or the belief that such crises can be solved through the integration of technical fixes alone.

Algorithms

The definition of algorithms according to the Oxford Languages dictionary is “a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer.” Algorithms are increasingly used as technological fixes in modern society to replace tasks or decision-making by humans, often to reduce labor costs, increase efficiency, or reduce human bias. These solutions serve as a “quick and flawless way to solve complex real world problems… but technology isn’t magic”. The use of algorithms as fixes, however, are not addressing the root causes of these problems. Instead, algorithms are more often being used as “band-aid” solutions that may provide temporary relief, but do not ameliorate the issue for good. Additionally, these fixes tend to come with their own problems, some of which are even more harmful than the original problem.
One example of algorithms as a technological fix for increasing public safety is face recognition software, which has been used by the San Diego County police department and the Pittsburgh police department, among other government security organizations. Face recognition is an example of algorithmic technology that is viewed as potentially having many benefits for its users, such as verifying one’s identity in security systems. This system uses biometrics to quantify and map out distinguishing facial features. However, face recognition as a technological fix for safety and security concerns comes with issues of privacy and discrimination. In the case of face recognition technology being used by the San Diego County police department, Black men were being falsely accused of crimes due to being mistakenly identified by the software. Additionally, San Diego police used the face recognition software on African Americans up to twice as often than on other people. The cases of discrimination perpetuated by the face recognition tool led to a three-year ban on its use starting in 2019. Instead of addressing systemic and historically embedded issues of inequalities among racial groups, the face recognition technology was used to perpetuate discrimination and support police in doing their jobs unfairly and inaccurately.
Another example of algorithms being used as a technological fix is tools to automate decision-making, such as in the cases of Oregon’s Child Welfare Risk Tool and the Pittsburgh Allegheny County Family Screening Tool. In these cases, algorithms replacing humans as decision makers have been used to fix the underlying issues of the cost of employees to make child welfare case decisions and to eliminate human biases in the decision-making process. However, researchers at Carnegie Mellon University found that the tool discriminates against Black families, who are statistically underserved and have historically lived in lower-income areas. This historical data caused by systemic disparities causes the algorithm to flag a greater percentage of children of Black families as high risk than children of White families. By using data based on historical biases, the automated decisions further fuel racial disparities, and actually accomplish the opposite of the intended outcomes.

Climate change

The technological fix for climate change is an example of the use of technology to restore the environment. This can be seen through various different strategies such as: renewable energy and climate engineering.

Renewable energy

Climate engineering

Externalities

Externalities are the unforeseen or unintended consequences of technology. It is evident that everything new and innovative can potentially have negative effects, especially if it is a new area of development. Although technologies are invented and developed to solve certain perceived problems, they often create other problems in the process.

Algorithms

Evgeny Morozov, writer and researcher on social implications of technology, has said, “A new problem-solving infrastructure is new; new types of solutions become possible that weren’t possible 15 years ago”. The issue with the use of algorithms as technological fixes is that they shouldn’t be applied as a one-size-fits-all solution because each problem comes with its own context and implications. While algorithms can offer solutions, it can also amplify discriminatory harms, especially to already marginalized groups. These externalities include racial bias, gender bias, and disability discrimination.
Oftentimes, algorithms are implemented into systems without a clear understanding of whether or not it is an appropriate solution to a problem. In Understanding perception of algorithmic decisions: Fairness, trust, and emotion in response to algorithmic management, Min Kyung Lee writes, “...the problem is that industries often incorporate technology whose performance and effectiveness are not yet proven, without careful validation and reflection.” Algorithms may offer immediate relief to problems or an optimistic outlook to the current issues at hand, but they can also create more problems that require even more complex solutions. Sometimes, the use of algorithms as a technological fix leaves us asking, “Did anyone ask for this?” and wondering whether the benefits outweigh the harms. These tradeoffs should be rigorously assessed in order to determine if an algorithm is truly the most appropriate solution.

DDT

was initially used by the United States military in World War II to control a range of different illnesses, varying from malaria to the bubonic plague and body lice. Due to the efficiency of DDT, it was soon adopted as a farm pesticide to help maximise crop yields to consequently cope with the rising population's food demands post WWII. This pesticide proved to be extremely effective in killing bugs and animals on crops, and was often referred as the "wonder-chemical". However, DDT was banned for over forty years after it was found that DDT accumulated within the fatty cells of both humans and animals.
In humans, DDT was found to cause:
  • Breast and other cancers
  • Male infertility
  • Miscarriages and low birth weight
  • Developmental delay
  • Nervous system and liver damage
DDT is toxic to birds when eaten; it decreases the reproductive rate of birds by causing eggshell thinning and embryo deaths. DDT negatively affects various systems in aquatic animals, including the heart and brain. DDT is moderately toxic to amphibians like frogs, toads, and salamanders. Immature amphibians are more sensitive to the effects of DDT than adults.

Automobiles

s with internal combustion engines have revolutionised civilisation and technology. However, whilst the technology was new and innovative, helping to connect places through the ability of transport, it was not recognised at the time that burning fossil fuels, such as coal and oil, inside the engines would release pollutants. This is an explicit example of an externality caused by a technological fix, as the problems caused from the development of the technology was not recognised at the time.

Different types of technological fixes