Technological Quicksand: Why More Technology Keeps Making Things Worse
Our addiction to technological fixes creates a quicksand cycle of deeper problems and endless digital dependencies.

This post originally appeared in my Studio Notes newsletter on Substack. To get early access to future articles, subscriber to Studio Notes.
"Have you tried turning it off and on again?"
This universal troubleshooting advice contains more wisdom than we realize. When faced with technological problems, our first instinct is often to add another layer—another app, another update, another system—rarely considering that disconnection might be the solution we actually need.
Picture this: A doctor hunched over a laptop, frantically clicking through an electronic health record system while a patient waits. To solve the doctor's burnout, we develop AI scribes. To manage the AI's hallucinations, we create oversight systems. To streamline those systems, we implement automation tools. Each solution spawns new specialists, new training programs, new costs—a never-ending technological Russian nesting doll.
This isn't progress. It's a trap.
Electronic Health Record (EHR) systems were introduced with promises of streamlining documentation, reducing errors, and improving patient outcomes. Instead, these systems have created what physicians now call "digital burnout." A 2023 Mayo Clinic study found doctors spend nearly twice as long documenting care as they do providing it—with physicians logging an average of 4.5 hours daily on EHR tasks. The technological solution? AI scribes and documentation assistants that now require their own specialists, training programs, and integration teams.
What started as a tool to simplify has spawned an entire industry dedicated to mitigating its unintended consequences. Each solution adds another layer of complexity, cost, and potential points of failure—a matryoshka doll where each nested solution contains the seeds of future problems.
This pattern isn't isolated. Consider the rise of "digital wellness" applications. The irony is remarkable: we've created apps to help us use other apps less. Companies that engineered interfaces specifically designed to maximize engagement and "time-on-device" now offer tools to help manage the very addictions they meticulously crafted.
Apple's Screen Time and Google's Digital Wellbeing don't represent enlightenment—they're symptomatic of our refusal to confront the root issue. Rather than questioning the fundamental design principles that created these addictive patterns, we've responded with yet more software—another layer of complexity that treats symptoms while the underlying condition persists.
This technological recursion reveals a troubling pattern: we've become trapped in a self-perpetuating cycle where each innovative solution creates cascading dependencies rather than genuine resolution. Technology critic Evgeny Morozov calls this "solutionism"—the belief that all difficulties have benign technological solutions, preferably ones that can be monetized and scaled.
This pattern has appeared throughout our technological history.
Consider early telecommunications. When telegraph lines first spread across continents in the 1800s, they created new problems of message congestion. The solution? More complex routing systems and codes, which required specialized operators. When telephone networks replaced telegraphs, they brought their own challenges—crossed lines and limited connections. Each solution to these problems added layers of switching systems, operators, and eventually computerized networks, each more complex than the last. Today's technological recursion simply accelerates this historical tendency, compressing what once took decades into mere months.
The psychological underpinnings of this pattern are worth examining. When faced with complex problems, we instinctively reach for tools that worked in the past. Having experienced remarkable technological progress, we've developed a cognitive bias that filters all problems through a technological lens. This narrow framing prevents us from seeing alternative approaches that might more effectively address the fundamental issues at hand.
This reflexive technological solutionism reflects a distinctly American pragmatism gone awry. The national ethos that celebrated Yankee ingenuity and believes "there's a fix for everything" has evolved into a cultural algorithm: when faced with any challenge, deploy technology first, ask questions later. Silicon Valley's "move fast and break things" mentality is merely the latest incarnation of this deep-seated cultural belief that practical action trumps deliberation, and that progress is measured by what we build rather than what we understand. This isn't merely a technical tendency but a manifestation of our broader cultural resistance to accepting limitations of any kind.
More concerning is how this cycle increases system fragility. Each additional technological layer introduces new interdependencies, creating what complexity theorists call "tight coupling"—where failure in one component can rapidly cascade through the entire system. The 2023 CrowdStrike update failure that grounded thousands of flights worldwide wasn't just a software glitch; it was a demonstration of how technology intended to protect systems becomes their most critical vulnerability.
This isn't an argument for technological regression. Rather, it's a call for technological discernment—developing the wisdom to distinguish between problems that genuinely require technological solutions and those that technology might exacerbate.
What might such discernment look like in practice? It begins with expanding our conception of innovation beyond the purely technological. Some of our most impactful innovations throughout history have involved new ways of organizing, communicating, or collaborating—innovations that often address root causes more effectively than new devices or algorithms.
We also need greater intellectual humility about technology's limits. Some problems—particularly those involving human behavior, values, and social cohesion—may be fundamentally unsuited to technological intervention. This requires a different kind of courage: the willingness to acknowledge when technology isn't the answer, even when building more of it seems like the path of least resistance.
For those creating tomorrow's innovations, this isn't about abandoning technology—it's about developing a more nuanced relationship with it. True technological mastery might not be measured by what we can build, but by our capacity to recognize when not to build; to see when a problem requires us to step back rather than layer on more complexity.
The most profound innovation challenge of our era isn't technical—it's philosophical. It asks us to examine not just what technology can do, but what it should do. It invites us to question whether addition is always progress, and whether the solution to technological problems must inevitably be more technology.
The next time you encounter a problem born from technology, pause before reflexively reaching for another technological fix. Ask yourself: Am I solving the actual problem, or merely addressing its symptoms? Am I innovating forward, or merely deeper into dependency?
Perhaps our most revolutionary act isn't the creation of the next breakthrough device, but a new way of thinking about technology itself—one that values simplicity over complexity, wisdom over capability, and restraint alongside creation. In a world obsessed with addition, the subtraction of unnecessary layers might be the most disruptive innovation of all.