Nerds, we did it. We have graduated, along with oil, real estate, insurance, and finance, to the big T. Trillions of dollars. Trillions! Get to that number any way you like: Sum up the market cap of the major tech companies, or just take Apple’s valuation on a good day. Measure the number of dollars pumped into the economy by digital productivity, whatever that is. Imagine the possible future earnings of Amazon.

The things we loved—the Commodore Amigas and AOL chat rooms, the Pac-Man machines and Tamagotchis, the Lisp machines and RFCs, the Ace paperback copies of Neuromancer in the pockets of our dusty jeans—these very specific things have come together into a postindustrial Voltron that keeps eating the world. We accelerated progress itself, at least the capitalist and dystopian parts. Sometimes I’m proud, although just as often I’m ashamed. I am proudshamed.

June 2019. Subscribe to WIRED.

Stefan Dinse/EyeEm/Getty Images (clouds)

And yet I still love the big T, by which I mean either “technology” or “trillions of dollars.” Why wouldn’t I? I came to New York City at the age of 21, in the era of Java programming, when Yahoo! still deserved its exclamation point. I’d spent my childhood expecting nuclear holocaust and suddenly came out of college with a knowledge of HTML and deep beliefs about hypertext, copies of WIRED (hello) and Ray Gun bought at the near-campus Uni-Mart. The 1996 theme at Davos was “Sustaining Globalization”; the 1997 theme was “Building the Network Society.” One just naturally follows the other. I surfed the most violent tsunami of capital growth in the history of humankind. And what a good boy am I!

My deep and abiding love of software in all its forms has sent me—me—a humble suburban Pennsylvania son of a hard­scrabble creative writing professor and a puppeteer, around the world. I lived in a mansion in Israel, where we tried to make artificial intelligence real (it didn’t work out), and I visited the Roosevelt Room of the White House to talk about digital strategy. I’ve keynoted conferences and camped in the backyard of O’Reilly & Associates, rising as the sun dappled through my tent and emerging into a field of nerds. I’ve been on TV in the morning, where the makeup people, who cannot have easy lives, spackled my fleshy Irish American face with pancake foundation and futilely sought to smash down the antennae-like bristle of my hair, until finally saying in despair, “I don’t know what else to do?” to which I say, “I understand.”

When I was a boy, if you’d come up behind me (in a nonthreatening way) and whispered that I could have a few thousand Cray supercomputers in my pocket, that everyone would have them, that we would carry the sum of human ingenuity next to our skin, jangling in concert with our coins, wallets, and keys? And that this Lilliputian mainframe would have eyes to see, a sense of touch, a voice to speak, a keen sense of direction, and an urgent desire to count my actual footsteps and everything I read and said as I traipsed through the noosphere? Well, I would have just burst, burst. I would have stood up and given the techno­barbaric yawp of a child whose voice has yet to change. Who wants jet packs when you can have 256 friggabytes (because in 2019 we measure things in friggin’ gigabytes) resting upon your mind and body at all times? Billions of transistors, attached to green plastic, soldered by robots into a microscopic Kowloon Walled City of absolute technology that we call a phone, even though it is to the rotary phone as humans are to amoebas­. It falls out of my hand at night as I drift to sleep, and when I wake up it is nestled into my back, alarm vibrating, small and warm like a twitching baby possum.

I still love software. It partially raised me and is such a patient teacher. Being tall, white, enthusiastic, and good at computers, I’ve ended up the CEO of a software services company, working for various large enterprises to build their digital dreams—which you’d figure would be like being a kid in a candy store for me, sculpting software experiences all day until they ship to the web or into app stores. Except it’s more like being the owner of a candy factory, concerned about the rise in cost of Yellow 5 food coloring and the lack of qualified operators for the gumball-forming machine. And of course I rarely get to build software anymore.

I would like to. Something about the interior life of a computer remains infinitely interesting to me; it’s not romantic, but it is a romance. You flip a bunch of microscopic switches really fast and culture pours out.

A few times a year I find myself walking past 195 Broadway, a New York City skyscraper that has great Roman columns inside. It was once the offices of the AT&T corporation. The fingernail-sized processor in my phone is a direct descendant of the transistor, which was invented in AT&T’s Bell Labs (out in New Jersey). I pat my pocket and think, “That’s where you come from, little friend!” When the building was constructed, the company planned to put in a golden sculpture of a winged god holding forked lightning, called Genius of Telegraphy.

But by the time the building was finished AT&T had sold off the telegraph division, so the company called it Spirit of Electricity. But that must have been too specific, because it was renamed Spirit of Communication. And then in 1984, the Bell system, after decades of argument about its monopoly status, broke up (with itself and with America).

Now the New York offices are rented out to, among other things, a wedding planning website and a few media companies. The statue has been relocated to Dallas. Today everyone calls it Golden Boy.

In the late 1990s I was terrified of mailing lists. For years the best way to learn a piece of software—especially some undocumented, open sourced thing you had to use to make websites—was to join its community and subscribe to its mailing lists, tracking the bugs and new releases. Everything was a work in progress. Books couldn’t help you. There was no GitHub or Stack Overflow.

I could only bring myself to lurk, never to contribute. I couldn’t even ask questions. I was a web person, and web people weren’t real programmers. If I piped up, I was convinced they’d yell, “Get off this mailing list! You have no place in the community of libxml2! Naïf!” The very few times I submitted bugs or asked questions were horrible exercises in rewriting and fear. Finally I’d hit Send and—

Silence, often. No reply at all. I’d feel awful, and a little outraged at being ignored. I was trying so hard! I’d read the FAQs!

Eventually I met some of those magical programmers. I’d sneak into conferences. (Just tell the people at the entry you left your badge in the hotel room.) They were a bunch of very normal technologists contributing, through their goodwill and with their spare time, to open source software tools.

“I use your code every day,” I’d say. They were pleased to be recognized. Surprised at my excitement. They weren’t godlike at all. They were, in many ways, the opposite of godlike. But I am still a little afraid to file bug reports, even at my own company. I know I’m going to be judged.

Netflix and Google Books Are Blurring the Line Between Past and Present

  • Soon We Won't Program Computers. We'll Train Them Like Dogs

  • Why the Future Doesn't Need Us

  • So much about building software—more than anyone wants to admit—is etiquette. Long before someone tweeted “That’s not OK!” there were netiquette guides and rule books, glossaries, and jargon guides, like The New Hacker’s Dictionary, available in text-only format for download, or Hitchhiker’s Guide to the Internet, first released in 1987. Bibles. There were the FAQs that would aid newcomers to the global decentralized discussion board Usenet. FAQs kept people from rehashing the same conversation. When college freshmen logged on in September—because that’s where the internet happened back in the 1980s and ’90s, at colleges and a few corporations—they would be gently shown the FAQs and told how to behave. But then in 1993, AOL gave its users Usenet access—and that became known as the Eternal September. The ivory tower was overrun. That was the day the real internet ended, 26 years ago. It was already over when I got here.

    The rulemaking will never end. It’s rules all the way down. Coders care passionately about the position of their brackets and semicolons. User experience designers work to make things elegant and simple and accessible to all. They meet at conferences, on message boards, and today in private Slacks to hash out what is good and what is bad, which also means who is in, who is out.

    I keep meeting people out in the world who want to get into this industry. Some have even gone to coding boot camp. They did all the exercises. They tell me about their React apps and their Rails APIs and their page design skills. They’ve spent their money and time to gain access to the global economy in short order, and often it hasn’t worked.

    I offer my card, promise to answer their emails. It is my responsibility. We need to get more people into this industry.

    But I also see them asking, with their eyes, “Why not me?”

    And here I squirm and twist. Because—because we have judged you and found you wanting. Because you do not speak with a confident cadence, because you cannot show us how to balance a binary tree on a whiteboard, because you overlabored the difference between UI and UX, because you do not light up in the way that we light up when hearing about some obscure bug, some bad button, the latest bit of outrageousness on Hacker News. Because the things you learned are already, six months later, not exactly what we need. Because the industry is still overlorded by people like me, who were lucky enough to have learned the etiquette early, to even know there was an etiquette.

    I try to do better, and so does my company. How do you change an industry that will not stop, not even to catch its breath? We have no leaders, no elections. We never expected to take over the world! It was just a scene. You know how U2 was a little band in Ireland with some good albums, and over time grew into this huge, world-spanning band-as-brand with stadium shows with giant robotic structures, and Bono was hanging out with Paul Wolfowitz? Tech is like that, but it just kept going. Imagine if you were really into the group Swervedriver in the mid-’90s but by 2019 someone was on CNBC telling you that Swervedriver represented, I don’t know, 10 percent of global economic growth, outpacing returns in oil and lumber. That’s the tech industry.

    No one loves tech for tech’s sake. All of this was about power—power over the way stories were told, the ability to say things on my own terms. The aesthetic of technology is an aesthetic of power—CPU speed, sure, but what do you think we’re talking about when we talk about “design”? That’s just a proxy for power; design is about control, about presenting the menu to others and saying, “These are the options you wanted. I’m sorry if you wanted a roast beef sandwich, but sir, this is not an Arby’s.” That is Apple’s secret: It commoditizes the power of a computer and sells it to you as design.

    Technology is a whole world that looks nothing like the world it seeks to command. A white world, a male world, and—it breaks my heart to say it, for I’ve been to a lot of Meetups (now a WeWork company), and hosted some too—a lonely world. Maybe I’m just projecting some teenage metaphysics onto a lively and dynamic system, but I can’t fully back away from that sense of monolithic loneliness. We’re like a carpenter who spent so long perfecting his tools that he forgot to build the church.

    But not always. One night in October 2014, I had a few drinks and set up a single Linux server in the cloud and called it tilde.club, then tweeted out that I’d give anyone an account who wanted one. I was supposed to be working on something else, of course.

    Suddenly my email was full: Thousands of people were asking for logins. People of all kinds. So I made them accounts and watched in awe as they logged on to that server. You can put hundreds of people on one cheap cloud computer. It’s just plain text characters on a screen, like in the days of DOS, but it works. And they can use that to make hundreds of web pages, some beautiful, some dumb, exactly the way we made web pages in 1996. Hardly anyone knew what they were doing, but explaining how things worked was fun.

    For a few weeks, it was pure frolic. People made so many web pages, formed committees, collaborated. Someone asked if I’d sell it. People made their own tilde servers. It became a thing, but an inclusive thing. Everyone was learning a little about the web. Some were teaching. It moved so fast I couldn’t keep up. And in the end, of course, people went back whence they came—Twitter, Facebook, and their jobs. We’d had a very good party.

    The server is still up. Amazon sends a bill. I wish the party could have kept going.

    But briefly I had made a tiny pirate kingdom, run at a small loss, where people were kind. It was the opposite of loneliness. And that is what I wish for the whole industry. Eternal September is not to be hated, but accepted as the natural order of success. We should invite everyone in. We should say, We’re all new here.

    “Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind.” This was John Perry Barlow’s “A Declaration of the Independence of Cyberspace,” a document many people took seriously, although I always found it a little much. Barlow was a prophet of network communication, an avatar of this magazine. “On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.” It’s signed from Davos, 1996 (the year of “Sustaining Globalization”).

    Exposure to the internet did not make us into a nation of yeoman mind-farmers (unless you count Minecraft). That people in the billions would self-assemble, and that these assemblies could operate in their own best interests, was … optimistic.

    But maybe! Maybe it could work. There was the Arab Spring, starting in 2010. Twitter and Facebook were suddenly enabling protest, supporting democracy, changing the world for the better. This was the thing we’d been waiting for—

    And then it wasn’t. Autocracy kept rearing its many heads, and people started getting killed. By 2014, Recep Tayyip Erdoğan was shutting off Twitter in Turkey to quell protests, and then it came home, first as Gamergate, wherein an online campaign of sexual harassment against women, somewhat related to videogames, metastasized into an army of enraged bots and threats. And as Gamergate went, so went the 2016 election. It was into this gloomy context that I made tilde.club that night—a blip of nostalgia and cheer fueled by a few Manhattans.

    People—smart, kind, thoughtful people—thought that comment boards and open discussion would heal us, would make sexism and racism negligible and tear down walls of class. We were certain that more communication would make everything better. Arrogantly, we ignored history and learned a lesson that has been in the curriculum since the Tower of Babel, or rather, we made everyone else learn it. We thought we were amplifying individuals in all their wonder and forgot about the cruelty, or at least assumed that good product design could wash that away. We were so hopeful, and we shaved the sides of our heads, and we never expected to take over the world.

    I’m watching the ideologies of our industry collapse. Our celebration of disruption of every other industry, our belief that digital platforms must always uphold free speech no matter how vile. Our transhumanist tendencies, that sci-fi faith in the singularity. Our general belief that software will eat the world and that the world is better for being eaten.

    It’s been hard to accept, at least for me, that each of our techy ideologies, while containing various merits, don’t really add up to a worldview, because technology is not the world. It’s just another layer in the Big Crappy Human System along with religion, energy, government, sex, and, more than anything else, money.

    I don’t know if I can point to any one thing and say “that’s tech” in 2019. (Well, maybe 3D graphics GPU card programming. That’s nerd central.) The cost of our success is that we are no longer unique. The secret club is no longer a gathering of misfits. We are the world. (We are the servers. We are the ones who gather faves and likes, so let’s start clicking. Sorry.)

    I’ve made a mistake, a lifelong one, correlating advancements in technology with progress. Progress is the opening of doors and the leveling of opportunity, the augmentation of the whole human species and the protection of other species besides. Progress is cheerfully facing the truth, whether flooding coastlines or falling teen pregnancy rates, and thinking of ways to preserve the processes that work and mitigate the risks. Progress is seeing calmly, accepting, and thinking of others.

    It’s not that technology doesn’t matter here. It does. We can enable humans to achieve progress. We make tools that humans use. But it might not be our place to lead.

    I wish I could take my fellow CEOs by the hand (they’re not into having their hands held) and show them Twitter, Facebook, Tumblr, and any of the other places where people are angry. Listen, I’d say, you’re safe. No one is coming for your lake house, even if they tweet “I’m coming for your lake house.” These random angry people are merely asking us to keep our promises. We told them 20-some years ago that we’d try to abolish government and bring a world of plenty. We told them we’d make them powerful, that we’d open gates of knowledge and opportunity. We said, “We take your privacy and security seriously at Facebook.” We said we were listening. So listen! They are submitting a specification for a world in which fairness is a true currency, and then they’re trying to hold everyone to the spec (which is, very often, the law). As someone who spent a lot of time validating XML and HTML pages, I empathize. If bitcoin can be real money, then fairness can be a real goal.

    We might have been them, if we’d been born later and read some different websites. And it’s only time before they will become us.

    Every morning I drop off my 7-year-old twins, a boy and a girl, at their public school, and they enter a building that was established a century ago and still functions well for the transmission of learning, a building filled with digital whiteboards but also old-fashioned chalkboards and good, worn books.

    I think often of the things the building has seen. It was built in an age of penmanship and copybooks, shelves of hardbound books and Dick and Jane readers; it made its way through blue mimeographs with their gasoline smell. Milkmen delivered with horses when it was built, and now every parking space is filled with Toyotas and school buses. Teachers and principals come young and retire decades later. There are certain places where craft supplies are stored. The oldest living student just turned 100 years old, and some students walked to his home and sang him “Happy Birthday.” They announced it at the multicultural music event.

    The school hasn’t moved in a century, but it is a white-hot place in time. Ten or twenty thousand little bodies have come through here on their way to what came next. While they are here, it’s their whole world. It feeds the children who need to be fed.

    I watch my kids go through the front doors. (I call this my “cognitive receipt,” because unless I see them I worry that I somehow forgot to drop them off.) Then I walk to the bus stop. The bus comes, and off we go, across an elevated highway and through a tunnel. Then we take the FDR Expressway and drive right under three bridges: the Brooklyn, the Manhattan, the Williamsburg. Each bridge has its own story, an artifact of its time, products of various forms of hope, necessity, and civic corruption, each one an essay on the nature of gravity and the tensile strength of wire. Everyone on the bus looks at their phone or looks out the window, or sometimes they read a book.

    Sometimes I think of the men who died making the Brooklyn Bridge; sometimes I play a game on my phone. This is as close as it gets to the sacred for me, to be on a public conveyance, in the arms of a transit authority, part of a system, to know that the infrastructure has been designed for my safety. In the winter, I can look down into the icy East River and fantasize about what it would take to push us into the river, because only a small, low concrete barrier keeps us from death. I think of how I’d escape and how I’d help others up. But the bus never hurtles into the water. They made sure of it.

    I know that my privacy is being interfered with, that I’m being watched, monitored, tracked by giant companies, and that I’m on video. (I wish I’d known how often I’d be on video in 2019, how often I’d need to see my own animated face in the corner of the video call.) I know also that I have been anticipated by the mineralogists who study asphalt and that I am surrounded by tolerances and angles, simple and complex machines.

    My children are safe in an old, too-warm building that has seen every system of belief and every kind of education, one that could easily last another 100 years, with glowing lichen on the wall in place of lights. Imagine how many light-emitting sneakers they’ll have by then.

    Maybe I should have moved to the Bay Area to be closer to this industry I love, and just let myself fall backward into tech. I could never muster it, even though I studied maps of San Francisco and pushed my wife to come with me and visit the corporate campuses of Apple, Google, and the like, which meant visiting a lot of parking lots.

    But I didn’t move. I stayed in New York, where on a recent Saturday I went to the library with my kids. It’s a little one-story library, right next to their school, and it’s as much a community center as repository of knowledge. I like quiet, so sometimes I get annoyed at all the computers and kids, the snacking moms and dads. But it’s 2019 and I live in a neighborhood where people need public libraries, and I live in a society.

    When we visited one day in February, there was a man in a vest behind me setting up some devices with wire and speakers. He was trying to connect two little boxes to the devices and also to two screens, and calling gently to a passing librarian for a spare HDMI cable. Kids were coming up and looking. They were particularly interested in the cupcakes he’d brought with him.

    “We’re having a birthday party,” he said, “for a little computer.”

    By which he meant the Raspberry Pi. Originally designed in the UK, it’s smaller than a can of soda and runs Linux. It costs $35. It came into the world in February 2012, sold as a green circuit board filled with electronics, with no case, nothing, and became almost instantly popular. In that and subsequent versions, 25 million units have been sold. A new one is much faster but basically the same size, and still costs $35.

    But for the terrible shyness that overcame me, I would have turned around right there and grasped that man’s hand. “Sir,” I would like to have said, “thank you for honoring this wonderful device.”

    You get your Raspberry Pi and hook it up to a monitor and a keyboard and a mouse, then you log on to it and … it’s just a Linux system, like the tilde.club machine, and ready for work. A new computer is the blankest of canvases. You can fill it with files. You can make it into a web server. You can send and receive email, design a building, draw a picture, write 1,000 novels. You could have hundreds of users or one. It used to cost tens of thousands of dollars, and now it costs as much as a fancy bottle of wine.

    I should have said hello to the man in the library. I should have asked my questions on the mailing lists. I should have engaged where I could, when I had the chance. I should have written fan letters to the people at Stanford Research Institute and Xerox PARC who bootstrapped the world I live inside. But what do you say? Thank you for creating a new universe? Sorry we let you down?

    We are all children of Moore’s law. Everyone living has spent the majority of their existence in the shadow of automated computation. It has been a story of joy, of mostly men in California and Seattle inventing a future under the occasional influence of LSD, soldering and hot-tubbing, and underneath it all an extraordinary glut of the most important raw material imaginable—processor cycles, the result of a perfect natural order in which the transistors on the chips kept doubling, speeds in the kilo-, mega-, and eventually gigahertz, as if the camera had zoomed in on an old IBM industrial wall clock that sped up until its minute hand was a blur, and then the hour hand, and then the clock caught fire and melted to the ground, at which point money started shooting out of the hole in the wall.

    There is probably no remaining growth like what we’ve seen. Attempts to force a revolution don’t seem to work. Blockchain has yet to pan out. Quantum computing is a long and uncertain road. Apple, Google, and their peers are poised to get the greatest share of future growth. Meanwhile, Moore’s law is coming to its natural conclusion.

    I have no desire to retreat to the woods and hear the bark of the fox. I like selling, hustling, and making new digital things. I like ordering hard drives in the mail. But I also increasingly enjoy the regular old networks: school, PTA, the neighbors who gave us their kids’ old bikes. The bikes represent a global supply chain; when I touch them, I can feel the hum of enterprise resource planning software, millions of lines of logistics code executed on a global scale, bringing the handlebars together with the brakes and the saddle onto its post. Then two kids ride in circles in the supermarket parking lot, yawping in delight. I have no desire to disrupt these platforms. I owe my neighbors a nice bottle of wine for the bikes. My children don’t seem to love computers as I do, and I doubt they will in the same way, because computers are everywhere, and nearly free. They will ride on different waves. Software has eaten the world, and yet the world remains.

    We’re not done. There are many birthdays to come for the Raspberry Pi. I’m at the office on a Sunday as I write this. My monitor is the only light, and if you could see me I’d be blue.

    I’m not sure if I should be a CEO forever. I miss making things. I miss coding. I liked having power over machines. But power over humans is often awkward and sometimes painful to wield. I wish we’d built a better industry.

    I was exceptionally lucky to be born into this moment. I got to see what happened, to live as a child of acceleration. The mysteries of software caught my eye when I was a boy, and I still see it with the same wonder, even though I’m now an adult. Proudshamed, yes, but I still love it, the mess of it, the code and toolkits, down to the pixels and the processors, and up to the buses and bridges. I love the whole made world. But I can’t deny that the miracle is over, and that there is an unbelievable amount of work left for us to do.

    Getty Images (all photographs)

    Paul Ford (@ftrain) is a programmer and a National Magazine Award–­winning essayist on technology. In 2015 he cofounded Postlight, a digital product studio in New York City.

    This article appears in the June issue. Subscribe now.

    Let us know what you think about this article. Submit a letter to the editor at mail@wired.com.

    When you buy something using the retail links in our stories, we may earn a small affiliate commission. Read more about how this works.


    Why We Love Tech

    Read more: