This Is for Everyone
The Unfinished Story of the World Wide Web
By Tim Berners-Lee
Category: Technology & the Future | Reading Duration: 22 min | Rating: 4.5/5 (27 ratings)
About the Book
This Is for Everyone (2025) tells the inside story of how one man’s simple idea at CERN grew into the World Wide Web that now connects us all. From the first browser wars to the bigger debates over privacy, social media, and AI, it reveals how the web’s open spirit was both its greatest strength and its biggest vulnerability. It also looks to a better future – a web that lives up to its potential by empowering individuals and restoring trust.
Who Should Read This?
- Tech enthusiasts curious in how the web came to be
- Entrepreneurs interested in how bold ideas can grow into global movements
- Everyday web users who are eager for a better online experience
What’s in it for me? Find out what the inventor of the World Wide Web thinks about today’s internet.
Tim Berners-Lee created the World Wide Web with decentralization in mind. He wanted it to be a human-like creation, where ideas link to one another in non-linear but intuitive ways – just like how our brain works. But over the years, there’s been a creeping sense of centralization on the web. A small number of tech giants are battling for control, more and more claims of monopolization are being made, and the primary business model is one of simply getting and maintaining your attention. All of this has led to the web being a place of polarization and addiction. Not at all what Tim Berners-Lee had intended.Rather than a tool for capturing attention, the author dreamed of the web as being a place of intention. This was a tool for bringing ideas together to solve problems. It was about designing things that empower individuals instead of manipulating them. In this Blink we’ll chart that path. We’ll see how the web was created and how it might once again be used to serve humanity, not the other way around.
Chapter 1: A family of engineers
Tim Berners-Lee was born in London, England, in 1955 – an auspicious year for technology, it turned out, since it also brought Steve Jobs and Bill Gates into the world. It wasn’t just a coincidence. Being born at this time put them in the perfect position to take a leading role as a new technological era dawned.Berners-Lee had the additional advantage of being the child of two mathematicians, who were also electronic engineers. So, as the eldest of four children, he grew up in a household where intellect, curiosity, and imagination were encouraged. His parents had worked at Ferranti, the British company that built the first commercial computer. From the start, the language of logic, puzzles, numbers, and circuits were part of the air they breathed. The world of computing was still small in those days that his parents actually knew Alan Turing, the quiet genius who cracked the German Enigma code during World War II. Turing’s ideas on computation and logic lingered in the air at Ferranti, where he’d once tried to teach an early computer to play chess. His work – and his friendship – left an impression on the Berners-Lee family, and through them, on Tim himself.At school, Tim gravitated toward mathematics and science fiction. He treated the works of Asimov and Heinlein as scripture, dreaming of far-flung civilizations built on logic and code. His early teachers nurtured his love of problem-solving and his appreciation of the simple beauty behind a perfect equation.His education continued at home through his father’s demonstrations of logic gates made with water jets – a literal, liquid model of how computers think. While his love of gadgetry began with model train switches and homemade intercoms, it only expanded when he went to Oxford to study physics. There, he built his own computer terminal from a broken television set, a discarded adding machine, and a mess of homemade circuitry. His enthusiasm was matched by the patience of the university engineers – who agreed to let him connect it to their minicomputer. It worked!By graduation, Berners-Lee had built a computer out of literal scrap parts and earned a first-class degree. As he set off into the world, the only thing he knew for certain was that more computers were in his future.
Chapter 2: A perfect environment for invention
Berners-Lee arrived in Geneva, Switzerland in 1980, to work at CERN, which at the time stood for European Council for Nuclear Research. He found himself stepping into a place that looked half science museum, half secret lair: concrete halls, underground rings, and machines that hurled particles around like tiny race cars. He was hired as part of a team that was tasked with updating the control system on the Proton Synchrotron Booster. It was a job that involved replacing massive banks of knobs and oscilloscopes with software and plain terminals. He enjoyed the challenge, even as the room’s glow dimmed into a quiet ring of consoles.But the real lesson at CERN was not about hardware, but about humans. CERN ran on coffee and conversation. Berners-Lee was thrilled to be in this amazing mix of people from English, French, German and Swiss backgrounds. The exchange of ideas and personalities was exciting and it planted a seed: could the serendipity of the coffee bar be captured in code? Inspired, he wrote “Enquire,” a program that let users create linked notes about people, files, devices, and ideas – knowledge growing in any direction, in multiple languages, like a living map. It sketched a way to organize information that felt natural to the way people thought.Soon, building a functional, helpful network at CERN became his passion project. And at the heart of this project was universality. If a system hoped to unite a place like CERN, it had to welcome every format, every machine, every language and mental model. The vision crystallized: hide the server machinery in a back layer and put a friendly client on top, speaking in hypertext.Hypertext had already been debuted by Ted Nelson, who blew minds at a San Francisco computer conference in 1968 when he showed how hypertext, and hyperlinks, could work. An idea or piece of content in one document could link to ideas within the same document or within other documents. It was a simple, jump-here, jump-there philosophy that beautifully mirrored the non-linear way the human brain worked. And Berners-Lee believed it was the perfect thing for his network: a link should leap to anything on any computer.
Chapter 3: The web is born as the 90s begin
Berners-Lee was lucky to have supportive bosses at CERN who let him follow his muse in creating his dream network, one that went far beyond the connected computers at CERN. One of those bosses was Mike Sendall, who hooked him up with the one tool that really sped up the process: the NeXTcube. Developed by Steve Jobs during his wilderness years outside of Apple, the NeXTcube came with a brilliant Interface Builder that practically allowed him to drag and drop an app together, quickly and easily, one piece at a time.In 1990, as he was putting on the finishing touches, he named the project the World Wide Web. It was all coming together. HTTP fetched resources swiftly. HTML marked them up with simple tags. The jewel was the URL: compact, open-ended, able to point to a paragraph in a file or to a server across the ocean. Networking lived to the left of the hash; hypertext lived to the right. That small character became the handshake where two worlds met. The first app, WorldWideWeb.app, read and wrote pages, a tidy tool that matched the spirit of the place that birthed it.It’s worth noting that, in fact, URL wasn’t the original terminology Berners-Lee had in mind. Nowadays, URL stands for Uniform Resource Locators, but back in 1990, the specifications for HTTP referred to the hyperlinks as “Universal Document Identifiers”, or UDIs. This may not sound like a big deal, but for Berners-Lee, “universal” was the whole idea – in more ways than one. He wanted the links to sprout in wild directions, for the structure of the web to emerge from how it was used. Really, there shouldn’t be anything uniform about it.In 1991, Berners-Lee and his CERN colleague Robert Cailliau, who’d helped in the web’s creation, hauled a NeXT machine and a modem to the Hypertext convention in San Antonio. After persuading the hotel to string a phone line into the conference room, their demo hopped from Texas to a server in Switzerland, setting the room abuzz.The server logs at Berner-Lee’s computer in Geneva ticked upward. Slowly at first. But by the end of 1991, it was notching a hundred hits a day. By ‘93, it was 10,000 per day.
Chapter 4: Setting the standards to keep it free
For the web to catch on, it was a matter of that “universal” nature figuring itself out. Berners-Lee still shepherded HTTP, HTML, and URLs, but text-only clients on rare NeXT boxes wouldn’t carry the day. He encouraged an open ecosystem, and developers raced ahead, but at the same time, standards mattered.By 1992, it was already clear that the US was taking the internet seriously. By this time, Senator Al Gore had already drafted a bill that was passed by congress, approving a $600 million budget to pave the way for the “Information Superhighway.” So the author believed it wise to set up a second home in the US, and he landed on the Massachusetts Institute of Technology.He also noticed that some of the early web adopters in the US were already looking for ways to monetize the web. But this could backfire. Some students at the University of Minnesota created a web competitor called Gopher, but when the University suggested the possibility of issuing licensing fees, users revolted and Gopher was dead.That’s when Berners-Lee knew what he wanted to do: he and Robert Cailliau decided that the web should be for everyone. So on April 30, 1993, CERN released the web’s software and protocols into the public domain. If everyone was going to build, everyone needed the keys.Already, developers were out there, helping it along, and browsers were key to broader adoption. One of the first serious contenders in the ongoing browser wars came from UC Berkeley student Pei-Yuan Wei, whose ViolaWWW arrived in 1992, complete with many of the bells and whistles that continue to this day, including bookmarks, history, back-and-forward buttons, even applets and early style controls.Then came Mosaic, which emerged from the University of Illinois at Urbana–Champaign. One bright feature was its “What’s New” homepage, which gave users a daily parade of fresh servers. By late 1993 Mosaic was king, and it was going to their head. The team behind Mosaic wanted to take control, to set the standards of functionality – for people to refer to “the web” as Mosaic.In order to create a set of neutral standards, and a level playing field, Berners-Lee and Cailliau staged the First International Conference on the World-Wide Web (known as WWW1) at CERN in May 1994. Then, at MIT, he started the World Wide Web Consortium. The W3C, as it’s commonly known, is a membership model that attempts to give scrappy nonprofits and giant corporations an equal voice.
Chapter 5: The web goes mainstream
The early web felt exuberant and hand-made. When America Online emerged and brought in a flood of newcomers, the event became known as the “Eternal September.” The web instantly became far less academic and, well, more fun. Early enthusiasts like Justin Hall stitched together personal sites that were unique and creative. Geocities handed out “homesteads”; Craigslist spread city to city with plain pages and real utility. By the mid-90s, Håkon Lie’s CSS, or Cascading Style Sheets, arrived and gave designers a way to create pages that transformed sites from being blocky and dull to intentional and eye-popping. It was also what allowed the transition from laptops to mobile devices to be relatively seamless.With CSS in place, by 2000, the browser wars really began to get bloody – and that’s when the internet began to take a turn for the worse. For a while it was Netscape versus Microsoft’s Internet Explorer. And behind the scenes of these browsers lurked something sinister: cookies, a little block of data that a web server can store on your computer so that a website would recognize you and you wouldn’t have to keep entering a password. The problem is, some cookies – often known as third-party cookies – could also be used by websites to track your movements, record your IP address, and basically invade your privacy, ostensibly in the pursuit of harvesting advertising data. Third-party cookies are what allowed political parties to launch targeted ads at specific people and groups. By many accounts, this can have the power of stoking divisions and influencing major events like Brexit and presidential elections.But there were bright spots in the early 2000s as well. Most notably: Wikipedia. Ward Cunningham’s humble wiki engine met the author’s global volunteer spirit. An editable encyclopedia. It was the web’s promise made visible: intercreativity at planetary scale.
Chapter 6: From wonder to worry
When smartphones arrived, it brought a flood of new users to the web. Global counts leapt from one to two billion overnight. Berners-Lee wanted those numbers to include the unconnected. In 2009, with his wife Rosemary Leith, he launched the Web Foundation to treat web access as a basic right and to do the patient, policy-heavy work that made access real. This included field visits to schools in Rwanda to set up satellite dishes, and witnessing the difference increased connectedness can make in Burkina Faso, where shared farming techniques rescued half a million hectares of land. This was the vision he always had in mind: connectedness amplifying ingenuity. Bringing together two people on opposite ends of the planet, each of whom has one half of the solution to a game-changing problem.The Web Foundation went on to draft a three-part Contract for the Web. The first part is aimed at governments: keep everyone connected; keep the network available; and respect privacy and data rights. The second part is for companies: make the internet affordable and accessible; respect privacy and build trust; and make tech that amplifies the best in humanity while challenging the worst. The last part is for citizens: be creators and collaborators; build respectful communities; and fight for the web.All of this was a challenge as the web grew with mobile phones and social media. It was amazing to see the role social platforms played during the Arab Spring protests that toppled autocratic governments. But then again, he also saw social media’s role in helping autocratic regimes install new leaders and undermine democracy.The mood was tilting from wonder to worry. Smartphone convenience went hand-in-hand with industrial-scale tracking. Nefarious firms like Cambridge Analytica could not only learn what brands you liked, but also whether you were pregnant, what medical ailments you’re suffering from and what your politics are.Berners-Lee kept chasing a more humane architecture for the web. To make this a reality, he began building Solid: a decentralized web platform that will work with personal data pods on ordinary web servers. It has open specs so that apps can talk to each other and share bank transactions, photo details, health records, messages, but all under that person’s control, giving them both custody and portability. Mastercard backed early prototypes and it has continued to move forward in promising ways, even as AI entered the picture.
Chapter 7: Restoring trust in the web
It was at a 25th-birthday party for the web that Tim Berners-Lee listened to Demis Hassabis describe the neural nets that were powering DeepMind’s artificial intelligence. Rather than programmed logic, this method allowed the AI to practice and learn. It was a big leap forward, one that would eventually earn Hassabis a Nobel prize for his work with AlphaFold’s and using AI to predict protein shapes with better-than-human accuracy. But while AI has led to the creation of some extremely helpful tools, it has also raised a lot of questions and concerns. From copyright issues to the ability to create increasingly realistic-looking deepfakes, a lot of these concerns are only serving to further erode trust and stoke divides. Some of these issues have relatively straightforward solutions, like giving media cryptographic “birth certificates” that anyone can check to make sure they are real and not deepfakes. But a lot of it comes down to the familiar grounds of standards, trust, and the need to anchor any smart agent in a user-controlled data layer. Like everything else, AI should be a tool for us, not something that is used to exploit us, further track and target us with ads.Ideally, you would trust an AI assistant with your data. Not only that, but to do as many high-quality jobs as it is capable of, the AI program would have access to a wide range of your data. Currently, most data is siloed in individual apps that don’t communicate with one another and we don’t trust AI with sensitive information like our bank details.This brings us back to the work Berners-Lee is doing with his decentralized Solid platform, which has also taken on AI tools. Solid gives people tools to declare what they want, let vendors compete, and route it through data stores that they control. With your consent, the AI tool, named “Charlie,” would be able to access all the data in your private pod in order to complete a task. If you asked it to pick out a new pair of running shoes, it could reference your fitness logs and financial records in order to match your needs, the way a good assistant would. But at all times your data would be secure and enclosed.There are signs that this is the way we’re headed. Other decentralized platforms like Mastodon, Matrix, and Bluesky have grown in their appeal for users who are tired of the insidious, outrage-baiting algorithms fueling Facebook and X.He hopes this trend will be like the original web. First came a few thousand geeks, then tens of thousands of curious users, then institutions, governments, and everyone. If the web has taught us anything, it’s that decentralized seeds, well-tended, can still grow forests.
Final summary
In this Blink to This Is for Everyone by Tim Berners-Lee, you’ve learned that the author was a curious kid raised by inventive parents. At CERN he stitched together hypertext and networking into URLs, HTTP, and HTML on a NeXT cube, then pushed the code into the public domain so anyone could build. The web leaped into the mainstream world from there, with a line of browsers that helped turn clunky message boards into appealingly designed personal online spaces. Along the way, the author continued to champion net neutrality, open data, and privacy rights. But as usage soared, so did the number of concerns. Third-party cookies tracked people across sites, exposed their data, and further eroded our sense of trust and security on the web. These concerns have deepened with AI, but there is still hope to create a more useful web, where users state what they want and control their own data. With Solid, the author hopes to do just that, by creating a decentralized platform using data pods that let individuals set permissions and grant self-contained access to everything from health records to bank statements. This way, the web could still evolve toward something generous, trustworthy, and genuinely empowering.Okay, that’s it for this Blink. We hope you enjoyed it. If you can, please take the time to leave us a rating – we always appreciate your feedback. See you in the next Blink.
About the Author
Tim Berners-Lee is the British computer scientist who invented the World Wide Web while working at CERN in 1989. He later founded the World Wide Web Consortium (W3C) at MIT, which has guided the web’s open standards for more than three decades. A lifelong advocate for online freedom and user privacy, he continues to push for a more ethical, decentralized web through projects like Solid and his company, Inrupt.