Traceroute

Equinix

Traceroute is a fascinating glimpse into the inner workings of our digital world. Join Technical Storyteller Grace Ewura-Esi and a team of brilliant hosts from Equinix as they illuminate the human element behind the hidden design and unseen infrastructure that shapes our digital lives. For more information, visit https://origins.dev/originals/traceroute

Episode 7: Compute
Mar 24 2022
Episode 7: Compute
The invisible bones holding up the Internet are its hardware. One of the most prominent benefits we are reaping from hardware innovations is cloud services. And as you may have guessed, the cloud isn’t actually just somewhere up in space: physical data centers services are necessary to keep them up and running.  In this episode of Traceroute, we take a closer look at hardware and why its advancement is crucial to the development of the internet. We discuss the importance and benefits of optimization for hardware to suit the needs of software. Joined by our guests Amir Michael, Rose Schooler, and Ken Patchett, we explore the synergy of software and hardware in data center services and its effects on the connected world.  Episode Highlights The important Relationship Between Hardware and Software Efficiency depends on understanding how software uses hardware and vice versa Software consumes every just like hardware depending on the way it’s written People want software and hardware “out of sight/out of mind,” but hardware is increasing in visibility due to data centers and the cloud As the internet increases, so does the need for better hardware Amir Michael: “There are thousands of people at large companies that are driving not only the design of the hardware, but the supply chains behind them as well. And if you just look at the financial reporting from these companies, they spend billions and billions of dollars on infrastructure.” The Building Blocks Of Getting Online Intel started in 1968, specializing in bulky but efficient memory chips. Now they lay transistors on top of atoms. Microprocessors are in every device now, from cell phones to servers to routers, making foundational microprocessor capability critical The biggest breakthrough came when Intel was able to use their infrastructure to support networking, and could then scale up to data centers and cloud architecture This began the transformation of networking, with storage moving from big fixed function hardware over to software-defined More growth in hardware is on the horizon with things like Artificial Intelligence, 5G, and edge computing  The Birth Of The Cloud The “Metal Rush” of the early 2000s saw companies like Google and Yahoo building their own data centers For smaller companies, this infrastructure development didn’t make sense Small business turned to companies like Amazon, which had server resources to spare, and the cloud was born Data centers have scaled in size, but now the need is to optimize efficiency  More and more, hardware is now tailored for specific software applications Unlike software, developing hardware requires a longer production schedule and a more consistent supply chain, which can be difficult The next step is density, where more computing power is packed into less space but with greater efficiencies. Amir Michael: “You know, no one really goes into a bank anymore. Everything's just done over the network over these cloud resources today. It's how we've become accustomed to getting a lot of work done today. And so you need all that infrastructure to drive that. And I think it's just going to become more and more so in the future as well. The Nuts & Bolts Of Data Centers The cloud is simply a combination of data centers of various sizes across the globe that are all connected through a network The first data centers relied on redundancy and stability, so they were built like bomb shelters with backup systems Data centers started redesigning hardware to optimize it for different uses,  depending on who’s renting the server space Open compute is the next phase for data centers, where engineers figure out how to get bigger, better, faster and more resilient with existing servers and components Ken Patchett: “Data and the usage of data has become much like a microwave in a home,  it is simply required, is expected. Most people don't look for it, they don't need it,...
Episode 6: Sustainability
Mar 17 2022
Episode 6: Sustainability
Technology is a staple part of our lives. Its continuous growth has improved the world in countless ways. But what most people don’t know is the environmental impact of something as mundane as streaming a video. In this episode, we discuss the impacts of data storage, technology, and the Internet on our world. Ali Fenn, David Mytton, and Jonathan Koomey share their insights on investing in sustainability and transitioning to more efficient energy sources. The key to global sustainability lies in the hands of data storage and technology industries. They need to find greener, more sustainable alternatives. If you want to learn about the Internet’s environmental impacts and know how you can contribute to investing in sustainability, then this episode of the Traceroute podcast is for you. Episode Highlights [01:23] Areas For Infrastructure SustainabilityThe demand for increased data storage grows globally and daily.  Data centers need more compact and more efficient transistors to decrease their harmful effects on the environment while still providing good service. Ali Fenn, the president of ITRenew, says we should focus on energy, materials, and the manufacturing process for infrastructure sustainability. Spewing a ton of waste on the back end is also alarming. It's vital to consider environmental sustainability for the future of the Internet infrastructure industry. Ali Fenn: “The manufacturing process has this huge carbon impact. So let’s think about a less wasteful, less linear stream, and let's at least maximize the value we can get out of all that stuff.” [04:53] Investing in Sustainability by Reusing MaterialsAli didn’t think much about the environmental impact of technology infrastructure until she worked at ITRenew, which promotes the reuse of data center hardware. The demand for infrastructure is spurred by hyperscalers, like Google and Facebook. Open hardware is becoming the norm, maximizing the value and longevity of hardware through repurposing and reusing. Open hardware allows ITRenew to grow, buyers to get quality equipment, and hyperscalers to improve their sustainability. A circular economy is about deferring new manufacturing from a carbon perspective without sacrificing quality. Tune in to the full episode to hear Ali’s analogy about reusing materials using second-hand cars. [10:23] Data Center Energy ConsumptionOther concerns for investing in sustainability include electricity, materials, and water consumption. The primary resource for Internet usage is electricity. The rapid growth of technology and the Internet leads to colossal consumption of our natural resources and poses a significant threat to the environment. The total amount of data center energy consumption ranges from 200 terawatt-hours to 500 terawatts-hours. Data centers are more efficient now, and the world is transitioning to cloud computing. [14:48] Three Steps for Greener Data CentersWhile data centers have made impressive steps in reducing their carbon impact, there are three steps they can take to become greener.  The first step is to offset all the carbon they emit through electricity generation. Next, match all electricity usage with 100% renewables. Although this is a good step, it may not be sufficient, as data centers still require a local electricity grid. Lastly, use 100% clean energy through power-purchase agreements to gain renewable electricity sources. Governments can encourage companies to move in this direction.  [16:33] Switching to Efficient InfrastructuresDavid Mytton: “Improvements in their facilities mean that they are able to invest in efficiencies.” Many companies are moving in this direction to save money and commit to social and corporate responsibility. Scale still matters in this situation. With sustainability in mind, these companies benefit from their scale and can invest in new programs. Investing in efficient infrastructure may not be affordable for smaller...
Episode 5: Open Source
Mar 10 2022
Episode 5: Open Source
There is tension between the digital and the physical development spaces. As the world becomes more digital, the distance between software and hardware widens. Only a few people are attempting to bridge the gap. Unspoken competition, gatekeeping, differences in perspective — these reasons and more push experts from the software and hardware spaces apart. But open source is the key to furthering collaboration and innovation in technology development. In this episode of Traceroute, we look deeper into the digital space and how it intrinsically connects to physical hardware. Joining us today are open-source advocates Jon Masters and Brian Fox. They share with us their insights on hardware and software proprietary rights. They also provide context on open-source technology and how vital open source is for innovation and increasing opportunities.   If you are someone looking to explore open source technology, then this episode of the Traceroute podcast might be perfect for you! Episode Highlights [1:50] Behind The Scenes In The Digital SpaceThe utilities we use daily — like water and electric appliances — are built to meet exacting standards to ensure user-friendliness.  Similarly, tech companies build digital infrastructures that most computer users can easily utilize. Jon Masters: “We build very boring, elaborate standards so that the average user, if they don't want to, doesn't have to understand every layer of what's going on.” [03:30] How Open Source Ties Software And Hardware TogetherMany people in the tech space tend to focus on either the physical or digital aspects of technology.  Not being able to grasp the hardware that supports software can be a lost opportunity. Knowing the hardware that goes with your software and how they intertwine can bring many opportunities. The software industry, especially the internet, requires a durable physical backbone. Likewise, hardware can only evolve with new software developments. The reawakening to hardware development mirrors the early stages of the open-source software space. Jon Masters: “If you look at where the industry is going right now, hardware and software, they were always important counterparts to one another.” [06:34] The Definition of SoftwareSoftware is a symbolic way of writing ideas. Similar to the English language, it employs semantics to express the developer’s collection of ideas.  Software technology aims to develop a space that allows computers to perform several tasks simultaneously.  To achieve a higher level of computing platform, computer processors would have to undergo time slicing.  An operating system manages the software that runs on a computer, as well as access to hardware devices. Essentially, it serves as the interface between humans and hardware.  [08:47] The Beginning Of The Open Source MovementBack in the day, students and academics wrote numerous codes. They shared these codes in an effort to further the science.  However, the rise of proprietary software ended the open collaboration system of the early days.  Not everyone was onboard with proprietary software—thus, giving birth to the idea of open and free software.  Brian Fox: “I'm working on a vision detection system, and I want the other guy who was working on it to also be able to enhance it in the direction that he cares about or that she cares about. And it shouldn't stop me. That way, we can share and collaborate, and the entire science moves up.” [09:37] Free Vs. Open Source SoftwareBoth free software and open-source software advocate public access to code. However, the idea behind these software types comes from different places of understanding. Free software does not contain any license that prevents it from being shared across different users. The open source software movement is rooted in an ethical understanding that formulas should not be restricted. Anyone can join the open source...
Episode 4: Wireless
Mar 3 2022
Episode 4: Wireless
The term 5G has been the talk of the town. Much of the hype is due to its faster internet speed that allows the handling of multiple devices compared with previous networks. Recently, some countries have started to roll out this technology. However, it's still in its early years, so we have yet to discover its full potential.  In this episode, we'll hear insights from Ed Knapp, Sue Marek, and Sascha Segan on the topic of wireless network connection. We discuss the development of the wireless industry and how internet infrastructure spurred its growth. We also go through the generations of wireless network connection, from 2G to 4G, and peer into how the development of 5G will unfold.  If you want to know more about next-generation wireless networks and how technology develops to support them, then this episode is for you. Episode Highlights[01:07] The Beginning of Wireless Technology Wireless technology was introduced during the 80s. It was then that Ed Knapp started to see the emergence of innovative technologies like the car phone.  Demand for wireless services was limited because wireless devices and services were expensive. No one expected them to have more than a million subscribers in the US.  The technology had tremendous value, even life-saving for some. And so, Knapp wanted more people to access it.  By the 90s, people were trying to join analog modems to the cellular network so more people could connect to the internet. But it was too difficult to get them connected. One company couldn't overcome this challenge alone. More help was needed to create massive infrastructure networks necessary to solve this problem.  [04:21] Diverging of Paths: Internet and Wireless The wireless industry developed at the same time as internet infrastructure. As they grew, demand for their service also increased.  There was an insatiable want for wireless service, and engineers needed to figure out how to create networks that could support it.  Cell towers are needed to connect cell phones to networks, but they are expensive to build. Companies, later on, decided to share the equipment instead of building their own.  [05:11] Opening the Wireless Network to an Independent Model When the iPhone entered the market, 4G traffic and operatives needed to increase their capacity.  The industry evolved into a point where telecom companies do not need to own all network infrastructure. Instead, independent companies started to manage the installed towers.  [06:53] Customer Complaints Customers had an issue with how they were being billed.  During this time, cell phone companies could get away with charging customers by the minute for their service by acting like they had limited capacity.  The same problem happened when text messaging emerged. Customers were still billed by the number of characters. The internet changed the game as it made sending information cheaper. Suddenly, it didn’t make sense for people to be charged the same way again.  Because of this technological advancement, businesses were pressured to change their service and how they charged their clients.  [09:29] The ‘G’The G in 4G or 5G stands for “generation.” It refers to the phase of technology that is the industry standard.   Sue Marek: “Every generation of cellular [technology] is about every 10 years. So 2020 is 5G, 2010 was 4G, 2000 was really the 3G. 3G was really when we used to talk about the mobile web or the wireless internet.” One of the technological hurdles the 3G era faced was figuring out how to access the web through a phone. [12:06] Cell Phone Digitalization Technology took a huge leap when cell phones started to connect with wireless networks.  The digitalization of cell phone systems started during the 1990s. Multiple people could use the service of the same channel at the same time.  Sascha Segan: "Once
Episode 3: Networks
Feb 24 2022
Episode 3: Networks
When we open web browsers and streaming services, we expect them to work seamlessly without interruptions. Sounds basic enough, right? But have you considered how much data goes over your local network? Now imagine all the computers communicating worldwide! It took years for internet service providers to make the internet work the way it does today. Without the physical infrastructure underpinning our networks, connecting computers the way they are now would have been impossible. In this episode, Dave Temkin, Ingrid Burrington, Jack Waters, and Andrew Blum join us to discuss how the internet works. They detail the hidden infrastructure involved in getting computers connected around the world. Contrary to what digital natives might think, your connection to the World Wide Web isn't 100% wireless. They also discuss the rise of Netflix and the need for an interconnected and open global network. If you want to understand the massive network of physical infrastructure required to connect computers worldwide, then this episode of the Traceroute podcast is for you. Episode Highlights [01:15] Netflix’s Goal and Challenge Dave Temkin: “We always knew that streaming was going to be the future. It's not a coincidence that the company was called Netflix, the intention was always to deliver it over the network. We just needed to feel that the network was ready.” Netflix, the global streaming service that allows uninterrupted streaming, took years to build.  The infrastructure needed to be scalable to a point where it can serve millions of users without breaking the internet.  The key to solving this data transmission challenge is networks. [3:12] What is a Network? Networks are overlapping and interconnecting things. These can be virtually or physically tied together.  The networks that let the internet work require the support of physical infrastructure. Acknowledging this fact helps us understand that the internet is a public resource. People don’t see internet infrastructures as public work. Network infrastructure includes data centers, towers, and all the wires, cables, and fibers that connect them.  [5:47] How the Network Market GrewAfter the government relaxed regulations in the 1990s, there was a big wave of infrastructure development.  For example, Williams, an oil and gas company, built fiber networks using their non-operational oil and gas pipelines. Developers built many fiber networks beyond that time's demand. Many of these infrastructures are still in us today.  [6:58] Interconnection and Resiliency of Networks Most people will only think about their own network. In reality, a larger computer network of interconnected cables is the basis of how the internet works. Interconnectivity forms the basis of maintaining a stable internet connection. Hundreds of interconnected cables ensure that computer networks are durable and resilient. Ingrid Burrington: “There is a resiliency built into the way that Internet networks function in that it's not just like one single cable that gets cut and everyone loses their internet access.” [8:18] Level 3’s LegacyPhysical linkages are necessary to make the internet work. Many people don’t think about this equipment.  For Level 3, internet infrastructure needed to be built from scratch but still have the space for upgrades.  The company built 16,500 miles of network in the United States and 3,500 miles in Europe in 30 months.  Before this network was constructed, the internet ran largely on the legacy of the telephone network.  The demand for the networks Level 3 built did not surface until the late 2000s. While they missed the timing, their legacy remains. [14:38] How The Internet Has ChangedThe emergence of smartphones helped dramatically change the internet’s landscape. We now favor cloud, triggering the need for a hybrid cloud provider and such. Jack Waters: “I do think it is probably...
Episode 2: Silicon
Feb 24 2022
Episode 2: Silicon
There are a lot of components that make up a computer. It’s amazing how the tiniest little chips can make the whole thing work. However, not many of us think about these today. We just expect our devices to work as they should. But did you know that only some decades ago, the innovations we enjoy today were essentially unthinkable? The pursuit of something better brought the tech space to where it is today. In this episode, Renée James and Jon Gertner join us to talk about what silicon is used for in computer hardware. They break down the history of semiconductors and transistors. They also lay down the various experiments and breakthroughs that occurred before the conception of the industrial and consumer products we enjoy today. If you want to know why and how silicon metal runs everything in tech, this episode is for you. Episode Highlights [01:18] A Little Girl’s Journey to the Computer IndustryThe CEO of semiconductor company Ampere Computing, Renée James, grew up alongside the computer industry. Her exposure to tech began with her father, who used to work at HP. He built computers and motherboards. Renée went on to a storied career at Intel. Now, she leads her own semiconductor company. The material that has stayed constant throughout Renée’s career is silicon metal. [03:10] What Silicon Metal IsSilicon metal is the hard, brittle crystalline semiconductor that makes up transistors. These, in turn, make up chips, which make up computers. In essence, what silicon metal is used for, is computers. Silicon metal production began before the 70s and 80s. It inspired the name Silicon Valley. [03:28] Bell Labs and AT&TSilicon metal started with Bell Labs, a company named after Alexander Graham Bell. Bell Labs produced the American Telephone and Telegraph  (AT&T) Company. The company later monopolized the telephone service in the US. AT&T created an R&D development laboratory in 1925 called the Bell Telephone Laboratories. It started as a means to create a national phone system.  The lab's monopoly was critical to its long-term growth and success. It allowed them to plan for innovations around communications. [05:24] Inventing Innovative TechnologiesBell Labs produced technology not so much because they had great ideas, but because they had problems to solve. They had to create a national communication system from scratch. Switching centers in the 1930s contained enormous banks of switches that connected people to each other. The idea of the transistor was to use a new material without moving parts. The transistor is the building block of all electronic products. It's an amplifier and switch that replaced vacuum tubes and electromechanical relays. Jon Gertner: “It made everything smaller, it made it faster, and it made it better.” The material that would make transistors work is silicon metal.  [07:40] SemiconductorsA material that would become critical for transistors is semiconductors. Semiconductors acted like conductors under certain circumstances. These became valuable for wireless radios. Silicon metal, alongside germanium, was also used as a semiconductor for radar sets. [08:08] Experimentation on Transistors and SemiconductorsSome experts guessed that semiconductors could be useful in the phone system in the late 1930s. William Shockley experimented in turning semiconducting material into amplifiers in the 30s and 40s. It proved to be very difficult. It took years of experimentation to get anywhere with silicon metal and transistors. Bell Labs clearly understood the need to manipulate materials for communication systems. Jon Gertner: “The backbone of electronics and the backbone of these vast interconnected communication systems, it's actually this sort of decades-long or almost century-long pursuit of understanding the kinds of materials we needed to create the system.” [09:36]...
Episode 1: Interconnection
Feb 24 2022
Episode 1: Interconnection
Inventing the internet can be traced from its formation for military and academic use. Since then, we've made huge leaps in terms of communication and interconnectivity. Greater interconnectivity has changed the game for building networks between people. The projects that began in 1966 have fundamentally altered communication practices all over the world. In the first episode of Traceroute, we go back to the start of the Cold War. What was the initial purpose of computer networking? How has it changed over time? We'll answer these questions with insights from Jay Adelson, Sarah Weinberger, John Morris, and Peter Van Camp. In this episode, we'll discover how the very nature of digital communication evolved and continues to evolve today. One major contribution to the interconnectivity we enjoy today is the neutral exchange framework spearheaded by Equinix. Episode Highlights [02:46] DARPA and Improving InterconnectivityThe Defense Advanced Research Projects Agency was created in response to the panic caused by the Soviet Union’s Sputnik, the first artificial satellite in the world. DARPA had a broad mandate to take on research projects as directed by the Secretary of Defense.  It tried to create new technologies to keep the Pentagon and the military ahead of the Soviets.  DARPA's priorities were space and defense research. However, it also had to consider effective communication and improving interconnectivity. [04:24] The Birth of ARPANETOne of the research projects funded by DARPA was ARPANET. The concept of computer networks were new, but improved interconnectivity within the organization. In the early days of computers, DARPA hired J.C.R. Licklider. He became fundamental to inventing the internet. Sharon Weinberger: “He sort of looked ahead and said, the way that we work with computers is going to fundamentally change our society.” Their proposal became a prototype. 1969 was the first instance of two computers being connected, and the first message delivered over ARPANET was sent.  It was a struggle to convince people of the benefits of greater interconnectivity. The project's funding was almost cut due to lack of support. [07:41] Interconnecting PeopleMore people realized that having interconnected systems had applications outside military use. The internet left DARPA's hands in the 90s, becoming commercially viable and consumer-friendly. But we can't overlook its military legacy. J.C.R Licklider’s hand in inventing the internet also cannot be understated. ARPANET is an example of a successful collaboration between the government and private sector. [09:36] Traffic in the Open WebJohn Morris: “Back in the '80s, commercial communications were prohibited on the internet. The internet was only for government and academic communication.” The internet’s evolution to how we know it today started when it was decentralized from government control. Connection points soon became congested and created traffic in physical telecommunication networks. More importantly, opportunities online led to commercial growth and the need for regulation. [13:07] The Telecommunications Act of 1996The main focus of the legislation was to generate competition among phone companies. It also created an opportunity for CLECs (competitive local exchange carriers). They could deliver better connectivity and services to a user through higher-speed internet.  This development led to the birth of broadband internet. It also increased the need for physical connection points to maintain efficient interconnectivity between devices. The '96 Telecommunications Act enabled private organizations separate from phone companies to run exchange points. Competition between phone companies made neutral exchange points that laid the groundwork for the internet today. [16:06] A Faster, Decentralized Internet Cable companies entering the competition for providing internet access opened the debate for open...