By Shelby Cheyenne Job
One hundred eighty-five years after Charles Darwin’s famous expedition on HMS Beagle, the Galápagos Islands are still bursting with discoveries, only the science looks a little different. For one thing, researchers like Craig Venter have traded in the Beagle for more comfortable research vessels. Venter is a biochemist who, among other things, helped complete the first sequencing of the human genome. He has spent his career leveraging 21st century tools like big data for cutting-edge scientific research. But he is still drawn to the Galápagos. Aboard his personal yacht, Sorcerer II, Venter set sail in 2003 to circumnavigate the globe in a route inspired by Darwin’s voyage.
While anchored in the Galápagos, the Sorcerer II’s mission was to secure hundreds of ocean water samples, as the crew had done all over the globe. The samples were then frozen, analyzed, and catalogued in massive computer databases. From there, “high-speed sequencers and supercomputers” employed the whole genome shotgun technique, reassembling genome sequences over and over again until sections of DNA were fully mapped. The result was the discovery of thousands of unknown bacteria species that have aided our understanding of marine microbiology. Venter’s work has not only put him at the forefront of ocean science; it has also exposed the bald fact that while big data is changing the world, it’s changing the oceans along with it.
The grand irony here? Venter’s big data ocean research would not be possible without thousands of miles of deep sea data cables that threaten the very marine ecosystems he studies.
* * *
In data-based research such as Venter’s, we see how global data infrastructure has become the backbone of modern science, replacing the scrawling field notes and finch sketches of Darwin’s 1830s expedition. Science is no longer slow thanks to our ever-expanding digital knowledge bank, which holds far more data than anyone could hope to analyze in a lifetime. One of the big buzzwords in tech, “big data” can seem a daunting concept. Brian Jefferson, an Associate Professor of Geography at the University of Illinois Urbana-Champaign, explains big data to his students through the three “Vs”: velocity, volume, and variety. Your regular, run-of-the-mill data can qualify for a promotion to “big data” once it’s big enough (volume), fast enough (velocity), and diverse enough (variety). An easier way to determine if you are dealing with “regular” data or big data is to ask whether you could feasibly manage the dataset yourself. Would analyzing the data take years, decades, centuries, or even longer? If so, it’s likely big data.
Using big data has largely become the responsibility of governments, companies, and research groups. These groups grapple with big data through innovative technologies of collecting, creating, analyzing, and storing data. Supercomputers, data centers, servers, algorithms, and internet cloud storage are a few examples of such innovations. By monopolizing digital technologies capable of manipulating big data, companies such as IBM, Facebook, Salesforce, and Google have built data empires worth billions. By generating consumer data, innovating new ways to use that data, and building new infrastructure to store it, tech companies enjoy seemingly infinite growth. In addition to topping the economic sector, some observers predict big data will provide “miraculous solutions to well-worn problems” such as climate change. In these terms, big data starts to seem like the modern messiah. Skeptics like Jefferson, however, warn us that data “… is only magical without problematizing it.”
* * *
The problems with big data become apparent when you realize that the internet and its cloud are not just digital abstractions, but a material network of concrete components: thousands of miles of fiberoptic cables, added to the raw materials required to build them, the energy-intensive data centers they connect to, and the billions of electronic devices they power. With millions of iPhones touting wireless connections to the cloud, the physical reality of a world bound with industrial cables and wires goes unseen. This is particularly true when that industrial footprint is buried under the floor of the deep sea. As one New York Times writer put it, “People think that data is in the cloud, but it’s not. It’s in the ocean.” So much data infrastructure has been installed in the ocean that lined end-to-end, deep sea fiberoptic cables could wrap around the globe 22 times. Imagine these cables crisscrossing the ocean floor, carrying infinite tidbits of data from mundane bank transactions to the President’s tweets, to that often Googled question, how does the internet work? In all, 99 percent of the world’s data travels through the ocean’s trenches. Each byte of data rubs elbows with marine life we aren’t even aware exists as it rides along a fiberoptic railway through the watery Wild West.
Strung across delicate and little-understood marine ecosystems, the massive undersea data industry is raising alarms. Environmental researchers such as Jean-Baptiste Jouffray, a graduate student at the Stockholm Resilience Centre, explain that “the extent, intensity, and diversity of today’s aspirations (in the ocean) are unprecedented.” Coining the term “blue acceleration” to describe industrialization of the sea floor, Jouffray and fellow researchers argue that the ocean has become the latest high-impact frontier of the Anthropocene. To what extent “blue acceleration” will damage marine ecosystems with noise, heat dissipation, and contamination is highly contentious. Minimal research has been done that might clarify the short- and long-term effects. In the absence of environmental research, cable companies have funded industry studies in support of their claims that cables are benign features of their underwater habitat.
These cable companies even go so far as to argue that it is the cable infrastructure itself that is under threat from marine life, specifically sharks. And indeed, there is evidence of sharks interfering with sub-cables. In the 1980s, cable companies reported that sharks were responsible for decommissioning the same cable line on four separate occasions, making sharks the original hackers. In 2010, video footage showing sharks’ antagonistic relationship with fiberoptic cables went viral on YouTube. Viewers expressed their amusement in the comments section, making puns about the shark’s “megabyte” and blaming it for their subpar Internet connection. The increased publicity seemed to reflect an uptick in shark attacks, prompting Google to take action in 2014. The data giant announced that its new cables would don a Kevlar coating aimed at protecting the fiberoptic lines within from the jaws of, well, Jaws. Google’s statement mentioned little about the safety of the sharks. The report also fell short of providing a reason for sharks’ interest in data-laden cables. Cable companies suggest that sharks’ ability to sense the electro-magnetic field produced through the vibrations of suspended cables leads them to believe that the cables are food. Alternatively, shark specialists say not to underestimate the animals’ curiosity. If you came across a cable in your backyard, you would probably check it out, too.
If sharks are a principal concern for cable companies, the ocean floor surrounding the Galápagos might not be the optimal place to sink a cable. What made the islands ideal for Darwin’s scientific research — the variety and abundance of flora and fauna — make the islands a poor home for undersea data. Home to 32 species of cable predators, the area has “the highest abundance of sharks in the world.” Nevertheless, plans to sink the America Movil-Telxius West Coast Cable just south of Isla Isabella in the Galápagos were made in 2019. Set to be operable sometime in 2020, the cable will run down the west coast of South America, from Puerto San José, Guatemala, to Valparaíso, Chile. The cable will represent a measly 0.01 percent of cables wrapped around the Earth’s crust, but its impact is significant. Parts of South America previously bereft of data connection will now have command over 108 terabits of data per second. All of those 0s and 1s certainly outnumber their marine predators, but they’re still no match against a shark attack.
* * *
Should sharks cut the connection to South American homes, it won’t take long for cable companies to repair the damage. But will this same level of protection be extended to the marine life residing in some of the most biodiverse waters on the planet?
The sea floor is often imagined as a flat, watery desert, but ocean geographies and ecosystems are as diverse as those on land. Volcanoes, deep-sea geysers, and trenches are regular features. Natural disasters and “meteorological disturbances” sweep across the ocean like an indecisive artist’s brush, reconfiguring the topography of the seabed regularly. Thus, cables are not sunk into a vacuum-like void, but are forced to interact with a “dynamic and fluid external environment.” No larger than 0.08 of an inch, the fiberoptic cables that transmit bank transactions, research data, and everything else have no chance of surviving the hazards of life underwater. To give the tiny cables extra protection, manufacturers retrofit fiberoptics with layers of hardy materials. The number of layers varies according to the depth at which the cable will rest. Deep-sea cables have less girth than those closer to shore, which use added layers as protection from human interference. The movement of sediment in tidal zones constitutes additional turbulence for coastal-dwelling cables. To mitigate the issue, the cables are suspended between rock formations causing them to “vibrate or strum” as they sway with the tide. The minuscule movements of the cable will strain the suspension system over time, wearing on the integrity of the supporting rocks and potentially disrupting rock-dwelling life.
Cables aren’t always homewreckers, however. Their hard, outer layer often makes a perfect underwater home on the otherwise soft, sedimentary ocean floors. Specifically, the cables attract “encrusting marine biota,” such as mollusks and anemones. While some adhesive creatures are lucky enough to find a forever home on cable lines, not all cable-dwelling creatures are so fortunate. Should the line become damaged or inoperable, repair or replacement will disturb the colony of life it has attracted. And for the mollusks and anemones that do remain, another threat looms: The hard surface of the cables that the sea creatures prefer is often made of polyethylene, the most common type of plastic. Polyethylene is infamous for stirring up controversy; there have been many debates about whether the plastic leaches harmful chemicals when submerged in water. Initial studies suggest that polyethylene leaching is untraceable in the short term. Cables would need to be submerged for centuries before significant impacts due to leaching take place. This, in turn, raises a key question: With life spans of approximately 25 years, how long will cables actually be in our oceans?
Cables and data infrastructures might only be useful for a quarter century, but it has become common practice to leave them in place after they have stopped working. Senay Boztas, a writer for The Guardian, revealed that an estimated “94 percent of unused cables and 72,000 repeaters are abandoned on the seabed.” Having reached the end of their life cycle, the cables should be stripped of their expensive raw materials. But most data giants are choosing to leave the cables for dead. Owing to the lack of regulation over international waters, there is no hope of enforcing cable clean-up: “It is like the Wild West in the middle of the ocean,” according to one of Boztas’ informants. While some companies are beginning to take the initiative and extract their outdated cables from the ocean floor, there is disagreement on whether cable “recycling” is cost-effective or beneficial for the environment. Because cable extraction is a difficult task, companies must decide whether the energy and labor are worth a couple extra million dollars. Many cables can become biologically cemented to suspension points or buried under feet of sediment, which complicates the process of extraction and potentially disrupts marine life. Leagues above the site of extraction, the cable ship chugs along, burning hefty amounts of diesel fuel. For these reasons, some marine biologists suggest abandoning the cables in their underwater graves might be less damaging overall to marine life, especially those who have found a home on the cable. However, there is still significant controversy as researchers acknowledge “little is known about the long-term impacts of leaving the cable abandoned.” Scientists are asking what marine ecosystems will look like after centuries of absorbing polyethylene leached from undersea cables. Granted, it may so happen that no one is around to find out. In the meantime, I wonder, can we do better?
* * *
There is so much we don’t know about undersea cables and data networks: What are the short and long-term effects of cables on marine environments? How durable are fiberoptic cables in dynamic undersea environments? In spite of these uncertainties, data companies continue to move at full speed. In 2015, Microsoft launched Project Natick with the hopes of building a “standard, manufacturable, (and) deployable” underwater data center. Phase II of the project is now underway in Scotland, where the team is testing the data center’s capability to run unmanned on the ocean floor for a span of five years. Parsing through the project’s website, a sustainability agenda is at the forefront, with the vision of a fully recyclable facility that can be re-outfitted every five years and sunk back into the sea. Bathed in the cool waters of the ocean, Natick forgoes the need for energy intensive cooling systems. What little power the center does require is obtained from renewable energy sources, such as on- and off-shore wind farms. In all, the team is optimistic that Natick data centers can be “truly zero emission (with) no waste products … (from) power generation, computers, or human maintainers.” Microsoft goes so far as to say that their underwater data center operates as a “shelter for local wildlife.” Regardless of the merit of this claim, the Natick is leagues ahead of land-based data centers that have carbon emissions on par with commercial airlines. In the mostly murky waters of the oceans’ future, Microsoft’s Natick is a glimmer of hope for sustainable data infrastructure.
Leagues below the ocean’s surface algal blooms and gyres of trash, data is coursing through fiberoptic cables. Some of this data is vital — spatial data on Arctic fires, satellite images of Earth’s thinning ozone, and statistics on the spread of COVID-19 — some, not so much (pet videos, anyone?). Ungoverned and unsupervised, it seems nothing can stop the data industry’s colonization of the sea bed — except, perhaps, the sharks. While cable companies assure us in soothing tones that the “impact … is … minor,” it becomes increasingly apparent that big data is hugely disruptive to global ecosystems, and to our vulnerable oceans especially.
Scientific research has never been without cost. In Darwin’s day, natural history research meant killing specimens in order to observe them. Some of the Galápagos tortoises Darwin studied ended up back in museum displays in London and even on Darwin’s dinner table. It remains to be seen if big data will be our saving grace or our damnation. For now, it’s hard not to be struck by the irony of the big data ocean grab. In 10 years’ time, will ocean researchers like Craig Venter still be storing their data on marine life in the very waters from which they collected it?
* * *
EDITOR’S NOTE: Since the original reporting of this article, Microsoft proudly announced that it has pulled its data center out of the water. Read more >>>
About the Author …
Shelby Cheyenne Job is from Fayetteville, Ga. She graduated with a B.A. in English and B.S. in Earth, Society, and Environmental Sustainability. She currently works as an Events & Programming Assistant for the University of Illinois Research Park. This piece was researched and written for ESE 498, the capstone course for the Certificate in Environmental Writing, in Spring 2020.
WORKS CITED
https://www.jcvi.org/research/gos
https://www.pbs.org/wgbh/evolution/library/01/6/l_016_02.html
https://www.researchgate.net/publication/312597999_Can_big_data_tame_a_naughty_world
https://www.nytimes.com/interactive/2019/03/10/technology/internet-cables-oceans.html
https://phys.org/news/2020-01-blue-colossal-human-pressure-ocean.html
https://www.youtube.com/watch?v=1ex7uTQf4bQ
https://www.submarinecablemap.com/#/submarine-cable/america-movil-telxius-west-coast-cable
https://www.noaa.gov/education/resource-collections/ocean-coasts/ocean-floor-features
https://www.unep-wcmc.org/resources-and-data/submarine-cables-and-the-oceans–connecting-the-world
https://www.escaeu.org/download/?Id=329&source=documents
https://www.crn.com/news/data-center/microsoft-s-underwater-data-center-a-success-azure-ahead