Books Trending News – Guaripete | Online Store |
- Gulls and Us
- The Known Known
- Resistance Means More Than Voting
- The Post-9/11 Generation
- Writers’ Cribs
- Cixin Liu, China, and the Future of Science Fiction
| Posted: 11 Sep 2018 07:05 AM PDT ![]() ![]() New York Harbor, August 2018 I tried to give up birdwatching for a time in my early twenties. The scene of my abandonment was a gull roost on a reservoir. From a hide on the banks of Chew Valley Lake, outside Bristol, England, I often watched gulls coming to land on the water to find a place to sleep. Many common gulls did this and, occasionally, one or more ring-billed gulls, vagrants from North America, got caught up on the wrong side of the Atlantic, and came to rest with the regular crowd. In the hide, if others were with me, there was a chance of picking out a ring-billed gull. But on my own, I never managed it. My desire to see a ring-billed gull prevented me from seeing one in the shakedown of thousands of birds arriving out of the dying light, and then the exquisite distribution of these white forms across the darkening water. I felt stupid to have misread the truth of the scene. Any ring-billed gull that was there was a barely significant accident in the great arrival of local birds that was going on. It was them that I should have attended to and not the putative vagrant in their midst. I put away my binoculars for much of the next three years. Before I quit, I had seen most of the gulls then known in Britain. The gulls I grew up with were herring, lesser black-backed, great black-backed, black-headed, common, and kittiwake. The gulls I saw as a teenager, when I began to chase after rare birds, were little, glaucous, Iceland, Mediterranean, Sabine's, ring-billed (I was shown one eventually at Chew), Bonaparte's, Franklin's, laughing, Ross's, and ivory. Some are Arctic species; some are American. Recent revisions to our taxonomic understanding of birds, and resulting changes to the record that include the splitting and lumping together of species—some birds previously thought subspecies have been redesignated as separate species; others believed to have been different have been shown to be less distinctive than once thought and have been taxonomically demoted, as it were, to subspecies—have released more gulls that might now be seen in Britain. (There aren't actually more gulls; they've simply been given new names and species status.) Most notable among the additions are yellow-legged and Caspian gulls. Previously, both these birds were believed to be races or subspecies of the herring gull. There are also smaller numbers of other split species, assorted subspecies and hybrids, including American herring, Thayer's, Baltic, Azorean, Viking, Nelson, and more with names still stuck in Latin. All are hard to spot and few are easy to tell apart. Yellow-legged and Caspian gulls are the species whose observed presence in Britain has prompted new interest among birdwatchers in the family as a whole. The first rare bird I sighted was a Mediterranean gull at Oxwich on the Gower Peninsular in South Wales. I was twelve. My friend Richard was with me, and we wrote a description of what we saw and had our bird accepted by the county recorder, so it found its place in the county bird report for that year. After the Mediterranean gull, I got serious and "twitched" other rares: an ivory gull, polar-white, at Chesil Beach in January 1980; two Sabine's gulls at Sheringham on one wild autumn sea-watch; a laughing gull at Yarmouth (or was that a Franklin's gull?); a Bonaparte's somewhere, perhaps in Cornwall. I should check my notebooks, except that I wasn't a good diarist then, especially not on the chase, when everything in front of me was too exciting to be captured by standing to one side of it and writing it down. Besides, I was intimidated by the records of my fellow birdwatchers: the beautiful lifelike sketches of Laurel Tucker, who was often in the front seat of the car we shared from Bristol to get to the target bird; or the meticulous counts and annotations of Antony Merritt, who wrote in pencil and then sprayed his pages with fixative to preserve his lists, and who now is sick and has been, for decades, cruelly confined indoors. * It was a gull that also got me back into birdwatching. I worked for a time in bird conservation after I left university. Birds were my day job for three years, and usually that left me wanting to do other things when I wasn't at work. But the lure of some species never dims. An adult Ross's gull—an unbelievably pink bird—arrived in Norfolk in May 1984 and my boss, Nigel Collar, was similarly surprised by his resurgent appetite; he drove us the hundred miles to collect the prize at Titchwell where it sat, like a melting raspberry-ripple ice cream on a muddy island. ![]() Lower Manhattan, August 2018 Not long after that rarity, I noticed how the herring and lesser black-backed gulls had set up shop in my hometown of Bristol. The city is only a few miles from the Severn Estuary, and the River Avon, a tidal finger, drives daily into its heart. Among Bristol's sound-signatures are the birds' marine yelps as they navigate the Avon and the Feeder and the Cut and the other channels that broker the meeting of fresh and salt water. But in the 1980s, when I left the city (to study, and before I returned, with a young family, to work), the gulls had started something new—breeding on rooftops across its center. In that time, well within the lifespan of individual birds, a profound change gripped both species: urban gulls came of age. Near the center of Bristol, until a few years ago, there was an ice rink and music venue in a chunky building complex with a flat roof put up in the late Sixties. I skated there once as a teenager. In 1978, next door at the Locarno, I watched Suicide open for The Clash, and enjoyed the electronic Americans more than the jangly Brits. There were no gulls at that time, as far as I recall. By the Nineties, you couldn't miss the gulls. From early spring until late summer, the roof was always busy with herring and lesser black-backeds. Standing next door in the multi-story car park, you could easily see fifty or more birds. I began to watch the colony during my lunch breaks. Level G, the highest in the car park, was open to the sky and offered a panorama of the city center; in midsummer it was good for gulling. The first afternoon I went to the car park was hot and muggy; the felted roof grew sticky. A shrinking puddle of water attracted three young lesser black-backeds. They sipped at it. One still had down around its teenage face—"bum fluff," we used to call it in my Locarno days. I could see that almost every suitable flat roof across the city center had breeding birds on it. There was even a herring gull nesting between the hooves of one of the two golden unicorns prancing on top of the city's Council House. Someone had planted a two-foot-high plastic great horned owl on the far end of another roof to scare the birds off, and I was pleased to see another herring gull dozing on its nest at the base of the shit-spattered decoy. The gulls seemed particularly drawn to this area, as if they divined its hidden waters, where the last reaches of the tidal river runs culverted and capped below the streets. In fact the water is not so important to them. They had made a nutrient-rich sea out of the city's food waste and a marine archipelago out of its unlovely rooftops. At the ice rink, jackhammers were demolishing something with a noise like the bombard of a shingle beach at the foot of a cliff. The hum of the rink's cooling fans added to the mix. The roof, with its black-tarred felt, was like the lava plug of a volcanic island. Dripping in the July sun at the back of the rink was a small mountain of dirty, cast-off ice, like a wayward iceberg fretting at a northern shore. ![]() A herring gull foraging on Guernsey, August 2017 The nesting gulls preferred the edge of the roof above, where a shallow trench had grown a thin scab of mud and a drift of bottles. Earlier in the year, I had watched gulls fly over the city with mouthfuls of bright green moss or fresh grass. They were busy then, building or repairing nests. The rooftop assembly looked casual and messy, a shanty town built on a sewage works, but it was a formalized space: a place of territories, rituals, and hierarchies of age and of species. There was even a dance floor. An adult lesser black-backed gull landed near its two young with a piece of grey chicken skin flapping in its beak. One youngster grabbed it and bolted it down. Other young birds bleated hungrily; some tried their wings and made circuits of the roof; others walked about like disconsolate children trailing home after a summer day on the beach. Flying ants crashed everywhere and loafing gulls picked at them. Between begging calls, the young birds made more practice launches, flapping their wings and jumping. Paired adults were re-forming their relationships; returning birds went in for bouts of head flicking and kissing. Neighbors were in dispute, caterwauling above the din. Moaning babies were held off by stabs to the head by their mothers or fathers, but they were tenacious and advanced again, pecking at their parent's red bill-spot. A young bird walked backwards like a hypnotist leading its zombie parent, and eventually a meal was sicked up. A loitering carrion crow flew in, but the adult gull, broken from its spell, saw it off. In a way, it is fitting that the gulls have made Bristol their home. The city that brought half of the Atlantic to Britain—slaves, sugar, and tobacco—has drawn seabirds, as well, into its heart. The gulls are canny opportunists and worthy embodiments of the spirit of the place. And people hate them for it. The gulls' boom in the northern hemisphere has coincided with industrialization and urbanization. By moving onto the rooftops of our buildings and by sourcing our edible refuse at rubbish dumps, gulls—herring gulls more than any—have gained unique admission into our habitat. No other relatively large wild animal is more commonly encountered. Pigeons in cities and pheasants in rural areas compete as members of what has been called a "slum avifauna," but both those birds have surrendered to human terms and conditions (pigeons have abandoned the sea cliffs they once nested on, pheasants are reared in their millions like a farm animal). Gulls are still wildlife; they are still called seagulls, and are associated with waves and saltwater by most people, but over the course of the last hundred years, especially around the North Atlantic, they have come ashore. In part, they elected to fly inland; in part, we made them do so. They have lived in our slipstream, following trawlers, ploughs, garbage trucks. And they continue to tell how something of the once-wild can share our present world. They survive as we do now, walking built-up zones and grabbing a bite where they can. In 2016, a herring gull was dyed orange after it fell into a vat of curry in Newport in South Wales. In July 2018, herring gulls were reported drunk along the Devon and Dorset coasts. This novel behavior largely disturbs us: we have started to fear that gulls get along with us too well. Nowadays, gulls are increasingly thought of not as seabirds but trash birds, the sub-natural inhabitants of what MIT professor of urban design Alan Berger called "drosscapes," déclassé and mongrelizing in their habits. We see them as scavengers, not as entrepreneurs—as aliens, not as refugees. They steal our chips and kill our Chihuahuas. They are too big for the world they have entered. Some of this distaste is particular to the times, and some is a resurgent rivalrous antagonism that almost any other creature on Earth can trigger in our species—that dark loathing we can find in ourselves for any nonhuman life. Yet, even besmirched like this, the gulls keep us company. And they'll be with us, we feel, for the duration of this, our late hour. ![]() Kelp and Hartlaub's gulls flocking to a trash heap, Cape Town, South Africa, December 2017 Landfill means more than just a tip for the end of things. It is also a description of how we have worked the living world, learned about it, named and catalogued it, and have thus occupied or planted our planet, filling the land. F. Scott Fitzgerald knew the early landfills of New York, and put one in The Great Gatsby:
Fresh Kills on Staten Island might have auditioned for Fitzgerald. It sounds like a perfect name for a dump, but "kill" is a Dutch word for stream. Once a landfill that became the largest manmade structure in the world, having been reopened to take the rubble of the 9/11 World Trade Center attacks, Fresh Kills has now been plugged and capped; it is being metamorphosed over time into an urban park, where the water may once again be clean. The same has happened at the equally well-named (and one-time premier gull spot) Mucking in Essex, England. At the point we identify anything as waste, even though up until then it has been ours, we don't want it and we don't like it. Anything can arbitrarily become waste or dirt in this way. Dirt, in Mary Douglas's memorable formulation (developing William James's thoughts) in Purity and Danger, is simply "matter out of place." And therefore places designated for dirt stir complicated emotions. "The waste remains," wrote William Empson in his 1937 poem "Missing Dates." The phrase is an infectious refrain that slides through the villanelle: "the waste remains and kills." But waste doesn't necessarily kill—it can nourish and sustain. Dumps where plastic junk lives on horribly, and toxins persist, are also strangely lively places. We are a waste-making species like no other, but we are also workers of waste: recyclers, ragpickers, archivists, librarians, archaeologists, historians, bricoleurs, and gullers. For several winters, I went ringing gulls at Pitsea landfill in Essex, furling them under an arm, their beaks to my back, so I could fix rings on their legs to keep track of individuals. I spent other days counting urban breeding gulls on city rooftops across Britain. I watched summer roosts and winter roosts on reservoirs and gravel pits. I tracked gull flightlines from the top of double decker buses. I read rings on any birds I saw anywhere. I pored over the mind-numbingly detailed field guides that pro gullers have memorized. I stood close to them and their telescopes, trying to absorb something of their acuity and their passion. I spent days online looking at their field photography. I tested myself alone on muddy beaches, and blasted sea-watches, in city parks, and gutted factories. I watched what might have been Chekhov's seagull begging for scraps on the terrace of his little seaside cabin in the Crimea. I ate the soft-boiled eggs of a black-headed gull; they tasted like a tart marsh. And I did see a yellow-legged gull, or rather was shown one, then several. While they were in front of me I could tell them apart; now I am not sure. I have yet to clinch a Caspian. ![]() Coney Island, Brooklyn, March 2012 Gulls in cities are doing well, but surviving coastal populations are not. And food for all the birds is looking meager. Populations boomed during the throwaway decades of the 1960s and 1970s. But along the coast, fish processing has all but finished, and ended a source of gull food. Waste barges with their headaches of hungry gulls no longer float down an open sewer from London to Essex. The dumps are being grassed over and converted into parks. Recycling or incineration of food waste is now commonplace, and edible trash in landfills is rare. The men who go to Pitsea landfill to trap and ring gulls must negotiate with the garbage men, holding back a truck with restaurant leftovers until they have readied their ringing nets. We desire a cleaner world, for ourselves and for wildlife. Because we have gotten better at managing our trash, there are salmon and seahorses in the Thames near Pitsea, and that is surely good. But for species that have come to depend on our garbage, these efforts are another change to which they must adapt. There has been a gull moment, and it is coming to an end. * On December 1, 1963, just two weeks after it had first appeared from beneath the sea off southern Iceland, gulls were seen to land on the cooling volcanic waste that was growing into the new island of Surtsey. They were the first life form to set foot on the newest addition to the land surface of the Earth. Thereafter, they continued to visit the bad black tooth, and in time they have turned it, in part at least, green. I can't trace the specific identity of those first-footers, but five gull species have since bred on Surtsey: great black-backed gulls first reared young in 1974, kittiwakes in 1975, herring gulls in 1981, lesser black-backed gulls in 1983, and glaucous gulls in 1993. Most (apart from the kittiwakes) live in a busy mixed colony on the south side of the island. A count in 2003 noted 301 pairs of gulls, mostly lesser black-backeds. All these gulls make nests, and on Surtsey they tore up the pioneer plant life to do so. But, as time went on, they also planted more. A study of soil development showed that, within their colony, the gulls transfer 45–50kg of nitrogen per hectare from sea to land every year, while the barren areas surrounding the colony receive only 1–2kg as atmospheric deposits. The birds fertilize the ground: they defecate, they regurgitate, and they drop food remains that they have found elsewhere, and their empty nests compost back into the deepening soil. They are landfilling. I spent a day and a night on Surtsey in 2003 to mark the island's fortieth birthday. The raw material of the Earth, the coughed-up guts of our planet, was still much in evidence. But the landscape was already aging and shrinking, its foundation stone eroding. The sea was eating at the island's friable edges; it was half the size when I saw it that it had been in 1967 when the eruption stopped. I tried to trek its length. I felt as though I were walking an autopsy—Surtsey's grizzled lava has a grey just beneath its skin that looks like death, with a few patches of urinous yellow and rust. My boots were shredded by the rough climb; I cut my hand on a lava snag. On the bald summit the sea wind's harsh blow met foul air smoking from the hot cracks that riddle the rock. My head spun, my lips chapped. There were no birds. I slid and skittered toward the south. There, in a sheltered bowl, wind baffled and out of earshot of the sea, the gulls had planted an extraordinary green sanctuary. A dozen lifted from their nests and ugg-ed at me. Their downy young were hidden in the long tangle of meadow grasses and plants. The gulls' agitated grunts were the only sounds I could hear, and they stopped after I lay down on the soft bed the birds had made. I could smell the chicks, the warm dust from the cracking cases of their new feathers, the cooked greens that they shat, and the ozone-chlorine tang that their nearby parents gave off. The sky came blue above and the birds returning to their nests drifted silently overhead. And there, in the gulls' nest, I fell asleep. ![]() Coney Island, Brooklyn, March 2012 This essay is an edited excerpt from Landfill, to be published in the UK in October by Little Toller Books. The post Gulls and Us appeared first on GuaripeteMagazine. This posting includes an audio/video/photo media file: Download Now |
| Posted: 11 Sep 2018 07:05 AM PDT ![]() The Known Citizen: A History of Privacy in Modern America by Sarah E. Igo Harvard University Press, 569 pp., $35.00 Habeas Data: Privacy vs. the Rise of Surveillance Tech by Cyrus Farivar Melville House, 281 pp., $27.99 Beyond Abortion: Roe v. Wade and the Battle for Privacy by Mary Ziegler Harvard University Press, 383 pp., $45.00 Privacy's Blueprint: The Battle to Control the Design of New Technologies by Woodrow Hartzog Harvard University Press, 366 pp., $35.00 ![]() An unmarked camera drone above Toronto, Canada, May 2018 1.In 1999, when Scott McNealy, the founder and CEO of Sun Microsystems, declared, "You have zero privacy…get over it," most of us, still new to the World Wide Web, had no idea what he meant. Eleven years later, when Mark Zuckerberg said that "the social norms" of privacy had "evolved" because "people [had] really gotten comfortable not only sharing more information and different kinds, but more openly and with more people," his words expressed what was becoming a common Silicon Valley trope: privacy was obsolete. By then, Zuckerberg's invention, Facebook, had 500 million users, was growing 4.5 percent a month, and had recently surpassed its rival, MySpace. Twitter had overcome skepticism that people would be interested in a zippy parade of 140-character posts; at the end of 2010 it had 54 million active users. (It now has 336 million.) YouTube was in its fifth year, the micro-blogging platform Tumblr was into its third, and Instagram had just been created. Social media, which encouraged and relied on people to share their thoughts, passions, interests, and images, making them the Web's content providers, were ascendant. Users found it empowering to bypass, and even supersede, the traditional gatekeepers of information and culture. The social Web appeared to bring to fruition the early promise of the Internet: that it would democratize the creation and dissemination of knowledge. If, in the process, individuals were uploading photos of drunken parties, and discussing their sexual fetishes, and pulling back the curtain on all sorts of previously hidden personal behaviors, wasn't that liberating, too? How could anyone argue that privacy had been invaded or compromised or effaced when these revelations were voluntary? The short answer is that they couldn't. And they didn't. Users, who in the early days of social media were predominantly young, were largely guileless and unconcerned about privacy. In a survey of sixty-four of her students at Rochester Institute of Technology in 2006, Susan Barnes found that they "wanted to keep information private, but did not seem to realize that Facebook is a public space." When a random sample of young people was asked in 2007 by researchers from the Pew Research Center if "they had any concerns about publicly posted photos, most…said they were not worried about risks to their privacy." (This was largely before Facebook and other tech companies began tracking and monetizing one's every move on- and offline.) In retrospect, the tendencies toward disclosure and prurience online should not have been surprising. As Sarah Igo observes in The Known Citizen, her masterful study of privacy in the United States, the sharing and oversharing of intimacies predates the social Web; indeed, the social Web simply allowed these behaviors to proliferate on a more open and accessible platform. Igo cites the enormous popularity of An American Family, a documentary doled out in twelve installments on public television in 1973, as one of the earliest cultural watersheds in Americans' changing appreciation of privacy. Culled from the filmmakers' seven-month immersion in the day-to-day lives of an ordinary family, the Louds of California, the series suggested that nothing was off-limits on TV: the Louds' marriage fell apart; their son came out as gay; his father's infidelities were exposed. Part of what made this so sensational was that, by making the private public, voyeurism and exhibitionism became mainstream entertainments. (Decades later, with webcams built into computers, peering into other people's homes and lives no longer seems all that unusual.) Igo also points to the influence of confessional talk shows, like Phil Donahue's in the 1970s and Oprah Winfrey's in the 1980s and beyond, where guests opened up about previously taboo subjects such as incest and spousal abuse. The public also had a voracious appetite for revelatory memoirs, a genre that grew exponentially as writers, famous or not, offered up increasingly startling, true—or possibly true—confessions of drug addiction, alcohol abuse, childhood traumas, sexual misadventures, and failures of every stripe. Igo writes:
When the social Web came along not long afterward, people were primed to participate. None of this meant that Americans—ordinary people and lawmakers alike—were unconcerned then, or at any time, about what they perceived to be actual and potential incursions into their private lives. Government overreach into the homes and lives of citizens animated the Framers, and their intent has been debated in the courts and Congress ever since. As Cyrus Farivar writes in Habeas Data: Privacy vs. the Rise of Surveillance Tech, a lively catalog of privacy-related court cases and laws that have arisen alongside new technologies, "Nowhere in the Bill of Rights, or in the Constitution, is the word 'privacy' mentioned. But scholars, lawyers, judges, and others have intuited, or extrapolated, something resembling a privacy right from both documents." "The right to be let alone is indeed the beginning of all freedom," Supreme Court Justice William O. Douglas wrote in 1952, echoing the words of Justice Louis Brandeis, who, nearly a quarter-century earlier, wrote in his dissenting opinion in Olmstead v. United States, "The right to be let alone [is] the most comprehensive of rights, and the right most valued by civilized men." Brandeis and his law partner, Samuel Warren, are credited with inserting modern ideas about that right into American jurisprudence when in 1890 they published a Harvard Law Review article titled "The Right to Privacy." Inspired by a new technology—the camera—and the widespread, unauthorized dissemination of photographs taken by prying tabloid journalists, Brandeis and Warren decried the
The law needed to respond as new technologies brought about new invasive practices, they argued, because "political, social, and economic changes entail the recognition of new rights." As a Supreme Court justice, Brandeis had the opportunity to pursue these ideas further a few decades later in Olmstead. In that case the new technology at issue was the telephone and the presumption of privacy it did—or did not—afford. The plaintiff, a bootlegger under surveillance by law enforcement, argued that his constitutional rights had been violated when investigators listened to his calls without a warrant. The Court's majority did not see it that way. Since the government's wiretaps did not entail an actual physical breach of the man's private property, they found no injury. In his dissent, Brandeis chided his colleagues for failing to take into account the development of new technologies unimagined by the Framers:
But then he made a prophecy that perfectly anticipated our current, post-Snowden moment, when both the courts and the country grapple with government surveillance of e-mail, cell phones, and other electronic devices:
In Habeas Data, Farivar illustrates how this "someday" is now here, as he examines the privacy issues arising from such relatively new technologies as license plate readers, security cameras, drones, stingrays (devices that mimic cell phone towers in order to intercept calls), stingrays mounted on drones, Doppler radar, facial recognition, and persistent surveillance systems—cameras mounted on airplanes that can see and record what's happening on the ground. The stories he tells, often about how these technologies are used by the government to spy on its citizens, are all the more chastening because, for the most part, they are legal. The problem, as Farivar points out, is that "absent a department policy or state law specifically forbidding a particular practice or regulating a particular technology, law enforcement will always push the limits until they are told to stop." The deployment of new technologies frequently precedes their regulation, or even the public's knowledge that such technologies exist or are being used. All too often, by the time these practices are challenged in the courts or addressed legislatively, the specific technology at issue has been retired and replaced by something new and, very likely, more invasive. For instance, facial images of more than half the American population already reside in various government databases, collected from such benign activities as renewing a driver's license or reentering the country from a trip abroad, but now Axon, the nation's largest supplier of police body cams, has just announced that it is considering adding facial recognition to its cameras. Facial recognition software, which is powered by artificial intelligence, is notoriously unreliable in identifying people with dark skin. Even so, at least one company is adding an "ethnicity detection" algorithm to its facial recognition software. The Department of Homeland Security (DHS) is taking this further, according to the Electronic Frontier Foundation, by using
These will be part of a new, comprehensive DHS database that will include, in addition to facial images (supplied by, among others, airline companies), fingerprints, iris scans, DNA data, descriptions of physical anomalies (scars, tattoos), and maps of individuals' affiliations and relationships culled from social media. The DHS will share this database with local and state law enforcement departments, other federal agencies, and certain foreign governments. 2.An argument could be made that a different once-novel technology—the birth control pill—was at the root of the other major privacy issue that bedevils both the courts and society: abortion. In 1965, five years after the Pill was approved by the FDA, a case was brought before the Supreme Court challenging a Connecticut law that banned the use of contraceptives by married couples, setting in motion developments that led to the Court's Roe v. Wade decision legalizing abortion eight years later. At issue in the 1965 case, Griswold v. Connecticut, was the most basic measure of privacy, the one identified by the authors of the Fourth Amendment: the right to be let alone, free from government interference in one's home. "Would we allow the police to search the sacred precincts of marital bedrooms for telltale signs of the use of contraceptives?" Supreme Court Justice William O. Douglas wrote, striking down the Connecticut law. "The very idea is repulsive to the notions of privacy surrounding the marital relationship." Marriage, Douglas said, compelled a "right of privacy that was older than the Bill of Rights—older than our political parties, older than our school system." And while the privacy right the Court was asserting in Griswold pertained specifically to the legal union of a woman and a man, and would not be extended to unmarried couples for another seven years, other language in the decision—especially the vague yet encompassing notion that "the specific guarantees in the Bill of Rights have penumbras, formed by emanations from those guarantees that help give them life and substance"—set the course for future privacy claims going forward. The Supreme Court issued its decision in Roe v. Wade in 1973, the same year PBS broadcast An American Family (and only a year after the Court broadly legalized contraception outside of marriage). No other decision has proved more contentious or more socially and politically fracturing. This past spring alone, Mississippi, Indiana, Iowa, and Kentucky passed laws aimed in different ways at gutting it. Since the Reagan years, and even more so now under Donald Trump and Mike Pence, overturning Roe and outlawing abortion has been an organizing principle of the Republican Party. But in 1973, when the Court issued its ruling, only two justices dissented. Writing for the majority, Justice Blackmun noted:
The privacy right established by Roe was not absolute. In the words of the Court:
This, as we've witnessed in the intervening years, is the legal wedge that the anti-abortion movement has used to pick apart Roe. Yet when the decision was issued, even some feminists, including Ruth Bader Ginsburg, were dismayed that the case had been argued and decided on privacy interests. To them, relying on claims of privacy enabled the Court to bypass more enduring and inclusive arguments based on sexual equality and self-determination, as well as to ignore the structural reasons poor women, especially, would face difficulties in obtaining abortion services. This became even more apparent in 1976 when Congress passed the Hyde Amendment, blocking federal Medicaid funding of abortions (except in cases of rape or incest, or if the woman's life is endangered by the pregnancy). Nonetheless, many feminists and feminist organizations like the National Organization of Women, Planned Parenthood (which had been the plaintiff in Griswold), and NARAL supported the privacy argument. In the estimation of the legal scholar Mary Ziegler, they then reinterpreted it, claiming that Roe conferred upon women a right "to choose" and a "right to control her own body," though these constructions do not appear in the decision itself. "To some extent, the connection between freedom of choice and privacy was implicit in the Roe decision," Ziegler writes in Beyond Abortion: Roe v. Wade and the Battle for Privacy:
It wasn't only abortion advocates who found Roe's privacy language both elastic and useful. Ziegler cites example after example of groups that adapted Roe for causes far removed from terminating a pregnancy. These included those advocating the right to die, the right to use unproven medications, and the right to refuse medical treatment, including mental health services. Underlying all of these was an interpretation of Roe that found that the right to privacy—a right that has been shown to be both pliable and capacious—included the right to control one's own body. These post-Roe movements occurred at a time of heightened public awareness—and wariness—of government intrusions into people's lives. COINTELPRO, the FBI's covert surveillance of domestic political figures and groups, had been exposed in 1971. The Watergate break-in happened the following year. Two years later, Congress passed the Privacy Act of 1974, which, despite its many exemptions, was intended to give people the right to know what information was contained in their government records and the ability to amend them when they were incorrect, and prohibited federal agencies from sharing their records without citizens' consent. According to a Justice Department statement explaining the provenance of the law, the Privacy Act came out of a Congress eager to curb
The Social Security number had been looked on with suspicion as soon as the Social Security Act was signed by Franklin Roosevelt in 1935. Here was the state collecting personal information (religious affiliation, marital status) on ordinary citizens: What might it do with that information, and what might others (like employers) do with it? Despite assurances from the Social Security Board that these records would be closely held, and despite the Board at first keeping them out of reach of the police, federal agents, and others who would find them of use to their investigations, its position weakened during the war, opening up a Pandora's box of information—not only for the military looking for deserters, but police searching for escaped convicts, the FBI tracking down suspects, the IRS pursuing tax cheats, and investigators hunting Nazi war criminals. (This, as we've seen, is a pattern during times of heightened national security—after September 11, the government's surveillance of citizens increased under the Patriot Act.) ![]() 'Webcam portraits,' from a series by Roman Drits, 2011 The administrative state was just getting started. In the postwar period, computerization and centralized record-keeping became the norm, and with them came reawakened public recognition of the potential for and dangers of what the late Columbia professor Alan Westin, in his book Privacy and Freedom (1967), called "data surveillance." When the Social Science Research Council and the Bureau of the Budget proposed combining the records of the Internal Revenue Service, the Census Bureau, the Social Security Administration, the Federal Reserve, the Bureau of Labor Statistics, and a slew of other federal agencies, the proposal was met with fierce public resistance. There was a growing sense, as Sarah Igo suggests, that the government knew its citizens better than they knew themselves. But it wasn't just the government. As advertisers turned to psychologically determined market research and employers relied on obscure personality tests, people became aware of the ways in which businesses had begun to encroach on something more intimate than even the bedroom: their minds. In Igo's estimation:
3.Those threats have escalated and multiplied, especially now that cloud storage and artificial intelligence have enabled the collection and analysis of vast amounts of data. The DHS alone, for example, has fingerprint information on 220 million individuals and processes 350,000 fingerprint transactions a day. The NSA has been hoovering up data as well. In May, when the office of the director of national intelligence released its annual transparency report for 2017, it revealed that the number of Americans targeted for surveillance under section 702 of the Foreign Intelligence Surveillance Act of 1978, which allows the warrantless surveillance of digital communications, had tripled in the past year. Among other things, last year the NSA collected the metadata from 534 million phone calls and text messages. As we learned—yet again—from the Cambridge Analytica scandal, privacy is also imperiled by the companies that have built their businesses by gathering, trading, and selling personal data. In May, just two months after Facebook apologized for the unauthorized appropriation of at least 87 million user profiles by Cambridge Analytica and explained that its policies regarding data-sharing with third parties had changed in 2014 (so that users and their friends were now safe from that kind of violation), the company admitted that it had allowed at least two hundred other apps access to its users' data without their knowledge. Then it turned out that the company was also sharing users' personal information with Apple, Microsoft, Amazon, and nearly sixty other device makers, even when users had denied Facebook permission to share this information with third parties. One of these, the Chinese telecommunications company Huawei, is considered by US intelligence to be a security threat. Around the same time, Facebook transferred 1.5 billion user profiles from its international headquarters in Ireland to its American offices in California. This was a few weeks before the European Union's General Data Protection Regulation (GDPR) was to go into effect, enabling the company to follow (in the words of Mark Zuckerberg) the "spirit" of the GDPR for those users, while sidestepping the strict mandates of what has been called "the most profound privacy law of our generation." It also allows the company to avoid, for the most part, the considerable fines levied on firms that do not comply with the GDPR. Facebook is not alone in trying to bypass strict privacy regulations. Recently, Google's parent company, Alphabet, mounted a campaign to neuter what is considered to be "the most rigorous consumer privacy statute in the country," the Illinois Biometric Information Privacy Act, which allows consumers to sue companies that use their biometric data without their consent. And Google itself, which in 2017 promised no longer to read users' e-mail, in fact still allows third parties to read users' e-mail. Add to all these incursions one more: it has recently been discovered that US cell phone carriers have been supplying customers' real-time location data to a company called LocationSmart that, in turn, has been selling it to other obscure and sometimes shady companies. One of them, Securus, enables its customers to track anyone who carries a mobile device, anywhere, at any time, and without a warrant. Money trumps privacy, not because consumers want that, but because in this new world of "surveillance capitalism" riches are to be had by luring people to use "free" and intentionally addictive products and then, with their consent or not, sucking up every possible bit of information about them. In Privacy's Blueprint, Woodrow Hartzog suggests that companies might be required to take "privacy values" into account and build them into the designs of their products. This, in fact, is one of the mandates of the GDPR, a law that does not extend to the United States or other non-EU countries. Considering the prevarications of Facebook, for example, as it continually diminished users' privacy over the years while claiming to protect it through obscure "privacy policies" that gave the company new ways to monetize people's personal information, one wonders how feasible this is. We cannot rely on lawmakers to adjudicate these matters. In the words of Senator Mark Warner, "If you leave us to do this on our own, we're gonna mess it up." Yet the House of Representatives voted once again not to fund the Office of Technology Assessment, the bipartisan agency that provides assistance and advice on technical matters. In 2016, the TRUSTe/National Cyber Security Alliance Consumer Privacy Index found that 92 percent of US Internet users were concerned about their online privacy, and that "worries over online privacy topped the loss of personal income by 11 percentage points." This suggests that Mark Zuckerberg's statement from 2010 may have been prescient in a way, though not in the way he would have liked: the social norms of privacy are changing as the known citizen becomes more knowing. A survey recently published in The Atlantic found that "78.8 percent of people said they were 'very' or 'somewhat' concerned about the privacy of their information on social media, and 82.2 percent said they self-censor on social media." This spring, Vermont passed legislation to regulate data brokers. In June, California passed the California Consumer Privacy Act, giving its residents the right to be informed about the kinds of personal information companies have collected about them, as well as the right to request that their personal information be deleted. Also in June, Colorado passed a tough data security law; as of September, Colorado-based companies will be required to, among other things, dispose of certain kinds of personal identifying information. Overall, two thirds of Americans are now eager to see stricter privacy laws. These may be—to borrow Justice Douglas's word—emanations of things to come. The post The Known Known appeared first on GuaripeteMagazine. |
| Resistance Means More Than Voting Posted: 11 Sep 2018 07:05 AM PDT ![]() ![]() The purse of an attendee at a Democratic Congressional Campaign Committee rally addressed by President Barack Obama, Anaheim, California, September 8, 2018 When former president Barack Obama called on the nation to oppose Donald Trump at the University of Illinois at Urbana–Champaign last week, he said there was only one way to do it, by voting. This was a criticism of the internal resistance supported by the anonymous op-ed writer in The New York Times. Obama said that people who "secretly aren't following the president's orders" are not defending democracy: "These people are not elected. They're not accountable." That is a plausible argument, one that has been used in the past to oppose every kind of civil disobedience, whether unions that launch strikes, Rosa Parks sitting in a forbidden place, or those who break trespass laws for sit-ins (Occupy Wall Street, Black Lives Matter), or Harriet Tubman smuggling blacks out of the South, or people hiding Jews from Nazis. People engaged in these activities have not been elected; they are not publicly "accountable." And yet, if people consider specific laws unjust, we are told, they should change the laws, not just break them. Argue against them. Vote against them. Use legal means. But what if the laws are not only unjust but framed and upheld by measures that baffle democratic correction? The classical justification for tyrannicide is that the tyrant has removed legitimacy from the laws, so there should not be unilateral observance from those who would be crippled by the law's observance. But the trouble with tyrannicide as a test of obedience to law is that it does not allow for any resistance short of killing the tyrant. There must be other ways to resist before that drastic extreme. If we wait until the tyrant kills six million Jews before tyrannicide is contemplated, we are actually facilitating tyranny. Some people said that Martin Luther King Jr.'s form of civil disobedience was justified because he broke the law openly and took the punishment prescribed by law. But that was less a moral validation than good public strategy. He was trusting that support could be recruited for his position. Harriet Tubman could not have done that, nor Oskar Schindler, nor the people hiding Anne Frank. And even King's Southern Christian Leadership Conference, though it acted openly, planned in secret. Secret disobedience is not immoral because secret. But is it not premature or scare-mongering to think of Donald Trump as a tyrant? Yes, if Hitler's final Holocaust is the measure. But the signs of a creeping dictatorship are clear and daily displayed. What else is government by constant rallies—with crowd size exaggerated by lies? By insulting tweets spewed out in a constant stream? By the co-option of the justice system? By declaring all major press outlets except the propaganda arm of the administration to be "enemies of the people"? By branding Muslims as "animals" as Hitler did the Jews? By the personal rescinding, achieved or attempted, of major pacts reached by national diplomacy—from the Paris Accords, to NATO, to NAFTA, to TPPA? By declaring all agencies of government not originated by the dear leader as remnants of a "deep state"? By threatening to use nuclear "rockets"? By advocating that critics of the dear leader be roughed up? By calling some self-proclaimed Nazis "good people"? By calling any opposition to the leader matters of "national security"—whether Canada's managing of its own affairs, or The New York Times's publishing an anonymous op-ed? By using the machinery of the state for enrichment of the leader and his family? By multiple steps to limit the ability to vote? How long must the list grow before we hope that resistance be mounted openly, secretly, immediately, effectively? President Obama says we must wait until we can vote. After all, in a matter of weeks, we may vote in a House of Representatives that may impeach the president. But it is doubtful that we will elect a Senate that can convict the president. Vote, of course. But there is no reason to think that voting is the sole allowable form of resistance. Even those who thought Prohibition unjust were energetic and imaginative in opposition to it before it was legally abolished—in fact, the legal outcome was caused by the opposition. That was a minor matter compared to the dictatorial steps constantly being taken by Donald Trump—and he chafes that he is not able to take more of them. President Obama should reflect on Cicero's maxim: Salus populi suprema lex esto. "The highest law should be preservation of the people." An earlier version of this article misidentified the University of Illinois campus at which President Obama spoke; it was Urbana–Champaign, not Chicago. The post Resistance Means More Than Voting appeared first on GuaripeteMagazine. |
| Posted: 11 Sep 2018 07:05 AM PDT I've been teaching college students for the past decade or so, and every year I pose the same question to my freshmen: Where were you on September 11, 2001? My first year of teaching a freshman writing seminar, the question led to a disarming conversation about how their 8th grade teachers handled the news. Students from Manhattan and Brooklyn had parents who'd worked at Cantor Fitzgerald. They recounted with visceral detail how it felt not to know until late that day if their parents were OK. The eternity of waiting. Five years later, the discussion was different. Now freshmen described a glimpse of memory of a first-grade teacher attempting to figure out how to talk as events unfolded. If pushed they had to admit they didn't know if they remembered watching news footage of the attacks that day, or if their memory was of seeing that footage over and over years later. One student was certain her first grade teacher made a point of not showing them the smoke pouring out of the World Trade Center towers. "I was sure I saw it that day," another student said, "but now that we're talking about it I honestly don't know." When I recounted my own story of watching the events unfold from Brooklyn and then Manhattan that day, as I found myself doing each fall, they listened more intently than past students had. I was recounting a history they didn't wholly remember. Last year when I asked the question, I found myself a bit in shock at the response. I hadn't prepared myself for the answer. Last year's freshmen were not yet a year old on September 11, 2001. They knew of it only as a number, or from reading about it. To them, it was history. This fall, the students sitting in freshmen classrooms across the country will be more distant still. This fall's freshmen will be the first college students whose answer to the question, "Where were you on 9/11?" will be: I wasn't born yet. This year's freshmen will also be our second class of post-millennials, the next generation that is now being commonly referred to as Gen Z. Last year's were the first. I'll confess I couldn't tell what they thought when I recounted my own experiences in New York on 9/11. Frankly, we were still kind of reeling from the 2016 election, and so it seemed somehow secondary. Probably I seemed old. Probably the story I was telling was more tinged with nostalgia than I can account for, telling it almost two decades later. I do know that they listened intently as I described riding the subway from Ft. Greene to Midtown, watching the smoke billowing out of Tower Two from a stopped D train on the Manhattan Bridge. I'm sure I embellished the story. But at this point I don't even know how. Probably it was as much a retelling of the version I'd told the year before, and the year before that, than it was the story of the thing itself. The first-year writing class I teach is called "Stranger Than Fiction." The syllabus focuses on texts that walk the wily line between fiction and nonfiction, books and essays by writers like Orwell, Isaac Babel, Lorrie Moore, WG Sebald. Around the second week of classes, when 9/11 rolls around, I often assign a set of readings related to the events. We read Tom Junod's iconic Esquire piece "The Falling Man," about a photograph that ran on the Times' front page of a man in free-fall against the backdrop of one of the Towers, a piece I helped usher to publication as an editor there. We'll look at parts of Ken Kalfus's A Disorder Peculiar to the Country, reporting on the events from The New Yorker. For this year's freshmen those readings will smack of history. They'll have been published in legacy print magazines and books, not online. The PDFs I post for them to download are looking a little grainy. I've spent a lot of time this summer thinking about these new freshman, who were born into a post-9/11 world. The term used most frequently is "Gen Z," but I'm skeptical it'll last. Some how it feels too reactionary. Post-Millennials sounds reactionary, too. The 9/12 Generation wouldn't be all bad. Or maybe we could spare them from having to have a generational moniker at all. I myself was born in 1978 and I don't belong to a generation as currently defined: I came of age after the Gen-Xers, and I'm too old to be a Millennial. My clearest topical memory was of my second grade teacher rolling a TV in on a cart so we could watch the Challenger launch, only to watch it explode. No college professor ever asked me where I was when Christy McAuliffe died. Maybe this is what it means to move decades, not months or years away from epochal events. Three months after 9/11, I wasn't asking college students about literary treatments of the event. At that point I was just a twentysomething magazine editor, and I kept asking, "Why are these fighter jets still flying over Brooklyn?" Or a year later: "When will I start sleeping through the night again?" Those questions eventually had answers. Last year will be the last time I'll have asked, "Where were you when the Towers fell?" It's time to find a new question to ask each fall. Maybe it'll be something like, Where were you when you heard the election results on November 7, 2016? And maybe one day, "Where were you the day our Democracy was saved?" A suddenly-kind-of-old professor can dream. Daniel Torday is the winner of the 2017 Sami Rohr Choice Award. His new novel, Boomer1, is out this fall from St Martin's Press. The post The Post-9/11 Generation appeared first on GuaripeteMagazine. |
| Posted: 11 Sep 2018 07:04 AM PDT Roald Dahl When Roald Dahl and his family were living in Gipsy House in Great Missenden, Buckinghamshire, UK, he realized his kids were so noisy that he needed his own writing space. After seeing Dylan Thomas's shed in Wales, he built a shed of his own in his garden. Dahl wrote all his major works here, including Charlie and the Chocolate Factory and Matilda. Dahl collected lots of photos, objects, and memorabilia, including part of his own hip bone. Jane Austen For the last eight years of her life (during which four of her novels were published), Jane Austen, her mother, and her sister lived in a cottage in Hampshire, along the southern coast of England. The cottage was gifted to them by her brother, Edward Knight. Knight had inherited the property (and others) from a childless couple who had adopted him. That cottage is now the Jane Austen House Museum (also known as Chawton Cottage), home to a library that includes an original manuscript handwritten by Austen, early editions of her novels, and works by other women who either inspired Austen or were inspired by her. The most sought and awe-worthy artifact in the museum might be Austen's tiny desk (if you can even call it that!). The walnut dodecagon table is just barely big enough for a few sheets of paper, a quill pen, and an inkwell. Other items at the museum include Austen's small gold ring with a turquoise stone and small framed silhouettes of Austen's parents. James Baldwin At age 46, feeling alienated and persecuted in the US, author and social critic James Baldwin left his country for Saint-Paul-de-Vence, a medieval village on France's Côte d'Azur. He spent the last 18 years of his life in a villa nestled among orchards, rosemary hedges, and fields of wild strawberries. There Baldwin wrote several works, including his famous "An Open Letter to My Sister, Angela Y. Davis." Tragically, Baldwin's former home most likely won't stand for much longer. Shortly after his death, it was bought by a developer with plans to build apartment buildings. Baldwin's family and a group called His Place in Provence have made attempts to acquire and preserve it, but the wing he lived in has already been destroyed. Baldwin hosted Josephine Baker, Miles Davis, Nina Simone, Ella Fitzgerald, Beauford Delaney, Harry Belafonte, and Sidney Poitier. He was adding to the long list of famous artist visitors to Saint-Paul-de-Vence (Henri Matisse, Georges Braque, Pablo Picasso, Fernand Léger, Joan Miró, Alexander Calder, Jean Cocteau, and Marc Chagall had all spent time there). A large table in Baldwin's gardens is fondly remembered by guests as the place for lively eating, drinking, and conversing. Baldwin called it his "welcome table." One visitor recalls that it was under a towering grove of cedars, another under grape arbors. Baldwin's last work is a play called The Welcome Table. Dylan Thomas For the last four years of his short life, the poet Dylan Thomas lived with his family at the Boathouse. The house is located on a cliff overlooking the Taf estuary in Laugharne, Wales. He used a little shed down the road as his writing studio and created some of his most famous pieces there, including the poems "Do Not Go Gentle into That Good Night" and "Over Sir John's Hill" and the play Under Milk Wood. Thomas died at age 39 on a trip to New York City, from pneumonia and a lifetime of drinking too much. Thomas pinned photos, painting reproductions, lists, and magazine clippings to the walls. Virginia Woolf Virginia Woolf and her husband, Leonard, bought Monk's House in 1919 and visited frequently. They moved in full time in 1940, when their Bloomsbury, London, flat was damaged in an air raid. The Woolfs loved the property for its lush, informal grounds, including an Italian garden, a dew pond, a terrace, and an orchard. The main house was decorated with the help of Woolf's sister, the painter and decorator Vanessa Bell, who lived six miles away. Many of Bell's paintings still hang on the walls. The couple hosted members of the Bloomsbury Group—influential English writers, philosophers, and artists—at their new home. E. M. Forster was photographed happily pruning trees alongside Leonard. Visitors also enjoyed rigorous games of lawn bowling. Woolf wrote parts of all her major works in a converted toolshed she called her "writing lodge," where she had views of Mount Caburn, one of the highest points in East Sussex. The shed was also where she wrote her farewell letter to Leonard on March 28, 1941, before heading to the River Ouse with pockets full of stones. In "A Room of One's Own," Woolf noted that women need money, and their own room, to have the freedom to write and create, and that often they had neither. Virginia and Leonard's ashes were scattered on the property, under two large elm trees that have sadly since been cut down. George Bernard Shaw George Bernard Shaw's 8′ x 8′ writing shed sits at the edge of the garden at Shaw's Corner. These 3.5 acres also hold Shaw's 1902 Edwardian Arts and Crafts–influenced home and are located in the small village of Ayot St. Lawrence, in Hertfordshire, England. From here, the Irish-born playwright wrote Pygmalion (1912) and Saint Joan (1923), for which he won the Nobel Prize in Literature. Shaw's shed sits on a large lazy Susan. Shaw could push against the building to turn it as the sun moved, allowing for passive solar heating and cooling—more light on cold, dark days, and less on hot, sunny days—plus a little exercise. Shaw nicknamed his writing space "London" so his wife could tell friends and visitors he was away at the capital and he could be left alone. (The Shaws did in fact keep a second home in Fitzroy Square.) The Brontës Charlotte, Emily, and Anne Brontë moved into the Haworth Parsonage in West Yorkshire when their father Patrick, a priest and poet, was appointed there in 1820. When Patrick Brontë died (he outlived all of his children) in 1861, the contents of the family home were auctioned off. Decades later, in 1893, at a librarian's insistence that the artifacts be collected and preserved, the Brontë Society was founded and began gathering Brontë treasures. Even though the Brontë sisters grew up seeing their name on book spines at home (Patrick was a published poet), they published their first work together, a collection of poems, under the (male) pseudonyms of Currer (Charlotte), Ellis (Emily), and Acton (Anne) Bell. Three copies of the book were sold. After being in private collections for more than a century, the mahogany desk Charlotte wrote from was acquired for £20,000 and donated to the museum, in 2011. In 2015 the large mahogany drop-leaf table all the sisters used was purchased by the museum with a grant of £580,000. In a diary entry from 1837, Emily sketched the table, showing Anne and herself working at it. Excerpt from Bibliophile: An Illustrated Miscellany by Jane Mount, published by Chronicle Books 2018 The post Writers' Cribs appeared first on GuaripeteMagazine. |
| Cixin Liu, China, and the Future of Science Fiction Posted: 11 Sep 2018 07:04 AM PDT "I'm so tired of the future." It was late in the day at the Tsinghua University Art Museum and I was getting whiny. My boyfriend and an acquaintance thumbed through some catalogs near the exit and managed to ignore me. We had reached the end of an exhibit of architectural models from the firm Foster + Partners: London's Gherkin, a cruise ship terminal, sundry airports… I'm a Berliner and the most dizzying display was a table of alternate models for the Reichstag dome, a dozen potential realities in balsa and cardboard. In the final room, an animated video envisioned some sort of building project in space—on Mars maybe?—but I couldn't really muster the energy to watch it. It's been said that the past is a foreign country, and I've come to believe that the future is too, I'd just never been so immersed in it before. In Beijing this summer, I read about two thousand pages of work by Cixin Liu, possibly the world's most important living science-fiction author, and certainly among humanity's most imaginative prognosticators. (A recent LRB review called his Three Body trilogy, published in English in 2016, "one of the most ambitious works of science fiction ever written.") Like life in Beijing, the experience was magnificent and exhausting and thrilling and flawed. Science fiction might be the genre best suited to Chinese society today; the breakneck pace of change becomes a constant, and to live in the present is to anticipate what is to come. When we told our acquaintance that we'd like to return next summer, she responded as many of our Chinese friends did: "You might not recognize it here." Living at this pace requires flexibility and ingenuity; you are making up the story of the future as you go along. Everything, the first time we do it, is a fiction. The surety of truth comes only with repetition and belongs to things we know from the past. But the past also becomes rapidly unfamiliar once we are not repeating its methods: another friend, when we told him that we didn't have Chinese bank accounts and therefore couldn't use our phones to pay for purchases or order takeout, looked at us with fascinated pity, murmuring "It's like you're living some sort of social experiment!" Reading Cixin Liu on the spotless, teeming subway (the world's busiest annually) could be described as therapeutic. I often felt I was caught in the eddies of a time warp in China, a society that's all but done away with paper currency but does not yet have potable tap water; one in which facial recognition software will prevent you from stealing toilet paper but the plumbing can't handle flushing it. Remember William Gibson's old saw "The future is already here—it's just not very evenly distributed"? In Beijing that's true on a meter-for-meter basis. Liu writes hard science fiction, and his plots are propelled by the human pursuit of relentless scientific advancement. To read them is to be gripped by the frisson of discovery, of the infinite and infinitesimal brought within our comprehension. They give the comforting sense that humanity is rushing toward something, rather than just thronging along the ring line staring at their phones. This August, Tor released a new English translation of Liu's Ball Lightning, an earlier novel much more limited in scope than the Three Body trilogy, that nonetheless demonstrates the power of Liu's scientific imagination. Ball lightning is what it sounds like—an unexplained atmospheric phenomenon, usually associated with thunderstorms, in which a slow-moving ball of electricity floats near the earth's surface, then explodes. The protagonist Chen sees his parents incinerated by a freak occurrence of the phenomenon, and spends his life trying to understand it. The book follows him through university and graduate school, onto civilian and military research projects, a quest that takes him to the limits of our understanding of physics. It's also the sort of bureaucratic Bildungsroman only communism could produce, full of meetings, endless hierarchies, and funding issues; ball lightning may fuel Chen's dreams, but officialdom determines their trajectory. Chen's dilemma is that, in order to pursue his research, he must allow his discoveries to be used to destructive ends. As always with Liu, human curiosity wins out, with large-scale geopolitical ramifications. Ball Lightning was originally published in 2004, which means that the society it emerged from is now antediluvian in Chinese terms, but the book's focus on grand scientific and ethical questions place it somewhat outside of the flow of time. The human cost of technical progress and development is always in the air in China—in the form of PM2.5 particles—and I would imagine that AI developers feel much the same as Chen when their work is in the service of surveillance. Liu may be so prodigiously creative that his alternate futures are timeless, but they are by no means perfect. At a sentence level, Liu's work is uniformly bad—lines on the order of "No one could have anticipated that my ominous prediction would be fulfilled that very night" abound. His plotlines tend toward the maudlin (Ball Lightning has a cringeworthy weaponized killer-bees subplot, for example), his characters are wooden. Were we to put Liu's misogyny in terms of an air quality index, giving the first book in the Three Body trilogy a rating of unhealthy and the last two a rating of hazardous, Ball Lightning would be very unhealthy, somewhere in the middle. But to be honest, it doesn't matter, we are here for one thing and one thing only: Liu's ideas. (Actually, the misogyny matters. I can't enjoy Liu's work fully because of it, but that's another essay.) His books are propelled by the fascination of scientific discovery, in which the mysteries of existence unfold around the reader. In Liu's hands, everyday reality reveals itself to be composed of marvels, and the results are nothing less than mind-blowing. For example, as one character explains in Ball Lightning: "In the briefest period after the Big Bang, all of space was flat. Later, as energy levels subsided, wrinkles appeared in space, which gave birth to all of the fundamental particles. What's been so mystifying for us is why the wrinkles should only appear at the microscopic level. Are there really no macroscopic wrinkles? Or, in other words, are there no macroscopic fundamental particles?" From this single, intriguing question, Liu extrapolates a series of ramifications that ripple through human existence, from defense and politics to the nature of the soul and the afterlife. Liu's visions of the future are so vivid and near-at-hand because he presents them as extensions of reality around us. Another dimension is available to us within the world we know, accessible through human ingenuity; just because something is invisible to us now doesn't mean that it won't soon materialize. To join Liu in this perspective is to recognize the most fantastic aspects of our reality. This immanence and imminence of possibility felt true to the fabric of my experience of China, the not-quite-benign magic of the unexpected. The only predictable aspect was my reaction when enchantment eventually gave way to exhaustion. On our final day in Beijing, I dumped Cixin Liu's books in a trash bin on our hutong—they were too bulky to carry back and we didn't know anyone who would want to read them in English. I feel a bit cowardly admitting it: the future is nice to visit, but I'm not sure I would want to live there. Amanda DeMarco is a translator living in Berlin The post Cixin Liu, China, and the Future of Science Fiction appeared first on GuaripeteMagazine. |
| You are subscribed to email updates from Books Trending News. To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
| Google, 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States | |

























No comments:
Post a Comment