Zero Hour for Generation X?

Normally, I say that the idea of generations in American life is a refuge for lazy journalists who want to trade off nostalgia or fear of the future. Think how often you’ve read the clickbait headline “Why Millennials Have Ruined [insert beloved concept here]” and you’ll know what I mean.

But when the concept of generations crops up as a way to help us grapple with the impact of new technology, I’m all for it.

Based on this review, I’m likely to pick up Zero Hour for Generation X, a new book by Michael Hennessey, which calls on the cohort born between the mid 1960s and the early 1980s (I’m at the tail end) to remind the world how sane things were before digital technology turned the world into a voluntary panopticon, hostile to democracy, privacy, and enchantment. Americans are always fond of dismissing any skepticism about technology as merely a hatred of change, which, according to our national mythology is always for the good. All forward motion is progress, and the faster forward the better.

Throughout my career, I’ve spent a lot of time trying to make Americans listen to the sensible alternative to this argument. There are bad forms of change and good forms, and new technology has the power to change us in ways we don’t always understand or control.

As thinkers from Thoreau to  C. S. Lewis to George Orwell have reminded us, it is to direct experience and to personal memory that we can always turn to in order to stay sane. But what happens when our information technology renders both direct experience and personal memory doubtful? What then?

Gen X may be the last generation to have a sure answer. I’m curious to see what Hennessey comes up with.

The Latest Big Innovation In News? Human Editors

This post is a bit of a throwback to my days at The Schwartz Center at Fordham, where we spent a lot of our time investigating what was happening to the news business. We looked at the algorithmic gatekeepers that were replacing the human ones who once decided what was worthy of our attention, and we looked at the new business models that were feeding off that new locus of attention.

What seemed inevitable five or six years ago was that the future of news would be a grim collision between user generated content and dodgy social media algorithms, with the craft of journalism left out of the picture. That’s true in the darker corners of the Web, but, just as Tim Cook’s privacy speech this week showed, Apple may be charting a new, more human way forward.

Apple employs about thirty human beings with journalistic acumen to select the stories that show up in its news app, which reaches about 90 million people each week, according to this profile of the head of Apple News. You’ll never have the chance to meet the algorithm that selects the distractions in your Facebook feed–not so for Lauren Kern, Apple’s editor-in-chief.

I’d long ago deleted Apple News, preferring as I do to pay editors and writers to deliver the news to me in print each week. That’s still my preference, but for those of you who prefer thumbing through the news on an app for free, Ms. Kern’s operation may be the best option.

Dept. of Wonkery: Why is Apple so Strong on Cybersecurity?

This morning, Apple’s CEO Tim Cook spoke at the 40th International Conference of Data Protection and Privacy Commissioners in Brussels, and he came out strongly against the exploitation of data by Apple’s Silicon Valley peers. According to Fortune, Cook said ‘“people’s personal data is being ‘weaponized’ with  ‘military efficiency,’ and technology is being used to deepen divisions and ‘undermine our sense of what is true and what is false.”’ Cook even warned of a “data-industrial complex,” in a reference to the “military-industrial complex,” a phrase coined by Eisenhower to describe a combination of interests he felt were hostile to democracy.

The salvo is in keeping with Cook’s previous statements on data privacy, most notably his justification of Apple’s refusal to let the FBI hack into the data on a phone owned by one of the shooters in the San Bernadino shooting of 2016.

Why is Apple so strong on protecting the data consumers store inside its devices?

One reason is that the position supports Apple’s business model. Unlike Google and Facebook, Apple does not sell your personal data to advertisers. They are a hardware company, with ambitions to become a media brand, a la Netflix and Amazon, a more mainstream and family-oriented one. In addition to associations with Pixar and Disney, Apple has also put its toe in the water of original content, having just licensed the rights to Isaac Asimov’s Foundation trilogy. There is a utopianism to Apple’s media ambitions which recalls that of the early days of the telephone and radio, when benevolent, self-regulating monopolies like AT&T and NBC would own both hardware and the content it distributed, and preserve a standard of excellence in both.

It’s that utopianism that is the key to the deeper reason for Apple’s stance on cybersecurity. The data that resides on Apple devices in our pockets and on our wrists will someday reside on devices that are implanted inside our bodies, or will even be integrated with them. Some may shudder at the thought, but in countries like Sweden, the implantation of chips that allow the user to access their data, go through security, and pay for goods and services are already in demand. Consumers can’t wait to get them, and even mark their implantation with celebrations. Before smartphones, the idea of having a device with a camera and microphone with us at all times would’ve been horrifying, but in exchange for the conveniences they offer, such squeamishness was easily set aside.

Apple and Tim Cook are adamant about data privacy because they want to dominate the future market in voluntary cyborg implants. To violate the data privacy of our phones would set the precedent that we should someday violate the data privacy of our bodies. Whether in our brains or implanted devices, the memories under our skins should be inviolate.

When our devices don’t let us forget, we can’t move on.

As the record of our daily written communication is fused with our phones, something sinister is happening to our memories of other people, especially those who have died.

William Gibson has described information technology as a form of prosthetic memory, and I agree with him. Written language exists as much to store the past as it does to communicate the present. Accounting records may actually have been the first use of written language, which would make extending human memory writing’s primal function.  Inscriptions to memorialize the glory of the powerful dead followed soon after.

Luke O’Neil recently recounted the strangeness of living with his late father’s last text messages. They record banalities, but also invite him to further communication with the dead every day, hovering just at the edge of his attention, like an advertisement. What he and his father tapped out without a thought have, by proximity alone, been asked to bear the weight of mourning, and they are doing a poor job of it.

We have always used written communication to remember the dead. We have had letters and books and tombstone inscriptions, but these are all deliberately chosen and set aside from daily life. Before digital communication, we could choose to remember our dead in a way that was true and relevant. In contrast, our accidental digital memorials are shoved at us every day by the grim mechanics of our technology. We live forever in the horror of Emily Dickinson’s vision of banality at the last moment: “I heard a fly buzz when I died.”

I have written elsewhere about the need to relieve ourselves from the burden of perfect digital memory, and how Marshall McLuhan got it right when he said:

“The future will be an electrically computerized dossier bank – that one big gossip column that is unforgiving, unforgetful and from which there is no redemption, no erasure of early ‘mistakes’.”

Our prosthetic memory is attached to us at too many nerve endings. When we can remember everything, memory is simply noise. We burn out on memory and, by extension, the present moment, which is no less than life itself.

Future of Media Beat: Cable is “cash-rich, but future-poor.”

Comcast has actually described itself as “cash rich, but future poor.”

Cable’s business model is indeed a machine for printing money, which leads me to believe that it will always be profitable despite the rise of internet video. The question is simply how profitable.

As Marshall McLuhan observed, new media never totally replace old media. They are simply added to the mix. For example: the market for ebooks has fallen a bit in the last few years while the market for print books has actually risen. The relationship between those two media seems to be stabilizing, despite predictions that one would eat the other. To look at a larger time scale, print has never recovered from the blow dealt to it by radio and live TV. But as the bustling Frankfurt book fair opening last week shows, print is not going anywhere, probably ever.

So for forms of media, the important question is never survival, but dominance.

We have this massive infrastructure of wires and satellites that deliver cable TV. We’re not going to stop using it. Comcast just forked over $40 billion to purchase Sky, a satellite TV network, even thought satellite TV is in decline, too. Why? Because cornering the market in a multi-deca-billion dollar business, even in decline, is still obscenely profitable. But by purchasing Sky, Comcast does seem to be doubling down on the obsolescing media market rather than entering the fray for the future of internet video.

The only people who seem to have the stomach for that are the FANGs (Facebook, Amazon, Netflix, and Google), and the indomitable Disney, which always seems to be both cash-rich and future-rich.

There is no question that the dominant medium of the future will be internet video. In creating the iPod, Steve Jobs was actually trying to create a market for what he called “portable video,” as early as 2005. Disney is just catching up to that realization over a decade later, but with a massive global audience of deeply committed fans and the best reservoir of content, their fortunes look good.

Whoever has their hand on the master switch of the dominant medium gets to make all the money in the future, and my bet is on the Mouse.

The HBO-Sesame Street deal is not a reason to privatize public television

The Nation

Ever since its creation in the late 1960s, public broadcasting in the US has been under attack. The attacks have come mostly from conservatives, but liberals have not been above them, either, and the debate usually flares up when there is a change in the funding for NPR and PBS, either in amount or source. The new partnership between HBO and Sesame Street is the latest such flare up. Usually, Big Bird gets held up as a symbol for all of public broadcasting, but in this case, he and his pals are actually at the center of the debate.

In my latest for The Nation, my co-author and I argue that the Sesame Street deal, while good for the iconic children’s show, is a bad model for the rest of PBS programs. When it comes to providing educational resources for all Americans, privatization isn’t the way to go. Click on the bird for the full piece.