Monthly Archives: April 2016

For Your Consideration: Laws of Life, Art of (Cyber)war, The Minecraft Generation, and Self or Selfie

Time to re-boot the newsletter. I’ve been meaning to start organizing it again for months but haven’t made the time. Not sure what interrupted the flow especially when I had a lot of encouragement from friends and family. One in particular who used to encourage me on this, to publish (or re-publish) the things that interested me, and to write even when I felt like it was an echo into the void. Miss you dude…

1. Jeremy England, the Man Who May One-Up Darwin

In town to give a lecture, the Harvard grad and Rhodes scholar speaks quickly, his voice rising a few pitches in tone, his long-fingered hands making sudden jerks when he’s excited. He’s skinny, with a long face, scraggly beard and carelessly groomed mop of sandy brown hair — what you might expect from a theoretical physicist. But then there’s the street-style Adidas on his feet and the kippah atop his head. And the fact that this scientist also talks a lot about God.

The 101 version of his big idea is this: Under the right conditions, a random group of atoms will self-organize, unbidden, to more effectively use energy. Over time and with just the right amount of, say, sunlight, a cluster of atoms could come remarkably close to what we call life. In fact, here’s a thought: Some things we consider inanimate actually may already be “alive.” It all depends on how we define life, something England’s work might prompt us to reconsider. “People think of the origin of life as being a rare process,” says Vijay Pande, a Stanford chemistry professor. “Jeremy’s proposal makes life a consequence of physical laws, not something random.”

England’s idea may sound strange, even incredible, but it’s drawn the attention of an impressive posse of high-level academics. After all, while Darwinism may explain evolution and the complex world we live in today, it doesn’t account for the onset of intelligent beings. England’s insistence on probing for the step that preceded all of our current assumptions about life is what makes him stand out, says Carl Franck, a Cornell physics professor, who’s been following England’s work closely. “Every 30 years or so we experience these gigantic steps forward,” Franck says. “We’re due for one. And this might be it.”

And all from a modern Orthodox Jew with fancy sneakers.

2. The New Art Of War: How trolls, hackers and spies are rewriting the rules of conflict

While there is no international law that directly refers to the ultra-modern concept of cyber warfare, there is plenty that applies. So CDCOE assembled a panel of international legal experts to go through this existing law and show how it applies to cyber warfare. This formed the basis of the Tallinn Manual and the 95 so-called ‘black letter rules’ it contains (so named because that’s how they appear in the text).

Through these rules the manual attempts to define some of the basics of cyber warfare. At the most fundamental level, the rules state that an online attack on a state can, in certain circumstances, be the equivalent of an armed attack. It also lays out that such an attack is against international law, and that a state attacked in such a way has the right to hit back.

Other rules the manual spells out: don’t target civilians or launch indiscriminate attacks that could cripple civilian infrastructure. While many of these sorts of rules are well understood when it comes to standard warfare, setting it out in the context of digital warfare was groundbreaking.

While the manual argues that a cyber attack can be considered to be the equivalent of an armed attack if it causes physical harm to people or property, other attacks can also be considered a use of force depending on their severity or impact. For example, breaking into a military system would be more likely to be seen as serious, as opposed to hacking into a small business. In contrast, cyber attacks that generate “mere inconvenience or irritation” would never be considered to be a use of force.
The manual also delves into some of the trickier questions of cyber war: would Country A be justified in launching a pre-emptive military strike against a Country B if it knew Country B planned to blow up Country A’s main oil pipeline by hacking the microcontrollers managing its pipeline pressure? (Answer: probably yes.)

The manual even considers the legality of some scenarios verging on the science-fictional.

If an army hacked into and took control of enemy drones, would those drones have to be grounded and marked with the capturers insignia before being allowed to carry out reconnaissance flights? (Answer: maybe.)

But what’s striking is that the Tallinn Manual sets the rules for a war that hasn’t been fought yet.

3. The Minecraft Generation

Minecraft is an incredibly complex game, but it’s also — at first — inscrutable. When you begin, no pop-ups explain what to do; there isn’t even a “help” section. You just have to figure things out yourself. (The exceptions are the Xbox and Play­Station versions, which in December added tutorials.) This unwelcoming air contrasts with most large games these days, which tend to come with elaborate training sessions on how to move, how to aim, how to shoot. In Minecraft, nothing explains that skeletons will kill you, or that if you dig deep enough you might hit lava (which will also kill you), or even that you can craft a pickax.

This “you’re on your own” ethos resulted from early financial limitations: Working alone, Persson had no budget to design tutorials. That omission turned out be an inadvertent stroke of genius, however, because it engendered a significant feature of Minecraft culture, which is that new players have to learn how to play. Minecraft, as the novelist and technology writer Robin Sloan has observed, is “a game about secret knowledge.” So like many modern mysteries, it has inspired extensive information-­­sharing. Players excitedly pass along tips or strategies at school. They post their discoveries in forums and detail them on wikis. (The biggest one, hosted at the site Gamepedia, has nearly 5,000 articles; its entry on Minecraft’s “horses,” for instance, is about 3,600 words long.) Around 2011, publishers began issuing handbooks and strategy guides for the game, which became runaway best sellers; one book on redstone has outsold literary hits like “The Goldfinch,” by Donna Tartt.

“In Minecraft, knowledge becomes social currency,” says Michael Dezuanni, an associate professor of digital media at Queensland University of Technology in Australia. Dezuanni has studied how middle-­school girls play the game, watching as they engaged in nuanced, Talmudic breakdowns of a particular creation. This is, he realized, a significant part of the game’s draw: It offers many opportunities to display expertise, when you uncover a new technique or strategy and share it with peers.

The single biggest tool for learning Minecraft lore is YouTube. The site now has more than 70 million Minecraft videos, many of which are explicitly tutorial. To make a video, players use “screencasting” software (some of which is free, some not) that records what’s happening on-screen while they play; they usually narrate their activity in voice-­over. The problems and challenges you face in Minecraft are, as they tend to be in construction or architecture, visual and three-­dimensional. This means, as many players told me, that video demonstrations have a particularly powerful explanatory force: It’s easiest to learn something by seeing someone else do it. In this sense, the game points to the increasing role of video as a rhetorical tool. (“Minecraft” is the second-­most-­searched-­for term on YouTube, after “music.”)

4. Saving the Self in the Age of the Selfie

Consider Erica, a full-time college student. The first thing she does when she wakes up in the morning is reach for her smartphone. She checks texts that came in while she slept. Then she scans Facebook, Snapchat, Tumblr, Instagram, and Twitter to see “what everybody else is doing.” At breakfast, she opens her laptop and goes to Spotify and her various email accounts. Once she gets to campus, Erica confronts more screen time: PowerPoints and online assignments, academic content to which she dutifully attends (she’s an A student). Throughout the day, she checks in with social media roughly every 10 minutes, even during class. “It’s a little overwhelming,” she says, “but you don’t want to feel left out.”

We’ve been worried about this type of situation for thousands of years. Socrates, for one, fretted that the written word would compromise our ability to retell stories. Such a radical shift in communication, he argued in Phaedrus, would favor cheap symbols over actual memories, ease of conveyance over inner depth. Philosophers have pondered the effect of information technology on human identity ever since. But perhaps the most trenchant modern expression of Socrates’ nascent technophobia comes from the 20th-century German philosopher Martin Heidegger, whose essays on the subject—notably “The Question Concerning Technology” (1954)—established a framework for scrutinizing our present situation.

Heidegger’s take on technology was dire. He believed that it constricted our view of the world by reducing all experience to the raw material of its operation. To prevent “an oblivion of being,” Heidegger urged us to seek solace in nontechnological space. He never offered prescriptive examples of exactly how to do this, but as the scholar Howard Eiland explains, it required seeing the commonplace as alien, or finding “an essential strangeness in … familiarity.” Easier said than done. Hindering the effort in Heidegger’s time was the fact that technology was already, as the contemporary political philosopher Mark Blitz puts it, “an event to which we belong.” In this view, one that certainly befits today’s digital communication, technology infuses real-world experience the way water mixes with water, making it nearly impossible to separate the human and technological perspectives, to find weirdness in the familiar. Such a blending means that, according to Blitz, technology’s domination “makes us forget our understanding of ourselves.”

The only hope for preserving a non-technological haven—and it was and remains a distant hope—was to cultivate what Heidegger called “nearness.” Nearness is a mental island on which we can stand and affirm that the phenomena we experience both embody and transcend technology. Consider it a privileged ontological stance, a way of knowing the world through a special kind of wisdom or point of view. Heidegger’s implicit hope was that the human ability to draw a distinction between technological and nontechnological perception would release us from “the stultified compulsion to push on blindly with technology.”

 

 

If you were forwarded this newsletter and enjoyed it, please subscribe here: https://tinyletter.com/peopleinpassing

I hope that you’ll read these articles if they catch your eye and that you’ll learn as much as I did. Please email me questions, feedback or raise issues for discussion. Better yet, if you know of something on a related topic, or of interest, please pass it along. And as always, if one of these links comes to mean something to you, recommend it to someone else.

Leave a Comment

Filed under Newsletter, Random, Writing