For Your Consideration : 4 Links : 12/16/2014

Subscribe Here: https://tinyletter.com/peopleinpassing 

1.  Atari Teenage Riot: The Inside Story Of Pong And The Video Game Industry’s Big Bang

That better game would be Pong, which was deceptively simple to pick up, but infuriatingly difficult to master (not least because a developmental hiccup meant that your paddle couldn’t defend all your territory in the original coin-operated version). Today it is considered one of the biggest arcade games in the world, responsible for the success of the video game industry, valued at $78.5 billion this year. Pong took video games out of windowless computer labs full of buttoned-up coders and brought it to the masses, and with it, Bushnell’s nascent company, Atari.

It would’ve been hard to imagine then, but games today are bigger than the global film industry, which had a 60-year head start. Pong is the reason that Call of Duty: Modern Warfare 3 can make more than three times as much in its first five days on sale as The Avengers can in its first five days in theaters. But while today’s blockbuster games are largely created by hundred-strong teams at bankrolled developers, the men who created and crafted Pong embodied the bootstrap start-up culture that typifies the most exciting edges of today’s tech landscape. They were knocked back by old men in drab suits who said games weren’t going to be big business. But games were going to be big business, even those started in unassuming surroundings. And nothing was going to stop them.

2. Saving Our Daughters From An Army Of Princesses

What was going on here? My fellow mothers, women who once swore they would never be dependent on a man, smiled indulgently at daughters who warbled “So This Is Love” or insisted on being addressed as Snow White. The supermarket checkout clerk invariably greeted Daisy with “Hi, Princess.” The waitress at our local breakfast joint, a hipster with a pierced tongue and a skull tattooed on her neck, called Daisy’s “funny-face pancakes” her “princess meal”; the nice lady at Longs Drugs offered us a free balloon, then said, “I bet I know your favorite color!” and handed Daisy a pink one rather than letting her choose for herself. Then, shortly after Daisy’s third birthday, our high-priced pediatric dentist — the one whose practice was tricked out with comic books, DVDs, and arcade games — pointed to the exam chair and asked, “Would you like to sit in my special princess throne so I can sparkle your teeth?”

“Oh, for God’s sake,” I snapped. “Do you have a princess drill, too?”

She looked at me as if I were the wicked stepmother.

But honestly: since when did every little girl become a princess? It wasn’t like this when I was a kid, and I was born back when feminism was still a mere twinkle in our mothers’ eyes. We did not dress head to toe in pink. We did not have our own miniature high heels. What’s more, I live in Berkeley, California: if princesses had infiltrated our little retro-hippie hamlet, imagine what was going on in places where women actually shaved their legs? As my little girl made her daily beeline for the dress-up corner of her preschool classroom, I fretted over what playing Little Mermaid, a character who actually gives up her voice to get a man, was teaching her.

[…]

Apparently, I had tapped into something larger than a few dime-store tiaras. Princesses are just a phase, after all. It’s not as though girls are still swanning about in their Sleeping Beauty gowns when they leave for college (at least most are not). But they did mark my daughter’s first foray into the mainstream culture, the first time the influences on her extended beyond the family. And what was the first thing that culture told her about being a girl? Not that she was competent, strong, creative, or smart but that every little girl wants — or should want — to be the Fairest of Them All.

[…]

Even as new educational and professional opportunities unfurl before my daughter and her peers, so does the path that encourages them to equate identity with image, self-expression with appearance, femininity with performance, pleasure with pleasing, and sexuality with sexualization. It feels both easier and harder to raise a girl in that new reality — and easier and harder to be one.

As with all of us, what I want for my daughter seems so simple: for her to grow up healthy, happy, and confident, with a clear sense of her own potential and the opportunity to fulfill it. Yet she lives in a world that tells her, whether she is three or thirty-three, that the surest way to get there is to look, well, like Cinderella.

3. As Robots Grow Smarter, American Workers Struggle to Keep Up

Clearly, many workers feel threatened by technology. In a recent New York Times/CBS News/Kaiser Family Foundation poll of Americans between the ages of 25 and 54 who were not working, 37 percent of those who said they wanted a job said technology was a reason they did not have one. Even more — 46 percent — cited “lack of education or skills necessary for the jobs available.”

Self-driving vehicles are an example of the crosscurrents. They could put truck and taxi drivers out of work — or they could enable drivers to be more productive during the time they used to spend driving, which could earn them more money. But for the happier outcome to happen, the drivers would need the skills to do new types of jobs.

The challenge is evident for white-collar jobs, too. Ad sales agents and pilots are two jobs that the Bureau of Labor Statistics projects will decline in number over the next decade. Flying a plane is largely automated today and will become more so. And at Google, the biggest seller of online ads, software does much of the selling and placing of search ads, meaning there is much less need for salespeople.

4. Study of poverty-ridden neighborhoods shows gentrification is not ruining enough of America

“Because the slow decline is more common and less visible, it is seldom remarked upon, while gentrification, when it happens – which is both unusual and dramatic – is far more evident change,” explains the report.

“There are more areas of poverty than areas undergoing gentrification, but that doesn’t mean that when communities do revitalize that people aren’t uprooted,” says Harold Simon, executive director of the National Housing Institute. “That kind of thing has happened all over the place.”

It’s not a matter of which is worse: gentrification or poverty. Americans should be concerned about both, says Simon.

Often the cities where gentrification occurs are also the cities where poverty slowly spreads across other neighborhoods. Take Brooklyn, for example. Over the last decade, Brooklyn went from having four of New York’s poorest neighborhoods to having five. At the same time, it went from having zero of New York’s richest neighborhoods to having two and was singled out as having the least affordable housing market.

 

First say to yourself what you would be; and then do what you have to do. – Epictetus

Leave a Comment

Filed under Newsletter, Random

For Your Consideration : 4 Links and an Infographic : 12/12/2014

Subscribe Here: https://tinyletter.com/peopleinpassing 

1. Why James Cameron’s Aliens is the best movie about technology.
Some great film writing on philosophical messages in sci-fi movies we love.

There are not many films on any topic that pull off the trifecta of big ideas, great moviemaking, and deep human resonance, let alone manage to be about technology. For my purposes, there are three that matter: Blade Runner, 2001: A Space Odyssey, and Aliens.

[…]

Right now (if you’re still reading) you’re thinking, Tim, I love Aliens. But I don’t love it because it makes me think the thinky-thoughts. I love it because people blow shit up, get killed by aliens, then blow up more shit. There’s no way it carries a deep message about human beings and their relationship to technology. It’s not high art. It’s fun. And I say to you, it is both. You just haven’t noticed it until now.

[…]

That’s what technology is. It’s the world of things, some impossibly stupid, some smarter than we are, we have assembled around ourselves to cover over our fundamental weaknesses as a species. The strength we have, the advantage this gives us, is our ability to stand apart from the things we’ve made: to use them and set them aside; to make them prosthetic extensions of ourselves and to let them go.

2. The Ethical Dilemma Behind Reporting On The Data Released From The Sony Hack

From the beginning, Variety has not shied away from reporting on what has emerged from the data to date. That isn’t to say absolutely everything that pops up will be duly noted in our publication — personally identifiable information about execs, for instance, would be one no-no.

But my mounting misgivings have forced me to explain to myself what all this reporting is really about. While I found a lot to question about the rationales, ultimately I’ve arrived at an uneasy peace with why the leaks just can’t be ignored.

When ethical boundaries get murky, it’s only natural to grab for some sense of precedent. The one that comes to mind for me is a relatively recent example: the celebrity nude photo leak in October that besmirched the good names of everyone from Jennifer Lawrence to Ariana Grande.

These young women clearly had their privacy invaded. There was a lot of justifiable hand-wringing in the press about the plight of these women, but why is there none of that for the corporations? Their privacy has been invaded as well, albeit in a different way.

Nude photos weren’t hacked at Sony, but it’s interesting that while nudity is deservedly considered to be crossing the line, financial records aren’t accorded a measure of respect as well. Rest assured that SPE chairman Michael Lynton would probably rather you see his private parts than the company’s movie budgets.

The difference between nude celebrity photos and the leaked Sony data, respectable media outlets will argue, is only the latter is “newsworthy.” But what does that really mean?

3. Swarm Weapons And The Future Of Conflict

Swarming is a seemingly amorphous, but deliberately structured, coordinated, strategic way to perform military strikes from all directions. It employs a sustainable pulsing of force and/or fire that is directed from both close-in and stand-off positions. It will work best — perhaps it will only work — if it is designed mainly around the deployment of myriad, small, dispersed, networked maneuver units. This calls for an organizational redesign — involving the creation of platoon-like pods joined in company-like clusters — that would keep but retool the most basic military unit structures. It is similar to the corporate redesign principle of flattening, which often removes or redesigns middle layers of management. This has proven successful in the ongoing revolution in business affairs and may prove equally useful in the military realm. From command and control of line units to logistics, profound shifts will have to occur to nurture this new way of war. This study examines the benefits — and also the costs and risks — of engaging in such serious doctrinal change. The emergence of a military doctrine based on swarming pods and clusters requires that defense policymakers develop new approaches to connectivity and control and achieve a new balance between the two. Far more than traditional approaches to battle, swarming clearly depends upon robust information flows. Securing these flows, therefore, can be seen as a necessary condition for successful swarming.

4. How Fixed Are Personality Traits After Age 30?

When psychologists talk about personality, they are usually referring to what are called the Big Five traits: openness, conscientiousness, extraversion, agreeableness, and neuroticism. These are our core characteristics, which generally don’t fluctuate depending on the particular mood we’re in. Some newer research in the emerging field of personality neuroscience suggests that these traits are biogenic, stemming from our genes, which helps explain why so many studies have found personality to be relatively stable. Research on identical twins, for example, shows that these five traits are largely heritable, with about 40 to 50 percent of our personality coming from our genes.

Some aspects of our personalities start to show up when we’re just days old, as Little writes in his book:

Such features of personality can be detected in the neonatal ward. If you make a loud noise near the newborns, what will they do? Some will orient toward the noise, and others will turn away. Those who are attracted to the noise end up being extraverts later in development; those who turn away are more likely to end up being introverts.

As we grow older, our personalities do evolve, of course; throughout adolescence and early adulthood, we change rapidly. One review of 152 longitudinal studies found the biggest changes in personality traits occur from childhood through the 20s. In the 30s, 40s, and 50s, we can and do still change, but these changes come more slowly, and require more effort, said Paul T. Costa Jr., scientist emeritus at the laboratory of behavioral science at the National Institutes of Health.

Infographic – Highest Consumption of Selected Spirits (2012) : Unsurprisingly the USA is pretty boozy.

 

 

 

Children have never been very good at listening to their elders, but they have never failed to imitate them. – James A. Baldwin

Leave a Comment

Filed under Newsletter, Random

For Your Consideration : 4 Links : 12/9/2014

1. Free The Drones : Regulating Drones In US Airspace

The right way to balance safety and innovation is to create a set of rules for commercial drones that depend on their size, use and so on. That is what happens in some countries: Canada, for instance, exempts small drones from regulatory oversight. The rules should also vary according to location, since surveying the outside of a building in a city is more hazardous than flying over a field. Japan recognises this. And requiring drone pilots to have experience flying manned aircraft is daft. Far better to say, as Britain and Australia do, that drone pilots need to be certified as competent to fly a drone.

Like any disruptive technology, commercial drones will hurt existing businesses. Some pilots will lose their jobs as more farmers and logistics firms use drones instead of hiring a helicopter or aircraft. The incumbents’ opposition to the drone industry is understandable. The FAA’s is not. It should take a more objective view, and free commercial drones.

2. Do Artifacts Have Ethics?

When we do think about technology’s moral implications, we tend to think about what we do with a given technology. We might call this the “guns don’t kill people, people kill people” approach to the ethics of technology. What matters most about a technology on this view is the use to which it is put. This is, of course, a valid consideration. A hammer may indeed be used to either build a house or bash someones head in. On this view, technology is morally neutral and the only morally relevant question is this: What will I do with this tool?

But is this really the only morally relevant question one could ask? For instance, pursuing the example of the hammer, might I not also ask how having the hammer in hand encourages me to perceive the world around me? Or, what feelings having a hammer in hand arouses?

6 of 40 questions applicable to the moral considerations of technology/objects/artifacts (all 40 are good)

  1. What sort of person will the use of this technology make of me?
  2. What habits will the use of this technology instill?
  3. How will the use of this technology affect my experience of time?
  4. How will the use of this technology affect my experience of place?
  5. How will the use of this technology affect how I relate to other people?
  6. How will the use of this technology affect how I relate to the world around me?

3. Seymour Papert’s Legacy: Thinking About Learning, and Learning About Thinking

Every so often you find a magic word that allows you to find the information you’re looking for. For me recently that word has been “Constructionism” and it has led me to the work of Seymour Papert.

Papert’s constructionism has, at its heart, a desire not to revise, but to invert the world of curriculum-driven instruction. If there is one keystone concept from Papert that will forever set the teeth of educational administrators on edge, it is probably this, from “Mindstorms”:

Many children are held back in their learning because they have a model of learning in which you have either ‘got it’ or ‘got it wrong.’ But when you program a computer you almost never get it right the first time. Learning to be a master programmer is learning to become highly skilled at isolating and correcting bugs … The question to ask about the program is not whether it is right or wrong, but if it is fixable. If this way of looking at intellectual products were generalized to how the larger culture thinks about knowledge and its acquisition we might all be less intimidated by our fears of ‘being wrong.’ – Seymour Papert

Papert was perhaps the first interaction designer especially concerned with digital tools and children. His awareness that children effectively think differently than adults, and that their cognitive evolution requires designing rich toolkits and environments rather than force-feeding knowledge, has set the tone for decades of research. The combination of developmental psychology, AI, and technology proved to be powerful and generative, and created a new genre of educational technologies. Papert was an inspirational force that motivated an entire generation of researchers and practitioners to bring his vision to the world. But the work is far from done. For example, why is it that half a century after these ideas were formulated, still we do not have robust forms of assessment by which to evaluate this vision?

4. It’s official: America is now No. 2 (Economic Superpower)

Make no mistake: This is a geopolitical earthquake with a high reading on the Richter scale. Throughout history, political and military power have always depended on economic power. Britain was the workshop of the world before she ruled the waves. And it was Britain’s relative economic decline that preceded the collapse of her power.

And it was a similar story with previous hegemonic powers such as France and Spain.

This will not change anything tomorrow or next week, but it will change almost everything in the longer term. We have lived in a world dominated by the U.S. since at least 1945 and, in many ways, since the late 19th century. And we have lived for 200 years — since the Battle of Waterloo in 1815 — in a world dominated by two reasonably democratic, constitutional countries in Great Britain and the U.S.A. For all their flaws, the two countries have been in the vanguard worldwide in terms of civil liberties, democratic processes and constitutional rights.

“Those who hold a legal monopoly on violence should be held to the highest standards for its use, not the lowest.” – Ramez Naam

Leave a Comment

Filed under Newsletter

For Your Consideration : 4 Links : 12/5/2014

1. Stop 20th Century Thinking : Or Educating for the Late 1900’s

I like the author’s title but I think something is lost when he uses the phrase “20th Century”. It sounds decidedly too modern. If it were instead “Stop Late 1900’s thinking” I believe it would better carry the weight of the piece.

The model of education that most of us are products of was designed for a different time and for a different purpose. The system was created to benefit industry as much, if not more so, than it was to create a freethinking society.

Technology, contrary to science fiction writers’ predictions, will not replace teachers. It will however change the model of how we teach from the 19th and 20th centuries, which was teacher-controlled and teacher-directed learning to a 21st century model of learner-directed learning. The teacher becomes more of a mentor and co learner with students. When it comes to teaching students in the 21st century I have come to believe that it is more important to teach kids how to learn than it is to teach them what to learn.

A very great disconnect in all of this occurs when we try to use the 21st century technology tools for learning and fit them into the 19th & 20th century model of teaching. I have witnessed English teachers having students do a composition assignment. They had students do a handwritten rough draft, revise it, do a final handwritten copy, and then put it on a word processor without accessing a spell check or grammar check. Those teachers learned that way, and taught that way, and added the technology to their 20th century model of teaching. The tech tool was not used for learning. In their future lives those students will certainly use word processors for any writing that they do. Is it not incumbent on their teachers to teach students how to do it correctly? (Yes, as an adult I effectively use a grammar check and a spell check on everything I write. Most people do, even the really smart ones.)

2. Terms Of Service – Our Role In A World Of Big Data

“The media’s coverage of Big Data is often dense, jargon-heavy and difficult to understand,” said Mark Coatney, Senior Vice President, Digital Media, Al Jazeera America. “Keller and Neufeld’s Terms of Service is a fun, graphic way to cut through the noise and boost the signal.”

Between social media profiles, browsing histories, discount programs and new tools controlling our energy use, there’s no escape.  As we put ourselves into our technology through text messages and photos, and use technology to record new information about ourselves such as FitBit data, what are the questions a smart consumer should be asking? What is the tradeoff between giving up personal data and how that data could be used against you? And what are the technologies that might seem invasive today that five years from now will seem quaint?  How do we as technology users keep up with the pace while not letting our data determine who we are?

3. DeepMind : Designing Intelligence For Google

Artificial intelligence researchers have been tinkering with reinforcement learning for decades. But until DeepMind’s Atari demo, no one had built a system capable of learning anything nearly as complex as how to play a computer game, says Hassabis. One reason it was possible was a trick borrowed from his favorite area of the brain. Part of the Atari-playing software’s learning process involved replaying its past experiences over and over to try and extract the most accurate hints on what it should do in the future. “That’s something that we know the brain does,” says Hassabis. “When you go to sleep your hippocampus replays the memory of the day back to your cortex.”

A year later, Russell and other researchers are still puzzling over exactly how that trick, and others used by DeepMind, led to such remarkable results, and what else they might be used for. Google didn’t take long to recognize the importance of the effort, announcing a month after the Tahoe demonstration that it had acquired DeepMind

Interesting to note that they are still trying to figure out how their system accomplished their remarkable results. How to they improve something they don’t understand? How long before the code of the more sophisticated systems is beyond us and requires another AI to interpret it?

4. We Are All Confident Idiots – The Dunning Kruger Effect

In 1999, in the Journal of Personality and Social Psychology, my then graduate student Justin Kruger and I published a paper that documented how, in many areas of life, incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are, a phenomenon that has come to be known as the Dunning-Kruger effect. Logic itself almost demands this lack of self-insight: For poor performers to recognize their ineptitude would require them to possess the very expertise they lack. To know how skilled or unskilled you are at using the rules of grammar, for instance, you must have a good working knowledge of those rules, an impossibility among the incompetent. Poor performers—and we are all poor performers at some things—fail to see the flaws in their thinking or the answers they lack.

What’s curious is that, in many cases, incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge.

Because it’s so easy to judge the idiocy of others, it may be sorely tempting to think this doesn’t apply to you. But the problem of unrecognized ignorance is one that visits us all. And over the years, I’ve become convinced of one key, overarching fact about the ignorant mind. One should not think of it as uninformed. Rather, one should think of it as misinformed.

An ignorant mind is precisely not a spotless, empty vessel, but one that’s filled with the clutter of irrelevant or misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches that regrettably have the look and feel of useful and accurate knowledge. This clutter is an unfortunate by-product of one of our greatest strengths as a species. We are unbridled pattern recognizers and profligate theorizers. Often, our theories are good enough to get us through the day, or at least to an age when we can procreate. But our genius for creative storytelling, combined with our inability to detect our own ignorance, can sometimes lead to situations that are embarrassing, unfortunate, or downright dangerous—especially in a technologically advanced, complex democratic society that occasionally invests mistaken popular beliefs with immense destructive power (See: crisis, financial; war, Iraq). As the humorist Josh Billings once put it, “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” (Ironically, one thing many people “know” about this quote is that it was first uttered by Mark Twain or Will Rogers—which just ain’t so.)

“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” – Max Planck

Leave a Comment

Filed under Newsletter

For Your Consideration : 4 Links : 12/2/2014

1. Terahertz Scanners Are Now A Real Thing And Graphene Keeps Getting Weirder.

If you don’t know too much about graphene, it is formed by carbon atoms in a single layer only one atom thick, and when submitted to electromagnetic waves, it behaves in a non-linear way, “kind of frequency multiplier. If we make a wave of a particular frequency impinge on graphene, the graphene has the ability to emit another, higher, frequency”, according to David Gómez and Nuria Campos from ITMA Materials Technology.

Until recently, the emission of frequencies in the terahertz band has been accomplished mainly in experimental settings.  In the terahertz band, frequencies are lower than infra-red but higher than those used by mobile phones and satellite communications.

2. The Coming Great Transition Or A View From The Second Half Of The Chessboard

The “automation of everything” has been discussed since the heyday of 50’s science fiction. Self-driving carsfully automated factoriesAI expert systems — the list of labor removing innovations that will be coming down the lane the next few decades is long and distinguished. The simple rule of our future: anything that could be done by a computer (or robot) will be done by a computer. And that means almost everything that currently employs human beings.

Recently, the idea has leapt from the pages of Asimov into conventional awareness. Somewhat astoundingly, the absurdly mainstream international real estate consulting firm CBRE partnered with the China-based property developer Genesis to repeat the theme in their report Fast Forward 2030: The Future of Work and the Workplace. They conclude:

50% of occupations in corporations today will no longer exist by 2025Let that sink in. In a decade, 50% of occupations in corporations today will no longer exist. Yes, new jobs will be created to replace some of those lost. But 50% in ten years? No matter how you slice it, historically unprecedented unemployment is going to be a major part of our future.

3. Getting Better At Getting Better

the biggest change in performance over the past few decades—it’s not so much that the best of the best are so much better as that so many people are so extraordinarily good. In fact, McClusky points out that in some sports, particularly in track and field, the performance curve at the top is flattening out (possibly because we’re nearing our biological limits). But the depth of excellence has never been greater. In baseball, a ninety-m.p.h. fastball used to be noteworthy. Today, there are throngs of major-league pitchers who throw that hard. Although a Wilt Chamberlain would still be a great N.B.A. player today, the over-all level of play in the N.B.A. is vastly superior to what it was forty years ago. There are exceptions to this rule—free-throw percentages, for instance, have basically plateaued in the past thirty-five years. But, as the sports columnist Mark Montieth wrote after reviewing a host of games from the nineteen-fifties and sixties, “The difference in skills and athleticism between eras is remarkable. Most players, even the stars, couldn’t dribble well with their off-hand. Compared to today’s athletes, they often appear to be enacting a slow-motion replay.”

What we’re seeing is, in part, the mainstreaming of excellent habits. In the late nineteen-fifties, Raymond Berry, the great wide receiver for the Baltimore Colts, was famous for his attention to detail and his obsessive approach to the game: he took copious notes, he ate well, he studied film of his opponents, he simulated entire games by himself, and so on. But, as the journalist Mark Bowden observed, Berry was considered an oddball. The golfer Ben Hogan, who was said to have “invented practice,” stood out at a time when most pro golfers practiced occasionally, if at all. Today, practicing six to eight hours a day is just the price of admission on the P.G.A. Tour. Everyone works hard. Everyone is really good.

[ BUT ]

In one area above all, the failure to improve is especially egregious: education. Schools are, on the whole, little better than they were three decades ago; test scores have barely budged since the famous “A Nation at Risk” report came out, in the early nineteen-eighties. This isn’t for lack of trying, exactly. We now spend far more per pupil than we once did. We’ve shrunk class sizes, implemented national standards, and amped up testing. We’ve increased competition by allowing charter schools. And some schools have made it a little easier to remove ineffective teachers. None of these changes have made much of a difference.

4. Of Course It’s Been Done Before (Quoted In Total)

John Koenig calls it vemödalen. The fear that you’re doing something that’s already been done before, that everything that can be done has been done.

Just about every successful initiative and project starts from a place of replication. The chances of being fundamentally out of the box over the top omg original are close to being zero.

A better question to ask is, “have you ever done this before?” Or perhaps, “are the people you are seeking to serve going to be bored by this?”

Originality is local. The internet destroys, at some level, the idea of local, so sure, if we look hard enough we’ll find that turn of a phrase or that unique concept or that app, somewhere else.

But no one is asking you to be original. We’re asking you to be generous and brave and to matter. We’re asking you to step up and take responsibility for the work you do, and to add more value than a mere cut and paste. Give credit, definitely, but reject vemödalen.

Sure, it’s been done before. But not by you. And not for us.

“Computer science is no more about computers than astronomy is about telescopes, biology about microscopes, or chemistry about beakers and test tubes. Science is not about tools. It is about how we use them, and what we find out when we do.” — Michael Fellows and Ian Parberry

Leave a Comment

Filed under Newsletter

For Your Consideration : 4 Links : 11/28/2014

A look at the names of classes taught at ITP is both intriguing and confusing. Some class names are puns, like “Cloud Commuting,” or “Drawing on Everything.” Some are poetic, like “Cabinets of Wonder,” “Sensitive Buildings,” and “Talking Fabrics.” And some are downright incomprehensible, like “Cooking with Sound,” and “Lean Launchpad.” One ITP class, called “Redial,” teaches students how to hack the phone system and requires them to sign a legal waiver before enrolling.

This playful, project-based approach to teaching tech literacy is how ITP itself operates. “I’ve always felt you’re going to get further with whimsy and hope, rather than fear,” O’Sullivan said. The challenge is to get the students to ignore their fear of failure and try as many new things as possible during their four semesters. “The key thing about play,” O’Sullivan said, “is that it makes failure look like a good thing.”

The trend among the students is to take not only what they’ve learned, but how they’ve learned it, package it as a gadget or an app, and release it to the world.

2. How Magic Leap Is Secretly Creating An Alternate Reality
With The Google Glass experiment winding down for an eventual retooling and Oculus Rift almost ready to make it’s appearance, the next generation is quietly toiling away to make something even cooler.

Want.

“It’s not holography, it’s not stereoscopic 3-D,” he says. “You don’t need a giant robot to hold it over your head, you don’t need to be at home to use it. It’s not made from off-the-shelf parts. It’s not a cellphone in a View-Master.”

The best description we have so far comes from the company’s press release: “Using our Dynamic Digitized Lightfield Signal™, imagine being able to generate images indistinguishable from real objects and then being able to place those images seamlessly into the real world.”

In an article that largely flew under the radar, John Markoff of The New York Times actually went to see the technology in person back in July. He wrote that he did indeed see a 3D creature floating in midair, through “an elaborate viewer that resembles something from an optometrist’s office.” It’s big, in other words. Markoff also confirmed that the device projects digital light fields onto the viewer’s retina.

3. Fitbit Takes The Stand As Expert Witness

The first known court case using Fitbit activity data is underway. A law firm in Canada is using a client’s Fitbit history in a personal injury claim. The plaintiff was injured four years ago when she was a personal trainer, and her lawyers now want to use her Fitbit data to show that her activity levels are still lower than the baseline for someone of her age and profession to show that she deserves compensation.

Medical research on the relationship between exercise, sleep, diet, and health is moving extremely rapidly. The decisions about what is “normal” and “healthy” that these companies come to depends on which research they’re using. Who is defining what constitutes the “average” healthy person? This contextual information isn’t generally visible. Analytics companies aren’t required to reveal which data sets they are using and how they are being analyzed.

The current lawsuit is an example of Fitbit data being used to support a plaintiff in an injury case, but wearables data could just as easily be used by insurers to deny disability claims, or by prosecutors seeking a rich source of self-incriminating evidence. As the CEO of Vivametrica, Dr. Rich Hu, told Forbes, insurers can’t force claimants to wear Fitbits. But they can request a court order from anyone who stores wearable data to release it. Will it change people’s relationship to their wearable device when they know that it can be an informant? These devices can give their own interpretation of your daily activity, sleep, and moods, and that analysis may be seen to carry more evidentiary weight than the owner’s experience.

4. Flowing With The Stresses Of Others

Imagine you’re rowing a boat on a foggy lake, and out of the fog comes another boat that crashes into you! At first you’re angry at the fool who crashed into you — what was he thinking! You just painted the boat. But then you notice the boat is empty, and the anger leaves … you’ll have to repaint the boat, that’s all, and you just row around the empty boat. But if there were a person steering the boat, we’d be angry!

Here’s the thing: the boat is always empty. Whenever we interact with other people who might “do something to us” (be rude, ignore us, be too demanding, break our favorite coffee cup, etc.), we’re bumping into an empty boat. We just think there’s some fool in that boat who should have known better, but really it’s just a boat bumping into us, no harm intended by the boat.

That’s a hard lesson to learn, because we tend to imbue the actions of others with a story of their intentions, and how they should have acted instead. We think they’re out to get us, or they should base their lives around being considerate to us and not offending us. But really they’re just doing their thing, without bad intent, and the boat just happens to bump into us.

“Judge a man by his questions rather than by his answers” – Voltaire 

Leave a Comment

Filed under Newsletter