Constant connectivity and pervasive social media often feel like a massive distraction, vacuuming brain power and attention away from more tedious but necessary work and killing productivity. Easy access to Google, especially via mobile devices, can make us reluctant to think or flip through our own memory files, favoring the fast, easy answer. Low barriers to publishing content online mean we’re frequently writing dumbass things we’ll live to regret. Has the always-on Internet made us collectively stupider? (Thanks a pantload, Al Gore.)
A number of “pundits” and “luminaries”—I’m reluctant to use those terms without irony—think just the opposite, that the Internet is making us smarter! (Thanks, Al Gore!) Echoing some of the ideas in Steven Johnson’s 2005 bestseller Everything Bad Is Good for You, which argues that popular entertainment like TV and video games are getting more complex and improving people’s critical thinking skills, Clay Shirky has authored a new book called Cognitive Surplus: Creativity and Generosity in a Connected Age. Yes, surplus, not deficit—we apparently have more brain power than we know what to do with! From the Amazon description of the book:
For decades, technology encouraged people to squander their time and intellect as passive consumers. Today, tech has finally caught up with human potential. In Cognitive Surplus, Internet guru Clay Shirky forecasts the thrilling changes we will all enjoy as new digital technology puts our untapped resources of talent and goodwill to use at last.
Since we Americans were suburbanized and educated by the postwar boom, we’ve had a surfeit of intellect, energy, and time-what Shirky calls a cognitive surplus. But this abundance had little impact on the common good because television consumed the lion’s share of it-and we consume TV passively, in isolation from one another. Now, for the first time, people are embracing new media that allow us to pool our efforts at vanishingly low cost. The results of this aggregated effort range from mind expanding—reference tools like Wikipedia—to lifesaving—such as Ushahidi.com, which has allowed Kenyans to sidestep government censorship and report on acts of violence in real time.
I like the ideas here, but I’m not sure tech has quite “caught up with human potential”—or the other way around. Shirky notes that “Wikipedia was built out of roughly 1 percent of the man-hours that Americans spend watching TV every year.” That’s awesome, but most people I know are still spending more time watching TV than writing Wikipedia articles, and they spend a lot more time checking Facebook than entrepreneuring or do-gooding. Certainly, the Internet empowers some people in thrilling new ways. But have we reached a tipping point yet?
Whether or not we’ve overcome passivity, it sounds like an interesting perspective on where we’re headed. In a brief review, Michael Gray recommends it for anyone who works in “old world media,” “electronic publishing” or “any aspect of social media”: “It offers insight and things to think about, plan for, and use to succeed in a world where more people are creating and competing for attention. With the rise in the amount of published material, especially from amateurs and prosumers, it’s important to understand that more people are producing and sharing than ever before. Make no mistake—there is a lot more competition for eyeballs, and it’s only going to increase.”
Graywolf’s review was the second thing I read this week about the effects new media is having on our lives and brains. The first thing I read was published in the Huffington Post, so you can guess right away that it’s ill-informed, sensationalist, and generally wrong-headed. (If new media is making us smarter, the HuffPo is doing its damnedest to counteract those effects.)
This article, penned by Bill George, a professor of management practice at Harvard Business School, is optimistic about today’s youth, for the most part. According to George, “Young adults today study harder and more often, engage in more community service, participate in greater numbers of extracurricular activities, and hold a more optimistic outlook on the future than any other generation in modern history.” Hmmm, kids these days study harder and more often? This directly contradicts the results of a study I read about a while back:
Using multiple datasets from different time periods, we document declines in academic time investment by full-time college students in the United States between 1961 and 2003. Full-time students allocated 40 hours per week toward class and studying in 1961, whereas by 2003 they were investing about 27 hours per week. Declines were extremely broad-based, and are not easily accounted for by framing effects, work or major choices, or compositional changes in students or schools. We conclude that there have been substantial changes over time in the quantity or manner of human capital production on college campuses.
This leads me to believe Bill George pulled those other “facts” out of a hat as well. (Most likely, he’s thinking of Harvard students specifically, who do have to participate in more extracurricular activities and do more community service in order to get in.) It doesn’t help his credibility that he says Millennials have “grown up on Twitter and Facebook”—as my friend Brian pointed out, Twitter is less than five years old—and describes them as “thirsty to impress.” Say what now? I can’t concentrate with this drymouth.
Unlike Shirky, however, George thinks all this cognitive surplus might be a bad thing:
Despite their collective activity level and propensity for community engagement, this generation may be at risk of becoming too accustomed to constant exposure, of becoming too quick to say: “Got it – on to the next one.” In charging ahead, are Millennials failing to take time to focus and reflect? Are they so caught up in keeping up that they will ignore vital real-life lessons that are needed to gain the wisdom to stay pointed toward their True North?
What are these real-life lessons George advocates? Making time for stuff like yoga, meditation, and prayer—really not where I thought he was going to go with that. Call me crazy, but I expected something … business-related.
So what do you think? Are we getting smarter or stupider? I’ve been interested for a while in a phenomenon known as the Flynn Effect, whereby people do appear to be getting smarter over time, at least by IQ standards. Mean IQ scores have been increasing steadily, generation over generation, for some time.
But do TV and Twitter have anything to do with it? Or is social media actually holding back our progress? All I know is, I could have written this a lot faster without all those TweetDeck notifications.
Have a great weekend everybody!
Please read our Comment Policy before commenting.