Impotence in the Age of Distraction

Ethan Gach

I write about comics, video games and American politics. I fear death above all things. Just below that is waking up in the morning to go to work. You can follow me on Twitter at @ethangach or at my blog, gamingvulture.tumblr.com. And though my opinions aren’t for hire, my virtue is.

Related Post Roulette

129 Responses

  1. dhex says:

    that restlessness is actually a symptom of increasing freedom from family, cultural, and social bonds. people gotta learn how to ride a trike before they can ride a bike. (only gods ride a unicycle)

    i always liked barzun, but i never particularly agreed with him.Report

  2. NewDealer says:

    This is interestingly one area where I feel the far-left and far-right have a lot in common or at least a perceived common enemy. That enemy is technological advancement and the fear of big things. I have a lot of far-left friends and they probably have more in common with Rod Dreher’s crunchy conservatism than either would want to admit.

    The ultimate societies might look very different but the far-left and far-right both seem to want small communities in a pastoral setting.

    I’m a bit more sympathetic to Frazen’s argument than anything else but the far left arguments still baffle me.

    I have an acquaintance from college who posts stuff from communities with names like “Decolonize Your Diet” I have no idea what that even means. I’m a third generation Jewish-American who grew up in the NYC suburbs. Do I need to eat like I am still in the Pale of Settlement? Can I eat Chinese food because they were always a big part of the New York population from my youth onwards or do I get a generic European-American white diet?

    That being said, I am more familiar with the ongoing literary disputes between Jonathan Frazen and Jennifer Weiner and here I generally come down on Frazen’s side. Weiner has two arguments. One is good and interesting and the other is self-serving and completely misses the point and purpose of criticism and the Sunday Book Review. Her good and interesting point is that the Sunday Book Review can cover more female and minority authors and employ them as reviewers.

    Her self-serving and completely wrong point is that the Sunday Book Review should cover more “commercial fiction” Short version: They should review authors like Jennifer Weiner instead of authors like Jonathan Frazen and serious non-fiction. Weiner’s argument seems to be that people are not reading 600 page books about the 1940 Presidential Election from Yale University Press, people read books like her “Good in Bed” or Tom Clancy thrillers. The point of the book review is not cover what is popular but to cover what is good and noteworthy and potentially game changing and will leave a stamp for the ages. It is not a popularity or book sale contest. Also I am old-school and believe that the primary role of a critic is to provide exposure to things that are not easily in the public consciousness.

    The above paragraph gets me labeled as snobby frequently just like Frazen so take it as you will.Report

    • Chris in reply to NewDealer says:

      I started to say that attempting to escape from technology leaves us almost as beholden to it as blindly, which is to say purely practically, following advancing technology, but then a little voice in my head said that wasn’t my idea (though I agree with it), but someone else’s. Then I remembered whose idea it was: he who on the the OT shall not be named lest ye find your comment in the spam folder:

      “Everywhere we remain unfree and chained to technology, whether we passionately affirm or deny it. But we are delivered over to it in the worst possible way when we regard it as something neutral; for this conception of it, to which today we particularly like to do homage, makes us utterly blind to the essence of technology.”

      (Emphasis mine.)

      I admit that I have moments of strong neo-Luddism, out of which I am always shaken by my phone vibrating in my pocket. So I certainly agree that technology is distracting; that blind devotion to it alters our relationship to the world, including to each other and ourselves, for the worse. This, however, is not the fault of technology itself, but of our almost entirely unreflective relationship with technology, which means that we allow technology to drive our wants and needs as much as our wants and needs drive technological progress. Getting out of this cycle, vicious as it has become in our culture, doesn’t mean we have to return to a mule and a plow, though. It just means that we have to take a step back for a moment, now and then, and think (though admittedly, this is precisely what technology often makes so difficult).Report

    • Stillwater in reply to NewDealer says:

      But we are delivered over to it in the worst possible way when we regard it as something neutral

      This strikes me as entirely wrong, but it’s probably so laden with implied content that I don’t understand what he means. And might not agree with anyway.

      I think it’s crucial to look at technology (well, innovation, more specifically) in the most neutral terms possible since in my mind it’s no different than any other basic human activity: art, love, creativity, curiosity, acting intentionally, etc. Or in other words, innovation (the outcome resulting from creativity applied to a purpose more or less) is just something people do, and so is (or certainly can be) viewed entirely neutrally.

      How individuals embedded in societies and economic systems view technology seems like a different topic. The more interesting one.Report

      • Stillwater in reply to Stillwater says:

        {{My greying analytic roots are showing, aren’t they?}}Report

      • BlaiseP in reply to Stillwater says:

        One of our own has just written a book about Faith and Doubt, furnishing three links to separate merchants selling that book. I hope the book does well. I hope Kyle finds a larger audience for his thoughtful musings. Innovation attempts to create new markets for things we didn’t know we wanted.

        What we do with these things, the meanings we assign to them, that’s what MH is trying to address when he says we give ourselves over to tech in the worst possible way. First a toy, then a tool, then a weapon, that’s the usual progress of any given technology. The Internet’s first truly profitable application was porn. Every monitor became a peep show. But photography had done the same thing, skeevy little men would peddle scandalous photographs of Nekkid Leddies — and Gennemens. In surprisingly inventive poses, too.

        MH did a fair bit of grovelling before the Gods of Pseudo-Science in his own time, in the worst possible way, wretched old Nazi that he was. Bit of an Alice in Wonderland, that one. Gave wonderful advice but he very rarely took it.Report

      • Glyph in reply to Stillwater says:

        This may not be at all what the comment means, but having re-read Blindsight recently, the “technology implies belligerence” meme is in my head.

        Even if true, of course it STILL doesn’t make technology non-neutral; but when people react negatively to technological advancement, I think at least some of it may be an inchoate reaction to what they perceive as technology’s motivation.

        It might seem almost too obvious a conclusion. What is Human history, if not an ongoing succession of greater technologies grinding lesser ones beneath their boots? But the subject wasn’t merely Human history, or the unfair advantage that tools gave to any given side; the oppressed snatch up advanced weaponry as readily as the oppressor, given half a chance. No, the real issue was how those tools got there in the first place. The real issue was what tools are for.

        To the Historians, tools existed for only one reason: to force the universe into unnatural shapes. They treated nature as an enemy, they were by definition a rebellion against the way things were. Technology is a stunted thing in benign environments, it never thrived in any culture gripped by belief in natural harmony. Why invent fusion reactors if your climate is comfortable, if your food is abundant? Why build fortresses if you have no enemies? Why force change upon a world which poses no threat?

        Full disclosure – I’m a techno-optimist, mostly. I generally have little patience for Luddism.

        And yet I have thus far remained off Facebook, Twitter, and smartphones (though the last will likely change when this phone finally dies)…Report

      • Chris in reply to Stillwater says:

        Still, no more Carnap or Quine for you!

        You’re right to suspect that neutral means something different than what you describe here. Mr. H means essentially what I was saying about us tending to take an unreflective approach to technology and technological advancement. That said, I think he would both agree and disagree with your characterization of how we should approach technology neutrally. In the essay from which I took the quote (which is pretty short and not at all impenetrable like some of his prose), he ends with a comparison of art and technology, and argues that our stance toward technology should be a little bit more like our stance to art, or there should be a little bit more of art in our technology (which doesn’t mean that we should buy Apple products because they look nicer).

        Put differently, the problem with technology, when approached “neutrally,” which is to say unreflectively, is that it ends up shaping our world in ways that we don’t really have any control over. The problem isn’t technology itself, but our understanding of its nature and its effects.Report

      • Chris in reply to Stillwater says:

        Glyph, that sentiment (and I remember that passage, for some reason) is not too far from H., actually.

        Now I’m just going to post a link to the friggin’ essay:

        http://simondon.ocular-witness.com/wp-content/uploads/2008/05/question_concerning_technology.pdfReport

      • Stillwater in reply to Stillwater says:

        This is from the H paper:

        Where and how does this revealing happen if it is no mere handiwork of man? We need not look far. We need only apprehend in an unbiased way That which has already claimed man and has done so, so decisively that he can only be man at any given time as the one so claimed. Wherever man opens his eyes and ears, unlocks his heart, and gives himself over to meditating and striving, shaping and working, entreating and thanking, he finds himself everywhere already brought into the unconcealed. The unconcealment of the unconcealed has already come to pass whenever it calls man forth into the modes of revealing allotted to him. When man, in his way, from within unconcealment reveals that which presences, he merely responds to the call of unconcealment even when he contradicts it. Thus when man, investigating, observing, ensnares nature as an area of his own conceiving, he has already been claimed by a way of revealing that challenges him to approach nature as an object of research, until even the object disappears into the objectlessness of standing-reserve.

        He concludes

        Modern technology as an ordering revealing is, then, no merely human
        doing.

        Of course! It’s all been revealed to me now!Report

      • Chris in reply to Stillwater says:

        Still, heh… now your greying analytic roots are showing.

        That’s actually a fairly important passage, in that it relates technology to our more primary relationship with Being, but I realize that if I say anymore, you will become even greyer ;).Report

      • zic in reply to Stillwater says:

        How individuals embedded in societies and economic systems view technology seems like a different topic. The more interesting one.

        this.

        I design hand-knitting patterns; before technology gave me an internet platform to sell them, I would not be designing; something that I love. I’m dependent on technology to help seduce others into the ancient world of fiber arts and crafting.

        My husband produces music. He’s got like five albums of his original jazz you can go listen to for free. These amazing recordings could not have been made without the advances in technology that allowed him to digitally edit, mix, and master the recordings; let alone record them in the first place. What did Mozart sound like in Mozart’s time? We do not know.

        My husband also writes computer-music applications; his brother literally invented a new art form, Keyboard Magazine once called him the Mozart of computer music, and bands like Radio Head and U2 thank him on their album covers.

        So technology is what you make of it. You can watch TV or you can make a TV show. You can play video games, or write them. Perhaps this isn’t a problem of technology so much as a problem of not valuing creative pursuits properly; of seeing them as something you do when you’re not at your day job and not watching TV.

        But I already know I’m pretty much in a minority; that distraction is the prime thing for many; thankfully, some people opt to be distracted by the art people make with the software my brother-in-law and husband write, by reading Kyle’s book, by knitting my designs.

        See, it’s really just a matter of perspective. Your distraction is my life’s work.Report

  3. BlaiseP says:

    Oh Tempura, O Morels! Is there anything more twee and triste than a successful book author bemoaning These Modern Times?

    I’ve read Karl Kraus. Good satire is a treat. But like all treats, it doesn’t make for a satisfying meal. Kraus could afford to sit like a noisy crow atop his perch at Die Fackel, squirting ink and shit at all and sundry, supported by his father’s money, made, as irony would have it, in the papermaking business.

    Kraus’ German is exquisitely phrased cruelty in a language well-suited to that genre of writing. He needlessly wounded plenty of people and was among the first to back the ur-fascists. His fans have made much of his later condemnation of the Third Reich but he backed the fascist Engelbert Dollfuß. Kraus had few friends and fewer allies.

    I’m not surprised to see Jonathan Franzen cuddle up to Karl Kraus. There’s a note of mordant self-absorption and the Indignation which would call itself Righteous but is no such thing.

    What is Franzen’s problem? Isn’t the choice of Mac or PC or Linux as much a personal choice as shoes — or books? Anyone who would predicate his distaste for Apple on advertisements has rather missed the point. Franzen’s just being precious. Microsoft Windows has its problems, Lord knows. But it’s just an operating system. It runs software people find useful. Most of its problems are not created by Microsoft but rather by the drivers submitted to them by third party hardware manufacturers. Don’t turn up your nose at Mac People, Franzen. They know what they like. They’re people. Many of them buy your books.

    Franzen’s head has swollen up to the point where he may need some sort of prosthesis to hold it upright, like some HANS device worn by NASCAR drivers to keep their cervical vertebrae intact in a crash.

    Technology comes over the horizon, people welcome it into their lives. Some of it is crap, some of it is interesting. What was deluxe becomes debris. The Enlightenment was fascinated with gadgets. They’d have dinner parties where noted scientists would conduct experiments. Seems like a pretty good time to have lived, all things considered. The Enlightenment swept out tons of old crap. Science was fashionable. I’d love to have met Lavoisier and Franklin and those guys — and girls, women liked scientists back then.

    And what’s with this about Jeff Bezos and Amazon? When it was Borders and Barnes and Noble and all those apostles of mediocrity destroying the local bookstore, where was your outrage then, Franzen?

    Apocalypse, Franzen, is a Greek word. Means to take the lid off a pot or remove a covering. Karl Kraus’ bitter sermons didn’t sell very well in his day and won’t sell in ours, either. There never were any good old days. These days, fraught with insincerity and frothy with empty rage, are no different than those of good ol’ Karl Kraus.Report

    • Jim Heffman in reply to BlaiseP says:

      Franzen seems like one of those people who believes that nothing is real unless he touches it with his hand.Report

    • Jaybird in reply to BlaiseP says:

      Socrates complained about the written word. It would cause the memory to atrophy.

      More recently, a friend was over and we were discussing giving another friend a call but she, sadly, left her cell phone at the house. I grabbed the cordless phone and handed it over and she looked at it and said “what in the hell am I going to do with *THAT*?”

      Truly, our earth is degenerate in these latter days…Report

      • BlaiseP in reply to Jaybird says:

        It seems it isn’t just the Lord lookin’ at how great the wickedness of the human race has become on the earth, and how every inclination of the thoughts of the human heart was only evil all the time. Now we must contend with angst-ridden effetes looking around for the Smite Button like so many drunk party-goers looking under seat cushions for the DVR remote.

        What can we do, Jaybird? What could be done — by anyone — about these people?

        Bunch of us used to go to the grocery store at lunch. Great deli counter, good prices, we’d get sandwiches made up there. So Nate’s the first to approach the door. Proximity sensor must have been busted or something. He stands there, frozen in place, not sure what to do. Peter pushes the door open. “Nate, you’re overcivilised” he said.Report

      • Burt Likko in reply to Jaybird says:

        Socrates was right, of course. The dissemination of writing eroded the powerful oral traditions of generations past such that we now have only fragments of Homer’s works — we know about the lead-up to one battle in the Trojan War and how one guy got seriously lost on the way home afterwards. But there’s a lot more that happened in that myth that we only know of obliquely because no one ever wrote it down because remembering it and retelling it beautifully in recitative was the point of having poets.

        This does not mean that moving to a written from an oral recordation of ideas was a bad thing. Similarly, adopting electronic media will not necessarily be a bad thing either. This is not what Franzen, or Kraus, or our esteemed author, is really talking about.

        Orwell warned, in the most important essay ever written, that people let language do their thinking for them, that language molds thought. When our language becomes compressed to 140-character zingers, our thought molds itself into that mode of expression. Thus our politics descends into “Obamacare will destroy us all!!1!” and “Repugnicans are wors than the mullahs in Iran you can negotiate with Iran LOL!” That’s all the deeper that our ultracompressed instant expressions allow us mental space to think.

        And it’s more than just politics, of course; politics is one aspect of our culture in which the displacement of contemplation by Instathought has eroded something very good that used to exist shared in the community of peoples’ minds and now feels much more uncommon than before.

        Way back in classical Greece, right around the time Socrates was griping that writing everything down would make peoples’ memories flaccid and leave all that great culture and thought behind, some scribe thought “Hey, if we use this new medium to write down all these great poems, then people can enjoy them forever!” They found a way to blend what was good about the new technology with what was good about the old traditions. It’s obvious to the point of triteness to say the same thing can happen, and is happening, with electronic media. It’s not so obvious, and perhaps schoolmarmish thus unpleasant, to point out that there’s not nearly enough of this going on as could, or should.Report

      • Jim Heffman in reply to Jaybird says:

        ” there’s a lot more that happened in that myth that we only know of obliquely because no one ever wrote it down because remembering it and retelling it beautifully in recitative was the point of having poets.”

        And if the anticopyright crowd gets its way, we’ll be right back there, because the only way artists will make a living is through live performance or work-for-hire (which will quickly become “tell me a story about how awesome I am, like the time I fought a tiger, uh, while skydiving, no wait it was a ninja, actually hang on NINJA TIGERS okay that’s BADASS gimme a story about that!” “Anything you want, ma’am. So there was this one time that Hillary Clinton was skydiving, and suddenly…”)Report

      • Chris in reply to Jaybird says:

        Orwell warned, in the most important essay ever written, that people let language do their thinking for them, that language molds thought.

        As he who shall remain nameless put it, “Language is the house of Being.”

        (Noted simply to drive Stillwater even crazier.)Report

      • Burt Likko in reply to Jaybird says:

        @jim-heffman , regarding your fear of the vapidity of for work-for-hire writing, do you disparage the painters of the Italian and Dutch Renaissance for flattering their financial patrons within their works? Despite the pressure to portray their paymasters as more attractive and wholesome than they truly were, these artists nevertheless managed to create sublime works of art.Report

      • NewDealer in reply to Jaybird says:

        @burt-likko

        People often note that it is impressive that I can and do write in full sentences and paragraphs.

        I’ve also had a lot of people tell me I seem formal for speaking and writing in a very slang-free kind of way and this is a bit intimidating.Report

      • Burt Likko in reply to Jaybird says:

        You just keep on keepin’ on, @newdealer . I like you just the way you are.Report

      • Stillwater in reply to Jaybird says:

        “Language is the house of Being.”

        Yaarrrgghhh!!Report

      • Jim Heffman in reply to Jaybird says:

        “Despite the pressure to portray their paymasters as more attractive and wholesome than they truly were, these artists nevertheless managed to create sublime works of art.”

        It’s interesting how you had to put a “despite” in there but you still think it’s a rebuttal to what I said.Report

      • Jaybird in reply to Jaybird says:

        “despite”

        Yeah, I was thinking about that too. I wonder if we’ll talk about Breaking Bad in 10 years. (Granted, we still talk about the Sopranos but that seems to be fading… we talk more about what the Sopranos meant and what it showed was possible. I mean, when *I* talk about the Sopranos, it’s to tell people to watch the first two seasons and then daydream about seasons 3-6.2.)

        The storytelling we do anymore seems to have left sublime behind. We’ve got craft mastery of plot, pacing, character development, swerves, and payoffs but… what’s the story laboring in service *TO*?

        I understand that our entertainments are better crafted than they used to be. I seem to notice that sublime doesn’t show up as often as it seems it used to.Report

      • Kim in reply to Jaybird says:

        Sublime is Arrested Development.Report

      • Mike Schilling in reply to Jaybird says:

        We still talk about The Wire. Not just blathering about what it means but about how Omar’s coming, and Bubbles finally got to go upstairs, and where the fuck is Wallace, String?Report

      • Glyph in reply to Jaybird says:

        @jaybird The storytelling we do anymore seems to have left sublime behind. We’ve got craft mastery of plot, pacing, character development, swerves, and payoffs but… what’s the story laboring in service *TO*?

        I understand that our entertainments are better crafted than they used to be. I seem to notice that sublime doesn’t show up as often as it seems it used to.

        I am dense, because I am not following you. Can you give an example of one you consider “sublime” from way back when, as a point of comparison?Report

      • Will Truman in reply to Jaybird says:

        I think Jim’s premise here is pretty flawed. Even if musicians make their money from shows, they’ll still have CDs. Even if authors make their money from live readings or also being an accountant, they’ll still release their works. Advertising.

        The only vulnerability I see is film. And even then, it would be a question of craftsmanship rather than release. As movies have a lower base-price, there’s not much reason to believe that people won’t keep releasing those, too. I do think something could definitely be lost with movies, though, in a completely copyright-free environment.

        As it stands, though, BitTorrent has been unbreakable and we contrary to what we might have expected, we have more distributable content than ever.Report

      • Cascadian in reply to Jaybird says:

        This thread is frustrating to me. It compels and repulses at the same time. To the extent that it is about technology, I think about my early distrust of GPS or electronic navigation. If you’re a sailor or mountaineer, relying on electrical instruments to keep you alive is a roll of the dice at best. Then I see the stuff they do now in the America’s cup, truly fascinating uses of this stuff.

        When we switch to authenticity and truth and why kids these days….. I just roll my eyes back into my head. Are we still having pissing matches about truth, meaning, and who is real? Really?Report

      • Chris in reply to Jaybird says:

        The idea of art as something other than a craft, like masonry or smithing, is a pretty recent one. The “despite” is wholly unnecessary, because it only makes sense within the fairly Romantic conception of art that we have today, and our conception of art work-for-hire being wrapped up in our experience of art as a consumer product, both of which wouldn’t have made much sense to someone artists or the patrons of artists in the 15th or 16th century.Report

      • Michael Drew in reply to Jaybird says:

        @burt-likko

        Just because you can compose tweets that stupid doesn’t mean that Twitter causes everyone to think that stupidly. Go on Twitter – a lot of people say smart, interesting things. See @CK_MacLeod, for instance. Or @TimKowal or @EliasIsquith. And thoughts that stupid (and short) were regularly composed long before twitter, and expressed in similarly brief statements.

        What really going on is that Twitter is the first public communal personal newsfeed (quasi-blog) – you can go on and see what anyone who doesn’t limit access to their tweets says all in one place. Was there something like it before that I’m not aware of? I actually think it’s a pretty significant innovation. What we se with communication on Twitter is really (potentially) everything all together in one place like we haven’t ever before. Certainly more people are saying more stupid things than before, but that could just be a function of more people saying more things in general. It’s not clear to me that Twitter (or other forms of shortened communication) is actually causing thoughtful people to contemplate less, or to write less thoughtfully in longer formats. It could be happening, but I think the effect we’re seeing is just more communication, not less thought. The critique would be cacophony, not so much mindlessness or thought control (not that I deny that the medium helps to form the message, and then even the thought).Report

      • Jaybird in reply to Jaybird says:

        Quantum Leap! Or, I suppose, most art that felt that it ought to exist in service to showing people at their best, attempting to make the world better, and showing them overcoming adversity.

        The Wire, I suppose, is brilliant insofar as it shows the machine and what happens when the people in the machine (the game?) do nothing but make perfectly logical and reasonable decisions in their own best interest. I understand it’s bleak as heck, though.

        Breaking Bad just finished an amazing story arc about a nice guy with the best of intentions become evil and we get to watch pretty much every step of the way.

        The Sopranos was about a sociopath who killed his best friends, his protégés, his girlfriends… but, hey, he was emotionally complex.

        A show devoted to people doing good deeds and trying to make the world better tends to get dismissed as treacle. I suppose that’s to be expected but… Drug dealers? Mafia bosses?

        Say what you will about superhero stories, at least the “well, you have to understand…” explanations are given on behalf of the folks trying to help rather than the ones making money because it’s all part of the game.Report

      • Stillwater in reply to Jaybird says:

        Can you give an example of one you consider “sublime” from way back when, as a point of comparison?

        I’m with ya there. Sublime is one of those words which seems univocal but ain’t. What constitutes a moving, transcendent, divine-approaching experience is different for everyone. It may even be context dependent.Report

      • Stillwater in reply to Jaybird says:

        Just because you can compose tweets that stupid doesn’t mean that Twitter causes everyone to think that stupidly.

        Yeah, that seems like a confusion to me. From one pov, individual people are just smart or stupid. Twitter doesn’t make them any smarter or dumber. From another, the idea that twitter is corrupting long-form argument presumes that people on twitter would have engaged in long form argument but for the existence of twitter. Is there any evidence of that?Report

      • Glyph in reply to Jaybird says:

        I think maybe you’re just focusing on the wrong genres if you’re looking for that in gangster (Sopranos) and/or noir-pulp (Breaking Bad).

        You want a Quantum Leap analogue, how about the OTHER Walter (Bishop), trying to right his wrongs in Fringe. You want superheroes making great sacrifices to save the world, you got your Buffy Summers and crew.

        If you don’t like fantastical elements at all, go with Friday Night Lights.

        I agree that these dark/bleak ones maybe take up a disproportionate amount of the cultural conversation (though for all the talk about it now, Wire NEVER got numbers while it was actually on, and when I wanted to talk about it with anyone, I had to personally screen it for them); but let’s face it, sci-fi and superheroes automatically exclude some people (or, they cause some people to automatically exclude them is a more accurate way to phrase it).

        Quantum Leap was never a big part of the cultural conversation as I recall. Star Trek eventually was, but it took a long time to get there (and famously pretty much failed initially).

        Game of Thrones still seems like an outlier as far as mainstream acceptance of drama in a fantastical setting (though it’s definitely dark/bleak).

        Everyone can talk about these other shows, even those who are embarrassed by wizards ‘n’ aliens ‘n’ s**t.Report

      • Burt Likko in reply to Jaybird says:

        @michael-drew if the proposition is “People were stupid all along, Twitter just lets us see this more clearly than we did before,” I suppose that’s not something I’d object to all that strenuously. But that’s not what I said.

        Twitter (among other media), not all at once but slowly, corrossively, is one of many factors encouraging shorter and less complex thought. Contrary to @stillwater I think that twitter (among other media) does contribute to an abbreviation and simplification of thought. Just as cloudy academic language contributes to cloudy thought.

        This is not quite the same thing as “twitter makes you stupid.” But twitter does make you brief. Twitter, among other things, reduces the default response to nuanced thought and careful composition to “TL/DR.”Report

      • Burt Likko in reply to Jaybird says:

        In addition: I made a point of indicating that new media can indeed be used in productive, important, and intelligent ways. Reference my praise of scribes who wrote down the Homeric poems that they did — combining what was good about the old culture and the new technology, and my indication that the proposition of new technologies actually being used in such a fashion was so obvious as to be stipulated. I ended my comment by implying, and here I explicate, that this is something that is happening — there’s just not enough of it to redeem the vast bulk of crap from which it rises.Report

      • BlaiseP in reply to Jaybird says:

        Seems to me allowing an idiot to get a Twitter account is rather like giving a six year old a bullhorn.Report

      • Mike Schilling in reply to Jaybird says:

        Twitter (among other media), not all at once but slowly, corrossively, is one of many factors encouraging shorter and less complex thought.

        +1Report

      • Mike Schilling in reply to Jaybird says:

        Twitter is a direct connection between the id and the Internet. It’s a slightly less social use of the fingers than masturbation.Report

      • Chris in reply to Jaybird says:

        Mike, have you ever participated in an internet chat room? A lot of Twitter is basically an internet chat room.

        Also, it works really well with certain cultural traditions that I think you would appreciate.

        Twitter can be a wasteland, but it’s people, and people create wastelands wherever they go. But it can also be fun, smart, informative, and more. If it’s your primary means of communicating with people, that might be a problem, but that’s true of just about any medium.Report

      • Mike Schilling in reply to Jaybird says:

        Never on that held my interest for more than five minutes, no.

        That’s an odd analogy, though. It would never occur to me that internet chat room conversations are of interest to anyone but the participants, or that they’re worth preserving. Twitter is one person broadcasting to anyone who cares to hear (in fact tweets are often cited by third parties), and it’s archived (AFAICT) forever.Report

      • Chris in reply to Jaybird says:

        Mike, it’s only just barely an analogy. It approaches literalness.

        It’s possible to think of Twitter as microblogging, in which case people really are posting things to be broadcast widely, and preserved. Many people use it this way. However, many people use it to have rapid-fire conversations with a group of participants, not intending to communicate to anyone outside of that group, and intended primarily, if not entirely, to be read in the moment. Of course, many people use it as a combination of the two, sometimes throwing things out meant, if not for the world, then at least for their followers any anyone who might happen by (via retweets), and sometimes having conversations with multiple participants for a limited period of time, often on a theme (like some sort of event).

        I’ve mentioned my girlfriend using it to have TV watching parties, and what happens when an episode of Scandal on is mind-blowing. In those cases, it is definitely a lot like a chat room.Report

      • Stillwater in reply to Jaybird says:

        Twitter (among other media), not all at once but slowly, corrossively, is one of many factors encouraging shorter and less complex thought.

        +1

        Heh. It took some long-form thinking to figure out that one.Report

      • Patrick in reply to Jaybird says:

        Anybody who thinks Twitter is the next step in a devolution needs to go back and look at MySpace.

        Because, really, puking crap all over a page is probably worse for nuanced thought than pithy one-liners.Report

      • Mike Schilling in reply to Jaybird says:

        @stillwater

        Thanks for getting the joke.Report

      • Mike Schilling in reply to Jaybird says:

        @chris

        I see. I’ve used Skype for that, e.g. to watch a ballgame “with” people I’m not actually with. Technologies can generally be used for multiple purposes. Usenet, which was a mostly-text chat system largely used for conversations not unlike blog posts + comments, is used today almost entirely for sharing pirated binaries.Report

      • Michael Drew in reply to Jaybird says:

        @burt-likko Actually, what you said was something rather more sweeping even than ‘Twitter makes you stupid’:

        When our language becomes compressed to 140-character zingers, our thought molds itself into that mode of expression. Thus our politics descends into “Obamacare will destroy us all!!1!” and “Repugnicans are wors than the mullahs in Iran you can negotiate with Iran LOL!” That’s all the deeper that our ultracompressed instant expressions allow us mental space to think.

        And it’s more than just politics, of course; politics is one aspect of our culture in which the displacement of contemplation by Instathought has eroded something very good that used to exist shared in the community of peoples’ minds and now feels much more uncommon than before.

        But taking your more modest subsequent explication, “Twitter (among other media), not all at once but slowly, corrossively, is one of many factors encouraging shorter and less complex thought,” as your bottom line, I don’t think we disagree too much here, really. You’re more sure this process is happening, and of its causes, than I am, but I don’t deny that the volume of modern communications places a premium on brevity and directness of message, and therefore on refinement of thought to match those parameters. I don’t deny it, but at the same time, I’m not at all sure I know to what extent it’s really happening. There’s still lots of long-form journalism and other complex writing produced – I know because I link to and read it all the time via Twitter! I simply don’t know what is going on with our capacity and proclivity to entertain and produce complex thoughts, descriptively or causally. It’s just something I don;t know the evidence about at all.

        But I’d suggest that holding Twitter out as a causal factor apart from the broader technological context in any of this is extremely speculative. From where I sit, it seems to me that the form that electronic media have taken on the intent in general has radically changed the informational economics of communication. Suddenly, there is essentially an infinity of individual communications available to us – texts, emails, digital newspaper and magazine articles, Facebook updates, Tweets – while in many of those cases, there’s essentially no additional cost in the delivery of that content associated with increased message length. Ironically, the internet potentially could allow us to communicate at greater length at lower cost than we ever could before, but practically, it causes us to choose to be brief, largely because the increase in number of communications that it also enables is, on net, of more value to us than the potential increase in length of those communications given the enduring constraint, which is our time. Certainly in business – but by and large in personal or political communication as well. Keeping in touch with three friends at a third the length each, we have decided more than not, is at least somewhat more valuable (if maybe not three times) than staying in touch with one friend at three times the one-third-length. Three political viewpoints, if stated succinctly by people whose views are all worth having, can be more valuable than the three-times-less-succinct thoughts of one of them.

        This insight is not new to the digital age – people responsible for making decisions using lots of streams of information have always had to excel at the task of managing the demands on time of the various streams. Twitter isn’t really anything but the logical extension of that basic insight in informational economics. An intelligent person’s Twitter stream consists more than anything else of concise summary statements (just like the ones we offer for each of our articles at this website) of longer pieces of analysis or argument, accompanying a link to said article. From 6am-7pm on any given work day, that’s the bulk of what I see in my Twitter stream. In those hours it’s a utility to allow information analysts of every stripe to promote their products, and to help consumers thereof to choose how to direct their most precious resource. At other hours, and to some extent throughout the day (or maybe the day/night dynamic is less pronounced overall than I suggest here – this may all be more mashed together) Twitter becomes something else. Of course Twitter is always a running commentary on various business, cultural, political, and sporting events. But taking in any intelligent Twitter user’s stream on a given evening or topic, rather than just a single tweet, as a unit of analysis, something very different from the portrait that’s been offered in this thread emerges. There is certainly complexity, but there is also multilateral interaction and reconsideration. The approach to analysis is social and iterative. And frequently discussions become too involved and are taken to longer-form formats. Take that as an indictment of the format if you want, but in fact without the multilateral iterative process (uniquely to my knowledge) facilitated by Twitter, those longer-form offshoots, to the extent they ever existed, would start in a much less-developed conceptual place than they do after the iterative gestation that happens on Twitter.

        That’s been my experience of Twitter – as a source of links to longer-form arguments and informative writing, and as a sight for iterative processing of various and sundry issues of public interest. 140 characters is an arbitrary and somewhat limiting unit of information, to be sure. But, as any time human beings converge in numbers on an adaptive tool for interaction, complexity emerges. There’s a lot of value that I see in that, no matter how much crap it springs out of. To be honest, I have a hard time seeing how the crap that gave rise to this complex value (or that just accompanies it) detracts from, much less outweighs said value for me at all. It’s all free (at least until I click through to try to read some of this allegedly complex prose), and I can choose: I know value from crap one way or another. At a minimum, it’s worth it.

        It’s possible that there is an effect going on in the logistical economics of modern digital communication that encourages brevity and discourages complexity; as I say I don’t deny this. But my contention is that that is more fundamental to the technological developments that have taken place than it is a function of particular utilities like Twitter. Twitter, it seems to me, is just one adaptive technology that has arisen to help people navigate the volume and complexity of the communication that is available on the internet. It allows us to structure the vast, undifferentiated information that cyberspace would otherwise be in a way that we can relate to as social animals, by structuring it in a way that resembles (or is in fact pegged to) our established social networks. Facebook does this too; Twitter, it seems to me tends to expand what Facebook did in structuring information around our real social networks (friends, family, real aquaintences) to our more virtual networks – the constellation of thought leaders, personalities, media organizations & ct. that we have come to invest trust and value in. 140 characters is beside the point. Twitter is as complex as the social world we construct in it is, and may not reduce our capacity for thought any more than that world would have if it had not migrated onto the service. But perhaps it does; I don’t really know. I’m not sure how anyone else feels so comfortable thinking they do.Report

      • Burt Likko in reply to Jaybird says:

        Truly, @michael-drew I’m more interested in your refutation of Orwell’s thesis in “Politics and the English Language” than a defense of Twitter.

        After all, we both agree that a) Twitter can be and at least sometimes is used in an intelligent fashion; b) as a practical matter, Twitter is (often although not always) filled with a bunch of crap that at minimum requires sorting, and c) Twitter piggybacks on pre-existing social networks as a means of enhancing its value. So stipulated.

        But you seem to resist the idea that language molds thought, and therefore that constraints upon language necessarily impose constraints upon thought. My impression of popular media, in particular the massively-proliferated short-form media prevalent to a point approaching ubiquity in our culture, confirms this. Your impression seems different — and this seems to be based upon your use of short-form media as a way to conveniently locate long-form media. To what extent are usage patterns similar to yours practiced by large numbers of people?

        In 1946, Orwell observed that writers could get away with meaningless expressions of foggy thought, and that expressions of language, either inadvertently or cynically, could be used to render the abhorrent palatable and to apply a patina of intellectualism over vapidity. I doubt anyone here would disagree with the proposition that not much has changed in the three generations between Orwell’s day and ours.

        He points out examples of this from both academic and popular media; the academy is singled out for particular obloquy due it its role as the educator responsible for teaching people how to avoid that sort of thing.

        But he goes further to say that the relationship between foggy thought and spongy language is reciprocal, that there is a positive feedback phenomenon as one creates more of the other, back and forth. I believe this to be true. I fight it within myself; perhaps I am guilty of projecting this upon others who are not similarly guilty. Surely, though, I am not the only one to notice quite a lot of foggy thought and spongy language out there.

        The presence of several dozens of good writers and clear thinkers who stand athwart this trend does not refute the existence of the reciprocal relationship between poor language and poor thought. A society permeated with distraction can only enhance this phenomenon: the more blinking sidebar advertisements and sudden noises emanating from a website, the less mental space available for the reader of an opinion piece to expand the mind and absorb what is being argued or the reader of an informative piece to digest and incorporate the data made available — and the greater the effort needed to attain sufficient isolation to do these mental exercises, the fewer number of people will actually accomplish them.Report

      • BlaiseP in reply to Jaybird says:

        You’ve misunderstood Orwell, Burt. Orwell is attacking the worst aspects of hackneyed writing. Brevity may be the soul of wit, provided the writer has any wit — and most don’t.

        Language does not mould thought. Language is the protocol of thought. You read these words, thanks to any number of mutually-agreed upon standards. Twitter happens to have a 140 byte limit. Curiously, Twitter in Japanese allows for wonderfully complete sentences. Several sentences. Each character can carry an entire word: never mind that Japanese requires two bytes for each character. Gets a bit interesting in Japanese, which requires kana modifiers to conjugate verbs. It’s still more efficient.

        I don’t like Twitter. Don’t have a Twitter account. Wouldn’t use it if I had it. Don’t even read tweets if I can help it. But Twitter doesn’t encourage weak thinking. It’s become a podium for weak thinkers, bad writers and much catcalling. It’s a moron magnet. Digital graffiti. Bird shit on the sidewalk reads better than most of these idiotic tweets. Social media is a contradiction in terms. Either I’m writing for someone, as I am right now — for you and those who read this site regularly — or I’m writing bland spec at FK readability 10 for an indeterminate audience. Guess which one I prefer?

        Orwell’s prose is mighty fine. He also breaks a good many of his own rules set forth in his famous essay you’ve cited. Have a look through Animal Farm again, see how florid and sarcastic a good deal of it becomes, all those In Jokes about the early Communists, the Dickensian names. A child would understand Animal Farm at a superficial level, might get a bit confused about the rise of the pigs and their transmogrification into men. Language was the tool of the Communists, “Four legs good, two legs bad.” The pigs were never convinced by any of it. They used it to their own ends, leaving their well-meaning subject animals to sort out what it all meant.

        Clear and concise might be useful for writing spec. I highly recommend it. But it won’t do when trying to write about emotional or political issues. I like to write a precis for all my posts here. Often, I write them first, then wrangle with them as the post grows and sends out tendrils here and there. But a precis isn’t a post. Twitter isn’t modifying the language. If you really hate someone, give his child a drum. The child will love you, the parent will hate you — because he can’t take the drum away. That’s all Twitter is, a great host of idiots doing what idiots have always done, giving rise to the lies which get around the world six times before the truth can get out of bed and put its shoes on.Report

      • Burt Likko in reply to Jaybird says:

        I disagree. The way you express yourself does have an effect on the thoughts you express, reciprocal to the way in which your thoughts motivate your language. Do you express yourself in face-to-face conversation with peers at work or in social situations the way you express yourself in the combox with your peers here? I don’t. I tend to use longer sentences here. I tend to use more composed language here: I alliterate more, I pay greater attention to my subject-verb structure, I take longer pauses to consider what phrase to use next, I use longer paragraphs. Indeed, I’m a different writer here than I am in my legal writing, again because of constraints imposed on my use of the language when submitting a legal brief. I’m willing to bet that most people who write and comment here do so in ways different, perhaps subtly so but still different, than they do in other kinds of interactions.

        Is it fair to say that certain kinds of coding are better-suited for certain tasks than others? I don’t even pretend to know the particulars, but it seems a reasonable assumption to say that C++ might be better than Java for some things and Java might be better than Visual Basic for others. I’m willing to bet that a competent programmer can make any programming language do any thing, but that certain languages are better than others for the job. Isn’t it easier to envision and plan the coding in one language than another? I’d expect that of all people, @blaisep , you are in a particularly good position to suggest why this kind of code is better than that kind of code if what you want to achieve is this result because it’s easier to think through how to get the desired result that way. The analogy seems obvious.

        So respectfully, I’ve read Orwell’s essay correctly: language does mold thought, and that’s why he railed so against lazy writing. His discussion of the language used to describe the remnants of imperialism staining Britain’s honor in the post-war years illustrates well not just lazy or hackneyed writing but the lazy and hackneyed thought that this sort of language created. Horrific reality is domesticated for pleasant consumption at the tea table:

        Things like the continuance of British rule in India, the Russian purges and deportations, the dropping of the atom bombs on Japan, can indeed be defended, but only by arguments which are too brutal for most people to face, and which do not square with the professed aims of the political parties. Thus political language has to consist largely of euphemism, question-begging and sheer cloudy vagueness. Defenseless villages are bombarded from the air, the inhabitants driven out into the countryside, the cattle machine-gunned, the huts set on fire with incendiary bullets: this is called pacification. Millions of peasants are robbed of their farms and sent trudging along the roads with no more than they can carry: this is called transfer of population or rectification of frontiers. People are imprisoned for years without trial, or shot in the back of the neck or sent to die of scurvy in Arctic lumber camps: this is called elimination of unreliable elements. Such phraseology is needed if one wants to name things without calling up mental pictures of them.

        The ambiguity Orwell left to the reader’s supposition was whether this was the result of intent or negligence, although my read on his implication was that some political leader used that language intentionally and all the rest of the writers simply followed along, because it was easier and more pleasant than thinking clearly for themselves. Orwell calls for clarity of language so as to induce clarity of thought.Report

      • BlaiseP in reply to Jaybird says:

        Let’s stipulate to two things: we both write for our respective audiences and we are obliged to confine our utterances to these contexts. I say it’s the audience which determines how we ought to write. You say thought moulds itself into a mode of expression, thus reducing politics to shibboleths and simplistic phrase mongering. This was preceded by Socrates’ complaint in Phaedrus about writing, which I have always felt was tongue in cheek.

        Do you remember the context of that conversation in Phaedrus? It’s between Thoth, the god of writing and Thamos, the king, who says:

        you who are the father of letters, from a paternal love of your own children have been led to attribute to them a quality which they cannot have; for this discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

        The whole of Phaedrus is about how Lycias is so eloquent he can convince people that black is white and non-lovers are preferable to lovers — and how poor Phaedrus can’t remember just what Lycias said. This, despite Socrates’ cracks about how Lycias had repeated the whole thing several times and Phaedrus ended up reading the book, looking at the parts he most wanted to see.

        But the Egyptians put great stock in texts as magical spells. Words had power to them. Were we to ask an Egyptian of Thamos’ day, he’d agree with you: thought is moulded into expression. Egyptian art and typography didn’t change for a thousand years. They were the words of the gods, not to be trifled with. The language changed but the orthography didn’t. That’s why it didn’t survive. It fossilised.

        I’ve always joked that the C language is just a good macro assembler, that is to say, a short step up from machine instructions and data they operate on. Ultimately, no matter which toolchain and language is used, all code is reduced to instructions and data. Java routinely deprecates functionality, replaces it with other, sounder application programming interfaces. Java, VB, all these languages develop frameworks around them. Periodically, these frameworks are discarded and others replace them.

        The same cannot be said of human language. Humans are constantly screwing around with words and their meanings. This is why I hate Political Correctness with such intensity: I might agree with its goals but despise its sleazy trifling with meanings, Orwell’s phrasing: “euphemism, question-begging and sheer cloudy vagueness”. Yours is a craft of words with actual meanings.

        I won’t stand for misuse of language, any language. Words mean things. I might stipulate to language moulding thought — limiting “language” to the English language and, say, the Japanese language. We are confined to the toolkits our language has given us, toolkits we didn’t invent but we’re modifying and handing on to others, as surely as the storytellers passed on their craft. These do shape thought. But Twitter shaping thought? Twitter is a fine tool for simple people writing simple things for other simple people.Report

      • Michael Drew in reply to Jaybird says:

        @burt-likko

        I’m flat-out not in intent or effect resisting or refuting Orwell’s thesis. Language molds thought. Stipulated. I’m questioning whether you actually have a handle, at all, about exactly how that is happening on Twitter, based on what look to me like armchair observations and reliance on one very venerable and formidable but ultimately unscientific thesis offered by a journalist sixty years ago.Report

      • Stillwater in reply to Jaybird says:

        Blaise wrote: Language does not mould thought.

        Burt wrote: So respectfully, I’ve read Orwell’s essay correctly: language does mold thought…

        FWIW, I think Orwell’s message is somewhere in between: that the misuse of language coupled with deference to power and the powerful can corrupt thought, and in some interesting cases in fact does so.Report

      • Chris in reply to Jaybird says:

        Language definitely moulds thought, though it is not the only factor. This is an empirical fact (and there is a ton of research over the last 60 years to back me up). It does so in many ways.

        It’s also certainly the case that repeatedly thinking in a certain way (e.g., in order to express one’s thoughts in 140 characters or less) affects one thinking, but I’m not sure it does so in a way that limits other ways of thinking. That’s an interesting empirical question. Maybe I’ll do some research.Report

      • Stillwater in reply to Jaybird says:

        but I’m not sure it does so in a way that limits other ways of thinking.

        If I had a dog in this debate that’d be it. It might, but I’m not convinced that it does.

        (Unless we view the mere existence of those types of communication as some sort of corruption or deviation or “molding”.)Report

      • BlaiseP in reply to Jaybird says:

        In fairness to Brother Likko’s point, and Orwell’s, by proxy, let’s suppose we sent a particularly offensive racist to a re-education by labour camp, in the classic mode of the Chinese laojiao. It’s what they do with petty crooks, political dissidents and the like. Force feed them words, no trial, the incident which sent them there isn’t even treated as much of a crime.

        The laojiao works. People don’t say things to irritate their masters after a stint in one.

        Have we really cured the racist? Of course we haven’t, any more than the laojiao did anything but create a new form of crypto-Chinese, what the Irish call the cant, thieves’ slang. The racist would come up with new terms, as kids do in school when told not to say a word.

        Twitter is giving us new terms. But I don’t see how anyone’s yet demonstrated how it’s changing how people think.Report

      • Glyph in reply to Jaybird says:

        It’s also certainly the case that repeatedly thinking in a certain way (e.g., in order to express one’s thoughts in 140 characters or less) affects one thinking, but I’m not sure it does so in a way that limits other ways of thinking. That’s an interesting empirical question. Maybe I’ll do some research.

        Check and see if the Japanese suddenly forgot how to express ideas in longer or more nuanced ways once haiku became fashionable. 😉Report

      • Cascadian in reply to Jaybird says:

        Are people really expecting War and Peace in 140 characters? I use twitter exclusively to keep track of ski racers and their limited dialog. Sometimes the tweets refer to longer blog entries. Usually, it’s check out the angles I got training in NZ. I don’t think twitter is supposed to replace long form literature.Report

      • Mike Schilling in reply to Jaybird says:

        Are people really expecting War and Peace in 140 characters?

        Lots of Russians go to a party. Napoleon invades. Pierre grows up. #classicReport

      • Cascadian in reply to Jaybird says:

        MikeS Well done.Report

      • BlaiseP in reply to Jaybird says:

        “OMG Pierre B tied a cop to a bear!”Report

      • Burt Likko in reply to Jaybird says:

        Armchair observations and debates? That’s what a blog is, @michael-drew ! I’m not soliciting peer review for a quantified social science thesis which I’m publishing in a bid for tenure on a university’s faculty. I appreciate your saying “I have had experiences that differ from your observations.”

        If your wish is to elicit from me a concession that I have cited no academic studies as evidence, then you may consider that concession made, concurrent with a request for a similar concession back, for whatever that concession is worth. I see no academic studies cited or referenced on your side of the exchange, either. It’s no more or less scientifically illuminating in a (hopefully still friendly) exchange of disagreeing opinions for you to say that “These three guys are good thinkers and they have great twitter feeds” than it is for me to say “Most people are sloppy thinkers and their tweets are mostly crap.” Both can be true, and both likely are true.Report

      • Burt Likko in reply to Jaybird says:

        @mike-schilling Very impressive. Sure to be plagiarized the next time someone wants to rant about Twitter.Report

      • Michael Drew in reply to Jaybird says:

        I’m looking for more than concession that you haven’t published scientific research about this. I’m looking for you to concede that knowing what Orwell said about language and thought and being on Twitter for a while isn’t enough for you to have figured out how Twitter is affecting people’s thinking. And I’m looking for a clarification from you that having that said to you is not denial of or resistance to or refutation of Orwell’s basic claim that language molds thought. Orwell doesn’t resolve any questions about how Twitter might be affecting thought; he just observes that it must, given how it structures language.Report

      • Michael Drew in reply to Jaybird says:

        As to my not citing,my main point in this has always been, I don’t think you really know what’s going on – and I’m certain I don’t know. I hope I don’t have to cite for that. My objection is to your heavy reliance on authority to cram absolute claims that no one is really denying down my throat in an attempt to dispel my raising of doubt about your thesis. What you’ve been up to is really, “You wouldn’t cross ORWELL, would you? Well, ORWELL said language molds thought. Therefore Twitter must be doing to thought what I’m saying it does.” But Orwell didn’t, and couldn’t have, provided any real insight about that. He said what he said, and it didn’t speak to the specifics of how Twitter affects people’s thinking.Report

      • Michael Drew in reply to Jaybird says:

        And, yes, still friendly. Though that is how you are currently using Orwell.Report

      • Burt Likko in reply to Jaybird says:

        Orwell doesn’t resolve any questions about how Twitter might be affecting thought; he just observes that it must, given how it structures language.

        Ipse dixit: Twitter necessarily affects thought. If Orwell’s postulate about the reciprocal relationship between language and thought is valid, then use of Twitter must, in some way, affect the thought of its users, as you’ve just indicated. So we know that a generalized, incremental intellectual change is underway.

        Whether that change is net beneficial, net detrimental, or lateral is necessarily going to be a matter of opinion informed largely by personal experience and observation, at least until someone can point to some sort of quantification arrived at through a rigorous survey process, which to my knowledge has not yet been attempted by anyone anywhere — in part because the qualities we are discussing are not readily quantifiable. Absent quantifiable data, we’ve little else other than theory (as exemplified by Orwell) and experience upon which to form opinions.

        My experience and observations have been that Twitter erodes thought by abbreviating it, and I opine based thereon that it is a net detriment. Your experiences have apparently been different than mine, so you have formed a contrary opinion, or in the alternative have refrained from offering an opinion at all. An observation that we’ve had differing experiences and therefore have differing opinions seems to me a fine endpoint to the exchange.

        So I’ll not concede as you request, @michael-drew , though you may accuse me of making an insufficiently supported claim should you wish to do so (and I understand you to have made that accusation already). As for me, I pronounce this horse well and truly dead, and shall therefore cease beating upon it.Report

      • Cascadian in reply to Jaybird says:

        Burt, I guess the right answer is to say that Orwell was right in one instance but that he’s not really the authority on language games. Because his notion of language was too narrow we can’t really trust his analysis any further than a handful of situations. Is brevity detrimental? It really depends on the game note to comment introducing Haiku somewhere in this thread.Report

      • Stillwater in reply to Jaybird says:

        Lots of Russians go to a party. Napoleon invades. Pierre grows up. #classic

        Awesome.Report

      • Michael Drew in reply to Jaybird says:

        @burt-likko
        So you will concede that you have only your own limited interaction with the service (I also consider min limited and I am fairly certain it is more extensive than yours) and the thoughts of Orwell to go on, and that conclusions and opinions going from there are speculative in the abence of systematic research into the question (I think you have conceded that), and you note that I have had considerably different experience with the service than you have, but you will not concede that you may not have figured out how Twitter is affecting people’s thinking (that is what I asked), that perhaps you have merely formed guesses and opinions about it. Glad we’re clear.Report

      • Michael Drew in reply to Jaybird says:

        Ipse dixit: Twitter necessarily affects thought. If Orwell’s postulate about the reciprocal relationship between language and thought is valid, then use of Twitter must, in some way, affect the thought of its users, as you’ve just indicated. So we know that a generalized, incremental intellectual change is underway.

        Even this is a leap. We really don’t know anything more from Orwell than that, because Twitter is a new format for language, it is likely changing thought in some way. It may not be generalized; it may be highly variable. It may not be incremental; it may be discontinuous. I get the feeling that you are using these words to mask the reality that you simply know no more about what Twitter is doing than what Orwell’s postulate would tell us about any change in the language or its use: that it’s likely changing thought to some degree, in some way.

        By “generalized” and “incremental” I think you just mean that, beyond knowing that Twitter must be changing thought at least a little because Orwell said so (and I agree with that), we (including you) don’t know a damn thing about that change. “Generalized” means that there’s nothing we can say about it qualitatively right now that we want to commit to, and “incremental” means we don’t want to commit to any claim that it’s a big enough change to be measurable right now, or about when it will become measurable.

        Twitter affects thought because all innovations in language must, by Orwell’s claim, affect thought. Ipse dixit. That gives us precisely zero reason to think we know anything specific about how Twitter is affecting thought as compared to the way any other change in language has affected thought over the years. Without further evidence, greater alarm over the effects of Twitter on thought, compared to the effects of previous changes in language on thought, based on Orwell’s observation and a non-systematic observation and analysis of Twitter, is just selective, and, beyond latitude for normal reasonable speculation and hypothesis-forming, baseless. You can’t say that you’re doing reasonable speculation and hypothesis-forming, and not concede that you haven’t necessarily figured out how the thing you’re speculating about works.Report

  4. Stillwater says:

    First off Ethan, this is top-notch prose. Really. You’re a helluva good writer.

    Next, you wrote

    Franzen accuses modernity of having submerged America in a sea of techno-capitalist inspired, existential “restlessness,” the only escape from which is distraction and forgetfulness.

    I’ll echo ND above in saying that the type of worry Franzen is expressing here can be broadly labeled “conservative” but is shared or felt by some liberals as well. I know in my own life I’ve often identified as a Luddite, not because I refuse to use or embrace or accept technological change but because I quite often do so only reluctantly, recognizing that while each innovation adds to productivity and efficiency it also destroys many of the cultural paradigms and economic-lifestyle choices people accept and identify with. This is one area Will Truman and I seem to widely agree about, even tho we both self-identify (loosely, I think, in each case) with different political parties.

    But it’s a worry without a resolution, it seems to me. Just something to note and maybe lament.Report

    • NewDealer in reply to Stillwater says:

      I think it is felt by a lot of liberals and is probably on of the firmer basis for the neo-liberal v. progressive split.

      Neo-liberals like Matt Y seem to be all about “markets yay” and “technological advancement yay” They do not seem to put much thought into the negative externalities of technological advancement and the increasingly jobless society. Matt Y is simply too invested in technological advancement and geeky engineering to consider the human aspect. He also has a bit too much of Star Trek in him.

      I’m not opposed to technological progress but just think it should always be coupled with a real and serious discussion on how to mitigate the damage it does to people economically. There is a point in the fact that Kodak employed over 100,000 people at their height but instagram had less than 40, possibly less than 30 when they sold to facebook.

      We seem somewhat incapable of this though. It generally seems to be full-speed ahead and damn the torpedoes and creating a narrative that people who do this are brave and free. I loathe the word disruption.Report

      • Ethan Gach in reply to NewDealer says:

        Matt Y.’s answer to what modern markets have wrought is basically: de-regulate and subsidize housing, and let all the displaced workers build stuff.Report

      • greginak in reply to NewDealer says:

        I agree although one reason a i hold on to being a bit of luddite is not just the negative effects of tech but also that quite often new fancy tech doesn’t actually do anything much new or much better. I’d love to see real and ubiquitous 3-d printing since that could really create many new possibilities. The newest comp processor or social network or highest def screen at the most is a very minor tweak to my life or more likely wouldn’t even be noticed. I mistrust the ad people and marketers more than the tech.Report

      • dhex in reply to NewDealer says:

        “There is a point in the fact that Kodak employed over 100,000 people at their height but instagram had less than 40, possibly less than 30 when they sold to facebook.”

        only in the sense that back when every man wore a suit, even to work in a factory, there were a lot more employed tailors than there are now.

        i mean, i see this pop up a lot and i’m not sure what it’s saying other than “two different industries doing two wildly different things founded in different centuries have two different results when it comes to workforce size” which is perhaps less sage than the nugget intends to be.

        it’s bizarrely conservative, a sort of liberal “remember when women cooked and minorities called you ‘sir’?” thing. back when everyone had three jobs in every chicken and pot in every american made pair of high tops.Report

      • Jaybird in reply to NewDealer says:

        Conservatives want to live in the 1950’s. Liberals just want to work there.Report

      • BlaiseP in reply to NewDealer says:

        No, we don’t want to work in the 1950s. Neckties, Jaybird. Suits. Everyone had to wear a suit. The clothes were awful. The shoes were worse.Report

      • Mike Schilling in reply to NewDealer says:

        I would love to go back to the 20s and see some baseball, but not in a suit-and-tie in New York or Chicago during the summer (this was before night games.)Report

    • NewDealer in reply to Stillwater says:

      Though I would say that I don’t go as far as Rod Dreher or my far-left friends who seem to think of the ideal society as a commune.

      I’m not a ruralist or a pastoralist. I don’t fear big things and often love them. My farther-left friends seem to dislike big cities as much as Dreher.Report

      • Stillwater in reply to NewDealer says:

        No, it seems to me communes aren’t sustainable unless everyone is monetarily broke and have no aspirations to acquire material stuff. The only ones with any longevity I’m familiar with (anecdotally!) are groups of primarily young people – often students -living on land owned outright by a benefactor who permits (encourages, really) people to live there and garden and whatnot.

        That’s Boulder for ya.Report

      • greginak in reply to NewDealer says:

        Living a pastoral life often requires some sort of city in the distance where you can go to stock up at Sam’s Club/Costco, get spare parts for machinery, and have a large number of people around you can sell stuff to. People in rural areas who hate cities end up benefiting from the city and using it for the needs they can’t meet. I’ve known people who tried the living in the Bush completely on their own. Very few do it. If they do, they either live a close to stone age life for a few years until giving it up or they work in cities or the oil patch to make lots of money to afford their natural life.Report

      • NewDealer in reply to NewDealer says:

        Stillwater,

        There are Kibbutzes but those are dying and we weird because they were also strongly involved with nation building and creating a new identity in ways that other communes are not.

        Otherwise I agree.Report

      • LeeEsq in reply to NewDealer says:

        ND, leave Israeli history to me. I know more about. A lot of kibbutzes ventured into industralization into the 1960s as a way to supplement their agriculture income. Not exactly pastoral. Even the agricultural kibbutzes weren’t pastoral places. Many of them were very enthusiastic about the latest techniques. During the 1980s and 1990s a lot of them privatized, meaning they become less communal and more like regular villages.

        The kibbutz isn’t even the main form of rural municaplity in Israel. Since Independence, the main form was the moshav, the cooperative village rather than the communal kibbutz.Report

      • Glyph in reply to NewDealer says:

        I’ll form a commune with you guys, but nobody better touch my stuff.Report

      • Jaybird in reply to NewDealer says:

        Lighten up, Francis.Report

      • BlaiseP in reply to NewDealer says:

        “Communism will never work. People want to own stuff.” – FZappaReport

    • BlaiseP in reply to Stillwater says:

      Which cultural paradigms has technology destroyed? Handcrafted this-‘n-that has been doing very well these days, aided and abetted by the likes of Etsy and eBay, even Amazon. It’s so trivial to set up an online store now, I could point anyone desirous of selling their artful goodness to a half-dozen good sites immediately. I know a gay couple who make a rather nice living, just selling antiques they find around St Louis area on eBay. Doin’ what they love, saving wonderfully fine things from oblivion.

      It’s done great things for the music business, removing many barriers to entry. It does mean artists have to tour a lot more, but the money’s pretty good there, too.

      Technology isn’t all about productivity and efficiency. It’s proven a mighty conduit to artistic and intellectual freedom. I don’t like Facebook, refuse to use it. Think it’s creepy how much they want to know about you. Got it to stay in touch with my kids, now it sends me false emails about how I’ve been tagged in this ‘n that picture, pokes I never got. Some tech is intrusive. It’s also ubiquitous. I’m ready to get rid of Facebook entirely.

      There is a tendency to stare into the Void of our viewscreens these days. But what are we looking at, precisely? How much has changed in terms of what we want from life? If tech has given us Akihabara, the geek’s paradise, it’s also given the New Luddites cause for celebration: handcrafted raku pottery at great prices.Report

      • NewDealer in reply to BlaiseP says:

        Decent employment and a mass middle class.Report

      • North in reply to BlaiseP says:

        @NewDealer: Dial up a new world war and a new anti-market ideology to devour a large portion of the world and we can get back to those employment and economic metrics you’re pining for here (so long as neither the war nor the ideology happen here mind). Otherwise we need to figure something else out.Report

    • Chris in reply to Stillwater says:

      I meant to second Still’s praise for the writing in the OP. But I forgot, because I’m an ass like that. So consider it seconded now.Report

  5. Jim Heffman says:

    Perhaps it’s actually a brilliant art piece, a modern rewriting of Catch-22. “Modern people are vapid and stupid! And one of the ways you know they’re vapid and stupid is how they get all offended when you call them vapid and stupid!”Report

  6. Jim Heffman says:

    ” Does Facebook become a tool for meeting up with people or a substitute for doing so?”

    Yes, of course it does. Instead of seeing some people once every couple of years, when I get the opportunity to make a cross-country trip at a time when they’ll be at home and available to hang out, I now hear from them every day.

    The issue is not that Facebook ruins human relationships; the problem is that people like Franzen can’t conceive of human relationships existing beyond the length of their arm.Report

    • Chris in reply to Jim Heffman says:

      But ask yourself how this changes the relationship. When I used to see my old friends from Tennessee once, maybe twice a year, our conversations sitting around a table all night, drinking beer and talking about our lives, were often quite intense, therapeutic even. Many of the troubles of our lives came flowing out: it’s not working with my wife or girlfriend, things have gotten difficult with my parents because of my choices, my job is going nowhere, and so on, along with the good stuff as well: I’m really in love, I can’t believe I’ve found someone like this, this new job is a God-send, had a great vacation, etc.

      Now almost all of our communication consists of Facebook updates about our vacations, or the silly things our kids say, or the crazy shit that happened in the grocery store checkout line. When we do see each other once or twice a year, all that superficial bullshit (and it is often, in fact, bullshit) gets in the way of real depth.

      I’m sure this isn’t true of everyone, and perhaps your relationships have gotten better, more intimate, than they were pre-Facebook and email and the like, but I’ve seen it enough to know that it’s not uncommon.Report

      • Kazzy in reply to Chris says:

        @chris

        The difficulty in comparing pre-Facebook relationships and post-Facebook relationships is that, for most of us, those two periods also represent vastly different periods in our lives. All of my pre-Facebook relationships occurred during high school and college. Post-Facebook involves college and a continued descent into adulthood. It is difficult for me to say how much of the shift in my relationships is due to growth and change and how much is due to Facebook.

        I suppose I could compare people whom I am connected with on Facebook and those who I am not, but even that won’t be perfect because the way we connect in real life still may or may not be impacted by FB.Report

      • Chris in reply to Chris says:

        Kazzy, that’s true, though for me, Facebook came about when I was well out of high school, and I and my friends were mostly relatively late adopters. On top of that, I haven’t seen this dynamic with my friends who don’t use Facebook at all, or, like me, don’t post very much on it (I mostly just look at my family’s pictures).Report

      • Jim Heffman in reply to Chris says:

        “ask yourself how this changes the relationship.”

        It allows it to exist.

        You imagine that relationships are some kind of deep soulful heartfelt staring-into-the-eyes communion between two long-distant souls. My experience is that if I only see someone once or twice a year, we’re basically strangers.

        “When we do see each other once or twice a year, all that superficial bullshit (and it is often, in fact, bullshit) gets in the way of real depth. ”

        See, that’s not a problem with Facebook, that’s a problem with you being unable to handle bullshit. If you saw these people every day there would be the same amount of bullshit.Report

      • Chris in reply to Chris says:

        Jim, as someone who’s spent most of his adult life around psychologists, I am an expert at handling bullshit.

        But if relationships just become that?

        Anyway, you didn’t really address what I said, just went snarky. I’ll formulate a response when you have something to respond to.Report

      • Will Truman in reply to Chris says:

        FWIW, @chris , I have found that Facebook actually helps get past the BS. I keep in touch with people on Facebook and then when I get home, I am already caught up on what’s been going on in their life and we can talk about more interesting stuff.

        What I mean is, I know my friend changed jobs from a big law firm to a small one. I can ask more specific questions (“Has there been a culture shock?”) or I can bypass it entirely since I already know that he still has a job, he seems to be liking it, and so on.

        (Note, I don’t really have time to follow threads generally today, so if you respond, would you might doing the @ thing?)

        Or maybe that’s still in the realm of stuff you are complaining about? If so, I’m trying to think of what conversations you would consider “quality” ones and how Facebook detracts from it.Report

      • Jaybird in reply to Chris says:

        I have more facebook friends than friend friends. My assumption is that facebook is what extroverts are like.Report

      • Chris in reply to Chris says:

        @will-truman I’m mostly speaking of the people you don’t see very often, but you always make sure to spend some time with when they’re in town, or you’re back in town, or whatever.

        You’ve probably read about the research suggesting that people who post a lot on Facebook, and who almost always post positive stuff on Facebook, tend to misrepresent their lives, with the result being that everyone thinks that other people have more perfect lives than they do. I basically mean that dynamic, which isn’t unique to Facebook of course, but which is magnified by it, gets in the way of more personal relationships, because once you’ve made it look like everything’s hunky-dory, it’s difficult to have conversations about reality.

        Like I said, I don’t think this is true of everyone, but I’ve see it in my own life and the lives of others enough to know that it’s a thing. Facebook encourages a fairly shallow level of engagement, perhaps beyond what we would have with many of the people we would otherwise not even keep in touch with without Facebook (they might not even know where I am now, much less that today it’s kind of hot in Austin), but below the level that we might have with some of them.Report

      • Chris in reply to Chris says:

        I can imagine this scenario happening more than occasionally across the universe:

        A bunch of people who have a bunch of Facebook friends, many of whom even live in their towns, spend their Friday evenings by themselves, for lack of anyone to hang out with, but post on Facebook that they had an awesome Friday night out. The reason they lack anyone to hang out with? They all think that the others are already having awesome times out on Friday night.Report

      • Kim in reply to Chris says:

        Am I the only person here not on facebook?
        Such a luddite!

        Categories of friends:
        1) People you wouldn’t screw over without a damn good reason.
        2) People you’d break the law for.
        3) People who you’d compromise your morals for.Report

      • zic in reply to Chris says:

        On Facebook:

        I’ve been doing an art project based on my facebook feed. I’m a dye artist, which means I hand dye yarn/fleece, etc. for other projects, and this includes several techniques called painting, where you combine colors to make interesting color-changing fibers.

        I started curating beautiful colored photos people were posting on a single page to use as the basis of dye projects. I made the page public, but I did not promote it in any way, other then to invite facebook friends who I actually work with when I dye to the page. I’d occasionally post a picture of the fiber that we dyed based on the original photo with the photo.

        There are people from all over the world posting there now, sharing pictures that inspire them, searching through the pictures, and occasionally posting a photo of their own results.Report

      • Will Truman in reply to Chris says:

        @chris Thanks for the clarification. That’s not really my experience at all. Some of this I think is due to the fact that my Facebook cohort is out of the ordinary. I will say things like “The annoying thing about Facebook is how many people do nothing but talk about how crappy their lives are” and people look at me like they must have misheard what I’ve said. So I recognize that part’s unusual. Even so, I do see people coordinating events over Facebook. “I’m going to the hockey game Friday. Anyone want to come with?” I’m not sure how unusual that part is.Report

      • roger in reply to Chris says:

        Kim,

        I have considered writing a guest post on the amazing family missing the Facebook gene. My wife and I just fail to get it. That might be expected as we are in our fifties. My wife even took a class on it to try to grok what the hype is about. Her findings are that the more she learns about it the more she agrees it is a waste of time about where people ate or how cute their cat is.

        More unusually, our kids don’t get Facebook either. They both see it as the equivalent of the useless personal stuff we share with casual acquaintances before the real discussions start.

        “What’s new?”
        “How is the weather out there?”
        “How’s yer momma and them?”
        “How is Fluffy doing?”

        They must have gotten our bad genes.Report

  7. Jim Heffman says:

    “how many of them have devoted a #slatepitch or Wired cover story to what that homeless person thinks of making faster Internet speeds the cornerstone of the Great Society 2.0?”

    Well, I remember how all those people at Ocupy Wall Street rallies had smartphones. And how people critical of the movement said “how bad off can you be if you can afford a smartphone?” And the response was, usually, “smartphones means internet connectivity, and that’s just something that people have to have these days”.

    Meaning, yes, it does matter to a homeless person that faster internet speeds are part of The Great Society 2.0.Report

  8. Burt Likko says:

    Where I’m having some issue with is the word “modernity.” By this, we appear to be discussing a world filled with highly advanced electronic gadgets. Which I suppose is a fine definition. But there are other, equally fine definitions.

    “Modernity” to me signifies not so much the advent of technology as the advent of a particular mode of thought. The technology with which the underlying essay looks so cockeyed at is a result of thinking in a scientific way, which in turn is a result of thinking in an individualistic way, something that was hinted at during the Italian Renaissance, effloresced in the mid- to-late seventeenth century in Europe and sort of spread about the entire world from there. The individual is capable of reason, and thus of understanding the world and thus of manipulating it to be more to his liking. This is a break from a mode of thought in which the world both as a physical and a social system is a given, and largely immalleable. Modernity is dynamic where pre-modernity was static.

    So I rather like modernity that way. Modernity is change. Modernity is individualistic. Modernity is progress. To me, this does not necessarily imply a world analogous to the floor of a Las Vegas casino, filled with seductive sights, sudden sounds, and soothing soma.Report

  9. James K says:

    Having read Franzen’s article, I’m unimpressed. He seems to have written it with his nostalgia goggles on. The idea that there ever was a period where people weren’t constantly distracted while problems occurred unabated around them. The term “bread and circuses” was not coined to describe current events.

    The real reason why governments do nothing about climate change is because it would require voters to make significant sacrifices to their standard of living, at least in the short term, and any government that tries to make that happen will be voted out very quickly. And so governments take one of two paths to avoiding problem: deny the problem exists, or pretend to do something about it, but make your solution so watered down it doesn’t actually do anything.

    As for inequality, maybe the reason people aren’t that exercised about it is that most people actually don’t care that much? People compare themselves to their neighbours, not distant one-percenters.Report

    • Michael Drew in reply to James K says:

      …Not just sacrifices, but sacrifices for something that offers no immediate or medium-term tangible benefit: literally a non-thing. Status quo itself is the goal of climate-change mitigation. Voters can get that by just not thinking about the future.Report

    • Jim Heffman in reply to James K says:

      “The real reason why governments do nothing about climate change is because it would require voters to make significant sacrifices to their standard of living, at least in the short term”

      The real reason is that the things we’re supposed to do about climate change are the same things that people have been saying we should do since the Puritans ragequit England: Shun all pleasures, live a life of strict moderation, keep to yourself and don’t show off, maintain a sense of spirituality and the divine in all things, always put yourself last.

      When there was a problem with leaded gasoline, the solution was not to ban gasoline.Report

    • roger in reply to James K says:

      “The real reason why governments do nothing about climate change is because it would require voters to make significant sacrifices to their standard of living.”

      That and the collective action problems. Let’s not forget these.

      The other issue of course is that we are not really dealing with a magic wand. Every real world solution to real world problems as complex as this has costs and benefits. Pros and cons. And people with differing values and contexts legitimately disagree with each other on which course is best, and some even argue nothing is better than doing something which causes net harm (in their biased opinion).Report

      • Kim in reply to roger says:

        The disparity in costs is about as horrid as the disparity in benefits.
        We’re talking abandoning American cities (we’ve already stopped with housing insurance).

        Pick where you live very very carefully.Report

    • North in reply to James K says:

      Ditto the Kiwi here.. I read Franzen and then read Ethan and, to my own mild shame, the first thing I thought was “Quintessential first world problem article” and then felt bad that that meme seemed to sum up the whole thing neatly.Report

  10. Shazbot8 says:

    My impotence is not caused by my being distracted.Report

  11. Kim says:

    Chris:
    Language molds thought.

    … if there were people whose thoughts weren’t molded by
    language, we’d probably consider them defective.Report