@TheAlexKnapp on Religion & Science

Will Truman

Will Truman is the Editor-in-Chief of Ordinary Times. He is also on Twitter.

Related Post Roulette

87 Responses

  1. Kazzy says:

    THAT is how Twitter works? Ugh!Report

  2. Stillwater says:

    I think he’s responding to Ted Cruz saying he’s

    “Today the global warming alarmists are the equivalent of the flat-earthers,” Cruz continued. “You know it used to be: ‘It is accepted scientific wisdom the Earth is flat.’ And this heretic named Galileo was branded a denier.

    So he and all the other AGW deniers are a collective Galileo in their opposition to cultural mythology. No, that’s not right. Opposition to science-based consensus. No, that’s not right either. Hmmm. Confusing.

    Knapp is wrong, tho, when he says that the lone genius is a myth. It isn’t. Most big advances in scientific knowledges resulted from the lone genius offering an evidence-based theory with greater explanatory power than it’s predecessors until it’s accepted by a majority and becomes the new consensus.

    That Cruz thinks rejecting scientific consensus in the absence of a more powerful explanatory account amounts to an expression of genius is beyond my limited reasoning. But he’s the guy who went to Harvard, so who am I to argue with him?Report

    • trumwill in reply to Stillwater says:

      Alex wrote this before Cruz said that. I collected it and held off to see if he would say anymore. Confident that he won’t soon, I went ahead and ran it.Report

    • Mike Schilling in reply to Stillwater says:

      Isaac Asimov explained this decades ago, though of course no one listened. Anyone who lives near the sea knows that the earth is round. The way that ships approaching from far away become visible, masts before hull, would make no sense on a plane — it’s true because the ship is coming towards you around the curve of the earth. Columbus wasn’t doubted because people though he would fall off the edge, but because they thought he had wildly underestimated the length of the western route to Asia. Which, since he thought he’d gotten there when he was still 13,000 miles short of it, he had.Report

      • Kim in reply to Mike Schilling says:

        Near the Atlantic, specifically, and that only works if you sail away from the coasts, which the Romans really didn’t.Report

      • Near the Atlantic, specifically, and that only works if you sail away from the coasts, which the Romans really didn’t.

        The Romans knew the Earth was round anyway, mostly from contact with the Greeks who’d already figured it out. If you read Roman authors it’s referred to pretty much as a matter of course, even among rhetoricians or other writers who didn’t do much work in natural philosophy.Report

      • Glyph in reply to Mike Schilling says:

        I may be misremembering from long-ago Sunday School, but isn’t there evidence that even some of the oldest references in the Bible describe the earth as spherical (first, because the Hebrew word of the day for “circle” can also mean “sphere”; and secondly, in Job, which is one of the older books, the author appears to describe the earth’s terminator, dividing day from night, as a circle – which only makes sense if you are dealing with a sphere, not a flat surface?)Report

      • Oscar Gordon in reply to Mike Schilling says:

        @mike-schilling

        Of everyone here, and despite your criticism of his views, I knew you’d get the reference first.Report

      • Mike Schilling in reply to Mike Schilling says:

        I actually kind of like the guy. I just get annoyed with people who worship him. (Though I’ve always found that book in particular pretty boring.)Report

      • Oscar Gordon in reply to Mike Schilling says:

        Not my favorite of his books, but it was the first one of his I ever read as a boy and the character stuck with me (something about all the scars I have…).Report

      • Pinky in reply to Mike Schilling says:

        In the first question of the first part of the Summa Theologica (written by Thomas Aquinas in the 1200’s), addressing the question of what means can point us to truth, he says:

        “Sciences are differentiated according to the various means through which knowledge is obtained. For the astronomer and the physicist both may prove the same conclusion: that the earth, for instance, is round: the astronomer by means of mathematics (i.e. abstracting from matter), but the physicist by means of matter itself. Hence there is no reason why those things which may be learned from philosophical science, so far as they can be known by natural reason, may not also be taught us by another science so far as they fall within revelation. Hence theology included in sacred doctrine differs in kind from that theology which is part of philosophy.”

        In 1300-and-something, in the Divine Comedy, Dante travels downward to Hell, gets disoriented by the shift in the direction of gravity when he passes through the center of the earth, and emerges on the opposite side of the world.

        I haven’t read a lot of medieval texts, but these are two of the most basic, and they treat the roundness of the earth as a given.Report

    • Alex Knapp in reply to Stillwater says:

      Most big advances in scientific knowledges resulted from the lone genius offering an evidence-based theory with greater explanatory power than it’s predecessors until it’s accepted by a majority and becomes the new consensus.

      This really isn’t the case. Most “lone geniuses” worked in collaboration with partners who are frequently forgotten. Their ideas were often influenced by ideas and concepts of peers and predecessors. And often some of the ideas we attribute to the “lone genius” were more often interpretations and commentary that came from their followers and admirers through the years. (Aristotle is a big example of this – a lot of “Aristotle’s” ideas are really developments of his followers, commentators and interpreters.)

      History has a way of fixating on big names with big personalities and attributing things to them they were only partially or sometimes not at all responsible for. History also has a way of forgetting people who developed inventions or concepts independently in favor of the person who insisted on the credit the most. (Like poor Galileo Ferraris, who independently developed a lot of stuff that Tesla gets credit for – and did it before Tesla.)

      Even worse, history has a way of making people famous for stuff they really weren’t that good at. Galileo is noted for heliocentrism (for obvious reasons), but he really didn’t do much to prove or advance heliocentrism at all. He did much more interesting and better work in other areas of physics & astronomy.

      The fact is that when it comes to scientific discovery, unique insights are fairly few and far between. Far more often, once the knowledge available at the time leads to a distinct conclusion, lots of people will end up there. If Copernicus hadn’t published on heliocentrism, there were other astronomers working at the time whose ideas were already leading to it. Somebody else would have gotten there. Once we had alternating current, the induction motor was just a matter of time.

      Or as Mark Twain put it better, “The kernel, the soul — let us go further and say the substance, the bulk, the actual and valuable material of all human utterances — is plagiarism. For substantially all ideas are second-hand, consciously and unconsciously drawn from a million outside sources, and daily used by the garnerer with a pride and satisfaction born of the superstition that he originated them; whereas there is not a rag of originality about them anywhere except the little discoloration they get from his mental and moral calibre and his temperament, and which is revealed in characteristics of phrasing. When a great orator makes a great speech you are listening to ten centuries and ten thousand men — but we call it his speech, and really some exceedingly small portion of it is his. But not enough to signify. It is merely a Waterloo. It is Wellington’s battle, in some degree, and we call it his; but there are others that contributed. It takes a thousand men to invent a telegraph, or a steam engine, or a phonograph, or a telephone or any other important thing — and the last man gets the credit and we forget the others. He added his little mite — that is all he did. These object lessons should teach us that ninety-nine parts of all things that proceed from the intellect are plagiarisms, pure and simple; and the lesson ought to make us modest. But nothing can do that.”Report

    • Alex Knapp in reply to Stillwater says:

      Cruz and Galileo actually have a surprising amount in common, personality-wise. Especially a talent for egotisticial bombthrowing when they could have accomplished their ends more ably by showing a touch of humility and empathy.Report

  3. Chris says:

    The named genius narrative is part of our understanding of just about every human institution, as his 2nd through 7th tweet demonstrate quite nicely. And while he’s right, it’s not how science, or most human endeavours for that matter, really works, it also is how science and many human endeavors work. Aristotle’s influence on both science and theology is not exaggerated, but neither is Newton’s, or Einstein’s, or Mach’s, or Darwin’s.

    I do think he’s right that the “skeptics” tend to have somewhat impoverished representations of just about everything, including their beloved science, though. Getting the history of science wrong is par for the course for them (remember the first episode of the new Cosmos?).Report

    • Mad Rocket Scientist in reply to Chris says:

      Thing is, with only a few exceptions, the lone genius has an insight/epiphany and does some preliminary work & that gets the ball rolling for a bunch of other people to contribute work.

      The genius just gets all the credit.Report

      • Stillwater in reply to Mad Rocket Scientist says:

        The “few exceptions” are what make the lone genius theory true.

        Sure, they didn’t do all the work. They saw further by standing on the shoulders of giants and all that. (They didn’t build it!)Report

    • Mike Schilling in reply to Chris says:

      Newton invented calculus when no one else could have. Except for Leibniz, anyway,Report

      • Oscar Gordon in reply to Mike Schilling says:

        Newton is most definitely one of those exceptions.Report

      • I think Newton was a genius. But most of his personal feuds came from his belief that he was THE genius. The contributions of Leibniz, Hooke, and others to Newton’s ideas (and of course vice versa!) are plain to anyone who honestly studies his work. It doesn’t diminish Newton’s genius to note the influences on his ideas. At least, it shouldn’t. But for some people it does I suppose.

        But does anyone really think – HONESTLY – that if it weren’t for Newton we wouldn’t have figured out universal gravitation in the 17th century? Once Kepler developed the laws of planetary mechanics, I think it was pretty much a done deal.Report

      • Chris in reply to Mike Schilling says:

        And there’s Kepler.

        As another genius said, “chance favors the prepared mind.”

        Science works both ways: big discoveries by teams and individuals that shake it free of some temporary fetters, and the more obscure toil of many who not only lay the groundwork, but also move science forward by tiny increments.

        And to be fair, given that science was generally done by well off men (and very occasionally, women), members of various societies, often competing with each other, for much of its post-medieval history, the “lone genius” narrative mashes more sense describing science into the 19th century than it does now.Report

      • Mike Schilling in reply to Mike Schilling says:

        @oscar-gordon

        E. C. for you to say.Report

      • Oscar Gordon in reply to Mike Schilling says:

        @alex-knapp

        That is pretty much what I think you were trying to get at in your… whatever that was. There are geniuses who move the ball forward in great leaps, and they get all the recognition, but the reality is that no one does it alone. Hell, many “geniuses” simply enjoyed the luck of just being published first, or more widely.

        But, unless you are in the field, the likes of Hooke, Leibniz, even Kepler never enter your knowledge. In popular myth, Newton doesn’t even get credit for his greatest work (stupid apple story plays well because of the Eden myth, but Calculus…?).Report

      • Chris in reply to Mike Schilling says:

        Perhaps an even stronger example, then, is Descartes, who is necessary for Newton (and Leibniz: his beef with Newton was as much over Cartesian mechanics as it was over who did what first), but who didn’t discover anything about geometry that wouldn’t have been discovered eventually. It’s not so much a matter of the lone genius discovering the undiscoverable, but the lone genius discovering something just hidden that might, without that mind, have been hidden for significantly longer.Report

      • Oscar Gordon in reply to Mike Schilling says:

        @chris

        And to be fair, given that science was generally done by well off men (and very occasionally, women), members of various societies, often competing with each other, for much of its post-medieval history, the “lone genius” narrative mashes more sense describing science into the 19th century than it does now.

        This is massively relevant to the lone genius narrative. Science is just not done that way anymore, so while there are certainly scientific geniuses out there still today, we don’t hear about Dr. So Enso’s breakthrough, but rather the breakthrough from a team at CERN.Report

      • Chris in reply to Mike Schilling says:

        Right. Even the scientific Nobel are, at this point, something of an anachronism given how collaborative contemporary science is.

        For those of you who are interested, Kevin Dunbar (among others) has done some really interesting work on the social and cognitive processes that are involved in scientific research and discovery. See, e.g.:

        http://www.cc.gatech.edu/classes/AY2013/cs7601_spring/papers/Dunbar.pdf

        Also interesting:

        http://onlinelibrary.wiley.com/doi/10.1111/j.1756-8765.2009.01036.x/full

        http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2346274

        https://books.google.com/books?hl=en&lr=&id=8fwWL8WVi64C&oi=fnd&pg=PR4&ots=19m-VP882B&sig=rYjYSLVTWRjSSgut0S5v2PQSMXo#v=onepage&q&f=false

        http://www.lrdc.pitt.edu/BOV/documents/Schunn_Paletz.pdfReport

      • Kim in reply to Mike Schilling says:

        Chris,
        yeah. It’s applied math that gets the lone geniuses, these days.
        (Igor’s math makes my head hurt…)
        [And that’s mostly because applied math is CHEAP. And there’s a lot of work to be done]Report

      • rexknobus in reply to Mike Schilling says:

        @mike-schilling

        O.k. But don’t make hobbit of it.Report

  4. ScarletNumber says:

    Numbers are sometimes scarlet,
    Violets are sometimes blue.
    The previous version of this comment violated the comments policy
    And you’ve been warned about this before, too.

    (BL)Report

  5. Burt Likko says:

    “Lone Genius discovers truth that turns world inside-out” is a much more dramatic story than “dozens of people collaborate and compete resulting in incremental improvements.”

    We shouldn’t be too surprised if the more dramatic story gets more traction even if it isn’t nearly as accurate.Report

    • Mike Schilling in reply to Burt Likko says:

      It was a trope of pre-WWII SF that a lone genius would discover a brand-new, limitless source of energy on Monday, use it to power a spaceship by Wednesday, and have the galaxy mostly conquered by Friday.Report

    • Kim in reply to Burt Likko says:

      It sounds way better than “Lone Genius discovers Obvious backdoor and convinces everyone that it would be a GOOD idea to run with it anyway…” (despite all the academics saying “but, but, hackable! easily hackable!”Report

  6. Damon says:

    Thanks to the flying spaghetti monster that I don’t twit, twitter, or pay attention to it.

    *wanders back to the wilderness.*Report

  7. Jaybird says:

    Twitter is perfect for a bon mot.

    The second you need to do more than set it up so you can knock it down, you should write an essay.

    Truly, our society is decadent.

    On the upside, the kids who do stuff like this for serious are the ones that I’ll be competing against so, there’s that. Quick! Legalize pot some more!Report

    • Michael Drew in reply to Jaybird says:

      Why fixate on the medium? It does affect thought, but OTOH, what’s the big deal if a train of thought gets punctuated in this way because it leave the station on Twitter? What is it about that that you feel supports you in posing as so superior?

      Twitter isn’t ideal for advancing essay-length thoughts, but it is far more interactive. I wouldn’t mind f tweets were 280 characters, but I don’t see what’s so juvenile or without substance about the medium or the platform. Its a powerful utility for communication. People who pose as superior to it come off as simply self-regarding.

      As I always think to myself whenever this comes up, the vast majority of a Mike Schilling’s comments here, for example, are not significantly more extensive in word count or content than a tweet.Report

      • Jaybird in reply to Michael Drew says:

        The medium *IS* the message, Michael Drew.Report

      • Chris in reply to Michael Drew says:

        It’s basically what telegrams would have been if they’d been free up to 140 characters.

        Also, what do psychics have to do with anything? 😉Report

      • Michael Drew in reply to Michael Drew says:

        The medium is not the message, though there is insight in that bit from McLuhan. I’m not saying it doesn’t affect thought, but then, why not some diversity in medium/thought/message? What’s so valuable about a 8.5X11″ white paper with inch margins as a unit of “thought”?

        Go on Twitter; it’s evident that exchanges of thought of real value are occurring by the millions daily. This in a way that is basically new for its accessibility and fluidity in the last ten years. As @chris says, it’s like having a public telegram feed in your home. There is certainly lots of tripe, but it’s not like it invades your feed. You decide whose tweets you will see.

        Whatever, I guess some simply see something in it that I don’t, or for whatever reason don’t see what’s plainly there.Report

      • Chris in reply to Michael Drew says:

        Language is the house of being, and Twitter is like one of those really extreme tiny houses that are all the rage right now, but even acknowledging that, the medium certainly isn’t the message, it’s simply where the message lives. That is, Twitter definitely impacts the message, shapes it, influences how it interacts with its readers, but it isn’t itself the message.

        The message is still there, within Twitter; Twitter just plays a roll in determining which parts open themselves to us and which parts withdraw.Report

      • Mike Schilling in reply to Michael Drew says:

        not significantly more extensive in word count or content than a tweet.

        #drewisdeadtomeReport

  8. zic says:

    The Lone Genius myth walks hand-in-hand with the natural talent myth.

    Both eschew collaboration, effort, and mastery.

    But there may be an individual effort worth recognizing here, one of the best self-promotion that disregards all the toes stepped on via lack of proper credit.Report

  9. Francis says:

    Alex shouild be a patent examiner. The truly revolutionary ideas are very thin on the ground. Virtually all of the 8 miliion plus patents granted have (what’s called in the trade) prior art.Report

  10. Guy says:

    Eh. I’m not impressed by a humblebrag any more than a regular brag.Report