Morning Ed: United States {2016.06.28.T}

Will Truman

Will Truman is the Editor-in-Chief of Ordinary Times. He is also on Twitter.

Related Post Roulette

152 Responses

  1. j r says:

    I do not know who Alan Cabal is, but he certainly manages to mix the best/worst of conservative moralizing and progressive hand-wringing and all in a way in which I am sincerely uncertain whether he means any of it.Report

    • LTL FTC in reply to j r says:

      It reads like he’s doing an impression of a cynical college dropout of about 20.

      “Everbody’s all like, ‘I’ve got mine.’ All hypocrites, man. Hey, you ever heard of a band called the Dead Kennedys?”Report

    • LeeEsq in reply to j r says:

      Agreed. You said it much more succinctly than I could.Report

    • veronica d in reply to j r says:

      The article was — well, just clueless.

      For example, this:

      We are sex-obsessed. Americans invented furry sex and mainstreamed BDSM, we are kinkier than a snake’s armpit. Our advertising would easily pass for pornography in a truly civilized society.

      According to my understanding, our advertising is pretty tame compared to much of Europe, who kinda balk at our desire-mixed-with-shame approach to sexuality.

      That said, I kinda wanna know more about this:

      Ballard captured it perfectly in his 1973 novel Crash. “The veronicas of our perversions…”

      Um… did someone say my name?Report

      • Don Zeko in reply to veronica d says:

        Sounds like a rock band to me. I would definitely go see Veronica and the Perversions in concert.Report

        • LeeEsq in reply to Don Zeko says:

          An adult version of Josey and the Pussycats.Report

          • J_A in reply to LeeEsq says:

            You just destroyed a big bunch of sweet childhood memories of that show.

            Josey and the Pussy-cats

            My eyes hurt

            Now, I should be going. I have to go cover the legs of the tables in my house. It’s indecent how they look now.Report

        • veronica d in reply to Don Zeko says:

          Heck I’d sing for them. A little switchy tranny slut like me, all in leather. Now that’s a band!

          (Sadly I can’t sing for shit.)

          (Cuz if I could, dammit I’d love to be center stage.)

          There was this aussie band The Veronicas, who I guess played around with the cute-maybe-lesbian thing. Which whatever. They hardly seem the height of perversion when the Genitorturers exist.Report

          • veronica d in reply to veronica d says:

            Plus there was this band Veronica Lipgloss, who I guess are pretty obscure, but they have one album up on Spotify. It’s pretty good gothy-dancey kinda stuff. I like it. Anyhow, if you want a band with “veronica” in its name (and who doesn’t), they’re a good choice.Report

          • Fortytwo in reply to veronica d says:

            A former girlfriend dragged me to a Genitorturers show one time. I left halfway through. That is some perverted scheiss.Report

            • veronica d in reply to Fortytwo says:

              Heh. I recall one show — let’s just say I participated. It involved fluids.

              Anyway, fun band. Not for the squeamish.

              That said, they seem like something from a different age. It’s like, they arose from the same zeitgeist that gave us Manson and shit. I dunno. Maybe it’s cuz I’m older. Maybe it’s cuz I’ve been to enough sex parties since that time that public kink just means less. Whatever. I’d still go to a show, for old times sake, but it’d be like looking through a dusty window pane.

              I’m thinking of that one Onion article where Marilyn Manson is going door to door in the suburbs trying to shock people. No one is shocked.

              The point being, being shocking has a shelf life. To be entertainingly non-shocking is clearly the greater value.

              Their music was passably good, if you like that sort of thing.Report

      • LeeEsq in reply to veronica d says:

        America might be more tame than Europe but we are wilder than many non-Western countries. That might be why assimilation goes easier in the United States.Report

      • DensityDuck in reply to veronica d says:

        “Um… did someone say my name?”

        He’s talking about bullfighting.Report

        • veronica d in reply to DensityDuck says:

          Well that’s boring.

          The cool thing about Veronica is, it is thought to be a romanization of the Greek name Berenice, which becomes Bernice. It means “bringer of victory.” (Bere-Nike, get it?)

          So anyhow, one strives to live up to their name.

          I just wanna be a rich shallow bitch with a best friend Betty — and tons of sublimated lesbian tension. Cuz obvi.Report

      • Chip Daniels in reply to veronica d says:

        Why do they come here?”

        We are sex-obsessed. Americans invented furry sex and mainstreamed BDSM, we are kinkier than a snake’s armpit. Our advertising would easily pass for pornography in a truly civilized society

        Question asked, and answered.Report

    • DensityDuck in reply to j r says:

      This is one of the many, many times in life where going beyond the link-drop and one-line snark provides a beneficial context. For example, I don’t think a writer who puts out something like “If you’re going to commit war crimes, at least be thorough about it. If war is the world’s only hygiene, we’re like some back alley abortionist who doesn’t wash his hands between patients or a truck stop bathroom with no toilet paper.” intends that we consider his thoughts wisely, dispassionately, and with a will to execute them exactly as worded.Report

  2. notme says:

    The final majority report of the Benghazi Select Committee is set to be released later Tuesday morning.

  3. Nopenopenopenopenopenopenopenopenope

    Do the Clintons ever get tired of murdering people?Report

  4. Buddy Ryan passes away. I’d say rest in peace, but he’d hate that.Report

  5. Chip Daniels says:

    So about those STEM vs Medieval Poetry grads:

    Business Majors need Liberal Arts,


    If you want to succeed in business, don’t get an M.B.A. Study philosophy instead

    Both articles touch on the point we have discussed here, that behind a lot of the criticism of college students studying “frivolous” liberal arts, is the myth that they are unrelated to earning a living. That in fact, liberal arts are essential for business.Report

    • I am modestly skeptical of business degrees (BBA/MBA) as such. Especially when it comes to the best and brightest who go to good schools. But I don’t sign on to the notion that liberal arts degrees are better in any inherent way. I think it depends on the kid and the liberal arts subject.Report

      • veronica d in reply to Will Truman says:

        I recall one of my first jobs as a software engineer. We were in a “brainstorming” session, and I was pontificating about some techno-mumbo-jumbo — and look! I’m sure I was “correct,” at least correct enough. But one of the business guys stopped me and asked, “Before we get lost in the technical problem, what is the business problem?”

        I realized I was actually kinda fuzzy on the “business problem.” I knew we had a product idea. I knew we wanted folks to pay us money for it. But I had no fucking clue how the market actually worked. I’d never sold a damn thing in my life.

        In fact, I seldom really know much about the markets I work in. Take my current area, the airlines. I know a lot of the “leaves” in the big forest of the airline industry, but if some keen airline investor were to ask me a question, I couldn’t tell him a single damn useful thing.

        I don’t know shit about business. I don’t know the first fucking thing.

        I’m often the “smartest person in the room” (although less at my current employer), at least in terms of raw cognitive power, but at a certain point I learned that that didn’t mean shit.

        Lesson: make sure you understand the business problem damn well before you start getting lost in the technical problem. Often the non-techies know the business problem very well.

        Blah blah blah. This is obvious — unless you’re so smart you’re stupid.

        Anyway my point is, I’m not sure how much a lib-arts degree will help in business. I haven’t first clue. But I’m pretty sure the stuff I know is useless.

        On the other hand, if you need a engineering gal who knows her shit and ain’t gonna fuck up, hire me.


        (As an aside, I have a theory: business types lack the insight to really “get” what makes a gifted engineering type. But that cuts both ways. Often success is the marriage of these things. But I suspect the pairing are as much good luck as keen insight. You cannot tell for sure, so you throw the dice.)

        (In engineering, we have the phrase “decisions made on the golf course,” which means engineering decisions that are completely wrong-headed and will fail, made by folks who read “CIO magazine” and buy shit from “big names.”)

        (I think a good example is the story of the Romney campaign’s software, built by “big consultants” who choose shit like Oracle and Microsoft, compared to Obama’s, built by real nerds. The software failures of the first were not surprising to me.)Report

        • J_A in reply to veronica d says:

          My experience is sort of reverse. I’m an engineer with an M. Sc. In engineering. I have been doing business around the utilities most of my professional life raising all the way to the C-suite (CFO, COO, CEO). I don’t have an MBA.

          Business professionals want to talk about the Business Problem because they have no fishing clue about how the technical aspects work. They come with brilliant (so they say) ideas that cannot be implemented.

          I can see that developing software for massive sales is a completely different animal, where marketing and other commercial aspects are very relevant (who is our customer? What does our customer need? What does he want (not the same)? But in my particular world the order in which questions must be answered is:

          – What can physically be done?
          – What can physically be done at a reasonable cost in a reasonable time?
          – Of the above, what is the best business option?Report

          • veronica d in reply to J_A says:

            @j_a — Oh I’ve definitely met business types who didn’t know shit. I’ve particularly met those who do not listen to their engineers. Those types do poorly.

            I guess it depends. If they’re in a stable business with “well understood” problems, they can probably coast on bullshit. Like, I know a couple guys who work pretty senior IT positions at one of the big huge international banks. (Name three or four big huge international banks. You named where they work.) Anyway yeah, the stories you hear.

            But smaller outfits, especially those that are “new ideas” focused — I think success is gonna come when you have “folks who get the market deeply” paired up with “folks who know the math deeply.”

            Plus luck!

            I suppose it is possible for one person to do both. Sure. Why not. But still, don’t send me in to make a business deal. That would be an error.

            But yeah, everyone needs to know their role and trust that the others know theirs. If I start telling you about real limitations, well you either trust me or you do not. If you do not, well we’re in for a world of shit. “That won’t work” means THAT WON’T FUCKING WORK.

            Anyway yeah, I’ve seen that shitshow play out — in the case I’m thinking of, I wasn’t the senior engineer who was shouting “no no no,” but I agreed with him. Anyway, management was dumb-as-fuck and the company failed hard. So it goes.

            Every job is a learning experience. If I (by chance) get hired by clueless business folks who fuck up, I will learn some stuff from them, and then find another job when it’s time to go.Report

            • Kim in reply to veronica d says:

              Know your role:
              “Excuse me, Mr.Consultant, we need you to be able to access Bugzilla so that you can put your bug reports in.”
              “That won’t be necessary. In fact, I deleted your database. And your latest version of code. We’ll be rewriting from the ground up.”

              Do you know why Facebook became the premier social networking thingummy?Report

        • Kim in reply to veronica d says:

          Mitt Romney really shouldn’t have hired trolls.Report

      • Chip Daniels in reply to Will Truman says:


        As an architect, I can see the absurdity in the entire STEM v. Liberal Arts issue, since my field requires a lot of both. In college we studied philosophy and art theory, and structural beam design.

        However, what is lacking in the conversation today is that the technical skills are actually the ones which can most easily be taught on the job, while the deeper concepts of philosophy and history are more conducive to a college setting.

        Exceptions abound of course, but the tools of the trade- the software programs, the workflow processes and such- lend themselves to the intern/ journeyman/ practitioner sort of pathway.Report

        • The question is the value of the philosophy or art student for the person who is there primarily to enhance art prospects. Learning and studying the deeper concepts of Thucydides can be great! Reading Thucydides to basically do a book report to get a degree to get a job is considerably less so.

          I think a lot more people fall into the latter category than the former category.

          It’s not binary, of course. But even a lot of people who think “Majoring in literature sounds cool” actually fall more on the latter end of the spectrum. They may like literature, but if their interest in college is getting a degree, and the interest in literature is as a consumer, that will be reflected in how much they absorb the greater concepts.

          When it comes to more utilitarian degrees, though, even the low-level stuff you pick up is useful in a way that knowing who Alcibiades was isn’t. Career-wise.Report

          • Kim in reply to Will Truman says:

            If you’re editing a scifi mag, you want to know the science as best you can.
            Same thing goes for literary mags and literary references, I suppose

            [in other news: the new voltron references Xcom.]Report

          • Chip Daniels in reply to Will Truman says:

            What I see mostly is a moral lesson being preached in the guise of analysis.
            Liberal arts are frivolous grasshopper stuff, while technical skills are stern disciplined ant habits.

            It is a morality lesson put to the use of excusing the weak job market offered to college students, and conveying legitimacy upon the financialization of our economy, and the hollowing out of the manufacturing sector.

            So the benefits of an MBA are considered “earned”, while non-tech graduates are scorned as spoiled and entitled.Report

        • Oscar Gordon in reply to Chip Daniels says:

          You have a very shallow understanding of the technical knowledge most STEM education conveys.Report

        • DavidTC in reply to Chip Daniels says:

          However, what is lacking in the conversation today is that the technical skills are actually the ones which can most easily be taught on the job, while the deeper concepts of philosophy and history are more conducive to a college setting.

          As someone who does software, which I suspect people think is the ‘easiest’ to learn outside of college: Bullshit.

          Firstly, yes, a good portion of people with programming ability will discover it on their own. You see it in every introductory college class…some portion of the class already knows all this stuff, and ignores the teacher, and writes their stuff in five minutes, and another portion, the actual ‘programming capable’ portion of the population (I’ve estimated that at 25% before.) learns it instantly, and the rest struggle and hopefully eventually change majors.

          But the thing is, despite all that…self-taught programmers are often *crap*. They don’t understand *any* of the why.

          People, me included, often mock computer ‘science’ as being somewhat short on the actual science, but it is a science, it’s just a somewhat practical one like medicine. What is *actually* going on is sorta skimmed over, and the important part is ‘what to do’…but the fundamental actual truth ‘how people work’ *is* there, somewhere, and gets across.

          Non-formally taught programmers, meanwhile, are often…quacks. Sure, they sorta understand how to get the results they want, but they don’t understand the why, and thus don’t understand how to get *new* results.

          There’s a saying in programming: Premature optimization is the root of all evil. To try to avoid fully explaining that, the point is that *non-trained* programmers often focus on entirely the wrong thing, trying to make code ‘better’ that doesn’t need to be better, and often, they accomplish nothing but make things harder to read and edit. Why?

          *Because they do not understand what is actually going on*.

          They do not, for example, understand that almost all language control structures are *actually* just ‘if (blah) goto label’, no matter what language pretends it is. (Which is why Duff’s Device causes a total brain meltdown in most people who see it.)

          Or, for another common example, that objects/classes in most languages do not hold the contents of some sorts of the variables in them, just pointers to those variables, so unless the language explicitly is doing something weird, you will pass a *copy* of the object, but if you change a string in that copy, you change the string in the original. Meanwhile, you can change a numeric value in the copy, but not change the original. This is confusing as hell unless you understand what the language is *actually doing*.

          I.e., ‘computer science’. Some actual ‘theory’, although it’s not ‘theory’ as much as ‘this is what is really going on, despite the fact that you can ignore it 95% of the time’. Because, it turns out, that remaining 5% is pretty damn important.

          Otherwise you functionally have auto mechanics designing cars. Now, yes, they probably do that better than most of the population, just by piecing car parts together…but you sorta need some engineers there also.Report

          • Kim in reply to DavidTC says:

            People who don’t learn how to code in college (regardless of whether or not they wind up going to college) are the type who write self-modifying code. And then the self-modifying compiler that you need to debug the self-modifying code….

            So much of computer science is just simple strategic level thinking.Report

          • veronica d in reply to DavidTC says:

            Keep in mind, you guys are talking about me.Report

            • DavidTC in reply to veronica d says:

              Actually, I’m talking about me, too. I don’t have a degree in computer science either. I have about half of one. I couldn’t handle the math requirements.(1)

              Getting a degree in computer science *forces* people to learn some of that stuff. Actually, I’m not even sure that’s true…I’ve run across people with degrees who didn’t seem to have any knowledge or curiosity of the level below where they work.

              But, hell, I didn’t finish my degree, so for all I know, *no one* is actually teaching that stuff. Which explains why computer programmers keep reinventing the concept of a device that you could place on an axle and perhaps use to reduce the friction of moving a large cart…because none of them actually knows any computer *science*.

              Meanwhile, I’ve just…read a lot about it. It helps that part of my programming origin story was hacking some pretty old C program (A program called PennMUSH, in fact, a social-only version of MUDs.), which doesn’t hide low-level stuff anywhere near as much as other languages do. (Right now, all my work is PHP, which hides *everything*.)

              But, statistically, I think people are a lot more likely to get some sort of low-level understanding in a *formal* education environment, instead of just the sort of apprenticeship that Chip was proposing. Even assuming that the person who teaches them *does* know that stuff, that’s the sort of thing that it is very easy to just keep glossing past, to pass out particular rules, instead of sitting down and saying ‘This is how this *really* works’.

              1) And it pisses me the hell off because, having worked in this field for a decade, I *still* haven’t ever needed trig or above (Trig is needed for graphical rendering, but I don’t do anything like that.), much less the goddamn two years of calculus that every computer science degree requires.Report

              • Oscar Gordon in reply to DavidTC says:

                I’ll chip in that I don’t have the formal CS degree either. At best, I have a minor in software engineering, but there are huge gaps in my education that I try to fill as best I can by reading. That said, those handful of classes included some in-depth discussions about what is happening under the hood, and how to design software from a high level perspective, and I am a better developer for them.Report

              • Kim in reply to Oscar Gordon says:

                I’ve got a physics degree. I’ve done the work for a masters in graphics programming and analysis, but as an apprentice only, not a formal class.Report

              • veronica d in reply to DavidTC says:

                Well, I say keep teaching the trig and calc, but mostly cuz the rise in machine learning. Basically, you’re gonna want some REALLY SOLID foundations in linear algebra and basic metric spaces stuff. Sure, not everyone is gonna be writing a non-linear optimizer from scratch, but just as you want to know how a compiler works, you’re gonna want to know how TensorFlow works.

                Taylor’s theorem works on higher dimensional manifolds. From this comes everything.

                If you say, “Not everyone is doing machine learning,” I say, “They will.”

                But yeah, the compsci foundations: automata theory, recursion, formal languages, a smattering of lambda calc and type theory, relational algebra and calculus, etc. We need more of that stuff. Tons more.

                The thing is, most of it is pretty easy. Which sure, Gödel’s stuff still bends my brain a bit, like I have to take deep breaths and remember which is referring to what at which level. (It’s easier just to translate it into Lisp.) But yeah, anything “formal” I do these days, I mostly try to translate it into some basic typed lamdba calculi. Just, my brain is trained at doing that. Every well-typed Haskell program is a proof!

                Train your brain!

                We’re training people to wire up bad Javascript/CSS monstrosities backed by barely understood Ruby code. This is not the right way.

                Beneath every bit of software is a problem, which can be expressed in formal terms — cuz computers are formal all the way down. And any “formal” thing is going to have a language, a logic, an algebra — even if you never write this down. It exists, if only as an emergent property.

                After all, even fucking Perl had a semantics — just no one really quite knew what they were.

                So anyway, learn to clarify, to write down, to find the logic, the algebra. Know the language, it’s terms, its compositionality, it’s rules. Code to that.

                Every year software takes over more of everything. Every year it gets bigger, takes up more “space” in our lives and culture. This is good, but darnit we need to master this art form.

                It’s math.Report

              • DavidTC in reply to veronica d says:

                Beneath every bit of software is a problem, which can be expressed in formal terms — cuz computers are formal all the way down. And any “formal” thing is going to have a language, a logic, an algebra — even if you never write this down. It exists, if only as an emergent property.

                Um, I suspect you think about computer programming in a fundmentally different way than I do.

                Specifically, you’re acting like people need to learn the formal grammar, because *that* is actually computers operate.

                Except…it’s *not* how computer actually operate.

                The things that parse computer languages are kludges on top of kludges, with all sorts of syntactic sugar that is design to make it *look* like they work a certain way. You can stand there on top of a towering mountain of LISP, with perfect formal logic…and below that the LISP parser is still allocating memory for a string, loading it in, and then recursively tokenizing down it one byte at a time.

                You seem to think the truth is *up*. But the truth is not up. The truth is down. Computers do not operate at abstraction. They operate at machine code.

                Now, computers, at the lowest level, are math, sure, but they’re addition and multiplication, not calculus. (No CPU actually *understands* calculus.) They’re not even truly algebra, despite programming languages often vaguely resembling that.

                Computer programming is a list of instructions that get translated to even more basic instructions, not math. And what I am saying is that programmers need to know *what* they’re being translated to (Not literally the machine code or even assembly language, but just *generally* how it all works.), both in general, and any weird exceptions for specific languages. How do pointers work? What is a symbol table? How does memory alignment work? What is copy-on-write? Etc. Learn the concepts, and then figure out how it works in your specific language.

                All too often, self-taught programmers don’t know *any* of that. A person starts with Javascript, or Ruby, or, yes, LISP, they don’t know any of that. Even if the person was ‘mentored’ in a language like that, the mentor might not have mentioned it.

                The abstract math stuff, the formal logic stuff, is a whole different thing, which, yes, can be useful. And that also is computer *science*.

                But despite what you think, people aren’t really doing machine learning…people are writing code that puts things in databases, or pull them out, or fills in a template on a web page, or validates some input, or communicates with some other program. That’s what 99% of actual programming is.

                As I said, I’ve been programming a decade, and the most ‘complicated’ math I’ve ever had to use was modulus and binary shifting/masking whatever. (Which are not actually ‘complicated’, modulus is exactly as complicated as division, and the binary stuff is easy if you know binary…which is, again, it is easy for someone to be shielded from, despite the fact all programmers should know it.)Report

              • veronica d in reply to DavidTC says:

                It is a mistake to assume Lisp programmers don’t know how memory works. For example, the application I work on is about 35% C++, and we share extensive data between both C++ and Lisp. Trust me, we understand where each bit goes and how it gets there — cuz we gotta. We pay a lot of attention to cache performance. At the scale we work, making sure related things land in memory close to other related things makes a huge difference. Likewise a large share of our static data is held on disk in its “in-memory” binary format, which C++ and Lisp handle just fine — we’ve hacked the Lisp GC to do what we want. In any case, we mmap that stuff in, for better VM performance, etc. etc.

                Knowing abstract algebra and formal semantics does not preclude a programmer from also knowing how malloc/free works, or what a kernel trap is, or how virtual memory works, or how interrupt handlers work, or Unix signals, or any number of things. If you’re interested, you should see the crazy-as-fuck things Ed Kmett will do to get the Haskell to sing.

                You saw above where I mentioned you need to know how your compiler works — you need to know this cuz you might not like what it does and need to tweak it to do something else.

                You need to know how your linker works, in the sense of what it actually does, cuz you might need it to do something different.

                Like, you might be working on some massive distributed system that deploys code packages to live boxes, and loads and unloads pages of code in real time — cuz that is what you need to do to distribute your problem across 10,000+ nodes. So you write that. Maybe you hack up your own linker.

                (In our case, our Lisp does not produce standard ELF binaries, which creates trouble. Anyway, I personally haven’t dug into that part of the code, but someone has. If I ever needed to, I could. I know how linkers work.)

                But still, big-hard-new problems require big-hard-new thinking. Things are bigger now, at enormous scales. The problems are also more ambitious. The tools are more advanced.

                If you want to write something massively distributed, it really helps if you think in terms of associative algebras. If you’re problem does not have an associative decomposition, it’s going to be very hard to effectively distribute. (That whole Fortress thing that Guy Steele was working on was based on these ideas. It didn’t get anywhere. But still, the ideas were good and will show up again, cuz math is truth.)

                Humans can solve these things with our brains, for small problems. For big problems, we need machines to do it. Deciding how to split up a variable amount of work among a variable number of CPUs under varying time restraints is not a problem you want humans reasoning about. So your work elements must be expressed in a formal way, so machines can decide.

                Which is to say, something like Map/Reduce was a very early/simple example of this. But many problems don’t break out that way. Not many tricky graph algorithms are being done in Map/Reduce.

                (There are large distributed systems to handle graphs, but you gotta know your shit to use them.)

                Another example is rules-based event systems. When you are dealing with tricky timing, and 100k+ different semantic elements that need to be scheduled and sequenced in an unpredictable environment — well you don’t want humans figuring that out. A machine needs to do it. This is logic programming, but not the Prolog stuff. That was so primitive and too slow. But still, your rules need to be pre-processed and verified, for performance. In the end, you are writing a proof system. (You are also writing a combinatorial optimization algorithm. But the point is, your optimizer is processing formal structures that obey rules. Those rules need to be baked in. Perhaps those rules will change. If they do, can you afford to re-hand-code the optimizer? The point is, if you expect the rules to change, better to make it rules-based from the outset.)

                (The entry-point of these ideas is declarative programming.)

                (One way to think of this is, we are doing for rules and scheduling what SQL did for database access. Sure, you can deal with relational data without SQL. Perhaps in some high-performance environment you’ll need to. But darnit SQL makes it safer, cleaner, more predictable, and easier to reason about.)

                (And yes it helps a lot to know how indexes work and to be able to read the output of a query planner. Someone on your team needs those skills. Usually that person turns out to be me.)

                Anyway, I know some folks who do this sort of thing for real-time finance stuff. Semantic clarity is a big deal to them. They indeed think in terms of logic and algebras. For them, mistakes are expensive.

                Anyway, the idea that knowing the “high level” stuff hinders knowing the “low level” stuff is nonsense. Certainly we want experts in each. People are going to find their focus. It’s clear the direction mine has gone. But that said, if you dismiss me as a “head in the clouds” type, you don’t know me.

                Back in the day, the machine language folks through Fortran was silly. I’m sure the database gurus thought SQL was a waste of time. But we march forward with better abstractions each year, and we solve bigger problems with less cognitive load.

                You don’t need map/reduce to distribute stuff. You don’t need TensorFlow either, when you want machine learning. But we have them now. Next year we’ll have more. And to keep everything doing what we expect, and to be able to communicate what we expect it to do, we will need formal languages to express these ideas, and we will need systems that understand those languages and that can process them correctly.

                Else there is chaos.

                It is math.Report

              • DavidTC in reply to veronica d says:

                It is a mistake to assume Lisp programmers don’t know how memory works.

                That is not what I was implying, although if you want an example of that you probably shouldn’t pick something that is 35% C++. *Any* multiple-language system is obviously going to have deal with memory at a very detailed level.

                I have a feeling that LISP programmers *in general* understand memory about as well as any programmer used to garbage-collected languages understand it. As in, it’s just something that happens, until memory usage is too high.

                Knowing abstract algebra and formal semantics does not preclude a programmer from also knowing how malloc/free works, or what a kernel trap is, or how virtual memory works, or how interrupt handlers work, or Unix signals, or any number of things

                Again, not where I was going.

                My point was, to be a good programmer, you have to understand low-level things, which is easy for people who self-teach themselves in modern languages to miss.

                Meanwhile, unless you’re in a pretty specific environment, you don’t actually have to understand formal semantics. You don’t normally have to understand anything ‘above’ where you are, except a few basic concepts like structured programming and whatnot. (If that really ‘above’…not sure what metaphor makes sense there.)

                If you want to write something massively distributed, it really helps if you think in terms of associative algebras. If you’re problem does not have an associative decomposition, it’s going to be very hard to effectively distribute.

                Well, yes. *If* that’s what you’re doing. That is not 99% of programming, though.

                There are all sorts of programming problems that require specalized knowledge. Game physics. Massively distributed software. Real-time software. Rules processing. Stuff I literally cannot name because I am not in that field.

                But those are things you don’t need to know unless you use them. You can be a good programmer without knowing them.

                Even things like object orientation and databases are technically optional if what you’re doing doesn’t use them, although, in actuality, you’re almost never going to *not* use those, so should probably just learn them to start with.

                The low-level stuff isn’t like that. It’s easy to *assume* that can just be skimmed over, ‘Hey, I’m not writing assembly, why do I need to know this?’ and sure enough a lot of programmers have indeed skipped it.

                But it really is important, because it is what the computer is *actually doing*, and if you understand it, you will understand a lot of really weird behaviors of programming languages, and know exactly the sort of things that trip people up in a *general* way. You can walk into a programming language you’ve never seen before and know where the dangers are likely to be.Report

              • veronica d in reply to DavidTC says:

                @davidtc — A well-rounded programmer is going to know a lot of things. Which is to say, I’ve written interrupt handlers — a long time ago on a 16 bit machine. I assume I could do it today. There are plenty of books titled “Linux Device Drivers for Boneheads.” I could read one. I haven’t needed to.

                My point on semantics is, you’re doing this stuff anyway. You’re just not thinking about it.

                Trust me, if your software system is over 10,000 lines, you are doing this stuff, just — in an non-deliberate way.

                Okay, so the examples I gave were places where you must do it, because the complexity has outstripped our cognitive capacity. Fine. Not everyone writes software like that. But many will. It’s a changing world.

                Anyway, my bigger point is, if you do something important, but in a not-thought-about way, then thinking-about-it can be a huge boost.

                Look, actually automated formal systems are still largely “research” — although I know plenty of people who use them, but that’s from “selection bias.” Those are the people I seek out. But still!

                (This might run long. I hope it’s worth your time.)

                (I promise I have a point.)


                You know the seven-plus-or-minus-two thing, right? This is how much we can “hold in direct consciousness.” It’s what we can focus on.

                What is the boundary of a concept? Well that is complicated, but look at it this way, the more sophisticated your concepts, the richer the structures you can think about. If I can hold a deep set of ideas as one concept, I can perhaps combine it with other deep sets of ideas in ways that I could not if I had to “keep in mind” all the complexity of each idea.

                You do this already. Every time you name a subroutine or create a class, or really any abstraction. Each time you multiply “floating point numbers,” you are using a complex abstraction. You need to know how they work, but you don’t need all of that that centered in your active conscious every time you type something like

                x = 3*cos(phi) + alpha;

                We can abstract nouns. That is obvious. “EmployeeRecord” or “AtomicCounter.” These are things.

                We can also abstract actions, relations, logical structures, some of them quite complex.

                People like to make jokes about Haskell Monads and their complexity, but they do unify some deep patterns. Are they the right “sweet spot” of abstraction versus complexity?

                I don’t know. They seem hard for people. They are very abstract.

                I’m pretty sure the Java object model is too simplistic, which is why all that crazy-as-fuck “Java Enterprise” XML hellscapes got invented.

                Okay, so a fun question: what is equality?

                In Java/C/C++/etc., you have something called pointer/reference equality. But you also have (for simple types) value equality. But those are not enough. To write anything “real world,” you need to define more elaborate “structural equality” for compound objects. Each languages handles these differently, but it is the same underlying idea. (In C++, you overload ==. In Java, you implement .equals(). In C, you cry.)

                Common Lisp has like 34080293904 different flavors of equality. It’s actually obnoxious and badly designed. (I like Lisp, but I don’t deny its flaws.)

                Haskell has one notion of equality. Sort of. That’s a long conversation. Let’s skip it.

                Consider this: what is the difference between identity and equality? How does that apply to an EmployeeRecord object, which might simultaneously exist in a database, a memcache, directly in the memory of your front-end web server, as Javascript objects in a user’s browser, and as display object in the browser’s DOM?

                You don’t need to answer. The point is, it is essentially complex, and a fairly typical problem for “normal, ordinary” programmers.

                It’s really nice to build your software on top of solid, well-tested frameworks that have already solved this kind of complexity —

                — when they work! It’s tricky. We have a long way to go.

                (Whenever I hear “eventual consistency” I want to cry. I get why it exists. I don’t have a better idea for software at that scale. But still, I cry.)

                Formal methods.

                Here is the Javadoc for Java’s hashCode() method. Just below that is the equals() method. Look them over. Notice something?

                (I don’t know if you know Java. Perhaps you are already familiar with this.)

                There is a formal contract you code must maintain if you implement these methods. Look at how they are expressed. “Reflexive,” “transitive.” That is formal math. If you do this correctly, then your objects will behave correctly when put in a HashMap (and many other structures). If you get it wrong, they will behave unpredictably.

                Java does not check that you have done this correctly. The thing is, Java has no essential notion of equality for objects. It can check pointer/reference equality, but not any “higher level” of equality. You have to implement that yourself. You have to use your own brain to ensure that you got it right. (Plus unit tests.)

                You are doing formal reasoning already.

                I’ve written code that implemented (something like) set-union and -intersection semantics, but where “equality” was based around behavioral equality, and where certain objects could be unified in certain cases, but not others. It was complex. But clients of the code depended on its behaving a very specific way. We also had to optimize the heck out of the union/intersection stuff — which was really a kind of unification procedure — and we used different methods with different compromises depending on the size of the set. Tricky, tricky, tricky.

                We used this structure to optimize a wall-clock-time-limited heuristic search over an NP-complete search space. It worked well. According to our sales folks, our competition “didn’t understand how we could do it.”

                That’s a win!

                These “sets” had a very specific semantics, which existed in three places:

                1. Implicitly in the code,
                2. Implicitly in the unit tests,
                3. Explicitly in code comments.

                That is the best we could do in Java. We could have specified things better in a language such as Scala, but even then, without full dependent typing, at some point you have to punt the ball. You do the best you can with the tools you have.


                Okay, summary time. (Did you make it this far?)

                You already reason formally. You have to. Managing complexity requires it. That said, thinking about the formal models themselves, on their own terms, gives us sharper tools to build even better formal models. This means higher abstraction, which means more complex systems with less cognitive load. This means doing more with the same brainpower.

                Mostly this is conceptual. It’s “in your head” stuff.

                The next step is offloading the “translate the formalization into code” part. The more you can offload onto machine-checked and machine-generated systems, the better. This really is an example of the “make sure information exists in one place” principle. When you can do this, you can just look at the formal spec, which often fits on a few lines on your screen, and know fully what it will do. This is not to “dumb things down.” A skilled engineer still needs to understand the process from tip to tail. That might be one expert on the team, who teaches others, just as often a team will have a few “database gurus” who work beside a tons of folks who know basic SQL.

                (These formal approaches precisely the same as knowing how your compiler works, but never having to hand-compile your code. It’s exactly that, just moved up a few levels of expressive power.)

                There are limits. In its full measure, computer proof systems are semidecidable, and even when the result is “valid,” they are nowhere near polynomial time. We’ll always be threading the needle between the practical and the desired. We also must teach these ideas, make them accessible. That’s hard. I’m trying to do that literally right now, right here (to a degree).



                Is it worth it? Consider math. Right now “smart but not world class” math students can, after 4-8 years in university, understand math concept that in a prior age could only be grasped by a small handful of humanity’s greatest geniuses. Think about that. This is the “shoulders of giants” thing.

                It is real. It is due to greater expressive power, greater unification of ideas, deeper abstractions.

                Take linear regression. You probably learned this in statistics class. It probably took a whole lecture, maybe multiple lectures. It was tricky with lots of moving parts.

                If you are good with matrix algebra, it can be written in one line.

                This is the power of abstraction. By looking at that one line, if you are good at juggling matrices, you can “just see” many, many interesting things.

                The point is, abstraction makes hard things obvious. I can read the Wikipedia article on factor analysis, and I “just see” efficient ways to implement it, even though I’ve never studied it deeply.

                (That said, if I had to implement it, I’d still go read some papers. But still, I have a foundation.)

                So it goes for software engineering. We are solving harder problems now. We need better tools.

                We still need people who know how interrupts works. But then, I understand them for single-processor models, cuz that is what existed “back then.” I assume it’s more complex these days in multi-cpu systems. I can kind of guess how they must work. If I need to know for sure, I’ll look it up.

                How do interrupts work in modern “supercomputers” with 34802938409328094328509390 CPUs connected to weird geometric arrangements of RAM?

                I dunno. If I ever work on a computer like that, I’ll want to understand it, at least at a basic level. Anyway, it sounds like a cool problem. I bet they use formal models.Report

              • Here is the Javadoc for Java’s hashCode() method

                What do you think this does?

                List list = new List();

              • veronica d in reply to Mike Schilling says:

                @mike-schilling — Tee hee. I’m guessing it heats up your CPU.Report

              • veronica d in reply to veronica d says:

                Except List is an interface, but whatever.

                Assuming your concrete class is built atop an AbstractList, you’re gonna convert available energy into heat:


              • Never post code without compiling it. it’s like correcting someone’s spelling, which always includes a spelling error of your own. So, yeah, make the second one ArrayList, or any concrete class that honors the contract for List.hashCode().Report

              • Kim in reply to veronica d says:

                It’s probably a bad thing when your cpu can heat up your entire house.Report

              • It’s been a long time since I coded in Java — some sort of stack overflow exception? A memory fault of some sort, at least. Runaway recursion is an infinite loop only if there’s infinite memory.Report

              • Yup. I’ve never seen that documented.Report

              • veronica d in reply to Mike Schilling says:

                [insufferable know-it-all mode]

                Well there is a command line flag to set per-thread stack size: -Xssn

                Discussed here:

                I suspect the behavior is at least implicit in the JVM spec, as a combination of these sections:



                (I have no intention of reading that in detail.)

                [/insufferable know-it-all mode — if you hate me now I don’t blame you]Report

              • Shorter veronica: We don’t document things that are obvious from first principles.Report

              • veronica d in reply to Michael Cain says:

                Heh. Funny thing, I did read through the mess (cuz of course I did).

                It turns out an implementation is totally allowed to allocate “frames” on the Java heap, instead of a stack, which means that according to spec, it could keep looping and creating frames until you pop the heap.

                No one would implement the JVM that way. But still. Specifications are specifications.

                Java has always been real conservative on how it treats frames, which is why those anonymous nested object thingies can only reference final variables in the enclosing scope. It’s meant to stop you from creating Scheme like closures, which would increase muchly the complexity of memory management.

                Anyway, I haven’t been paying much attention to Java lately. I know Java 8 was supposed to have closures, but maybe not, but maybe. Sorta. I don’t remember. Round and round the arguments went. At that point I got a Lisp job and kinda forgot most Java I knew.

                (I freed up those portions of my brain to think about pretty girls.)Report

              • Way too much information 🙂

                The problem is the List interface’s contract for hashCode() (i.e. that it’s calculated from its member’s hash codes).Report

              • veronica d in reply to Michael Cain says:

                Actually good point. It would surely stack overflow.

                Which, at least you generate less heat.Report

              • veronica d in reply to Michael Cain says:

                [Pedant mode]

                Well, there is tail recursion removal. There are two problems, of course:

                1. The JVM famously doesn’t support tail recursion optimizations [*].
                2. This function is not tail recursive.

                But anyway. Yeah.

                [*] This may not be true any longer on new JVMs. It’s a feature many non-Java language implementers have requested. Microsoft’s CLR does support tailcall stuff, so there is that.Report

              • DavidTC in reply to veronica d says:


                Well, there is tail recursion removal. There are two problems, of course:

                2. This function is not tail recursive.

                To paraphrase Red Dwarf: A superlative suggestion, ma’am. With just two minor flaws. One, this function is not tail recursive. And two, this function is not tail recursive. Now I realise that technically speaking that’s only one flaw but I thought that it was such a big one that it was worth mentioning twice.Report

              • Oscar Gordon in reply to Michael Cain says:

                It’ll probably be a stack overflow.Report

              • J_A in reply to veronica d says:


                On a complete sidetrack, I want to commend you on the ability that you have to express these very complicated issues in a language that is “simple” enough and high level enough that I can understand what you are saying even when I have no clue what you are talking about.

                In case is not clear, this is me praising your communication skills: general and high level enough for a person with basic math (1st year college I would say) to follow, while not dumbing down. So thanks again.

                On a separate, separate, issue, if I can think it, I can do it in FORTRAN. And I was a wiz kid programming the HP41. I was able to program complex number (a+bi) matrix inversions, which was one of my proudest days in college.Report

              • veronica d in reply to J_A says:

                @j_a — Cool! Thanks!Report

              • Kim in reply to J_A says:

                Gotta love programmers publishing in mathematical journals.
                A friend of mine — “So I didn’t really understand the math, so I kind of faked it… but I know the idea is sound, I can measure that.”Report

              • DavidTC in reply to veronica d says:

                Here is the Javadoc for Java’s hashCode() method. Just below that is the equals() method. Look them over. Notice something?

                Yeah, I noticed why I hate Java. 😉 Good luck figuring out the point of the hashcode (Which is only *usually* unique) instead of just actually using the memory location of the object as a unique identifier.

                There is a formal contract you code must maintain if you implement these methods. Look at how they are expressed. “Reflexive,” “transitive.” That is formal math.

                …erm, have you been using the term ‘formal math’ to mean a formal *specification*?

                Having the various parts of a program using well-defined inputs and outputs is just ‘structured programming’. Having that defined in advance is a ‘formal specification’, or ‘formal development’, and is essentially just planning crap in advance, usually so multiple people can work on it at once. And a ‘formal contract’ is part of the same idea, the idea that functions *must* do exactly what they are documented to do, and if they don’t they are Wrong and should be fixed.

                Using formal *math* and formal *semantics* in programming is something else entirely. That is a top-down approach where you start with *mathmatical models*, or specific logic interpetions of a program, and turn those *into* programs.

                But the thing is, as I said…most programs are not math. They’re an iPhone app that goes and gets the weather data and formats it pretty. They’re a CMS. They’re a taskbar email-checking program.

                Even the stuff that mostly is math…people face a choice: They can write a program with the math in it, or they can try to express some pure perfect math somewhere and turn it *into* a program. The latter thing…almost never wins, because it essentially is three times as much work.

                And I have no idea what ‘formal reasoning’ is even supposed to be. You seem to be describing what most people call ‘thinking’. 😉

                Common Lisp has like 34080293904 different flavors of equality. It’s actually obnoxious and badly designed.

                As opposed to the rest of LISP, which is reasonably designed, but still incredibly obnoxious. 😉

                The next step is offloading the “translate the formalization into code” part. The more you can offload onto machine-checked and machine-generated systems, the better. This really is an example of the “make sure information exists in one place” principle. When you can do this, you can just look at the formal spec, which often fits on a few lines on your screen, and know fully what it will do.

                No. This is the promise that computer science keeps making, and yet refuses to happen for 99.99% of the code out there.

                There are exactly two ways to do what you describe, and one of them doesn’t work.

                The one that doesn’t work is some sort of hypothetical meta-compiler that is publically available that can do such things. But the things is…we have those things. They’re called *compilers* already. They generate the code they already can generate, and don’t generate other code, as a tautology. Compilers are already up to five levels at this point if you write in C…you have a config script to rewrite headers, headers and code to run though a pre-processor, that gets compiled to assemble, that gets compiled to machine code, and a linker to put it together. Adding *another* level is just…addding another level. It doesn’t change the process.

                Programmers have tried, for *decades*, to simplify the code they have to write, usually by making a less stupid language. (Or, in the case of COBOL, a more stupid language!) And they did it. They got rid of a *lot* of the housekeeping stuff. I write in PHP, so the closest I have to come to memory management is sometimes I deliberately unset() large arrays after I don’t need them. And while PHP has file handles, I almost never use them vs. just reading or writing an entire file at once. I pull entire database results magically into arrays I can loop over. Looking at my code, there is all sorts of completely dumbass things I would have to do in C++ that I don’t do in PHP. (I couldn’t even write in C, my code is too object-y.)

                And, yes, that includes simplifying some program logic, too. Regular expressions, for a horrifying example. In a less ‘crazy people invented this’ example, foreach() was invented so we didn’t have to keep incrementing down an array and checking for null. Hell, some OO languages have loop constructs built into array objects, so you can trivially run a command on every item in the array. Clever.

                Programming languages are written by programmers. Programmers know what the shortcomings in languages are, and often explicitly invent things to fix those shortcoming…and also sometimes have clever ideas like OO that change entire paradigms. The languages in common use will always be at least one generation behind the problems that current programmers are running into, which means there will always be a new shiny thing to switch to that Solves All Our Problems, at least until we figure out new ones.

                The current problem is distributed code. Not *only* across multiple servers, but, hell, single-computer multitasking is not particularly great in most languages, and considering we’ve hit the limits on transistor size and the only way forward is multi-core, we really need to start taking that seriously in programming languages, as a built in feature. We’ll probably have to invent a new set of structure programming concepts to deal with it. (A section saying do this code, and while that code is happening you can also do this other code that has no interdependences to that first code, but you can’t continue *past* that section until both parts are done.)

                But we’re never going to get to the point where we can program by just telling the computer to program for us.

                And we *do* have ‘compilers’ that are designed for math. There are plenty of them. Programmers…mysteriously don’t use them for normal programming, probably because they’d be vastly unsuited for that. As like I said…programs are not actually math. At all. Most of them don’t deal with math. Yes, *you* might, with your job, but in reality, most of programming is just moving data around and showing it to users, not doing advanced math stuff.

                The *other* rules-based programming that can happen is some sort of *internal* thing, where people, in a specific situation, code a specific parser or compiler for some rules. And I mean, that’s all well and good, but it’s not some sort of magical future of programming.

                You’re standing there talking about the wonders of how nanotech superstructures will allow engineering feats we’ve never seen before, entire buildings contructed out of thin air, and I’m pointing out that…most people are just building houses and an apartment building or two.Report

              • Kim in reply to DavidTC says:

                So, um, sometimes you have enough data that it doesn’t fit into memory. md5 makes a decent hash, and that way you can minimize identical duplicates (have a backup algo to make sure your two identical md5s are really the same image).

                “But we’re never going to get to the point where we can program by just telling the computer to program for us.”
                … too laaate. Okay, sometimes the program develops religion and needs to be fixed manually. But it mostly runs on its own, merrily developing new ideas and schemes and scams. Did I mention the sleeper agents? It has sleeper agents.Report

              • veronica d in reply to DavidTC says:

                @davidtc — You’re doing that thing.

                So yeah anyway.

                I write in PHP…

                Oh! Never mind. I understand now. Carry on.Report

              • DavidTC in reply to veronica d says:

                I have no idea what you mean by ‘that thing’.

                I don’t even understand what you’re trying to get across. My point was that a lack of formal education in programming results in people who do not understand low-level stuff they probably should know.

                You decided to make this about the exact other direction. And, yes, there *are* high-level concepts that programmers should learn formally. I tend to classify those just as the concepts of ‘structured programming’, whereas you seem to want to add a bunch of really abstract stuff there.

                And, yes, some of that stuff is cool. You want to claim that understanding it can also make people a better programmer, well, you can do that.

                I suspect the problem is, you can get people to *learn* to pass tests on it, but not that many programmers would actually *understand* any of that without a lot of practice in it. But I have no actual objection to it being added. Hell, I’d be happy to see a class Godel-Escher-Baching their way through inventing math and programming. They can even learn low-level stuff *and* formal math at the same time!

                My objections are to your additional claim that this is, essentially, the future of programming. No, it’s not.

                The fact we now have really big datasets means, yes, *some* people will be programming with those things, and libraries will be invented, maybe some new computer science concepts to deal with them, etc.

                And programming will continue to look exactly how it’s always looked, with the vast majority of people coding login pages and device drivers and something to fix the crappy XML their client generates before handing it to the parser.Report

              • veronica d in reply to DavidTC says:

                @davidtc — Look, you don’t get what I’m talking about. That’s fine. This stuff is manifestly difficult.

                I can provide analogies. For example, I can talk about how learning matrix algebra — for example knowing all the major decompositions and understanding SVD and the spectral stuff and so on — allows you to deal with much more complex numerical problem than if you do not know that stuff. In other words, with this stuff you can “think at a higher level.” In turn, this lets you read more advanced literature, process complex techniques quickly, and see in your “working memory” connections you could not see if you were thinking at the “component” level, rather than the matrix/vector level.

                That’s an analogy.

                There was a time when matrix techniques were new. If you read math-stuff form 100 years ago, they didn’t always express obviously-matrix things in matrix form. They expressed them in a more burdensome form.

                We can think better now. We have better cognitive tools.

                It took time for these ideas to catch on. At first people were, “Why bother. I like the way I do it now.” Slowly, slowly, slowly, the new techniques caught on. Now they are ubiquitous. Is there another level coming? Can we see it yet?

                Of course, if you don’t know matrix stuff at that level, then you won’t get the analogy, I suppose.

                Take my word for it. Or don’t. I don’t know what to say.


                I cannot talk about this without sounding arrogant. Which look, that’s the way it is. I am a little arrogant. Fine.

                Godel Escher Bach is a cute book. But it is the “pop science” version of what I am talking about, and then not even. If that’s your “touch point” to understanding me, you don’t understand me. That’s okay.


                Let me put it this way, if we all had infinite time, then you would want to master all the things I am talking about, because if you have infinite time, why not learn everything?

                We don’t have infinite time. So you make trade-offs.

                Is it worth your time to learn Haskell Monads?

                I dunno. Up to you. They’re interesting, but they are also difficult. I can tell you they’re really cool and they expand the way you’ll think, and you can do with that information what you want.

                I can say, “Hey look, this is neat stuff that really helps me, and with the growing complexity of software, and how we are building larger systems, we’re going to need to get better at thinking in terms of ‘higher’ abstractions, and declarative techniques are a particularly great way to do that.”

                These are statements made from experience. I’ve done hard stuff and made it work. To do this well, I’ve adopted certain cognitive approaches. I’ve worked on teams with other brilliant engineers who thought similarly. These are not my weird-unique ideas. Some of my former coworkers are now working at finance shops in Manhattan, using these techniques to solve hard problems. Personally, I was more interested in staying at a “tech focused” company — I’m not the “finance type” — and I went a different direction.

                Again, you can take my word for it. Or not. Up to you. I can share my insight, and you can dislike what I say. Fine.

                If you are happy where you are, then be happy where you are.

                I’m kinda-sorta happy where I am. It’s a great company and a cool project. But I want more. I always want to push to the next level. I want to do harder things. I want to do the hardest things. If it sounds too hard for me, then I need to work harder. If someone is doing something groundbreaking and amazing, I should be working beside them.

                Do as thou wilt.Report

              • Mike Schilling in reply to DavidTC says:

                Good luck figuring out the point of the hashcode (Which is only *usually* unique) instead of just actually using the memory location of the object as a unique identifier.

                Garbage collection means that an object’s memory locations changes over time.Report

              • DavidTC in reply to Mike Schilling says:

                Garbage collection means that an object’s memory locations changes over time.

                I don’t actually know how Java’s GC works, but if it does move things, it means the documentation that hashcode() is ‘typically implemented by converting the internal address of the object into an integer’ is utterly wrong. It couldn’t work that way.

                I was just saying “Why not just have that actually be the rule, instead of just having it be ‘typical’?”

                Googling the actual implementation of hashCode(), it actually seems a lot of people *do* iterate over their class and make a hash, which is what I assumed until I had read that it was typically the memory location.Report

              • veronica d in reply to DavidTC says:

                The docs say:

                (This is typically implemented by converting the internal address of the object into an integer, but this implementation technique is not required by the JavaTM programming language.)

                You can actually launch the standard JVM with different garbage collectors, which have various speed/concurrency/etc. tradeoffs. Some are “moving” collectors. Some are not.

                The standard library implements Object.hashCode() as a “native” method, which means it calls down into the underlying JVM code. In other words, it is “beneath the language” magic. The JVM will be required to use whatever is needed to ensure correctness. If it is using a “moving” collector, it will need to generate some kind of persistent object ID. There are many ways to do this.

                In any case, if I were working on the JDK, I would remove that from the documentation. It should be irrelevant what the JVM does, so far as it obeys the contract.

                Anyway, I mentioned Java’s .hashCode() and .equals() because I wanted a simple example of a formal notion that is not well-represented in its host language, but that you need to reason about yourself. I can give other examples. C++ is full of them. If you want to freak-the-fuck-out, read all the rules about how to handle exceptions in a complicated C++ constructor in a memory-leak-free way. It’s mind-bogglingly complex and specific.

                Humans should not have to do this. But right now we do.

                There are better ways.Report

              • DavidTC in reply to veronica d says:

                I stay as far away from C++ as humanly possible. Especially C++ exceptions.

                I am old enough that, when I learned OO programming, we used C++ instead of Java. That…scared me away from OO stuff for a few years.

                You know what is possibly the dumbest thing when C++ as a teaching tool?

                The fact that the shift operators are overloaded to do input and output.

                And pretty much every toy program is just input and output, so you’ve got a bunch of cin and cout and angle brackets all over the place.

                And none of it really *means* anything and isn’t, in any way, some natural feature of the language that you can extend to anything. It’s just ‘Here is some weird totally inexplicably crap that prints things on the screen and the syntax really doesn’t fit into how the rest of the language works, but was created as a random shortcut for programmers’.

                I mean, I’m sure that comes in *handy* when doing input and output, but wow, that is confusing *as hell* for people trying to learn the language.Report

              • Kim in reply to DavidTC says:

                real time programmers laugh at relational ACID databases. too damn slow.Report

              • Kim in reply to DavidTC says:

                You can’t approximate an integral?
                I’ve got a physics degree, and I’ve seen all sorts of math (and tricksy math fun) in my coding and other people’s.Report

              • DavidTC in reply to Kim says:

                You can’t approximate an integral?

                I can approximate an integral. Or, at least, I’ve done it before, I had to look it up, and would have to do so again. (And it’s easier to just put it in a solver.)

                The problems I had with math in school were because I have a rather intuitive grasp on a lot of math up to the trig level, so never really bothered learning *rules*. And then my intuitive grasp ran out, and I couldn’t get my act together.

                I’ve got a physics degree, and I’ve seen all sorts of math (and tricksy math fun) in my coding and other people’s.

                There is all sorts of tricksy math fun in programming…and almost all of it isn’t even the level of algebra.

                …you know, that’s actually a bit self-proving. If solving ordinary differential equations was important inside a computer, computer languages would *have a way to do it*. Built in.

                The math languages do. Matlab does. R does.

                No one else does. Hell, the libraries aren’t even that common.Report

              • Michael Cain in reply to DavidTC says:

                I can approximate an integral. Or, at least, I’ve done it before, I had to look it up, and would have to do so again. (And it’s easier to just put it in a solver.)

                Last time I taught calculus, I spent some time talking about the dilemma contemporary software has created. On the one hand, Mathematica is far better at analytic solutions to derivatives and integrals than we can ever hope to be, and when there isn’t an analytic solution for an integral, can choose and apply good approximation techniques automatically. OTOH, the concept of operators on functions is immensely powerful, and the derivative and anti-derivative operators particularly useful, so we still go through a lot of drill work doing those by hand.Report

              • DensityDuck in reply to Michael Cain says:

                “the concept of operators on functions is immensely powerful, and the derivative and anti-derivative operators particularly useful, so we still go through a lot of drill work doing those by hand.”

                Which is the idea behind common core, aka “Weird Ways Of Doing Math That Meddling Government Busybodies Forced Upon Us”. It’s trying to bring in early the notion that mathematics is “operations performed upon a function”, rather than setting up a distinction between arithmetic and “higher math”.Report

              • DavidTC in reply to DensityDuck says:

                Every time I’ve looked at new math, I’ve noticed it’s how *people actually do math in their head*.

                I.e, add 37 to 14, right now, in your head.

                How did you do it?

                Well, you almost certainly realized that 37 plus 14 is the same as 40 plus 11, and that’s 51, and now you’re done.

                That is, apparently, ‘new math’. Doing it *that* way instead of nonsense with ‘carrying’. The way people *actually* do it: You steal from one side until the other side is a multiple of ten, and then, bam, add them.

                Meanwhile, new math multi-digit multiplication seems exactly like old math, except instead of having to carry each digit down the line, you just multiply every combination and write *all* of them down.

                I.e, if you multiply 37 times 14, the old way is to multiple 7 times 4, get 28, write the 8, carry the 2 back to the top, multiple 3 times 4, get 12, add the two back in, and the first line is 128…and at that point you’re *halfway* done. You then do it all again, offset by one space, for the ‘1’ in 14. This is, clearly, utter gibberish.

                New math is: Multiple 7 time 4. Multiple 30 times 4. Multiple 7 by 10. Multiple 30 by 10. Write them *all* down, no offsets, and add the damn thing up.

                This appears *infinitely* less error prone. And has no ‘state’ during the multiple part, aka, no ‘carrying’. (Still have to carry during the addition part.)

                When you actually learn this, your brain will explode, and you will wonder why the hell they taught you to deconstruct the multiplication problem *halfway*, with complicated positional rules, instead of just deconstructing the *entire thing* to ‘a single digit (Or multiple of ten) times a single digit (Or multiple of ten)’ and then adding it back together…which you have to do anyway! I am actually having trouble *conceiving* of how stupid the method I learned was.Report

              • veronica d in reply to Mike Schilling says:


                So true story. One day I’m at a bar in Cambridge, and the cute bartender asks me what I’m reading. Okay, so it’s a book on non-linear optimization. I say that. Then I go into my normal stumbling, dumbed-down layperson explanation of optimization theory.

                She listens for a few seconds with a flat look and then says, “I go to Harvard. I’m working on my masters in evolutionary biology. I know what the optimization of a non-linear fitness function is.”

                I love working in Cambridge. Whatever game we were playing, she won.

                Months later I’m in a restaurant by my apartment, reading another book on non-linear optimization. (I have my favorite topics.) My cute gay waiter asks me what I’m reading. Same response. Same layperson’s bullshit explanation. He says, “Oh, like operations research.”

                I just nod and say, “Yep, like operations research.”

                He goes on, “I have a physics degree.”

                Anyway, I live in a world just filled to the brim by smart folks. It’s nice.Report

              • veronica d in reply to Michael Cain says:

                @michael-cain — +100

                You gotta know it. It’s like, you’re not going to remember every weird-ass trick in the “methods of integration” chapter, but you should remember that D[ln x] = 1/x. It’s just, you’ll be reading papers that expect you to know this. You don’t have time to go fire up Mathematica each line you read. And indeed, “operator algebras” (and so on) are a thing, and it’s nice to experience them “running direct in your wetware.”

                Expand your brain as much as you can, as far as you can. Then use machines.

                The machines are gonna be smarter than us someday, if we don’t destroy ourselves first. But we gotta get a lot smarter before we can make them smarter.Report

    • Jaybird in reply to Chip Daniels says:

      I suspect that 15 minutes after everybody learns that you need a philosophy degree to get ahead in business, we’ll find out that philosophy degrees have ceased to be good indicators of whether a person will get ahead in business.Report

    • Brandon Berg in reply to Chip Daniels says:

      The study in the first link suggests that you should study science. Actually, more than that, it suggests that in terms of general skill acquisition, at least of the type measured by the Collegiate Learning Assessment, it doesn’t really matter all that much. The estimated difference between the highest and lowest value-added fields of study was 0.16 standard deviations, which is roughly equivalent to the improvement you can expect from taking the SAT a second time. This is not exactly nothing, but it’s pretty weak evidence for the claims being made.Report

      • j r in reply to Brandon Berg says:

        No fair. You’re not supposed to actually read these studies. They’re only their to lend their truthiness to the #content.

        By the way, I don’t know what the sample looks like or how the coding was done in the study, but one thing to keep in mind is that many top-tier schools don’t offer degrees in “Business” or “Business Administration.” So, the population of business students could be heavily weighted towards lower-ranked schools, which would contribute to the lower scores of that group.Report

  6. Jaybird says:

    Do you like work stories about getting fired from a summer internship written by millennials?

    Sure. We all do.Report

    • Mike Schilling in reply to Jaybird says:

      I’m hosting an intern this year, and it’s been explained to me quite clearly that the purpose of the program is to identify talented students and get a head start on turning them into full-time employees. Firing a bunch of them because they dared to challenge me would be a huge triumph of personal ego over supporting company goals. The fact that the columnist said firing them was a good thing, and almost all the commenters agree makes he realize once again how many businesspeople have their heads way up their asses.Report

      • trizzlor in reply to Mike Schilling says:

        Wow. It’s weird to see pretty much everyone agree that firing was over the top but still siding with management because “they make the rules”. The faster this attitude of “interns are a guest in our corporate home” dies the better for everyone. Oh and seeing a hundred comments like these make me want to start sniffing glue again:
        “In general, our interns are a net negative on the organization – far more work goes into them than goes out. But – if we can hire the good ones, it’s worth it.”Report

        • Kolohe in reply to trizzlor says:

          That’s not entirely wrong though? Talent scouting and training pipelines are resource intensive, and newbies rarely generate more ‘value’ than they causing resources to be resources to be expended. That was even in the discussion the other day about how does a newly minted lawyer at Big Law possibly ‘earn’ their neat 200k salary because they don’t really know jack squat yet.

          But if you keep in mind that you are planting seed corn for the future, you see it’s necessary.Report

          • trizzlor in reply to Kolohe says:

            Well, either it’s a net negative or it’s not. What gets under my skin is when someone in a mutually beneficial relationship acts like they’re performing a big favor so they can justify shitty treatment.Report

            • Kolohe in reply to trizzlor says:

              I really think it’s just they’re inartfully using ‘net’; either that or using it over the time frame of “this summer”Report

              • trizzlor in reply to Kolohe says:

                You’re probably right. But the tone I got from the whole thread was that management sees internship programs as this big charity giveaway that the company does to the community, and interns need to thank their lucky stars for every day they’re allowed to come in and work (fetch coffee for minimum wage). It gave me flashbacks to those bankster demand letters we used to see in the Times: “YOU need US to keep the economy from collapsing, so hand over the bailout and fuck off”. I’m guessing one’s take on this whole thing is going to come with a lot of baggage.Report

              • Oscar Gordon in reply to trizzlor says:

                As the post said, firing the whole cohort is extreme, so it is entirely possible this was merely the final straw. I mentioned this to my wife, and she noted that Millennials are raised to speak up, be heard, and question things, but they aren’t always taught how to do so ‘respectfully’ once you leave the protective embrace of school.Report

              • trizzlor in reply to Oscar Gordon says:

                I’m guessing there’s some tone misunderstanding going on too. Some people think “look I made a power point presentation with multiple bullet points on why I’m right” demonstrates professionalism and respect; others think it’s a direct challenge to their authority.Report

              • Oscar Gordon in reply to trizzlor says:

                Quite possibly. Especially if all the managers had already had this discussion with the individual employees and had already said no.

                The other thing is that the interns said that if they’d known about the vet with the missing leg, they would have factored that in, but the reality is that A) the vets dispensations, whatever they are, are none of their damn business, and B) the company could get in trouble for discussing those dispensations with people who don’t need to know about them.

                I still think that, if nothing else was going on, they could have used this as an educational opportunity for the interns. Lost opportunities and all that.Report

              • Kolohe in reply to trizzlor says:

                I’ve been on both sides of the ‘intern’ game (the Navy calls them ‘Midshipmen Cruises, I’ve also been on one side or the other of them in the civilian government & government contracting world)

                The worst ones are obviously when someone has the bright idea to get a lot of free or greatly discounted priced labor.

                But almost equally as bad is when someone in the grand poobah level says “hey, let’s get some interns as an investment in the future” (or just does it because everyone else is doing it, or does it because they’ve always done it). But, then, the grand poobah level just dumps all the responsibilty on the first line leadership, giving that leadership a warm body and no guidance on how to best utilize that person. At best, you have the intern still doing pointless, & sometimes even necessary scut work. More often than not though, the intern is told to just be quiet and stay out of the way.

                I’m philosophically opposed to unpaid internships, but I am a fan (in both the blue collar and white collar worlds) of paying someone, maybe even just minimum wage, to get a genuine apprenticeship.Report

          • Mike Schilling in reply to Kolohe says:

            how does a newly minted lawyer at Big Law possibly ‘earn’ their neat 200k salary because they don’t really know jack squat yet.

            The one exception to the general rule that new employees aren’t worth their salaries at first is when their time is billed to a customer. I’d think this covers associates at Big Law.Report

      • Well, now wait a minute here, @mike-schilling and @trizzlor . Some companies have dress codes for reasons: meeting client expectations, fulfilling professional obligations. We don’t know what kind of company this was. If a law firm imposed a dress code on its interns and disciplined them for rebelling against it, would you take the law firm’s side or the interns’ side?

        Other companies don’t have a direct, rational reason to use a dress code but subscribe to the notion that professional dress buttresses professional attitudes and therefore promotes productivity. Which may be right, or may be wrong, but it’s within the range of decisions that companies are allowed to make for themselves without outsiders intervening and telling them what to do whether through the law or otherwise.

        And yes, some companies have stern and rigid hierarchies and managers with a strong hunger for affirmations of their self-importance, and casual versus formal dress isn’t going to do a whole lot about that.

        Going immediately to this last explanation and branding it the key to understanding these events is taking a few leaps of faith about both the company and the people involved. All we actually know is the company from TFA was that it is big enough to have a need for multiple interns in an office setting, and that someone had implemented a dress code.

        As for the role of the interns themselves, internships are, to my knowledge, principally about identifying and recruiting good prospects for future employment. Who’s going to be a good fit for the company — in terms of talent and ability, yes, but also in terms of personality, work ethic, and congruence with the corporate culture. Here, you have interns organizing a petition about the dress code. This says that these people perceive themselves as a) more valuable than the dress code, and b) unwilling to state their cases for change within whatever structures are provided for within the prevailing hierarchy. Both of these are signals that these people are not going to be comfortable in the prevailing business culture there.

        Think what you will of the prevailing culture at this company — although IMO we lack information sufficient to be normative about that. Even if the prevailing culture is downright dysfunctional, these interns aren’t fitting in well with it.

        If these young people are really that smart and talented and professional and capable, they’ll bounce back from this and become the Young Turks of their industry and disrupt the ossified starched-collar attitudes of the stodgy dinosaur that they all interned at. Or, maybe, they’ll absorb the lesson that sometimes you have to suck it up and wear uncomfortable shoes to work, among myriad other things that you would prefer not to do.Report

        • If a law firm imposed a dress code on its interns and disciplined them for rebelling against it, would you take the law firm’s side or the interns’ side?

          The intern’s side, the minute the other side used the word “rebelling”.Report

          • Kolohe in reply to Mike Schilling says:

            In the Force Awakens, Flynn had a problem with following the Nu Storm Trooper dress code (his helmet), got called out on it, and the next thing you know…Report

            • DavidTC in reply to Kolohe says:

              Finn was a bad explainer there. Surely Storm Troopers aren’t supposed to walk around with bloody or damaged helmets!

              If so, either he was supposed to not be helmeted until it was cleaned, or the quartermaster should have given him a replacement. Did he turn it in?

              The way he explained it, it came off like he just didn’t *want* to wear the helmet while it was messed up. Instead of ‘I couldn’t see out of it, it needed cleaning’.

              Here’s the weird thing: It seems entirely possible the Storm Troopers are supposed to wear their helmet all the time, and never take it off, so he really *wasn’t* supposed to be walking around without it…except that another Storm Trooper recognizes him later, so clearly they *do* walk around without them sometimes. (Maybe only in the showers or something?)Report

      • DavidTC in reply to Mike Schilling says:

        The fact that the columnist said firing them was a good thing, and almost all the commenters agree makes he realize once again how many businesspeople have their heads way up their asses.

        I know this was you saying it instead of them, but I’ve met actually people who called themselves ‘businessmen’ or ‘businesswomen’ and, in my book, anyone who thinks their job is ‘businessperson’ instead of what they *actually* do is automatically assumed to have their head up their ass.

        Hell, ‘manager’ without any qualifications, at a large company, is silly enough. I mean, if you’re a manager at a local Walmart or a fast food place, I understand your job, that’s a fine thing to call yourself. But if you’re a manager at IBM…look, you could be doing *anything*. How about you trying being ‘a project manager of a software development team’, just for fun? Or ‘a manager in charge of overseas marketing’?

        If you’re a manager, that means instead of trying to accomplish something, you’re trying to lead a team to accomplish something. Fine. How about telling me what that something *is*? I mean, that’s the entire point of telling others your job, so you can establish some sort of reference of *what you are accomplishing for the place you work*.

        But at least manager, if somewhat unspecific, is an actual thing that happens at a business. People can actually ‘manage’. And sometimes we can guess, and sometimes things are confusing enough internally that what someone is manager of would make little sense to me. Fair enough.

        What people cannot do at a business is ‘business’. You cannot go into your work and business. You cannot actually be a damn ‘businessman’.

        Either they *literally* aren’t doing anything at the company they work at, or think that their ‘businessy’ is more important than that job.Report

    • j r in reply to Jaybird says:

      It’s possible that the management of this company has their head up their ass. It’s also possible that in this particular situation, the interns were expendable and not on any track to permanent hire. In any event, firing a whole intern class for not respecting the authoritah is extreme. And I can’t help but wonder if this is a troll job, when I read something like this:

      The worst part is that just before the meeting ended, one of the managers told us that the worker who was allowed to disobey the dress code was a former soldier who lost her leg and was therefore given permission to wear whatever kind of shoes she could walk in. You can’t even tell, and if we had known about this we would have factored it into our argument.

      All that aside, the behavior described is real stupid
      If you are an intern or new to a job, you show up early, dressed well with shoes shined and you finish all your assignments early and without error. If there are rules that you think are stupid, you shut up about it until you’ve had some time to figure out why. After a bit of time, once you’ve established yourself as a competent worker and someone that others want to have around, then you can start questioning the status quo and testing which rules bend and which don’t.

      True or not, this story hits all my millennial stereotype buttons. In particular, it points to the two things that I find most troubling: the belief that the rules are always there for a good reason and a sort of blind trust that those in authority are your buddies and are always looking out for you. Of course, I don’t blame them so much. They were drawn that way.Report

      • Maribou in reply to j r says:

        @jr Were I writing the blog, I would probably have said something like,

        “It’s terrible that your manager fired all of you for something like this – particularly if it’s the first time you’ve pissed them off – and if they’re actually interested in hiring and keeping creative people under 35 on their workforce, good luck to them if they keep operating like this. Philosophically, I would love to say that anyone who thinks this is a fire-able, rather than a work-this-shit-out-able, offense doesn’t belong in a position of power, and that you’re better off without them anyway. It’d be nice to tell you to keep looking until you find the right job for you, one that values your intentions and your initiative and wants to shape them, rather than having a knee-jerk reaction to them. But I’m going to assume that you are more interested in being able to feed and shelter yourself than in finding the ideal employer (or going into business as an entrepreneur), so here’s my advice:”

        And then said something nearly word for word what you said in the paragraph after the word stupid.

        (In real life, I also question the whole troll thing. Mostly because of that “factor it into our arguments thing.” but then again, have I met those kids? Yes, not many of them, but yes, yes I have.)

        As an aside, I find it odd/amusing how often I feel like you and I have the same pragmatic conclusions about what to *do* in much of life but very different ways of getting there. Maybe that’s just how I see it though.Report

      • trizzlor in reply to j r says:

        In particular, it points to the two things that I find most troubling: the belief that the rules are always there for a good reason and a sort of blind trust that those in authority are your buddies and are always looking out for you.

        Is this wrong? I mean, as Maribou said, if your only goal as an intern is to put food on your family then, yeah, keep your head down and eat your meat (or else you won’t get any pudding). But if your goal is to find a sustainable work environment where there is respect up and down the corporate hierarchy, then these seem like reasonable features to look for.

        Here’s a more extreme example. One summer in college I worked as a deck-hand on one of these shitty geriatric cruise ships that run up and down the east coast. Halfway through the season a new first mate took over and, as part of initiating some new rules, asked everyone who they preferred to work with for crew jobs. He then said that we would specifically never be assigned to work with those people because it would encourage socializing. The guy was a real tight ass ex-navy type and clearly picked this up somewhere as a management technique to establish that he’s not your buddy and he’s not here to make friends. For us, however, the message was “by the book and nothing more”, which meant if the engine room’s on fire and it’s break time, I’m taking my damn break. Basically, respect has to be earned, and that holds for management just as much as it does for the help. This askamanager forum seems to be read mostly by managers, and it’s too bad such a basic point still seems lost on many of them.Report

      • Mike Schilling in reply to j r says:

        True or not, this story hits all my millennial stereotype buttons.

        That was a recurring them in the comments, too. I had no idea millennialist were so hated. I suppose I’m prejudiced in favor of them through having two of my own.

        the belief that the rules are always there for a good reason and a sort of blind trust that those in authority are your buddies and are always looking out for you.

        The opposite here. They did not think there was a good reason for the rule, and they felt empowered to challenge it Honestly, good for them.Report

        • I told my son this story without editorializing, so I’d get his honest response. Which was “Yeah, entitled millennials”. He’s 21.Report

          • KenB in reply to Mike Schilling says:

            Interesting. While some of the stories about millenials do seem to me to reflect a distinctive amount of entitlement, this isn’t one of them — this just seems like ordinary youthful naivete. I could easily imagine something similar happening 40-50 years ago. I think for it to be millenial entitlement, they would have had to have assumed that the rules just didn’t apply to them, not that the rules were illogical.Report

      • j r in reply to j r says:


        I would say it this way: I am an idealist, but one who understands that it takes a healthy amount of pragmatism to enact your ideals. The former without the latter doesn’t make you more virtuous, it just makes you clueless.

        @trizzlor and @mike-schilling

        Is this wrong?

        Depends on what you mean by wrong. It is certainly wrong in an empirical sense. It is an inability to distinguish between what you think the world is and what you think the world ought to be. The inability to make that distinction is likely going to set you up for a lifetime of failures and not the good kind of failures that come with meaningful lessons. I’m talking about the kind of failing that leaves you confused, because you just know that you did everything right but still didn’t get what you want. The Gen X condition was angst, because uncertainty abounded. The millennial condition is anxiety, because uncertainty was replaced with a false certainty.

        Is it wrong in some kind of moral or ethical sense? That is a tougher question, but the short answer is yes. Yes, because the ability to make is/ought distinctions is part of any functioning moral or ethical framework. The alternative is solipsism, which is what leads people to think that they can come in and reorganize a workplace after a couple of weeks for no other reason than the fact that it makes more sense to them.

        By the way @trizzlor, the clueless manager in your example is more like the clueless intern in this story. There was already a working system in place, complete with sensible exceptions. The interns are the newcomers trying to change the rules to reflect their own sense of “by the book.”Report

        • Mike Schilling in reply to j r says:

          It’s wrong in the sense that the interns assumed that what they’ve been taught about reason and critical thinking applies to the real world. But no worries; they’ll have that beaten out of them soon enough.Report

          • j r in reply to Mike Schilling says:


            We agree. With the key difference being that you see the location of the injustice at the point where the real world intervenes, while I see it happening earlier.Report

            • Mike Schilling in reply to j r says:

              I (seriously) wonder how man of the commenters on the Ask A Manager post are parents, given that they appear to have no sympathy for young people, and think “Because I said so” is an unanswerable argument.Report

            • Jaybird in reply to j r says:

              When I was a kid, we had birth-order personality types. Oldest kids had this set of traits, the babies had that set of traits, middle kids had this other set of traits, and there was a chapter dedicated to The Only Child.

              We are becoming a country without middle kids.

              We’re going to have to figure out a way to more efficiently distribute positional goods.Report

      • Kazzy in reply to j r says:


        But that isn’t how disrupting works!

        That might be my new least favorite buzz word.Report

    • Kolohe in reply to Jaybird says:

      “Dear Intern Forum. I can’t believe this really happened to me….”Report

    • Kolohe in reply to Jaybird says:

      It also it occurs to me if they have enough interns for them to form a de facto union, they have too many interns for the intern program to provide enough value for the interns themselves.Report

    • David Parsons in reply to Jaybird says:

      Those interns dodged a bullet. “christ, what an asshole” applies perfectly to their employer (as well as the askamanager person.)Report

      • Marchmaine in reply to David Parsons says:

        Did they? There’s a key piece of info we don’t have: what company?

        I know when I read it, I assumed some semi-shitty white collar job at Acme behemoth company .

        What if it was Apple? (the would never do that) What about Oracle? (they might do that), or Goldman Sachs? (They’d probably do that).

        If you were a 20-yr-old who lost your internship at Goldman Sachs over comfy shoes… Goldman Sachs isn’t on the short end of that exchange. If on the other hand this was a management training internship for some giant retail mall company – well, as a 20-yr-old you’re still on the wrong end of the exchange, but you will probably recover from it.Report

    • DavidTC in reply to Jaybird says:

      The comments on that article is amazing.

      The sheer amount of quasi-fascist support is just…wow.

      Yes, what they did was somewhat stupid. If the boss says no, repeatedly, take the hint. Do not attempt to organize some sort of petition, especially not via *interns*. Learn to take no for an answer. Also learn that if you want to stage a revolt in a company, you have to start with the people who actually make the place function, not interns!

      But a few points:

      1) Not knowing about the guys medical condition does not actually change their argument. If that guy can do his job while wearing those shoes, so can everyone. People with disabilities have to be accommodated, but if ‘wearing nice shoes’ was actually integral to the job…he wouldn’t be able to do his job, and would have to be given another job. So the conclusion is: *Nice shoes are is not actually integral to the job*.

      Hypothetical example: Say I am a cashier, and we have a detachable scanner that can scan bar codes of things in the cart, but we’re not allowed to detach that scanner because it looks ‘unprofessional’ and have to lift them onto the belt…and yet, a pregnant cashier *is* allowed to detach it and scan heavy items while they’re in the cart. Yes, she’s pregnant and the company *must* make accommodations for her…but, uh, that does raise all sorts of questions as to why they think it’s important for *everyone else* to behave differently. Clearly, what she is doing *works*, so why can’t everyone do it?

      2) Companies requiring non-front-facing employees to wear dress clothing is, in fact, utterly stupid, and has been so for *decades*.

      Least a few comments that point out that requiring expensive dress for jobs that do not interact with customers is a really a form of classism. (In fact, career fields that assume unpaid internships at the start are rather classist *by themselves*, and the clothing just extends the problem. The wealthy are *much* better able to live for a while without income, and buy expensive things during it, than anyone else. Obviously.)

      3) Employees organizing to change corporate policy is, in fact, protected by the NLRA. Whether this covers unpaid interns, I don’t know, but it’s amazing how many people seem to think that workers do not have a *right* to do this.

      If the *paid* staff of the that office had written a petition like that, and presented it, it quite possibly would have been illegal to fire them for it, as that would likely count as ‘organizing’ under the NLRA.

      Of course, as interns (at least unpaid ones) are literally not supposed to be doing necessary work (That would be a violation of min wage), whether or not they can organize and ultimately strike is a rather idiotic question. Even if the company cannot fire them…the company can just let them organize and go on strike. The company doesn’t *need* their work, so who the hell cares if they’re on strike?!

      At least there *does* seem to a lot of pushback about blaming this on ‘millennials’.Report

    • Jaybird in reply to Jaybird says:

      Huh. We’ve got a chunk of people saying “this is an obvious troll, intended to confirm the priors of the already institutionalized”, and we’ve got a chunk of people saying “man, this is evidence of how rotten our institutions are and look at all of these people taking the sides of the corporations!”

      My takeaway more involved that the snake person in question informally asked if the policy could be changed and was told no, was there for several of his or her peers informally asking if the policy could be changed and was told no, and then started a petition as part of a formal request to change the policy. (A formal request that turned out to contain a lot of tone-deafness.)

      I guess this comes down to a disagreement of what interns are for.

      I’ve seen companies that use internships to see who is a good “fit” for the corporate culture. An eventual team member who can fit in and run with everybody in service to a common corporate goal.

      I’ve seen companies that use internships as de facto additional probationary periods. Maybe six months isn’t enough to know if someone should be let past the no-paperwork firing period into the part of the job that requires HR to be involved if you want to let someone go.

      I’ve seen companies use internships as some sort of thing that the CEO read about on a plane flying from this meeting to that one and he said “we should have an intern program” to a VP who then talked to a site manager who then talked to a program manager who then talked to a team lead about getting an intern whose job it was to… something. Not get spun up on the project because we can’t afford to lose bandwidth. Maybe we can have them sit at the table during peer review meetings and staple things?

      It does seem to me that these young former interns were very much done a disservice by many different parties… the corporation that fired them was only one of them.

      But, perhaps, as others pointed out above, these young kids dodged a bullet and some other enterprising corporation out there would be lucky to have them now.

      Assuming that the letter isn’t a troll, of course.Report

      • Jaybird in reply to Jaybird says:

        “The proposal was written professionally like examples I have learned about in school, and our arguments were thought out and well-reasoned.”

        This was my favorite part of the troll/cri de coeur, for what it’s worth.Report

    • Tod Kelly in reply to Jaybird says:

      Looking at these threads, and the threads from the article, it strikes me that another way the Internet is really weird is that people just assume any old thing someone has taken the time to bitch about in their work life is objectively true, and treats it thus.

      My first thought when reading that letter was, “I will bet you $20 that if that person really was fired, it wasn’t for pitching a dress code to her boss.” Most people have a skewed idea of why they’ve been terminated, and that idea gets skewed even more when they tell the world about it. If you’re really young? Multiply all of that by about 100.Report

      • Jaybird in reply to Tod Kelly says:

        It’s true that you always have to look at any “this happened to me” story as a story that came through a filter, but it’s the stuff that is relatively filter-proof that helps provide a skeleton and that skeleton is something that probably happened, assuming the person isn’t lying.

        “I got fired because my boss is a jerkface” tells you nothing at all about the boss. Maybe the boss is a jerkface. Maybe the person him or herself got fired for themselves being jerkfaces (the old rule about how if you meet one or two jerkfaces, you meet one or two jerkfaces but if everybody you meet is a jerkface, then you’re the jerkface is a rule that applies here).

        But the skeleton of that story is “I got fired”. We can probably assume that that part is true.

        Same with the story here. Was the person fired for the powerpoint presentation? We have no idea. But, assuming the person isn’t totally trolling us, I think that it is the case that this person and this person’s co-interns did get fired following a petition incident. The main question is whether we’re in “this person got uppity with their boss after being told no” territory or whether we’re in “this person did everything wrong, every day, and then, to top it all off, spent more time at work writing a powerpoint presentation about their dang footwear than they spent stapling the packets that their boss gave them”.

        We just don’t know. It could honestly be either or somewhere on the continuum in between.

        But if they got fired for gross incompetence and were given the impression that they were fired for a tone deaf powerpoint presentation, then, yet again, they were treated quite improperly by people who should have done a better job of training them for their future employment.Report

      • Dave in reply to Tod Kelly says:

        This, this, this, this , this and this.

        “Never believe everything you read on the Internet” – Abraham LincolnReport

      • Oscar Gordon in reply to Tod Kelly says:

        Even the “Manager” that responded was dubious:

        The fact that they did fire all of you for it makes me wonder if there were other issues too and this pushed them over the edge. Were you getting good feedback before this, or had you noticed your manager trying to rein you in on other things? If there were other issues, I can more easily understand them just throwing up their hands and being finished with the whole thing.


    • Mike Schilling in reply to Jaybird says:

      I added this comment:

      This is a valuable lesson for the interns. Some employers are petty dictators and should be avoided like the plague.

      It got censored, with a general explanation that “this piece has attracted some rude comments”.


  7. DavidTC says:

    American Muslims may actually be more likely to support gay rights than Evangelical Christians

    …has this particularly idiotic claim: It is most obviously true that even to the extent that Christian social conservatism has been hostile to acceptance of gays and lesbians, it has certainly not risen to the horrifying levels of Sunday’s attack by Omar Matteen, which he dedicated to the Islamic State.

    I feel it would be exceptionally depressing to attempt to tally this up, but I think the *assumption* that less than 49 homosexuals have been killed in hate crimes by ‘Christian social conservatives’ is a rather large assumption. Wikipedia has a few lists of LGBT murder victims of likely hate crimes, but does not actually list who did them or their religious beliefs. It’s *way* more people than 49. (And interesting how *trans* people are left out of ‘acceptance of gays and lesbians’.)

    And, I don’t know, but I feel that ‘shooting’ gay people is, perhaps, not the most horrifying thing to do to them, either. Gay men and trans women have a rather horrifying history of being beaten into a coma and later dying instead of just being quickly killed, and people also seem rather inclined to rape lesbians and trans men and trans women, sometimes killing them, sometimes not.Report

    • El Muneco in reply to DavidTC says:

      It’s also interesting that some people out there thinks “some people, possibly even one person, killing people for X” forgives literally everything short of killing people.Report