There is More to the Gun Debate than Gun Culture and Stats

Kazzy

One man. Two boys. Twelve kids.

Related Post Roulette

45 Responses

  1. George Turner says:

    In one of the recent trivia threads there was a long article that showed American’s are the weirdest of the weird in their psychological makeup. Regarding death (or animals) they were the slowest to mature of any children in the world. Other kids apparently understand death, or that animals aren’t little furry people, at much younger ages than American kids. Apparently an intellectual diet of Sesame Street and Roadrunner and Coyote does not convey key points about the nature of existence.Report

    • LeeEsq in reply to George Turner says:

      George, they have Sesame Street in other countries to, its a global ban. South Korea and Japan doesn’t have Sesame Street but they have some rather similar programming. Japan also has a lot of cute furry mascots in their children’s entertainment. Can you say Pokemon and Digimon, I know you can. Somehow I doubt that Sesame Street has that serious effect on American children. If you are going to say things like this than you need to provide evidence.

      There is probably evidence that kids growing up in rougher environments, where life is closer to the bone have a better sense of death than children growing up in the safer places. The former is nothing to be celebrated, it is to be fought against at all costs.Report

    • George Turner in reply to George Turner says:

      I don’t think being disconnected from reality is something to strive for. However, here is an article discussing the shocking finding that Americans are horribly unrepresentative of people worldwide, a finding that has thrown a lot of psychology and economics into turmoil by undermining a lot of academia’s basic assumptions.Report

      • Kazzy in reply to George Turner says:

        It is true that there are cultural and contextual elements to consider. But not all studies are based on contemporary American society. Piaget was Swiss, Vygotsky was Russian, and Bowlby was British; Erikson and Kohlberg published their seminal works more than 60 years ago. So while you’re right that the prevailing theories are not necessarily universal, they are also not based exclusively on a few generations of American kids raised on Sesame Street.Report

      • George Turner in reply to George Turner says:

        Of course not all kids learn the same lessons from Sesame Street.

        Frog – a pellet gun, .22 short, or a trusty gig.
        Pig – anything from .223 to .30-06
        Giant yellow bird – 12-gauge with 00 or a deer caliber
        Grouchy monster in your trash can – a .22 LR or a 410 with bird shot
        Monster in your cookie cupboard – rodent traps
        Numerically obsessed vampire – a .38 silver tip or a wooden stake
        Snuffleupagus – a .375 H&H Magnum to a .600 Action ExpressReport

  2. Brandon Berg says:

    I have distinct memories from before the age of five of being very distraught over the idea of dying, specifically because of its permanence.

    I’m not saying you’re wrong, necessarily, but I also find it hard to believe that I was that far ahead of the curve.Report

    • Stillwater in reply to Brandon Berg says:

      Me too bro. I remember having a major Existential Crisis caused by the realization of death and impermanence when I was only four or five. From his perch in the afterlife I’m sure Sartre was very impressed.Report

      • Kazzy in reply to Stillwater says:

        BB and Still,

        When ever we are looking at developmental milestones, there is always going to be a range, and sometimes a large one. I’m not a strict stage theorist and believe that individual experiences can hasten or slow development within a certain range. It wouldn’t shock me if there were 4- or 5-year-olds who have some sense of the permanence of death. But it is not the norm. And there is lots of research to back this up.

        I’m tempted to guess (though can’t say definitively) that part of the intense feelings you felt around this were precisely because you came to this understanding early, perhaps before other key understandings that typically develop in conjunction had formed, thus leaving you in a state of greater disequilibrium. There is also a difference between having an academic understanding that death is permanent and fully appreciating that a small movement of one’s finger can result in the irreversible death of someone across the room (or, hell, across a football field). Cause-and-effect, intentionality… these and other things go into how well equipped an individual is to properly and safely wield a deadly weapon. Much as most of us would agree that people suffering from mental illness, developmental disabilities, or other conditions that rob them of certain facilities should not have access to weapons, we must also consider that young children (for entirely different reasons… childhood ain’t a mental illness, as crazy as kids may seem) have limited facilities.

        The cutoff might not be 10. Maybe it’s 7. Or 8. Again, no matter what number you pick, you are going to find kids who surpass the expectation and those who fail to meet it. I’d trust developmental psychologists to use evidence-based studies to come up with an appropriate recommendation.Report

        • Brandon Berg in reply to Kazzy says:

          I wonder if religion is a factor. After all, the impermanence of death is a key tenet of the predominant religion in the US.Report

          • Kazzy in reply to Brandon Berg says:

            I saw an abstract of a study done in Israel that indicated as such, but it seemed to be saying that it was more about the conversations and interaction with death than the faith itself, though obviously there is a lot of interplay between the two.Report

          • George Turner in reply to Brandon Berg says:

            Some of it is cultural. For about a century French, Germans, English, and others who were taken with the Greek classics thought they could get Greece to field a super-effective army of steely eyed death beasts like they did in the days of ancient Sparta. They gave up and said Greeks take death too seriously, lacking the stoicism to stand in a pile of their friends’ bodies and focus on the task at hand instead of starting a big elaborate funeral procession with people beating themselves in the head with boards and such. It was thought maybe something about being Orthodox had something to do with it, but the Russians don’t seem to have an issue with throwing away lives like burnt matchsticks.

            There have been many cross cultural comparisons made by military folks over the years, and some notice fruitful insights. One I recall involved problems getting some particular tribesmen to fight with European weapons in a line of battle, and the officer noted that while Western recruits had an irrational belief that everyone else in their ranks might get killed but they wouldn’t, the warriors from the indigenous group he was training all figured that everyone else might survive but were convinced that they themselves would be killed with the first shot.Report

    • George Turner in reply to Brandon Berg says:

      Yeah, but we weren’t part of the later generation that wear diapers to elementary school because they’re still awaiting potty training, another area where US children lead the w0rld.Report

    • Caleb in reply to Brandon Berg says:

      Add me to the list of those who had a crystal-clear idea of the permanency of death from a very early age. I, too, am skeptical that I was at any point more than mediocre in my developmental progress.

      I am skeptical of the “research” on this subject because the vast majority of the data input from these studies seems to be the verbal expressions of the children surveyed. I’m not a developmental psychologist, but I think that the ability of children to accurately express their cognition progress is probably as unsure (if not more so) as their understanding of basic biological fact. In other words, I don’t think those studies sufficiently control for the differential of expressive ability that children posses vis-a-vis their understanding of death. Academic understanding is not needed for the frailty of life to be appreciated.Report

      • Kazzy in reply to Caleb says:

        “I’m not a developmental psychologist…”

        Clearly.Report

        • Caleb in reply to Kazzy says:

          A well-argued rebuttal. I bow to your superior knowledge.Report

          • Kazzy in reply to Caleb says:

            At the risk of falling victim to one logical fallacy or another, I’m going to trust the thousands of experts and decades of research over a single anecdote.

            You’re welcome to your own opinions. You’re welcome to your own experiences. You’re not welcome to your own facts or science.

            The science isn’t perfect. I note that there is a range and that not all children fit neatly into it. But we shouldn’t ignore good, sound science because it disagrees with our politics.Report

            • Kazzy in reply to Kazzy says:

              I also find it interesting that deferring to superior knowledge is done sarcastically. Generally speaking, it seems logical to give a certain precedence to those who are more informed on the subject.Report

              • Caleb in reply to Kazzy says:

                Deference to superior knowledge does not extend to abandoning critical thought. You glibly dismissed my objections as the product of an inferior. You easily could have pointed me in the right direction if you know of good sources which address my concerns. Superior knowledge only requires deference if it can adequately address critical questions.Report

            • Caleb in reply to Kazzy says:

              It’s the science I question, not the political implications. (Although I could easily argue that point too.) The data measured (verbal expressions, behavior, ect.) are essentially proxies for what we actually want to know (internal cognition). I’m merely questioning the accuracy of the proxies toward revealing what we actually want to know.

              With adults, we assume that behavior and verbal expressions are fairly accurate proxies for internal cognition. We also assume that children do not have a fully developed ability to understand the permanency of death. This is a rational assumption. But doesn’t it follow that we should question the accuracy of the proxies, since they are in development as well?Report

              • Kimsie in reply to Caleb says:

                with adults: “behavior and verbal expressions are fairly accurate proxies for internal cognition” — within certain parameters.

                We are able to tell by how interested a young child is, whether they recognize that an object that was previously seen, then hidden, and then seen again, is the same object they saw the first time.

                That may not be as instinctive as the “orient” response, but it’s pretty close.

                I can grok how “discussing death” is harder than “is that the same object”, but then we’re specifically discussing psychology’s ability to determine conceptual understanding based on “lower level bits and pieces”Report

              • Kazzy in reply to Caleb says:

                Your initial comment seemed rather glib, putting research into quotations. But let’s put that behind us and move forward. You are right that we ought to consider the way in which this information is gleamed from children. As an educator, I give my children ample opportunities and various modes of expression in order to assess understanding. They may not be able to verbalize, but maybe they can draw. Or maybe they can’t draw, but they can represent with manipulatives. Etc.

                I don’t know the specific methodologies, but do know that these theories have arisen from the work of various people and have been held for several decades now. If they are invalid, I’d have expected there to be evidence of such. I have not seen that. Which doesn’t mean it doesn’t exist. Critical review is fair, but when something is well-established and accepted within the scientific community, I tend to accept it absent evidence to the contrary, which I haven’t see beyond the anecdotal (which the theory allows for, mind you).

                Also, as I note down below, my views tend towards the pro-gun side of the aisle. So this isn’t part of a broader political ideology or an attempt to kill gun rights by a thousand tiny cuts. I think that people have a right to guns. But I think it important that they demonstrate the necessary facilities to use them safely and responsibly. I think that children, for a variety of reasons, lack this.Report

              • Caleb in reply to Kazzy says:

                I admit I am relatively skeptical of soft-science research as opposed to the hard sciences. The fact that people are the subjects means that there are innumerable uncontrollable variables at play, and that true repeatability is impossible. Not that I think the research is valueless. We have learned much from the soft sciences, and will continue to learn more. But that knowledge is qualified, and probably will be for the foreseeable future. What I object to is placing soft-science research on the same level as the hard sciences.

                I appreciate that the theory accounts for a range of understanding at a given age. I also acknowledge that developmental basics must be acquired before understanding of death develops. However, there are more factors in play other than developmental biology. From the research I’ve read, we have yet to tease out what influence those factors have at a given developmental stage. My objection to the research is that it tends to blend all potential factors together and output a single metric based solely on age. (With a rather broad allowance for differences.) At a certain point, the range becomes so broad that it is meaningless in application for any particular case. This goes doubly for areas such as psychology, where any given uncontrolled-for individual factor can easily throw the subject well outside 2-sigma.Report

              • Kazzy in reply to Caleb says:

                My answer to the “nature versus nurture” question is always “Yes.” Both matter. Which is why I’m not a strict adherent of stage theory. A child’s experiences can accelerate or decelerate development to a degree, sometimes a large one. Hell, if I didn’t believe this, it’d be hard to justify getting paid to do what I do. What value would a teacher be if a kid is just going to move through developmental stages independent of his environment?

                I might be wrong on exactly what the data says about children and guns. But, there is data to be looked at. We must consider fine motor development and control, impulse control, agency, understanding of cause-and-effect, understanding of death, and many other factors, which do not develop congruently. A child might have an acute understanding of the permanence of death but not recognize that there can be outcomes for actions beyond what he desires. “I didn’t mean to kill him, so he can’t be dead.” Stuff like that.

                I shouldn’t be the one drawing up policy, but I do think we should involve developmental psychologists, among others, in the conversations. As the title says, gun culture and gun stats aren’t the end all, be all.Report

              • Kimsie in reply to Caleb says:

                Okay, a good deal of psychology is in the “reasonably hard” variety of science. I could start talking about Mexican hats here (Visual Cortex), or I could start talking about how we see faces (first with a low-pass filter).

                There is a LOT of basic understanding that we have pretty much down pat.

                That’s not to say that “understanding death” is in that category — but I really don’t want people to walk away with the idea that 90% of psychology’s findings are likely to be wrong. Basic research is basic research, and we’re pretty good at that.Report

  3. Jason M. says:

    Why restrict your concern to comprehending mortality? Those little prefrontal cortices (thanks Google) need more development in general.Report

  4. Mad Rocket Scientist says:

    No matter how careful or responsible the adults around him might be, no 5-year-old should be allowed to handle a deadly weapon absent close, careful adult supervision.

    Fixed that for ya!

    In all seriousness, the parents mentioned failed in so many ways. I can understand wanting to let their little boy have his rifle, but removing the bolt on a cricket takes 10 seconds & leaves the gun inert & guarantees it is not loaded.

    Also, while I did not have the advanced understanding of mortality that my betters up-thread did, but I could understand that a knife, or a bow, or a gun could cause big boo-boos and that I should exercise care with them. I never owned a gun as a child, but I did have a knife & a bow, and while the potential lethality of either was pretty low in my young hands, it was not zero.

    Still, my father drilled knife safety into my head before he let me keep my first Buck pocket knife, and while my bow & arrows were in my room, he kept the string in his.Report

    • Kazzy in reply to Mad Rocket Scientist says:

      The problem is, I’m not even sure close, careful adult supervision would be sufficient with a 5-year-old and a loaded, primed gun. If the weapon is inert, it is no longer a deadly weapon and the context changes. If it is a pellet gun or other non-lethal device, the calculus changes.

      And while a bow and a knife are both deadly weapons, there is great difference between them and a gun. A gun requires minimum input but can achieve maximum output. A simple squeeze of the finger can kill someone. Not so with a knife or a bow.

      We shouldn’t allow young children, who have limited impulse control, an incomplete understanding of cause and effect, and a developing conceptualization of death to handle weapons that can so easily kill or maim.Report

      • Mad Rocket Scientist in reply to Kazzy says:

        I think it really depends on the child, & the supervising adult. However, as a general guideline, I think you are correct, 5-year olds should not be allowed to handle firearms. I had a BB-gun at 6 , and learned a lot with it, so that when I shot my first rifle at Scout camp when I was 10, I did very well. Yet I had friends who were shooting rifles at 5-6 years old. Their parents didn’t let them keep the rifles, but they could shoot them at the range, with close supervision, and no one was hurt.Report

        • Kazzy in reply to Mad Rocket Scientist says:

          There are certainly going to be kids who can handle a gun safely before whatever cutoff we might come up with, just as there are kids who can drive safely or drink responsibly or vote intelligently before those cutoffs. But, as a general rule, I think it wise to lean on developmental psychology when it comes to determining what is and is not appropriate for children. As the title of the post says, this isn’t just about what gun culture dictates nor what the stats say. We have decades of good, sound research on what children, in general, should be reasonably expected to do. Guns are not so unique that we should ignore this.

          We don’t teach 5-year-olds calculus equations and with good reasons. If the science says we shouldn’t give them guns, I think we ought listen.Report

          • Kazzy in reply to Kazzy says:

            I should also make clear that I, personally, offer these recommendations apolitically. My feelings and views on guns are mixed. Generally speaking, I support gun rights and do not believe in limiting those for responsible users. I balance that with strong consequences for those who are unsafe and irresponsible. So my advocacy here is not based on some larger anti-gun or pro-gun control agenda.

            If the experts in the field looked at a broader range of child development and okayed gun use for young children, I would accept it. My personal understanding of young children (which comes from having a bachelors and masters in early childhood education and 10+ years working with the field) and of the research tells me it should be advised against, but I’m not an expert.Report

            • Mad Rocket Scientist in reply to Kazzy says:

              Don’t worry, I don’t think this is political. 🙂

              I do agree that expert suggestions should be given considerable weight when deciding when a child is probably old enough to do X. I’m just making sure that we all remember they are suggestions, and parents should be able to make the determination on their own as to when a child is probably ready for X (while understanding that humans are imperfect, will get it wrong, and from time to time tragedy will happen).

              I’d hate for the 5-year old math prodigy to be denied learning calculus because science says he can’t handle it yet.Report

              • Kazzy in reply to Mad Rocket Scientist says:

                Great point. And I agree.

                But guns are slightly different.

                Calculus hasn’t killed any 2-year-olds. Not yet, at least.

                There are actually a lot of things I give parents a wide range of latitude on. Hell, I’m sometimes criticized for giving them too much latitude! But when the latitude you give your child poses a real risk to my child, I think it is fair to ask for some enforceable lines.Report

              • Mad Rocket Scientist in reply to Kazzy says:

                Fair enough.Report

    • Mad Rocket Scientist in reply to Mad Rocket Scientist says:

      Thanks for fixing the tagReport

  5. zic says:

    There are two developmental phases that are worth considering with any new right/responsibility given to a child/young adult, particularly when that right/responsibility is potentially lethal, including access to weapons and to vehicles:

    1) Abstract thinking, typically around age 11. Death, until you face it up close and personal, is an abstraction. My experience raising/working with children under age 11 is that they get the words, but not the meanings of things like ‘death’ etc. without first-hand experience of it. If you’d asked my children what it meant before they were 11 or so, they’d probably have told you it made me sad because I missed somebody.

    2) Impulse control. This is a broad one; but I’d peg reliable impulse control on development of the frontal cortex, somewhere around 21. Before that, there’s perhaps understanding of right/wrong and consequences, but not always the ability to restrain oneself even with that understanding, particularly under influences of drugs, alcohol, or peers. That lack of impulse control may well be part of why people aged 18 to 21 are sought after as warriors.Report

    • Kimsie in reply to zic says:

      Article of the week:
      http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2396566/

      Adolescents seem to go through a phase of risk taking behavior, but I don’t think that’s terribly connected with impulse control.Report

    • Kazzy in reply to zic says:

      I think part of the late development of impulse control is cultural. I studied under a professor in college who promoted the idea of a new developmental stage emerging, unique to Americans, called “emerging adulthood”, which was basically a prolonged adolescence extending into the mid-20’s. I saw this as largely a cultural construct based in part on the way in which we delay adulthood through college. But that is really out of my range so, again, I’d defer to the experts in that field.

      But you touch on good points. All of that must be considered.Report

      • zic in reply to Kazzy says:

        Part may be cultural; but it’s also developmental; prefrontal cortex style.

        http://www.hhs.gov/opa/familylife/tech_assistance/etraining/adolescent_brain/Development/prefrontal_cortex/

        From the pages of this link, on abstract thinking:

        Developmental psychologists have measured and documented a jump in cognitive capabilities in early adolescence. Beginning around the age of 12, adolescents decrease their reliance on concrete, here-and-now thinking and begin to show the capacity for abstract thinking, visualization of potential outcomes, and a logistical understanding of cause and effect. Teens begin looking at situations and deciding whether it is safe, risky, or dangerous.1 These aspects of development correlate with the maturation of the frontal lobe, a shift from expanding neural connections to pruning and an increase in hormones released; all of which drive an adolescent’s mood and impulsive behavior. By age 15, studies show there is little difference in decision-making about hypothetical situations between adults and adolescents.2 Teens were found capable of reasoning about the possible harm or benefits of different courses of action. However, in the real world, adolescents still engaged in dangerous behaviors, despite understanding the risks involved. Both the role of emotions and the connection between feeling and thinking need to be considered when trying to understand the way teens make decisions.

        Researchers have termed this type of thinking “hot” cognition and “cold” cognition. “Hot” cognition is described as thinking under conditions of high arousal and intense emotion. Under these conditions, teens tend to make poorer decisions. Under “cold” cognition thinking, circumstances are less intense and teens can make better decisions. Then with the addition of all the complex feelings — such as fear of rejection, wanting to look “cool,” the excitement of the risk, or anxiety of being caught — make it even more difficult for teens to think through potential outcomes, understand consequences of their decisions, or even use common sense.3

        And on self-regulation; (the prefrontal cortex development, age 25! wonder why this correlates with lower car insurance?):

        s adolescents progress on their journey toward adulthood, with a body that is almost mature, the self-regulatory parts of their brains are still evolving. An earlier onset of puberty increases the window of vulnerability for today’s teens, making them more susceptible to take risks that effect their health and development over a longer period of time.1

        Self-regulation is broadly described among psychologists as the management of emotions and motivation. It also involves directing and controlling behavior to meet the challenges of the environment and to work toward a conscious purpose. Self-regulation also encompasses affect regulation, which entails controlling the expression of intense emotions, impulse control, and delaying gratification.2

        Such behavioral control requires a higher level of cognitive and executive functions. These functions reside in the prefrontal cortex, which matures independent of puberty and is still evolving and developing well into an individual’s mid twenties. During this period of development, adolescents should not be over-protected, but allowed to make mistakes and learn from their experiences and practice self-regulation.

        Report

        • Kimsie in reply to zic says:

          It’s worth noting that the prefrontal cortex is used relatively sparingly in the geriatric population, isn’t it? I worry that we’re biasing our research by culturally imposed ideas of maturity…Report