Archive for category Brains and Behavior
Two weeks ago, shortly before the Roy Moore-Doug Jones face-off in Alabama, an exterminator came to my home to humanely dispatch from the premises a mother raccoon, which had taken up residence somewhere between the second-floor ceiling and the roof in early December. As I returned from a run, he and the homeowner were talking about the potential debacle of a Senate race that was underway, and the exterminator, who looked exactly like Bruce Campbell in his Evil Dead days only bigger, mentioned that he was married to someone who worked for the Denver Office of the District Attorney and had met Moore years ago, long before most people outside of Alabama and atheist blogs had heard of him. When this Denver lawyer, who was part of a group hosting Moore and others from Alabama at a conference, learned that Moore was not only a lawyer but a judge, she was apparently stunned, given his startling lack of knowledge of everything related to the law, or the bench, or reality.
Maybe it’s not a good idea to use Roy Moore as an example of anything other than a demented, theocratic shitbag. But he did at one time manage to get a law degree. That’s supposedly not the easiest thing in the world, even in the decerebrate South.
One of the fun paradoxes of getting a college education is discovering that it’s possible to earn a bachelor’s degree in a given scientific while remaining largely ignorant of that discipline, even if you receive high grades at a reputable school. Read the rest of this entry »
Long ago, people believed mental illness was the result of demonic possession or other “supernatural” forces. Today, mental problems are typically described as resulting from imbalances in neurochemistry, even though there is no such thing as neurochemical balance.
I think it’s time to adopt a more progressive model, which includes exactly three psychiatric states (independent of drug use):
Read the rest of this entry »
Building on an observation I made yesterday: When people who are clearly mentally unbalanced are at least coherent enough to form political opinions, in any contest they observe between a candidate who goes about things comparatively quietly and one whose chief strategy is inexhaustible high-volume raving about Stuff That Needs Fixing, they invariably go for the shrieking demagogue. Read the rest of this entry »
I continue to be intrigued by people who self-identify as conservative in spite of having been supported by some combination of the government and other people’s charity for literally their entire lives. I ponder the underlying psychology and then conclude there’s actually nothing unusual about this seeming contradiction: If you can’t or won’t make your own way in life to an even marginal extent, it eases your internal conflict to symbolically align yourself with those who can and do.
I’m trying to come up with the liberal answer to this kind of person. Maybe a closet racist or closeted gay person who wishes to shed such biases because it’s “right” and who therefore superficially adopts anti-bigotry political stances?
You have to love it when someone burbles, “I’ve finally realized after all these years that I was doing X all wrong” immediately before making the very same bad life choice he or she just claimed to have put in the past. No, really, I do. The ability to simultaneously learn from a mistake and make that mistake repeatedly anyway is an exquisitely human thing. (I don’t like it when someone who happens to consider me a mortal adversary plays this game, but this is rare, even if it’s what inspired this post.)
In practical terms, wild animals with small brains alter their behavior in accordance with reward-punishment schemes much more readily than people do. If a lizard eats a plant that makes it sick, thanks to its limbic system and the intimate relationship between olfaction and memory, that lizard will never eat that type of plant again. A human, on the other hand, is inclined to engage its cerebrum, and concludes things like, “Well, maybe if I switch from vodka to beer, alcohol won’t be a problem” or “I’ll at least mix in some filtered cigarettes” or “As long as I limit myself to 10 tanning sessions a month, the gratuitous UVB rays really aren’t going to serve as a cancer risk.”
Ironically, we’re the only animals smart enough to be capable of completely pulling the wool over out own eyes. We readily conflate being aware of a problem to having solved it with no further effort.
Is this trait adaptive? It certainly helps reduce cognitive dissonance, which always provides psychological relief, but in general — no. It allows people to repeatedly engage in behaviors that cause them pain, and the fact that it exists to a more obvious degree in mentally unstable but otherwise fairly intelligent or even very intelligent people seems to suggest that it’s not a good approach to the world.
If you’ve spent any time reading blogs, you’ve seen at least a few of them that, rather than serving as a linear chronicle of events and ideas sharing a common theme in the writer’s life, center on a particular event, purpose or goal. In a typical runner’s blog, the writer describes his training and races and daily experiences, with the implicit idea that the blog will be regularly updated until the writer gets sick of it or quits running or no longer has the time. It’s essentially a journal, nothing more. But sometimes people start blogs with titles like “My Journey to the 2016 Olympic Trials,” or “Getting to the Bottom of Russian Doping” that have either a fixed temporal endpoint or a finite purpose or both.
With the latter type, you’ll often discover right away that the person behind the blog has a goal he or she stands no realistic chance of attaining, either because the goal is worthy but simply out of the blogger’s reach (for example, a 40-year-old man with a marathon best of 2:50 hoping to make the Olympic Trials standard of 2:19) or the entire scheme is founded on delusional thinking or ferociously corrupt logic (say, a runner who predicts that a Nepalese marathoner will set a world record this fall because of sherpas’ proven ability to perform yeoman physical feats in the presence of very little oxygen). In such cases, you have unearthed a genuine futility blog. Read the rest of this entry »
It’s often difficult to comprehend why people choose the mates they do. There’s the classic case of the attractive professional woman paired off with the layabout and perhaps abusive man, a situation that comes in a variety of flavors. There’s the quiet guy with the overbearing, endlessly carping wife. There are the women who seem determined to wind up with an active alcoholic or drug addict, and date not just one but a parade of such types. Why do people make the choices they do? I am not a psychology expert and have no interest in what those who are have to say, at least for purposes of this post. Instead I’ll do my best to explain my own patterns and how they have been both adaptive and maladaptive. Read the rest of this entry »
Today I was privy to a conversation between two men who appeared to be homeless (and if they weren’t, they dressed the part) in which each was dourly reassuring the other that the U.S. government was sitting on more oil reserves stateside than the rest of the world held combined, and better yet, that Uncle Sam’s grim scientists an actually manufacture sweet crude whenever they need to. The central idea here was that there is so much oil beneath our feet that if the government so desired, gas prices could drop to about a quarter a gallon and excess numbers of people could enjoy a much-improved standard of living — an egalitarian notion that the power brokers at the top of the heap could simply could not abide by. Bemused, I chalked this up from my position one Pearl Street bench over to, on balance, ignorance rather than paranoia. But then somber End Times talk took over (at which point one of these gentlemen may have been humoring the other) and I knew I had myself some conspiracy speculators. (Most conspiracy nuts don’t rise to the level of generating theories, so I use that term sparingly.)
A long time ago in a city far away, I volunteered for a spell at a facility servicing mostly homeless people with a well-honed taste for crack cocaine. At least half of them seemed to believe that President Clinton was withholding from the public a cure (not a vaccine) for AIDS because unleashing it would mean introducing more blacks into the American workforce, something that the power brokers at the top of the heap simply could not abide by. At the time I chalked this idea up to drugs and understandable bitterness, but given the number of similar proposals I’ve heard since that time from perfectly sober street people, I’ve abandoned that stance.
I have to wonder, then: Is the high prevalence of conspiracy-based notions among street people one of the causes of homelessness, or is it more one of its consequences? Read the rest of this entry »
If you follow or tolerate me on Facebook, you’ve probably seen the titles of some of the articles I’ve written for a particular entity that clearly counts on SEO, not quality, for traffic. I don’t write the titles; I choose them from among a huge slag heap of computer-generated ones and then write stuff that more or less conforms to them. I was musing in an editor’s forum yesterday that I wouldn’t mind seeing the following titles: Read the rest of this entry »
When an august but certainly slipping publication like the New York Times can’t even keep some of the entries in its science blogs from looking like chaff from a bored bunch of A.P. biology students, there would seem to be little sense in hoping that some news outlets still favor dry fact over controversy.
Experienced science writers should know by now that when writing about experiments involving rat behavior or physiology with a proposed bearing on human function, the proper thing to do isn’t to draw — or pretend to draw — lurid and half-assed conclusions and form — or pretend to form — questionable theories just because they are counterintuitive and bound to rile people. In other words, don’t bury the most important element — the ubiquitous caveats about comparing rodent bodies and brains to those of people — under a bunch of bullshit, as has been done in this piece purporting to deny the well-established cause-and-effect relationship between physical exercise and mood. Read the rest of this entry »
Not exactly, but an article on Health.com from a year ago that was picked up by CNN highlights perfectly the media’s insistence on exaggerating the findings in medical studies beyond all reason when the potential to scare people or stir up controversy exists.
First, note the difference between the original headline and the one on CNN. The first is “Do Fatty Foods Act Like Cocaine in the Brain?” That’s technically a little foolish, since foods don’t penetrate the brain, but it’s close enough. CNN decided that headline wasn’t wowsy enough, so some creative clown there went with “Fatty foods may cause cocaine-like addiction,” which is not even close to the same thing. My mock post title is closer in meaning to CNN’s than CNN’s is to the one at Health.com.
Sometimes I wonder how conspiracy theorists got by in the pre-Internet age. This can be said of any group, large or small, without continual and untrammeled access to mass media (television or radio) 15 or 20 years ago and hankering to spread a unique or outlandish message. But conspiracy theorists are a special breed not only for the astounding reach of their complexly puzzling ideas, but also because of the sheer amount of material they produce. Their fingers are as tireless as their minds, and their writing often give the impression that they can somehow operate on planet Earth (at least physically) while enjoying 40-hour days. And nights.
So it is fair to assume that before the Internet became a nexus for anyone in possession of a concept or belief to meet like-minded souls and propagate their claims free of charge, unfettered by limitations of snail-mail, the FCC, and prevailing sanity, cranks were able to distribute only a tiny fraction of their ideas to one another. I imagine that in the main they were forced to stick to theories that were already popular in the mainstream and either not particularly far-fetched (aspects of the JFK assassination) or so nutty as to have fallen into the dustbin of ridicule long ago (moon landings as hoaxes, itself an idea typically borne of its raving, bedwetting stepfather, flat-earthism). Read the rest of this entry »
Lize Brittin was a two-time Kinney (now Foot Locker) finalist at Fairview High School in Boulder in the mid-1980s, winning the Midwest Regional her senior year and winding up seventh at the national champs in San Diego two weeks later. Translated for the benefit of non-runners, that means that she was one of the best cross-country runners in the United States, and it makes her one of the few great runners associated with Boulder who’s actually from there; Melody Fairchild, who holds the U.S. record for the high-school two-mile and won Foot Locker Nationals twice, and Kelsey Lakowske, who was fifth at the Nationals in 2010, are the only other Boulder natives to make it to the extremely exclusive race in Balboa Park, which for many years featured just 32 athletes and now allows 40. She also set a women’s course record for the Pikes Peak Ascent and was ninth in the Bolder Boulder 10K, both world-class events — and both at age 16. But even before landing at Brigham Young University and then at the University of Colorado with a full scholarship, anorexia had begun to take its toll, and she wound up nearly dying from it in her twenties.
Lize is 44 now and thankfully as far back from the depths of her illness as anyone can be. She’s had some of her freelance writing published, but more importantly has also finished a memoir about her experiences, Training on Empty, that she’s looking to publish, and she has started a blog of that name. She’s visited area schools and been on local radio over the years in reaching out to younger runners about the topic, was recently on a Runners Round Table podcast, and has compiled (in my biased opinion) a memoir that goes well above and beyond the usual “I have issues” stuff that people have become almost hardened to. Importantly, she’s healthy today while acknowledging the toll her disease took on herself and her friends and family and balancing the running she’s able to do now with the loss of her best years to a life-threatening illness.
I recently chatted with Lize about these matters, exploring the questions below and others. Listen to the audio here (50 minutes, 48 MB download).
As a former competitive runner myself, a person who seeks therapeutic gain or release from the act of writing, and a voracious reader (and hence a critic), I have to ask — what in your view distinguishes your story, and in particular your memoir about your experiences, from others of its kind?
You raced at a very high level, did so without becoming a household name (as you would have been had the Internet existed in the 1980s), and have kept a low profile in the running world for a long time. You are an unassuming person, the diametric opposite of an ego-driven person or a name-dropper, but I imagine that there’s a part of you that shies away from reflecting a lot on your successes because what could have become a national- or perhaps world-class career in the sport disappeared before you could see it to fruition. Is there therefore a bittersweet flavor to the whole of your competitive memories?
In many ways you’re straight from a textbook: You grew up the youngest child in a household which, while characterized by heavy-duty alcohol abuse, was still very much a high-achievement environment; you were chubby as a youngster; and in general you were never comfortable with the idea that you were a valuable person who does things right. You articulate this in the text, but even if you didn’t, this mindset emanates from your words and from your present-day persona as well. On top of that you had a coach who would weigh you and tell you, surreal as it sounds, that you were a single pound overweight on the morning of a championship race. Since you’ve come to know a good many anorexics over the years thanks to treatment and being open about your travails in your recovery years, does it seem that there has to be a “perfect storm” of factors in order for most susceptible girls — or people — to actually develop anorexia?
Anorexia is different from other disorders that could be described as addictive in that there is, in athletes at least, often a “grace period” in which afflicted people actually perform better in spite of manifesting the disease behaviors before they start slipping. You’ve mentioned that Diane Israel and perhaps others had begun counseling you well before you bottomed out. When you were experiencing success in spite of clearly having gone down the road toward “full-blown” anorexia, did you ever have the idea that you might be costing yourself in the long term or was the success itself, combined with the power of the disease, simply too seductive to allow for any such thinking to make real inroads?
Can you describe reaching a point at which you understood that this was no longer about running, or feeling thin or fit or in control, or any of the other mental shells in which anorexics shield themselves, and was really about your own survival?
One way in which you can unquestionably serve as a valuable resource to untold numbers of runners is that you’ve continued doing it at a modest (for you) level and come to terms with how running both shapes and you and how you need to command its role in your life and not the reverse, and you make no bones about the fact that there are days on which you feel fat and that it’s very uncomfortable for you. A lot of anorexics seem to fade from the activity altogether for either physical reasons or because the idea of balance is not even in the equation, but you’re someone who’s occupied every conceivable position on the whole spectrum. What would you tell a young woman who clearly is nowhere near recovery but has accepted the problem and fears that she will never be able to run again?
When did you start writing Training on Empty? Did you see it as a book-length project from the very beginning?
Where are you in terms of publishing the book at this point?
File this one under “No shit, Sherlock”:
A new study of the relationship between alcohol intake and wheel-running in hamsters has found that exercise may provide an effective alternative for reducing alcohol intake in humans.
[S]aid Alan M. Rosenwasser, professor of psychology at the University of Maine, chronic alcohol abuse and circadian disruption become reciprocally destructive and result in negative effects on physical and emotional health.
“Dopamine is the primary chemical released within the brain in response to any type of reward, including exercise, drugs, food, and sex,” [study corresponding author J. David Glass] said. “For humans, exercise may be an effective, beneficial, and naturally rewarding substitute for any type of addiction.”
To be fair, all of this may actually be news to addiction researchers even though drinkers with a running problem themselves have been aware of these things for decades, and the research team did establish a common link–circadian rhythm regulation–offering insight into why, other than the variously triggered dopamine-reward system that was elucidated a long time ago, exercise helps keep people on the wagon. Still, the fact that scientists are just now catching on to the vital role exercise plays in managing chemical dependency underscores a greater clinical reality: All too often, exercise is never mentioned by psychiatrists treating all manner of substance-abuse problems and mood disorders, with the primary and typically sole intervention being a prescription drug with or without the suggestion to attend support groups.
That’s not only a pithy title but a dumb one; the quiz I’m going to link to is wildly unrevealing, little better than one you’d find on Facebook crafted by a 15-year-old and rife with misspellings, and of course the personality trait known as “sensitive”–as with all personality traits–originates in the brain.
Still, this post on CNN’s “Paging Dr. Gupta” blog is of interest from a neuroscience perspective. Research recently published in the medical journal Social Cognitive and Affective Neuroscience used functional magnetic resonance imaging (fMRI) to assess the brain activity of 18 subjects, and found that those with a tendency to be highly affected by events around them as well as drugs such as caffeine demonstrated higher levels of activity in regions of the brain associated with processing details pertaining to visual stimuli. (I’m betting that FBI Special Agent Dale Cooper of Twin Peaks fame would score off the charts here, as would the smarmy dude on the USA Network show Psych.)
The researchers claim that despite the ability of such individuals to take notice of minutiae that evades the less sensitive, this skill often fails to transfer into on-the-job productivity because these people often spend more time looking at details than cogitating about their immediate implications.
Do you have “”sensory processing sensitivity” yourself? Take this quiz, and find out, or not.
This is not surprising. A Consumer Reports survey of over 1,500 Americans with clinical depression suggests that far more people embrace pills than embrace talk therapy, despite the fact that those who attended at least seven therapy sessions reported as much symptom relief as those who relied on drugs alone. Four in five respondents, in fact, replied that they would rather go the pharmacological route.
This is understandable, given that taking a pill as a lot less work and, in many cases, is a lot cheaper than visiting a therapist. But this doesn’t take into account efficacy, and many people have spent years trying to find an SSRI or other drug that produces the desired effects.
Of course, this is a false dichotomy, since many people on medication are also in therapy. But it’s clear that people are hungering for a magical solution to a complex problem, and it’s unlikely that clinically depressed people will ever fully return to baseline using pharmacotherapy alone.
Of ancillary note: More and more people who seek help for mental-health problems report anxiety as one of their symptoms, and the type of therapist people employ (psychiatrist vs. psychologist vs. social worker, etc.) does not appear to have an effect on the efficacy of therapy.
Most of us can’t help wondering what might be going through the heads of man’s best friends, or whether dogs can “think” in any traditional sense at all. A new book, Inside of a Dog: What Dogs See, Smell, and Know, sheds some light on the issue of how canines’ brains function vis-a-vis those of humans. Among the highlights: Dogs “see faster” than people in that their eyes take “snapshots” at a faster rate; they can “smell time” in the sense that their tracking capabilities depend on the fact that footprints further toward the end of a trackee’s path–that is, those that are newest–exude a stronger odor, therefore drawing dogs along by dint of olfactory force; dogs’ hearing is actually inferior to that of people; and the concept of the “alpha dog” may in fact be a myth.
What I find interesting is how people who share some distinct trait, belief, or status–however rare–seem to gravitate toward one another without any conscious effort whatsoever. If agnostics and people who believe in some amorphous “higher power” are not included, the percentage of atheists in the U.S. apparently ranges from around 5% to around 15%, depending on the parameters of the survey. Yet well over half of the people I associate with are atheists by any measure. Similarly, this Web MD slide show cites the 2% figure I have seen elsewhere with respect to the fraction of the population believed to be afflicted with bipolar disorder; I’d have to say that a far greater fraction of my friends and associates have been diagnosed as bipolar.
Anyway, the slide show is a great overview. The curse of bipolar disorder is that people with it, especially in its less explosive forms, usually find the manic or hypomanic phases not only tolerable but enjoyable, and may often be more productive in some areas of their lives (or at least believe that this is the case). So when the depression hits, they find it easy to believe that their moods are under conscious control and that if they simply fight to reclaim the high of days and weeks past, it can happen. Since this is not how things work, people already experiencing “organic” depression excoriate themselves for their perceived weakness and incompetence, perpetuating a very nasty cycle within a population already apt to have alienated most everyone in their lives and thus operating largely in isolation.
Humans break so damned easily.
An article posted yesterday to Wired Science describes research by a team of German psychologists that strongly suggests that Facebook users provide accurate representations of themselves on their corners of the social networking giant.
“Online social networks are so popular and so likely to reveal people’s actual personalities because they allow for social interactions that feel real in many ways,” [team leader Mitja] Back says.
Back’s team administered personality inventories that evaluated 133 U.S. Facebook users and 103 Germans who used a comparable social-networking site. Inventories focused on the extent to which volunteers endorsed ratings of extraversion, agreeableness, conscientiousness, emotional instability and openness to new experiences.
The subjects — who ranged in age from 17 to 22 — took the inventory twice, first with instructions to describe their actual personalities and then to portray idealized versions of themselves.
Then, undergraduate research assistants — nine in the United States and 10 in Germany — rated volunteers’ personalities after looking at their online profiles. Those ratings matched volunteers’ actual personality descriptions better than their idealized ones, especially for extraversion and openness.
Facebook is so true to life, Back claims, that encountering a person there for the first time generally results in a more accurate personality appraisal than meeting face to face, going by the results of previous studies.
While this may all come as a surprise to people who liken Facebook to a dating-style site, it doesn’t surprise me at all, and Back et al. did not address what I believe is the reason: The majority of Facebook members use the site as a means of keeping in touch with people they already know, not meeting new ones. Since there’s little point in lying to people who already know better, there’s little incentive to embellish or exaggerate.
I don’t technically conform to the study’s findings myself. Anyone who’s a Facebook friend of mine is regularly subjected to a profile typically laden with “status updates” that are outright bullshit, and my personal information page is littered with arrant nonsense. But in treating Facebook like a cesspool of self-indulgent wackiness (hey, when in Rome…), it’s clear that I’m not actually trying to fool people, and anyone who thinks I’m being serious there with this stuff should consult a neurologist. I do keep visitors on their toes by posting something disarmingly sincere and even grave from time to time, but again, it’s not hard to separate these tidbits from the much greater volume of deliberate buncombe.
This is interesting and suggests that there may be a definitive answer to the long-standing question, “Is there an evolutionarily based explanation for homosexuality?”
Overly simplified, this “tipping-point” model (originally introduced by G. E. Hutchinson in 1959, and then later popularized by Jim McKnight in 1997 and Edward Miller in 2000) posits that genes associated with homosexuality confer fitness benefits in their heterosexual carriers. If only a few of these alleles are inherited, a males’ reproductive success is enhanced via the expression of attractive, albeit feminine traits, such as kindness, sensitivity, empathy, and tenderness. However, if many of these alleles are inherited, a “tipping point” is reached at which even mate preferences become “feminized,” meaning males are attracted to other males. In explaining this model, Miller asked readers to imagine a genetic system in which there are five different genes that place an individual along a masculine-feminine continuum. Each of the five genes has two alleles, one that pulls the individual to the masculine side and one that pulls to the feminine side. If a man inherited all of the feminine-pulling alleles (of which he has a 3.125% chance: .55), he will become homosexual. If he inherited less than all five of the feminine-pulling alleles, however, he would not be homosexual. Although originally proposed in simple form in 1959, this model was finally empirically tested in 2008 and 2009.
Behavioral geneticists at the Queensland Institute of Medical Research lead by Brendan Zietsch (joined by sexual orientation expert Michael Bailey and evolutionary geneticist Matthew Keller) found that psychological femininity in heterosexual men elevated the number of opposite-sex sexual partners, suggesting that their femininity was often attractive to women (think Johnny Depp). In addition, these researchers and those at Abo Akedemi University in Finland (lead by Pekka Santtila) independently predicted that if the “tipping point” model was correct, then heterosexual men with a homosexual twin should have more of the attractive feminine-pulling alleles and thus more opposite-sex sexual partners than members of heterosexual twin pairs. The Finnish group also measured the number of children and age at first intercourse between heterosexual men with a homosexual twin brother and heterosexual men with heterosexual twin brothers. While the findings did not reach statistical significance, data suggested that heterosexuals with a homosexual twin had slightly more opposite-sex sexual partners, slightly more children, and were a bit younger at the age of first intercourse than heterosexual twin pairs.
In other words, a certain amount of “femininity” makes straight men more appealing to women and increases the chances that they’ll pass along more of their genes. Too much and then men are simply gay, but the fact that they don’t reproduce is mitigated by the face that their close cousins–comparatively “woman-like” straight men–are busy passing along a slew of the same genes that apparently contribute to homosexuality. (Obviously this description is fraught with hazards. “Woman-like” as I’m using it here implies not a queeny bearing, but greater tendencies toward kindness, empathy, and other positive traits that can be found in men who present as perfectly “masculine.”)
It’s an intriguing idea, anyway, not that the homobigots–usually people whose comprehension of simple genetics is zero–will either understand or accept it if it ever comes to prominence.