Ruminating on a Kentucky farmer

(Sorry about the pastoral pun. I hope you can stomach it. *ahem*)

Dan Kahan, whose research into how the public comes to form opinions on scientific issues reveals the huge importance of our cultural identity and network, has been doing a lot of thinking about a couple enigmatic archetypes: the “Pakistani Doctor” and the “Kentucky Farmer”. (I won’t try to give the entire backstory here, but you can find it via this post.) The Pakistani Doctor, encountered through interviews by researcher Salman Hameed, professes to find evolution important and true in the context of medicine, but rejects human evolution as antithetical to his/her Muslim faith. The Kentucky Farmer, on the other hand, accepts that the climate is changing and alters his farming accordingly, even as he rejects the scientific conclusion that humans are responsible for climate change.

So are these people holding contradictory notions inside their head? How are we to understand their thinking? Dan’s point, as spelled out in the post linked above, boils down to this: these people see no contradiction, and they tell us that the difference between the “evolution” they believe and the “evolution” they disbelieve is context. It’s the cultural relevance. We may be tempted to say, “Yes, but the context is irrelevant— the thing is either true or it is not,” but perhaps we only say that because we successfully pretend that we don’t do the exact same thing. So, in Dan’s view, evolution as it applies to medicine is literally a different concept than evolution as it relates to religious beliefs, and that’s all we need to know to resolve this apparent contradiction. Again, read Dan’s post to flesh this out (and in case I’m poorly representing his argument).

Now, I largely find this insightful and interesting, but a bit of it has continued to stick in my craw, which I promised Dan I would elaborate. I’m going to focus on the Kentucky Farmer. I agree that we should take this farmer at his word when he (implicitly) says that “climate change” with respect to his farm is simply different than “climate change” as it relates to the liberal/conservative point of contention. However, I think stopping there over-reduces what’s going on. The two ideas (of “climate change”), arrived at through the cultural network, come as packets with specific factual baggage. The ideas can differ in content as well as cultural relevance— and if you sit down with the Kentucky Farmer for a beer, I contend it’s highly likely he can tell you all about it. Dan has, of course, thought of this, but seems to have framed it more as a competing hypothesis that comes with rejection of what you might call the “cultural context hypothesis”. I want to push back on that, and argue that this point cannot be left out if you want to fully understand what’s going on.

I’ll rely on my personal experience with the Wisconsin farmer, as I think I’m justified in believing these attitudes to be broadly shared. The Wisconsin farmer has also noticed the unusual weather of recent years, and attests that the current climate differs from the one his father worked with. So yes, you’d better believe he’s willing to adjust his practices. And no, he’s not willing to pin this change on human activities— he finds that hard to believe, so who really knows? And besides, he knows the weather is an unpredictable beast prone to mood swings. Things go up, and things go down, and that’s just how it works. It’s been warmer and drier lately, and someday it will probably be cooler and wetter again. Everything goes in cycles, you know? He’s happy to plan for next summer based on the last ten, because what else can you do?

The “climate change” he rejects isn’t just the liberal variety (context), it’s also different (content).  That climate change is the result of SUVs, coal power plants, and is supposedly going to get warmer and warmer and warmer until there’s some kind of doomsday, according to Al Gore. Here’s the key: ask him what he thinks the climate will be like in 60 years. Does he need to make changes to his farm to ensure his kids’ success? I can almost guarantee he’ll say, “Oh, who knows, maybe warmer, maybe cooler. I wouldn’t get too carried away.” If he truly accepted the reality of climate change as it relates to agriculture, he wouldn’t hesitate to bank on further warming, but that’s not what the “climate change” he accepts means to him. (Might he deny that “global warming” is a synonymous label for his “climate change”?) These things are factually distinct.

This will exhibit itself in his policy preferences. He may support policy that helps farmers adapt to (cyclical, unpredictable) climate change. He will not support policy that actually addresses the cause and mitigates it, because he doesn’t believe in that kind of climate change.

So while I think it’s really useful to consider the fact that we can think differently about a concept if it occurs in multiple contexts, I think we have to wrestle with the fact that in some cases these concepts will be sculpted into distinct things— either by our own reasoning or via our cultural network. Dan has written that it can be almost condescending to decree that the Kentucky Farmer is contradicting himself, and to demand a solution to his foolish paradox. However, I think it can also be condescending to stop him when he gives us one reason why he doesn’t feel he’s contradicting himself— he may not be done explaining things he has put a lot of thought into. I don’t think what I’ve outlined above superficially “explains away the paradox”, as Dan put it, any more than the “cultural context hypothesis” does so. It’s just digging far enough into the details to be sure we understand what the Kentucky Farmer is telling us.

I don’t think we can really celebrate the Kentucky Farmer’s willingness to accept “climate change” as it relates to his normal, everyday business, because he’s not accepting the same facts we’re concerned about. We may find it illustrative to discover how his “climate change” has been sculpted to become culturally palatable, though. Why is he willing to accept those facts/beliefs? We can come up with plenty of plausible reasons. (Perhaps human/divine control of nature, perhaps opposition to regulations on markets or government control, etc.)

However, we shoudn’t expect that he will necessarily be amenable to our version of “climate change”, given the right context. Context may not be the only thing that allowed his “climate change” to pass through the cultural filter.

Understanding strike and dip on geologic maps

In order to understand geology, we need to think about the rocks below our feet in three dimensions. Those spatial relationships— what’s on top of what, how rocks are faulted or folded— give us all kinds of interesting information. While maps are useful in all kinds of ways, they’re a little lacking in the 3-D department. So when we’ve measured the orientation of a rock layer, we need a way to represent that on a map. Here’s a basic primer on how that’s done.

The geological lingo for this is “strike and dip”. The words may be confusing at first, but it’s really quite simple. Let’s start with some nice, horizontal sedimentary rocks. Pretend the top of this block is the surface of the Earth. Picture a kangaroo hopping across it, if that helps or entertains you.


Let’s picture a single layer of that rock and tilt it downward to the right. The tilt of that layer is what we call “dip”. We describe this layer as dipping to the right, because that’s the direction it’s tilted downwards toward. We measure the dip as the angle between the layer and horizontal.


Now, let’s try to imagine looking at that same tilted layer from directly above it. Again, it’s dipping downward to the right. Draw a horizontal line across that layer— that’s what we call the “strike”. (Don’t worry about why we use that word.) In this example, the strike is north-south. That means this layer is dipping to the east.


Here’s a block made of tilted layers instead of horizontal ones. The top of the block is still the surface of the Earth, but I’ve sliced out a piece of the block to help us visualize this better. Imagine you’re looking at a cliff.


When you look at the side of the block, you can see the dip. When you look at the top of the block (the surface), you can see the strike. Click the image below to explore a real outcrop of sandstone and siltstone from the Oregon coast that looks a lot like this. Be sure to zoom in and get a closer look.


Here’s an annotated image of that outcrop, just to make sure you can see the strike and dip. You can click on this image to see more photos from this spot. If you click this link, you can check out  the area with Google Earth.


Geologists carefully measure the strike and dip on an outcrop like that using a tool called a Brunton compass, pictured below.

Wikimedia Commons

Once you’ve made the measurements, you can display them on a geologic map— which was our goal at the start of this post. Strike is easy enough, since it’s a compass direction. If we measured the rock striking north-south, we would draw a short, vertical line on the map in the location of our measurement (assuming up is north). If the rock was dipping 45 degrees to the east, we would add a tick mark at the midpoint of  our strike line, pointing east. Think of the tick mark as an arrow pointing in the direction a ball would roll if you could set it on that rock layer. Just below that we would write the measured dip: “45”. It would look something like the example below. The different colors on the map represent the different rock layers exposed at the surface, just like looking at one of the block diagrams from above. Without the strike and dip symbol, we would have no idea if those layers were dipping to the west, to the east, or were vertical. But by adding this simple symbol, we can understand something about the 3-D orientation of the rocks.


Hat tip to Lockwood DeWitt for giving me the idea for this post.

Two-channel communication and the sci comm broadcast

The following thoughts are hopefully instructively wrong.

The idea that science communication inherently occurs via two channels— meaning and content– strikes me as one of the most useful ways of thinking about it. The content entails the information, be it news of a fresh study or an explanation of some concept, and the meaning is determined by how that information strikes the receiver. If a climate skeptic and a climate advocate read the same article, they receive the same information but radically different meanings. If you have a homogeneous audience, you can attempt to control what you transmit on the meaning channel. For a mixed audience, however, your options are limited to trying to avoid inflaming a few known, hazardous meanings.

To twist this model around a bit, I can imagine the goals of science communication fitting into different channels, as well. In the spirit of one-upmanship, I prefer three channels. The first is the information channel, where the goal is simply to inform those with a desire for that information. Here, the “deficit model” can actually be quite a decent fit, just as self-motivated students can learn a lot from a very dry course that would be considered pedagogically poor. There are tons of science-hungry, curious people out there that love to learn about their topics of choice. There’s certainly plenty of space for you to do a better or worse job of engaging readers, but they’ll basically be content if you can communicate clearly. (I’ve blabbed on before about “preaching to the choir“.)

The second channel is the persuasive one. Here it will be more important to have a specific audience, as you’re attempting to win them over from a position of mild or strong opposition. They could be people skeptical of anthropogenic climate change, resistant to evolutionary science, or fearful of genetically modified food. The deficit model will be about as useful here as a kayak in the desert, considering the fine array of psychological weaponry we all have for fending off information inconsistent with our beliefs and cultural identity. This is usually the arena where we’re crapping on the deficit model. The public opinion needle on [insert topic here] is remarkably resistant to the dissemination of information, so this information-based science writing thing isn’t working. I hope I’m wrong about this, but what if that’s a bit like complaining about the number of square pegs that made it through round holes? I think most people writing about science do so with the hope that it can make a difference on contentious and important issues, but that may be be asking for miracles.

The third channel doesn’t reach out to the curious or to the opposed, but to the non-curious. The apathetic.  We’re trying to shift people into the curious camp, where they are reachable with other types of science communication that can get to deeper places. You run MythBusters marathons on channel 3 and see if you can get people to tune in to NOVA on channel 1 more often. (Going back to the content/meaning model of communication, this is an attempt to change the general meaning on science content from “not meant for me– boring” to “something I can dig”.)

Can we really work on more than one channel at once? We spend a lot of time talking about communication problems in “traditional” (for lack of a suitable term) science writing– how to reduce polarization and minimize the potential for readers to tune us out. There are certainly ways to be more effective, but overall, it seems pretty damn hard, and I’m starting to wonder. Many solutions for improving communication on channel 2 aren’t compatible with traditional science writing, because the solutions are to do other things– two-way conversations, appeals to cultural frames, pulls over pushes. Great, let’s do those things, but what about improving the science writing, too? Does the paucity of results there mean we haven’t looked hard enough (it’s not like there’s a lack of enthusiasm…) or does it mean you simply can’t do that much to make a screwdriver into a better hammer?

At the very least, maybe we need to be more specific when we talk about improving science communication. (Improve it on which channel?) And maybe the answer to the question “How can traditional science writing help with contentious topics?” is an unhappy one– do no harm and inform the receptive curious. If that’s the case, a lot less energy needs to be spent on making channel 1 work do channel 2’s job.

If there are ways to, as Peter Broks put it, become co-creators of meaning in the traditional avenue of newsy science writing, then that’s what we need to talk about. The conversation seems to often lack the specificity needed to be useful to that avenue, which leads to frustration and wheel-spinning because science news isn’t formal education isn’t informal education isn’t guerrilla outreach.

Like a lazy high school student on Yahoo! Answers with math homework, I just want to know how to get this part right.

Science literacy and “The Fear”

“Science literacy”. How often do we hear about how badly students (and the general public) need some? Educators know it’s not really about teaching people facts to give them a working knowledge in many fields. Check that- every sane person knows you can’t be an expert in everything. Science literacy is better thought of as learning to think scientifically- how to think critically and rely on evidence. And we might go a step more abstract, wanting people to have a familiarity with the process of modern science- how studies are performed, what peer-reviewed journals are, why some ideas can be trusted more than others.

I think there’s still another piece of this that doesn’t get enough attention. I think of it as “The Fear”. The Fear of dull and difficult science classes, of complicated mathematical calculations, of memorizing unfamiliar terms. I think it’s The Fear that steers people away from science programs on TV or books about science. They’re not sure why it’s interesting, but they damn well know it’ll be tedious.

Maybe you can teach a student the basic concepts of biology, and maybe you can even get that student to have a reasonable understanding of what constitutes scientific information. But if that student won’t touch anything sciencey with a ten-foot pole for the rest of their life, how much have you really accomplished?

Rather than feeling like I try to “hook” my students with something attention-grabbing so that I can get them to pay attention to the important stuff, I’ve always seen that goal as worthy in and of itself. At first, it was just because I wanted them to like what we were doing- after all, I loved the stuff- but now I see science “comfort” as a cornerstone of science literacy. If my students forget virtually everything I taught them about Earth science but are more willing to have their curiosity piqued by some science down the road, I’m not so sure I shouldn’t declare victory. A science comfortable student will be much more likely to acquire scientific information when they need it.

This is one of those things that I think come naturally for most educators, but might not get its time in the spotlight for acknowledgement. A lot of effort goes into designing activities to effectively teach concepts and skills. Maybe with a little justification that making things fun is serious business, spending time on designing science comforting experiences won’t seem like an indulgence, or even just an investment. It’s not the cherry on top- it might just be the dish that holds the sundae.

The dichotomous Kochs: why “anti-science” is a useless label

Publicly controversial issues in science can be incredibly frustrating, whether you’re a scientist, communicator, nephew, or neighbor. We have an expectation of rational decision-making and discourse from others, so when people form opinions that contradict scientific understanding, we get mad. Some get patiently mad; others, less so.

As in other cultural disagreements, this frustration can lead to name-calling and labeling. Perhaps the least useful label out there is “anti-science”– often applied to those who loudly reject evolutionary biology or climate science. In doing so, they often cite ideological motivations– the science of evolution is wrong because the Bible says so, or climate change isn’t real because those scientists are all socialists. It’s unscientific thinking, and so the unscientific thinkers are dubbed “anti-science”. After all, if they were “pro-science”, they’d take the science seriously, right?

But exceedingly few of these people will actually disavow all support for scientific inquiry and the rewards thereof. Their rejection is compartmentalized. “Of course science works out all kinds of things,” they might say, “but the science of x is flawed!”

There is, perhaps, no clearer demonstration of this than David and Charles Koch. The fossil fuel industry billionaires are probably the leading boosters of climate misinformation. They help fund groups like the Heritage Foundation and the Heartland Institute (among other conservative think tanks) that have led the way in opposing climate science. Then there are other political groups, like the tea party Americans for Prosperity, which has leaned hard on any Republicans who dared to accept the science of climate change (see: Bob Inglis). The Kochs are critical benefactors behind these efforts, among the most organized and effective sources of public politicization over climate change.

But at the same time, they have donated absolutely astounding sums of money to support other fields of science. Cancer research, especially, benefits from the Kochs generosity, as have a number of universities. And NOVA, the wonderful science program on public television, is now “made possible by” Koch family money.

It can be a reasonably jarring sight, watching those credits. If the Kochs aren’t the kingpins of the “anti-science” movement on the issue of climate change, nobody is, and yet here they are, going out of their way to be extraordinarily pro-science. Should the brothers change their surnames to Jekyll and Hyde, or is the “anti-science” label showing its uselessness?

You can’t fit people into these two boxes: rational & scientific, or irrational & unscientific. We’re all a little bit of everything (though some end up with larger servings of one buffet item or the other). We compartmentalize our thinking. On some topics, we’re right there with the best science, and for the right reasons. On other topics, we might harbor doubts that money has infected the whole thing and those shills in lab coats can’t be trusted. The Kochs obviously think they’re doing a great service by enabling all this opposition to the scientific consensus on climate change. In reality, they’re doing a great disservice. We all make mistakes. Most of us just aren’t able to fund them quite so handsomely.

Now, so what? It comes as no surprise to most that a label used in a contentious public debate isn’t the most fair or accurate. What good does reflecting on this do?

Name-calling obliterates the value of a discussion pretty much instantly. It’s great for succinctly expressing your frustration with a group of people, but not so great for trying to talk to those people. Even if it’s not intended as a called-name, the “anti-science” label has the same effect. Generalized labels like that act as replacements for reasoned arguments and explanations. (We know those arguments aren’t likely to persuade, so we skip ’em and throw a satisfying punch instead.) Obviously, not all labels are fightin’ words- some are mild descriptors- but it’s wise to examine the ones you use so you don’t score too many own goals.

The nature of the public controversy over climate change isn’t scientific, it’s cultural. If you don’t understand where people who disagree with you are coming from, you will never leave square one. Calling them “anti-science” is a fantastic way to not understand them.

Of troublemakers, priorities, and forgotten battles worth fighting

Science communicators expend a lot of energy on account of the opposition to important (but publicly controversial) scientific topics like evolution and climate change. Misinformation is debunked. Grumbles about Fox News, the Daily Mail, or [insert politician here] build to a low roar of background frustration. The questions are obvious. “Why do these people think that??” “How can we get through to them?” “How do we keep them from turning our comment threads to toxic waste dumps?”

And so we study them. The research on the psychology that ties people to these ideas is extensive (and fascinating). Lots of effort goes into thinking about what to do about it- communication strategies that navigate the minefield of cultural filters. With such worthy “foes”, it’s easy to turn your back to allies. But there are problems in those ranks, as well. I’m not talking about arguments over whether the political left is as “anti-science” as the right, and I’m not advocating some “scientific relativism” about us all being biased brains in vats. (No, Neo.) I just keep thinking: “What about the people who are on the right side for the wrong reasons?”

We draw a line based on a position (like acceptance of anthropogenic climate change) and assume that being on one side of that line means a person has reasoned poorly while being on the other side means they’ve reasoned well. While it’s pretty difficult for someone to correctly reason their way to rejecting climate change, it’s quite easy for someone to incorrectly reason their way to accepting it. This is obvious- just as “conservative” (for lack of better descriptors) cultural connections can lead someone away from the best science, “liberal” cultural connections can lead them toward it. Those are stories about culture, not science.

So what? Isn’t this a bit like worrying that some of NPR’s donors might only have done it for the sweet, sweet tote bag? I think that dismissing this fact would be pretty short-sighted. Climate change is far from the only issue for which it’s important that people recognize what science has to say. The person who accepts climate change only because they hate everything that Republicans say is ill-prepared to judge the next topic that comes down the pike. (Or to take a smart position on genetically modified foods or the safety and importance of vaccines.)

It’s worse than failing to “correct” them. When they hear the other side labeled as “anti-science”, they implicitly pat themselves on the back for being “pro-science”. They’ll still claim that mantle when they make a decision on another issue, even if they unwittingly side against the science.

If people need critical thinking skills and the ability to ferret out scientific knowledge through all the noise, bad habits shouldn’t be reinforced. Creating a science-tuned society is no small task- we can afford to leave no stone unturned.

So despite complaints that it does little to sway doubters, preaching to the choir is an integral part of science communication. Sure the choir listens to the preacher, but don’t forget that they learn something, too. If your goal is broader than “get people to join the choir”, you’ll see that’s not necessarily time wasted.

That broader goal is to educate and instill some “sci-sense”. Some strategies will be more effective than others, but discussion of this sometimes gets lost in the focus on more adversarial pursuits. (The squeaky wheels that grind our gears get all the grease?) The best solutions are probably just the things that everyone recognizes as “good science communication”, but it might still be worth thinking about more often- even if it’s just to assuage guilt from preaching to the choir. Public opinion polls of acceptance/rejection are not the only metric of progress.

I would love to hear thoughts on ways people think about this.