Coming home from vacation has to rank up there as one of the biggest letdowns of your life. Seriously. What’s fun about the end of vacation? But one thing still excites me to no end when I come home from a long vacation, and that is the mail. Oh, GOODY! I think to myself, like a kid at Christmas. What did I get? I’m well into my forties, and I still get that way about the mail. If the U.S. Postal Service ever collapses, I’ll probably go into deep mourning, because for me, it would be like the end of Santa Claus.
So what did Santa bring me this go-around? Lots of grocery store circulars, some random junk mail, a few bills here and there. (These are like the stocking stuffers – fun to sift through, but no big deal.) There was a big envelope with information about my daughter’s upcoming school year. (That was fun to open.) But the big “under the tree” gift, so to speak, was (drum roll please) . . . a big pile of psychology research journals!
Thank you, Santa! Thank you, thank you, thank you!!! (Yes, I am a big nerdy bookworm academic.)
When I read a psychology journal, I’m not much different from a die-hard sports fan watching The Big Game. I get really immersed into each article. If I come across a study that’s well-done and methodologically rigorous, I get super-excited. (Sadly, this is not a common occurrence.). More frequently, I read a study that has at least one major flaw in it. And then I get mad. Really mad. I’ve been known to not-quite yell at the article. Dammit! Why the hell didn’t you ask THAT question? Why didn’t you include THESE people in your study? Did you get your Ph.D. by drawing Tippy the Turtle? Then I have to restrain myself from throwing the poor journal across the room. Thank goodness I don’t read these journals while drinking beer and eating chips – then things could get REALLY ugly.
In all seriousness, I think there is good reason to get angry at the current state of psychological research. Back in the early 1990s, I distinctly remember my Introduction to Psychology professor saying that the course we had enrolled in really should be titled, “The Psychology of College Undergraduates” – because the vast majority of university-based psychological researchers use Psychology 101 students as their guinea pigs. In fact, as part of my grade for that class, I was required to participate in at least three studies that members of the Psychology department were conducting. This was more than twenty years ago – if the articles I read in my pile of published-in-2013-journals are any indication, not much has changed since then.
Here’s an example. In one study I read, researchers looked at whether sitting in a constrictive, closed posture (as opposed to an expansive, open posture) influenced women to feel more negatively about their bodies and, consequently, to restrict their eating. Given that my dissertation research focused on non-heterosexual women’s body dissatisfaction, this study was of particular interest to me. As I read through the study, I got to the “Method” section – the part where the authors describe in detail how the study was done – and it started off like this:
Following informed consent, 97 women (86.6% Caucasian, 4.1% Hispanic/Latina, 4.1% Asian, 3.1% African American, 2.1% unreported) with a mean age of 19.61 years (SD = 1.92, range = 18-29) participated (Allen, Gervais, & Smith, 2013, p. 329).
In other words, the sample included mostly young White women, with a grand total of 11 women of color. The researchers didn’t ask about mixed-race status – participants could only choose one category, even if their ethnicities spanned more than one group. The oldest person to participate was 29 years old, so age diversity wasn’t a consideration. And the participants weren’t even asked about their sexual orientation or gender status – or disability, or class status. At the end of the study, the authors issue this statement as a “limitation” of their research:
The current studies examined young women from primarily Caucasian backgrounds with average body size, which limits the generalizability of our results. A complete understanding of constrictive versus expansive postures should consider multiple identities including comparisons with men, various BMI levels (e.g., overweight or obese), different race (sic) and ethnicities, and diverse sexual identities (emphasis mine) (Allen, Gervais, & Smith, 2013, p. 333).
I see this over and over and over again. Instead of diversifying the sample (or clarifying from the beginning that the sample is not diverse, and perhaps isn’t meant to be), psychologists use the convenience of their university’s Psychology 101 classes for their research samples, conduct and publish their studies, then apologize for the lack of diversity later. And it’s not just this study. To give a sampling of other studies published in the three journals I received, there was a study of attitudes towards immigration policies, a study of gender differences in acceptance of casual sex offers, and a study of physical appearance comparisons among women. All of the participants in these studies were college undergraduates, the majority of whom were White. None of these studies involved any kind of meaningful analysis of sexual orientation. And all of them included an apologetic statement at the end, usually “somebody needs to include these populations.”
Somebody. Meaning “somebody else.” Frankly, anyone who conducts social science research needs to be that “somebody.” Period.
Some researchers will say in earnest, I’ve tried to get a diverse sample, and it’s just too hard. Believe me – having done research and working hard to get diverse samples (and to document that diversity accurately), I understand. But if we continue to use tools that were designed for privileged populations, we’re going to continue to get participants who come from a place of privilege. It’s time to get creative and radically diversify our methods.
Here’s an example: If you want to study day laborers (which was the focus of another study in one of my journals), you’re obviously going to get zilch in the way of participation if you merely hang up a flyer in the psychology department advertising your study, or if you create a web-based survey. But if you get up early in the morning, go down to the Home Depot parking lot, and ask the laborers (in the Spanish dialect they can understand), then you might actually get somewhere. That’s exactly what Lizette Ojeda and Brandy Pina-Watson of Texas A&M University did – and they got 143 people to take part in their study. Clearly, their sample wasn’t diverse (it wasn’t meant to be) – but it went far beyond the standard Psychology 101 college undergraduate base.
So listen up, all you social science researchers. Make an effort to diversify your samples – at the very least, by race and ethnicity, by age, by ability/disability status, by sexual orientation, by gender status, by class status. Be willing to use nontraditional methods in recruiting and studying your participants. Document your participants’ demographics accurately – don’t make any part of their identities invisible, just because you didn’t ask the right questions. And remember that each and every one of these participants is giving you a gift. When you publish a research article that respectfully documents their truth, you are giving them a gift too.