Research Methods, Statistics, and Video Games
A recent Iowa State University report claimed that one of its faculty members has “prove(n) conclusively that violent video game play makes more aggressive kids.” A colleague forwarded a link to this report to me, knowing that I have challenged claims like these in my books Connecting Social Problems and Popular Culture: Why Media is not the Answer and It's Not the Media: The Truth About Pop Culture's Effect on Children.
When researchers make powerful statements about their findings, it is very easy to be convinced, especially if we aren’t familiar with some of the technical terms in a report or if we don’t know how to think critically about research methods and statistics.
Let’s start by putting aside any preconceived beliefs you may have; most people have an opinion about this issue, but we are going to be using the claim to better understand how to deconstruct the meaning of reports like these. You can make up your own mind about what to believe after you become familiar with the following concepts: Meta-analysis, Correlation, and Predictive Validity.
If one study on an issue is good, lots of studies on the issue should be really good, right? That’s the premise of conducting a Meta-analysis, which involves finding studies with a similar hypothesis and generating statistical data from the group of previously conducted studies.
So far, this makes the Iowa State University report seem convincing, particularly for the average reader who might not follow this research closely. But if you’re not a big journal article reader, you might not realize that at least two other meta-analyses found just the opposite results in the past few years.
Another hidden factor: which studies were included in the meta-analysis, and which were not, and why?
Texas A&M researchers asked this question in a response to the Iowa State University researcher’s claim. They question why unpublished studies (which can include those presented at conferences or not accepted for publication) that are not peer-reviewed would be included in this meta-analysis. It is practically impossible to be sure that all or even most relevant studies could be taken into account. Cherry-picking specific studies could be the result.
Not that this doesn’t happen in peer-reviewed journals too. One of the Texas A&M researchers published a meta-analysis study in 2007, finding that journals are more likely to publish video game studies if they claim to find a negative effect.
One of the most common statistical measures used in studies about video games and violence is the correlation coefficient. This statistic, represented by “r”, calculates the degree to which two variables have a linear relationship, meaning when one variable rises, the other rises (or falls) accordingly.
This measure is often calculated from surveys and looks at variables such as violent video game playing time and measures of aggression (like getting into fights, feeling angry, and so forth).
The correlation statistic is reasonably easy to interpret: the results fall between -1 and 1, where a correlation of -1 implies a perfect inverse relationship, or when one variable increases, the other decreases at exactly the same rate. A correlation of 1 means that as one variable increases, the other increases at the same rate. A correlation of 0 means no relationship. The closer your number is to 1 or -1, the stronger the relationship, and correlations closer to 0 mean that
the relationship is weak.
Both the Iowa and Texas researchers agree that overall the correlation between violent video game playing and aggression (which does not necessarily mean violent behavior) is .15, a relatively weak, positive relationship. The Texas researchers measured other relationships with violent video games, and found several more powerful relationships: poverty and crime (.25), violent video game playing and improved hand-eye coordination (.36), and a very strong inverse correlation between video game sales and youth violence in the U.S. (-.95).
In other words, the strongest relationship suggests that as video game sales increased sharply, youth violence decreased sharply. The weakest finding the Texas team found was the relationship between violent video game playing and serious aggressive behavior (.04).
Some people might wonder, if the strongest relationship found is the decrease in violence following an increase in video game sales, could it be that video games actually decrease violence?
Probably not; in any case, we couldn’t measure that from correlation. If you have taken a statistics class, you probably recall that correlation does not imply causation. Just as your wearing a heavy coat didn’t cause the heat to turn on in your house, correlation measures relationships but cannot explain cause and effect.
3. Predictive Validity
Finally, we must look at any study’s finding and ask whether its conclusions can apply to actual outcomes. For example, we could test the predictive validity of SAT tests by measuring students' college GPA. If the SAT and college GPA produced only weak correlations, then we might wonder how good a tool the SAT test really is for college admissions.
Youth violence has declined significantly; according to the National Crime Victimization Survey, twelve- to seventeen-year-olds committed serious violent acts at a rate of 52 per thousand in 1993. In 2007, that rate had fallen to just eleven per thousand, a 79 percent decline. Video game playing has become such a common pastime for young people (and not so young people) that video game play is not a useful predictor of violence.
Social scientists draw conclusions about phenomena after careful considerations of factors like these; those conclusions are not simply just their opinion. Sometimes, scholars come to different conclusions after reviewing the same data. After examining the results of studies looking at violence and video games, I must respectfully disagree with the conclusion that the Iowa researcher has “prove(n) conclusively that violent video game play makes more aggressive kids.”
Whatever your feelings on video games, violence, or any other social phenomena, it is vital that before we draw any conclusions we test them empirically. What other commonly held assumptions do you think people often fail to test empirically?