Legal Update

Identifying the jurors who won't stay offline: Development of the Juror Internet Research Scale

An empirical test of the Juror Internet Research Scale.

By Alexis Knutson, MA, Edie Greene, PhD, and Robert Durham, PhD


These days, people are plugged in around the clock — researching nearby restaurants, scrolling through social media, seeing when the next bus will arrive. Access to as much information as one can sift through is expected to be constant and readily available. Given our tech-heavy culture, it is difficult enough for some people to sit through a jury trial, removed from their smartphones and tablets all day. Add to that the fact that jurors are required to refrain from researching the case throughout trial, even in their free time, and violations of these rules becomes highly probable.

For that reason, knowing who will follow the judge's rules is becoming increasingly important. A recent survey of 494 district court judges found that in the past two years, almost 7 percent had caught jurors using the internet to do research during a trial (Dunn, 2014). Given smartphone technology and the sheer number of people who use the internet regularly, this may seem like a small number, but because these infractions are difficult to detect, the extent of juror misconduct is undoubtedly a more serious problem than this figure suggests.

So who are the jurors who simply cannot stay off the internet during trial? We suspect that some jurors are more likely than others to follow judicial instructions regarding internet use. To test this idea, we developed the Juror Internet Research Scale (JIRS) to distinguish between those who are likely to follow the rules and those who will not. Our hope is that trial consultants and attorneys can utilize this scale to identify those jurors who are most likely to engage in internet research — a valuable tool in cases that have received media attention, particularly if the information jurors dig up is inadmissible in court.


Juror Internet Research Scale development

We generated potential scale items through a review of the available literature on juror internet use during trial. Initially, we identified and tested 27 items.

Study 1: Testing a Student Sample

The scale was first tested using a sample of 221 undergraduate student participants. Participants completed the 27 item JIRS as well as six additional scales.

Two scales were used to establish  convergent validity — a  measure of self-control  and a measure of  perceived obligation to obey the law. (PDF, 550KB)  Our hypothesis was that the JIRS would be inversely related to a scale measuring self-control and a scale measuring obligation to obey the law. This would provide theoretical support that the JIRS is measuring the construct of juror rule breaking through internet use. Three scales were used to establish  discriminant validity . Here, we used measures of  life contentmentreligious faith (PDF, 505KB) and  general happiness. Our hypothesis was that the JIRS would be unrelated to these measures altogether. Finally, to assess the tendency to answer questions in a socially desirable way, participants completed the  Marlowe-Crowne Social Desirability (MCSD) scale.

We conducted an exploratory factor analysis to identify the best items. Following factor analysis, we kept 10 of the original 27 items. We then conducted a second factor analysis on the 10-item version. The results showed that the scale as a whole measured a single construct, and that this single factor explained over 67 percent of the variability in answers. The factor loadings were all very good, indicating that each of the 10 individual items on the JIRS was measuring a single construct. The internal reliability of the JIRS was excellent (a = .95).

We also found, as predicted, that the JIRS was negatively correlated with obedience to authority ( r  = -.23) and self-control ( r  = -.21),  p  < .01. In support of discriminant validity, the JIRS was not correlated with life contentment ( r  = -.10), general happiness ( r  = -.03) or religious faith ( r  = -.11). A small positive correlation was found with the measure of social desirability ( r = .18),  p  < .01.

Overall, we concluded that our scale could be useful in predicting who would be more and less likely to do outside research during a trial. Namely, we found support that those low in self-control and low in obedience to authority would be more likely to do internet research, while those high in self-control and high in obedience to authority would be less likely to do internet research.

Study 2: Testing a Community Sample

To check the student-sample results, we tested the JIRS with a community sample of 237 participants recruited through Amazon's Mechanical Turk. We conducted an exploratory factor analysis with the community data on the same 10-item, final version of the JIRS tested with the student sample.

The community results mirrored the student results. Again, the items loaded on one factor, which accounted for over 71 percent of the variability. The items measured this single factor, as seen by good to strong factor loadings. The respondents answered consistently, making the internal reliability of the JIRS excellent (a = .95). Replicating the results of the student sample, the JIRS was again significantly negatively correlated with obedience to authority ( r  = -.22) and self-control ( r  = -.23),  p  < .01. In support of discriminant validity, the JIRS was not significantly correlated with life contentment ( r  = -.08), general happiness ( r = -.10) or religious faith ( r  = -.01). A small negative correlation was found with the measure of social desirability ( r  = -.23),  p  < .01.

Thus, the student sample results were replicated with the community sample.

How to Use the JIRS

The measure is scored by creating a sum score of a participant's answers on the 1 to 6 scale as shown below. However, only text responses were shown on the measure given to participants. Thus, the numbers are shown in parenthesis for explanatory purposes only. High scores indicate a higher likelihood of doing online research. Scores range from 10 to 60.

The JIRS appears below:

Listed below are a number of opinions. Read each item and decide whether you agree or disagree that this statement reflects your beliefs and to what extent.

Strongly Disagree




Somewhat Disagree


Somewhat Agree




Strongly Agree


1. I would look up the parties in the case online to try to find additional information about them.

2. Finding additional information about the case online would be more helpful than harmful.

3. If I don't understand something the attorneys have presented, I would look it up online.

4. I would try to find relevant and helpful information online that may be withheld during the trial.

5. If I can't ask questions during the trial, I would look up the information online.

6. I would use the internet to find out the forbidden information judges don't want me to find.

7. I would do extra research online because it would help me make the best decision in the case.

8. It would be wrong to do even a quick internet search for additional information during the trial. (Reverse scored)

9. I would do additional research online if I thought it would help me better understand the case, even if the judge asked me not to.

10. I would be curious to see what I could find online about the parties in the trial.

Does the JIRS work?

Results from our studies provide support for use of the JIRS to identify jurors who are prone to break the rules when it comes to internet research. The statistical analysis found the JIRS, as a whole, measures one construct, and that the items used to measure this construct are reliable in producing consistent responses. This suggests the JIRS items were all measuring the same construct — the likelihood of juror internet research.

These results suggest that the JIRS may be useful in distinguishing jurors who will follow judicial instructions to avoid internet research from those jurors who are more likely to break the rules. It represents an important development in combatting a thorny problem for parties to a lawsuit, attorneys and courts alike.

Our future research will seek to create a short version (three or four items) for use in voir dire when a very limited number of questions are allowed. Also, measuring the correlation between the JIRS and two recently published scales about  smartphone and  social media use would provide further reason to believe that the JIRS measures heightened likelihood of conducting internet research. Future research should also continue to validate the JIRS with different populations to increase generalizability.

We hope and believe that the JIRS has a bright future. Trial consultants and attorneys can use it as a tool during voir dire, particularly in high profile cases or cases that have received extensive pretrial publicity, to identify jurors who are likely to conduct outside research during trial. If jurors have high scores on willingness to conduct outside research  and   other high-risk traits, there may be reason to remove that person from the jury panel. As courts, attorneys, trial consultants and academics continue to seek ways to combat this issue, the JIRS can protect parties from exposure to negative, misleading or incomplete information that jurors can obtain online.


An earlier version of this article was originally published in The Jury Expert .

Dunn, M. (2014). Jurors' and attorneys' use of social media during voir dire, trials, and deliberations: A report to the Judicial Conference Committee on Court Administration and Case Management.  Federal Judicial Center . Retrieved from$file/jurors-attorneys-social-media-trial-dunn-fjc-2014.pdf