The Multistate Professional Responsibility Exam (MPRE) is Saturday. I heard the test wasn’t “particularly difficult or interesting,” so I didn't start studying until this week. Bad idea. The first practice test was a harsh wake up call.
When I asked my classmates how their preparation was going, I was surprised to discover how gender-dependent their responses were. The ladies shared my frustration. A woman in my Legal Ethics class vented that the “ethical answer” was never the correct choice. Another woman used her facebook status to express her insecurity about whether she would pass. In contrast, the men were annoyingly confident. One joked about how “ridiculously easy” the questions were. Another said that he didn’t even plan to crack the book.
My informal survey made me wonder whether MPRE’s passage rates reflect a gender bias. After conducting some research, I am still wondering. The National Conference of Bar Examiners (NCBE) does not publish gender-specific statistics. At best, I dug up a decade-old report from the Gender Fairness Task Force, which studied gender bias in admission to the Oregon State Bar. The report concluded that “[a]t the national level, the NCBE has taken effective steps to eliminate gender bias in tests.” To read the full report, click here.
Maybe the task force is correct? After all, a slightly higher percentage of women than men passed the California Bar Exam last year. To check out last year’s statistics, click here. To peruse statistics from other years, click here. Moreover, on the original Stanford-Binet IQ tests, women scored 2-4% higher than men. But psychologists quickly revised the tests “so as to try to bring about equal averages for females and males.” To read more about how gender and race influenced the evolution of intelligence testing (and to see an example of a horrifyingly offensive test question), click here.
So does that mean today’s standardized tests are gender-neutral? The National Center for Fair and Open Testing (“FairTest”) says “No!” Citing a joint study by the Educational Testing Service and the College Board, FairTest determined that most aptitude tests favor males. Although FairTest did not focus on bar admission exams, its findings are informative. To read these findings (which I will discuss below) click here.
FairTest’s research discovered that “the multiple-choice format itself is biased against women.” While men tend to perform better on multiple choice tests, women tend to perform better on “constructed-response tests” (like short answer questions and essays). FairTest argues that “equity concerns would dictate a mix of the two types of assessment instruments.” Sorry ladies. The MPRE is entirely multiple choice.
What might be more interesting is why multiple choice tests favor males. Studies show that “males are more likely to take risks on the test and guess when they do not know the answer.” Women, on the other hand, “tend to answer the question only if they are sure they are correct.” Time pressure also favors men because women prefer to “work a problem out completely, to consider more than one possible answer, and to check their answers.” To summarize, the multiple choice format punishes test takers who are thorough and cautious.
The result? A test taker's capacity for thorough analysis and cautious decision-making could make her less likely to pass a test of her “professionally responsibility.”