Standardized tests used to be, well, standard. Everyone taking the test would receive the same questions in a written exam and have a specified time to answer them as accurately as he or she could.
But in this digital age — where America Online has replaced the U.S. Postal Service and computers dominate households as well as industries — it shouldn't be too surprising that even standardized tests are being administered electronically.
What should come as a surprise, however, is that these exams are not merely electronic versions of the written tests. Some test makers, such as the Educational Testing Service, offer what they call "adaptive" exams, meaning that they are designed to adapt, on the fly, to the ability of the test taker.
Here's how it works: Students who take the Graduate Record Examination — which is often a requirement for graduate school — or the Graduate Management Admission Test — which is often a requirement for business school — receive only one question at a time. If they answer correctly, they then receive a more difficult question. If they answer incorrectly, the subsequent question is easier.
Therefore, the subsequent question depends entirely on whether the student answered the previous item correctly. And indeed students who answer some questions right and other questions wrong take an erratic path to the finish line, with some easier problems and some that are more difficult.
But how does this new adaptive test affect Princeton students?
Each year, hundreds of University students take some sort of standardized test for graduate school. Because of Princeton's perennially tough academic admission standards and high SAT scores for incoming freshmen, it is safe to assume that University students tend to score better-than-average on graduate-level standardized exams.
The Educational Testing Service — which administers the GRE as well as the GMAT, and is located in Princeton — claims that adaptive tests benefit all test takers because of the increased "efficiency" of the adaptive exams.
"The tests adapt to your ability," said Kevin Gonzalez, a spokesman for ETS. "If you remember when you took the SAT, the first few questions were easy, the next were medium and the last ones were very hard.
"But if you breezed through the easy ones, then we think, 'Why did we waste your time?' We should have just given you more difficult questions," he added. "This test allows us to be more efficient, so we will need fewer questions to determine your ability."
Gonzalez explained that the first question on these exams usually has a medium level of difficulty. As the student works his way through the remaining questions, ETS can hone in on his ability as measured by the test more accurately and efficiently, Gonzalez claimed.

"If there is an advantage to anyone [with this test], it would be to all test takers," he explained. "We are always looking for ways to make this a better test . . . It allows us to see more interaction between the test taker and the test, and it helps us make testing a more complete experience. It would give us the best estimation of their abilities."
But not everyone believes that the adaptive exams benefit all test takers.
John Katzman '81, chief executive officer of The Princeton Review — a test-preparation organization — said students who are more accustomed to using a computer would obviously be helped by the electronic tests.
"Certainly if you are good with a computer, if you're comfortable reading tests on a computer, scrolling for example, it's to your advantage," he said. "People who are better off socioeconomically are more likely to be using computers and so more often, they are benefiting."
Yet some Princetonians disagreed, arguing that the adaptive version of the tests would not necessarily lead to higher scores for University students.
Carolyn McKee '01, an ecology and evolutionary biology major who is planning to apply to veterinarian school, took the general GRE this summer.
"I don't think it mattered all that much either way," she said. "I didn't think about it much — I just tried to keep it in the back of my mind."
McKee said that she took several practice tests on the computer as well as on paper, and her scores were almost identical.
Though she does not think she would have performed significantly better with a written test, McKee did say that the electronic test created "a nicer testing environment."
"It's more convenient, and we are so used to using computers," she added.
Indeed, though the electronic versions of these tests may not directly increase a Princeton student's score, the fact that he or she is at ease sitting in front of a computer is significant. Though not all students have computers in their rooms, everyone has access to computer clusters throughout campus and can easily surf the Internet to prepare for an upcoming standardized test.
While almost all colleges offer their students computers and Internet access, Princeton has particularly strong resources in technology.
"My hunch is that it's a net benefit for most people throughout Princeton," Katzman said. "They tend to be wealthier, better prepped and more comfortable with computers."
Katzman explained that the adaptive, electronic tests are much more "coachable" than written exams. "This is going to sound like an advertisement, but it's true," he said. "It helps to practice on the computer if you've been taking paper-and-pencil tests your whole life. The idioms and the layout are a bit different, so you should definitely practice."
And Katzman's claim that University students tend to be more prepared for standardized tests does seem to hold water.
Though representatives from several academic departments such as math and chemistry said that they don't offer any special programs to help students prepare, they also said that they urge their students to practice in some way.
"For students heading to graduate school, we tell them to be sure to take practice tests just to get into the swing of the process, for time management of the test and for exposure to sample questions," undergraduate administrator for the chemistry department Kirsten Erwin wrote in an e-mail.
But while most departments do not offer any specific courses or programs for graduate school standardized tests, all students who were interviewed said they had completed some sort of preparation, whether they simply searched the Internet or took practice exams.
Though she said she is comfortable using a computer, Michelle Buckley '01 — who plans to take the general GRE soon — said the electronic version may make her more nervous about the exam.
"I don't like the way it depends on the [previous] question," she said. "People are nervous at the beginning, and maybe they're not used to it or in the testing mode."
Buckley added that she did not like the fact that students are not able to check their answers once they have completed the exam, regardless of time constraints.
"Honestly, I don't think that going to Princeton is that much more helpful than someone who went to another school, just because of what they are testing you," she explained. "I don't think standardized tests in general can [measure one's intelligence]. Presumably, all the [students at top schools] will be scoring similarly and will go to the same level of [graduate] school anyway."
"I think giving a standard set of questions to everyone is the best way to do it — equal playing ground gets equal results," she added.
But Gonzalez believes the standardized adaptive tests do just that.
When asked if the electronic version of the exams give computer-literate test takers an unfair advantage, the ETS spokesman said, "These are graduate level tests. One is presuming a fair amount of experience at college on the computer . . . And one is presuming a great deal of experience in the work place."
Gonzalez added that there is a computer-adaptive tutorial that comes at the beginning of each test that teaches the basic computer skills that are required for the exam.
"It will give you a quick run down of how to use the mouse and how to use the computer," he said. "Why would we create a test that would give someone an advantage? If you have the slightest bit of doubt, it will erase it."
Gonzalez emphasized that students who take a more difficult test will not be penalized for answering more questions incorrectly. In fact, the scores are derived by assigning different values to different questions, with greater weight given to the more difficult items.
Christina Brown '01, a chemistry major who took the general GRE this summer, said she believes less confident test takers might be more affected by the adaptive exams. If students start missing several questions, then they will lose confidence, she explained. As the students become more discouraged, they might be less likely to perform better later in the exam, even if the questions are easier.
In addition, Katzman believes the reasoning behind the test is flawed on a deeper level.
"ETS is right in saying that this is a more 'efficient' test," Katzman said. "But 'efficient' has a lot of different meanings. It tells you nothing about your ability to succeed in business and business school or graduate school or in life.
"But now it costs twice as much," he explained, citing the higher cost of registering for the test. "There's no reason to believe that [the computerized] tests perform better; the test is not shorter, no more predictive and twice as expensive."
Regardless of whether the adaptive tests favor wealthier, more computer-literate students, the computerized exams are here to stay. Though the SAT is several years away from being computerized — as high school students become more comfortable with the technology — many suspect the Law School Admission Test and the Medical College Admission Test eventually will be given electronically.
For now, those tests differ markedly from the GRE and the GMAT. But when they are finally offered exclusively through computers, standardized tests may take one step closer to again being standard.