Announcement

Collapse
No announcement yet.

Yes, IQ Really Matters

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Yes, IQ Really Matters

    Yes, IQ Really Matters


    The College Board—the standardized testing behemoth that develops and administers the SAT and other tests—has redesigned its flagship product again. Beginning in spring 2016, the writing section will be optional, the reading section will no longer test “obscure” vocabulary words, and the math section will put more emphasis on solving problems with real-world relevance. Overall, as the College Board explains on its website, “The redesigned SAT will more closely reflect the real work of college and career, where a flexible command of evidence—whether found in text or graphic [sic]—is more important than ever.”

    A number of pressures may be behind this redesign. Perhaps it’s competition from the ACT, or fear that unless the SAT is made to seem more relevant, more colleges will go the way of Wake Forest, Brandeis, and Sarah Lawrence and join the “test optional admissions movement,” which already boasts several hundred members. Or maybe it’s the wave of bad press that standardized testing, in general, has received over the past few years.

    Critics of standardized testing are grabbing this opportunity to take their best shot at the SAT. They make two main arguments. The first is simply that a person’s SAT score is essentially meaningless—that it says nothing about whether that person will go on to succeed in college. Leon Botstein, president of Bard College and longtime standardized testing critic, wrote in Time that the SAT “needs to be abandoned and replaced,” and added:

    The blunt fact is that the SAT has never been a good predictor of academic achievement in college. High school grades adjusted to account for the curriculum and academic programs in the high school from which a student graduates are. The essential mechanism of the SAT, the multiple choice test question, is a bizarre relic of long outdated 20th century social scientific assumptions and strategies.
    Calling use of SAT scores for college admissions a “national scandal,” Jennifer Finney Boylan, an English professor at Colby College, argued in the New York Times that:

    The only way to measure students’ potential is to look at the complex portrait of their lives: what their schools are like; how they’ve done in their courses; what they’ve chosen to study; what progress they’ve made over time; how they’ve reacted to adversity.
    Along the same lines, Elizabeth Kolbert wrote in The New Yorker that “the SAT measures those skills—and really only those skills—necessary for the SATs.”

    But this argument is wrong. The SAT does predict success in college—not perfectly, but relatively well, especially given that it takes just a few hours to administer. And, unlike a “complex portrait” of a student’s life, it can be scored in an objective way. (In a recent New York Times op-ed, the University of New Hampshire psychologist John D. Mayer aptly described the SAT’s validity as an “astonishing achievement.”) In a study published in Psychological Science, University of Minnesota researchers Paul Sackett, Nathan Kuncel, and their colleagues investigated the relationship between SAT scores and college grades in a very large sample: nearly 150,000 students from 110 colleges and universities. SAT scores predicted first-year college GPA about as well as high school grades did, and the best prediction was achieved by considering both factors. Botstein, Boylan, and Kolbert are either unaware of this directly relevant, easily accessible, and widely disseminated empirical evidence, or they have decided to ignore it and base their claims on intuition and anecdote—or perhaps on their beliefs about the way the world should be rather than the way it is.

    Furthermore, contrary to popular belief, it’s not just first-year college GPA that SAT scores predict. In a four-year study that started with nearly 3,000 college students, a team of Michigan State University researchers led by Neal Schmitt found that test score (SAT or ACT—whichever the student took) correlated strongly with cumulative GPA at the end of the fourth year. If the students were ranked on both their test scores and cumulative GPAs, those who had test scores in the top half (above the 50th percentile, or median) would have had a roughly two-thirds chance of having a cumulative GPA in the top half. By contrast, students with bottom-half SAT scores would be only one-third likely to make it to the top half in GPA.

    Test scores also predicted whether the students graduated: A student who scored in the 95th percentile on the SAT or ACT was about 60 percent more likely to graduate than a student who scored in the 50th percentile. Similarly impressive evidence supports the validity of the SAT’s graduate school counterparts: the Graduate Record Examinations, the Law School Admissions Test, and the Graduate Management Admission Test. A 2007 Science article summed up the evidence succinctly: “Standardized admissions tests have positive and useful relationships with subsequent student accomplishments.”

    SAT scores even predict success beyond the college years. For more than two decades, Vanderbilt University researchers David Lubinski, Camilla Benbow, and their colleagues have tracked the accomplishments of people who, as part of a youth talent search, scored in the top 1 percent on the SAT by age 13. Remarkably, even within this group of gifted students, higher scorers were not only more likely to earn advanced degrees but also more likely to succeed outside of academia. For example, compared with people who “only” scored in the top 1 percent, those who scored in the top one-tenth of 1 percent—the extremely gifted—were more than twice as likely as adults to have an annual income in the top 5 percent of Americans.

    Adjusting high school GPA and abandoning the SAT would do the opposite of leveling the playing field.
    The second popular anti-SAT argument is that, if the test measures anything at all, it’s not cognitive skill but socioeconomic status. In other words, some kids do better than others on the SAT not because they’re smarter, but because their parents are rich. Boylan argued in her Times article that the SAT “favors the rich, who can afford preparatory crash courses” like those offered by Kaplan and the Princeton Review. Leon Botstein claimed in his Time article that “the only persistent statistical result from the SAT is the correlation between high income and high test scores.” And according to a Washington Post Wonkblog infographic (which is really more of a disinfographic) “your SAT score says more about your parents than about you.”

    It’s true that economic background correlates with SAT scores. Kids from well-off families tend to do better on the SAT. However, the correlation is far from perfect. In the University of Minnesota study of nearly 150,000 students, the correlation between socioeconomic status, or SES, and SAT was not trivial but not huge. (A perfect correlation has a value of 1; this one was .25.) What this means is that there are plenty of low-income students who get good scores on the SAT; there are even likely to be low-income students among those who achieve a perfect score on the SAT.
    More.

    Read the second page. Advocates of "experience-based" admissions criteria are short-changing their pet groups. Adjusting test scores and shoehorning in bonus points for having an imprisoned Dad or crack-addicted Mom just push kids into an academic reality they aren't prepared to meet. What good is it to get admitted, get however many loans, and then drop out in the second year?

    You have a boatload of debt and nothing for it. As an employer, I don't need your "some college" when I can get loads of applicants with degrees who will work for the same pay.

    The truth is that SAT/ACT tests fairly accurately project how you will perform in a classroom and in a cubicle. They don't predict how you will parlay $20 bucks and a garage into the next Microsoft. But hardly anybody in college will have a side business that will make millions.

    The average college graduate going into the average job has to do things like show up, read the handbook, volunteer for crap stuff, wade through pages of sloppily written dross and then summarize them in a PowerPoint presentation. Also, they should know what "dross" means. They have to negotiate with peers and supervisors without using knives. They need to make plausible presentations in front of peers who want their jobs. They have to develop expertise in narrow areas. They have to read a lot, write some, and defend it all in front potentially adverse clients, stakeholders, audiences, etc.

    The SAT is a better predictor of those skills than a rap video or a selfie interview about drugs/sex/violence in the hood or on the farm.

    Slate
    "Alexa, slaughter the fatted calf."

  • #2
    Originally posted by Gingersnap View Post
    More.

    Read the second page. Advocates of "experience-based" admissions criteria are short-changing their pet groups. Adjusting test scores and shoehorning in bonus points for having an imprisoned Dad or crack-addicted Mom just push kids into an academic reality they aren't prepared to meet. What good is it to get admitted, get however many loans, and then drop out in the second year?

    You have a boatload of debt and nothing for it. As an employer, I don't need your "some college" when I can get loads of applicants with degrees who will work for the same pay.

    The truth is that SAT/ACT tests fairly accurately project how you will perform in a classroom and in a cubicle. They don't predict how you will parlay $20 bucks and a garage into the next Microsoft. But hardly anybody in college will have a side business that will make millions.

    The average college graduate going into the average job has to do things like show up, read the handbook, volunteer for crap stuff, wade through pages of sloppily written dross and then summarize them in a PowerPoint presentation. Also, they should know what "dross" means. They have to negotiate with peers and supervisors without using knives. They need to make plausible presentations in front of peers who want their jobs. They have to develop expertise in narrow areas. They have to read a lot, write some, and defend it all in front potentially adverse clients, stakeholders, audiences, etc.

    The SAT is a better predictor of those skills than a rap video or a selfie interview about drugs/sex/violence in the hood or on the farm.

    Slate
    Academic research is getting way too much out of touch. These researchers believe the talking points they themselves created as a means to get grant money and social engineering control. The reason SAT score generally correlates with parents' income is the same reason that high income parents generally raise kids that do well financially. The skills required to prosper are taught at home much more than at school.

    As a gifted child myself I was "invited" to participate in dozens of these research studies. I first took the ACT in 5th grade. I first took the SAT in 7th grade. I audited a graduate (MBA I think) Statistics course for 3 weeks one summer - got a 71 on the final. I was the only white kid in a Boy's Club where the leaders (who I now know were doing their PhD theses) tested various forms of "social intelligence." I was even evaluated by a study measuring language talent - one researcher said that she thought French was my actual first language since my Mom sang to me in French (her native language) when I was an infant. I think there's something to her theory. I've never considered my ability to do well as a linguist to be something that made me "smart." I know some good linguists that are idiots - not savants, not socially dysfunctional, just normal people who are great translators and are stupid in everything else, 85 IQ.

    Throughout all of those years of being a lab rat I realized a very important fact: most researchers in these social sciences are not "gifted" themselves and are not capable of understanding what makes us tick because of their biases. They go too far in either direction, either applying their versions of adult logic to us or viewing us as "special" just like the kids in the "Special Olympics."

    This is just more of that same failed methodology. I'm not sure if it's some social endgame or if it's just that a bunch of average people are evaluating a complicated dynamic and aren't up to the task. It's clear to me that a sample size of 1 isn't statistically valid but that didn't stop me from being the sole subject of multiple theses.
    "Faith is nothing but a firm assent of the mind : which, if it be regulated, as is our duty, cannot be afforded to anything but upon good reason, and so cannot be opposite to it."
    -John Locke

    "It's all been melded together into one giant, authoritarian, leftist scream."
    -Newman

    Comment


    • #3
      This is interesting:

      Scoring well on the SAT may in fact be the only such opportunity for students who graduate from public high schools that are regarded by college admissions offices as academically weak.
      I think that's code for "success" from the perspective of University Presidents. The primary goal is to admit students that are going to graduate, prosper, and be donors. From that perspective, they don't care what the SAT really means. They only care about what it means to them and if higher SAT scores equate to a higher graduation and donor rate, that's what they are going with. Perhaps a high SAT score predicts involved parents that are going to make sure their angel performs and therefore the checks keep coming. Perhaps a high SAT score predicts the next Mark Cuban.
      "Faith is nothing but a firm assent of the mind : which, if it be regulated, as is our duty, cannot be afforded to anything but upon good reason, and so cannot be opposite to it."
      -John Locke

      "It's all been melded together into one giant, authoritarian, leftist scream."
      -Newman

      Comment

      Working...
      X