The relationship of recruiting and success may be self-evident enough to fall under the heading of "duh studies," but we are approaching the peak not only of heavy breathing over collections of teenage talent, but also relentless mocking of said breathing, and of the big-business attempts to sate ever-growing recruiting lust with very official, inscrutably-reasoned rankings of that talent. Newspapers will anecdotally trash the system in one-sentence (and occasionally one-word) paragraphs on Signing Day, more or less mirroring the Wizard of Odds’ rhetorical smack before last year’s rankings were released: "snake oil salesmen have more credibility."
Recruiting is a kind of inherently chicken-and-egg business, which leads to opinions like this, cribbed from a run-of-the-mill debate on a UConn message board last week but representative of the same wider scorn for guru ratings (capitalization and lack thereof sic):
it's not groundbreaking to predict that the best players are being recruited by the schools that have, traditionally, been the best. they are playing the odds, they are grading on a curve. they figure the schools that have been the best, will stay the best. so they just rate the recruits of the bigtime programs higher, therefore the bigtime programs will always be ranked higher in the standings. so when LSU wins a national championship they can say "i told you so". But they certainly never predicted the success of Wake Forest, UConn, Boise State, or Hawaii.
- - -
I respect said poster, though, for actually bringing some numbers to the table, namely those of Miami and Florida State on the one end and Wake Forest, Hawaii, Boise State and UConn, about which the recruitniks have been so, so wrong the last two years (well, in UConn’s just one – most of it, anyway). To wit, his trump cards, aggregating the success of classes since 2003:
Uconn: 9-4 (13-12 over the last 2 seasons) average class of 71
Cuse: 2-10 (6-18 over the last 2 seasons) average class of 53
FSU: 7-6 (14-12 over the last 2 seasons) average class of 7
Miami: 5-7 (12-13 over the last 2 seasons) average class of 8
WVU: 11-2 (22-4 over the last 2 seasons**) average class of 49
BSU: 10-3 (23-3 over the last 2 seasons**) average class of 74
Hawaii: 12-1 (23-4 over the last 2 seasons) average class of 87
**indicates program won a BCS championship within the last 2 years.
We have 3 programs (Uconn, Hawaii, BSU) who's classes are consistently in the bottom third of D1A who have all won conference championships within the past 2 years, and have 1 BCS championship between them. I know, I know, BSU and Hawaii play much softer schedules. But only once in four years did one of them lead their conference in recruiting according to Scout. And in the past 2 years between them they have a BCS championship, 2 conference championships, and a 46-7 record. Not bad.
we have 2 programs (WVU, Cuse) who's classes are consistently ranked right around the middle of D1A. One came within 1 loss of a national championship, but still has a BCS championship (actually 2, but that was 3 years ago) while the other has only managed 2 BCS conference wins in 2 years, and only has 6 wins in the last 2 years. based on recruiting rankings, cuse should be competing with WVU for championships, not UConn.
- - -
To the point (that is, the counterpoint): If you’re going to make a claim about the numbers – especially if the claim is that the numbers don’t matter – you have to use all the numbers. In our case, that means going back to 2002 (the last year for national rankings archives on Rivals.com) and auditing the site’s projections against actual performance over the same span of time. I have no particular reason to use Rivals over any other outlet, except that it has the numbers, its numbers are not wildly different that anyone else’s, and since I want to be able to speak in generalities rather than conduct a journal-ready academic study of an admittedly un-scientific process, it will suffice for that big picture.
To keep things manageable – and since they collectively lost 83 percent of the time last year against schools from BCS conferences, proving a gap that would likely wreck fair comparisons – non-BCS conference teams are not included. Four teams that have made the move from non-BCS to BCS in that span (Louisville, Cincinnati, Connecticut and South Florida) are included, and the one team that has dropped from BCS to non-BCS (Temple) is not included.
Take a breath. Now, on-field winning percentage in relation to those rankings. A more careful study would divide winning percentages into rolling four or five-year periods and weight the results of certain seasons to account for the differences in a class as freshmen and sophomores and as upperclassmen, etc., but again, this is not intended to be that specific or time-consuming. The years 2002-2007 encompass the entire results of three classes ranked above and at least half the contributions of two more. If you long for more specificity, the numbers are there to sate your curiosity.
The records below are against BCS conference teams only (that is a control, since non-BCS teams weren’t included in the rankings) from the same years represented above, 2002-2007. Overachieving teams – those that were ten spots or more better in winning percentage than their recruiting rankings – are shaded in blue; underachieving teams – those that finished ten positions below their average recruiting ranking – are shaded in red. If the reader believes the rankings should be held to a higher standard than "within ten positions," we have very different perceptions of the reasonable limits of scouting.
|Rank||Team||Record||Win %||Rivals Rank||Difference|
|2||Ohio State||54-11||.831||12||+ 10|
|10||W. Virginia||40-18||.690||44||+ 34|
|11||Virginia Tech||41-20||.672||30||+ 19|
|15||Boston Coll.||35-21||.625||41||+ 26|
|21||Texas Tech||34-24||.586||46||+ 25|
|22||Oregon State||32-24||.571||47||+ 25|
|23||Georgia Tech||37-29||.561||54||+ 31|
|34||Wake Forest||31-33||.484||64||+ 30|
|37||Kansas State||27-29||.466||38||+ 1|
|39||Wash. State||27-32||.458||55||+ 16|
|41||South Florida||17-21||.447||58||+ 17|
The result is outstanding for the recruiting rankings at the top – five of the top seven in winning percentage were in Rivals’ top seven in the recruting averages, all of the recruiting top ten were in the top 20 on the field, and Rivals was exactly right or within a scant three positions of the on-field results of Southern Cal, Oklahoma, Texas, Georgia, LSU, Auburn, Michigan, California, Virginia, Oregon, Arizona State, Penn State, Kansas State, Pittsburgh, Kansas, Rutgers Iowa State, Vanderbilt and Indiana.
It was within nine positions, though, of only 33 teams, exactly half the sample; 16 teams had significantly better winning percentages than their recruiting rankings would predict, 17 teams had a significantly worse winning percentage. Based on that alone, a skeptic could still say, "The rankings have exactly the same chance of being wrong as they do of being right - they are unreliable."
On that point, you may have noticed some conference-specific imbalances in the number of teams appearing in red or blue – in the SEC, for example, half the conference "underperformed," and not one single team finished appreciably better than its recruiting rankings would suggest. Of the seven "overachieving" teams in the top 20 in winning percentage, on the other hand, all either are or were at some point in the sample members of the Big East (Louisville, West Virginia, Virginia Tech, Boston College) or Big Ten (Ohio State, Wisconsin, Iowa), two conferences loaded with overachievers:
|ACC||Big East||Big Ten||Big 12||Pac Ten||SEC||Total|
|Within Rivals Rank||6||3||6||7||5||6||33|
|Worse Than Rank||2||-||1||4||3||6||16|
|Better Than Rank||4||5||4||1||2||-||16|
Either the SEC and Big 12 are massively overrated in the recruiting rankings and the Big Ten and Big East are massively underrated, or something else is going on.
|Overall||vs. Higher/Equal-Ranked||% of Total Losses|
For eleven of the 17 "underachievers," the result is at least partially – in some cases, like Tennessee, Arkansas, South Carolina and Alabama, which all have outstanding records against teams ranked below them in the recruiting rankings – because they were playing so many teams stocked with, according to Rivals, equal or better talent ("equal" in this case being defined as within 5-8 slots of the average ranking over six years); for all eleven, those allegedly more talented teams make up a huge majority of the losses. Their lack of success is predictable because of their schedule. Even teams like Arizona, Illinois, Washington and Mississippi State, which have losing records against less-talented BCS teams since 2002, count the overwhelming majority of their defeats to teams with better or extremely comparable talent based on the rankings.
The inverse is somewhat true in looking at teams that won significantly more than their recruiting ranking would suggest – "higher-ranked" in this case refers to teams with an average Rivals rank over the last six years at least six spots higher than the team in question; it does not include "equals," since there is no "overachieving" at play against equals:
|Overall||vs. Higher-Ranked||% of Total Wins|
Not many teams were ranked appreciably higher than Ohio State, and the Buckeyes have played almost none of them. Behind OSU and Michigan, the rest of the Big Ten was ranked in the middle of the pack: Iowa, Illinois, Michigan State, Purdue and Wisconsin occupied spots 39, 40, 42, 43 and 45 in the aggregate rankings. The winners of that group (Wisconsin and Iowa) are therefore "overachievers" in the long haul while facing very few teams that were thought to have superior talent. The same thing occurs in the Big East: West Virginia and Louisville, despite good record against higher-ranked teams, have feasted overwhelmingly on supposedly less-talented teams, because that’s who occupies the rest of the conference. South Florida, UConn and Cincinnati have overachieved – it is not possible to underachieve from the very bottom of the rankings – but are still below .500 against BCS conference schools overall and well below that mark against those considered more talented in the Rivals rankings. All of them, without exception, are more successful against equally or less-talented teams.
Boston College, Georgia Tech, Texas Tech and Oregon State have a much stronger record as "overachievers" – they play more higher-ranked teams and are all at about .500 against that "superior" talent. Wake Forest is on pace to join that group after the last two seasons, especially if the ACC remains so friendly to conservative, defensive teams, but like their fellow recruiting bottom-dwellers in the Big East, their success is a result of beating teams in the middle of the pack, almost never (Wake’s two-year streak over Florida State, one of the real underachievers, is an exception) against teams at the top.
So: Rivals was very, very good at picking the top teams – of the top 25 winningest teams of the last six years, all were either pegged in or very near their respective positions by the recruiting rankings or achieved them by winning against overwhelmingly lower-ranked opposition; of the top 25 teams according to the recruiting rankings, 18 are in the top 30 in winning percentage. This is to be expected when you spend most of your time distinguishing between a small number of high-profile, four and five-star guys but can’t possibly make the same level of distinction among a much larger number of two and three-star prospects with more variability among them than the star-based rankings are designed to show. If Rivals indicates a team’s talent is good, it’s probably right; if it indicates it’s just average, or below average, that team probably still has a shot – but only to an extent. You’d be wrong if you cast your lot with the gurus completely, and wronger if you ignored them.
To look at this list and conclude "Recruiting rankings are nonsense because Florida State and Wake Forest" is obtuse and shortsighted, as is the suggestion that the players Florida State brought in weren’t really that good, just artificially inflated by the fact they were being recruited by Florida State – "they just rate the recruits of the bigtime programs higher." In one isolated case, this could be true, but given the larger correlation of "on paper" rankings to "on field" success, it’s an argument that’s wrong on its face to the point of absurdity: it suggests that successful programs just are successful, regardless of the players they recruit. If the gurus followed the larger schools’ assessments to a fault, if USC was being ranked at the top of the lists simply because it was USC, even if it was signing the same players as UCLA, the Trojans would obviously be closer to UCLA’s results. Substitute "USC" and "UCLA" in that sentence for an overwhelming majority of the possible comparisons between teams who are not close to one another in the rankings. It's not groundbreaking because it's obvously true: Good teams are good because they recruit good players, and recruiting rankings are more accurate than their reputation in which players are likely to be good, and to what extent. It’s not perfect, but it’s not random. What the gurus get wrong pales in comparison to what they get right, and even much of what they get wrong has more to do with injuries, academics, boneheadedness, competition from other hotshots and coaching – see the immediate bounce from mediocrity to dominance at Florida under Urban Meyer, Georgia under Mark Richt, Texas under Mack Brown, USC under Pete Carroll, Oklahoma under Bob Stoops, Ohio State under Jim Tressel, Auburn under Tommy Tuberville and LSU under Nick Saban in the last decade for evidence that even with raw talent on hand, coaching matters – than missing on the talent. The rankings are a serviceable baseline for expectation.