clock menu more-arrow no yes

Filed under:

Sign Me Up, Tear Me Down: Recruiting Rankings and On-Field Success

New, 13 comments

The relationship of recruiting and success may be self-evident enough to fall under the heading of "duh studies," but we are approaching the peak not only of heavy breathing over collections of teenage talent, but also relentless mocking of said breathing, and of the big-business attempts to sate ever-growing recruiting lust with very official, inscrutably-reasoned rankings of that talent. Newspapers will anecdotally trash the system in one-sentence (and occasionally one-word) paragraphs on Signing Day, more or less mirroring the Wizard of Oddsrhetorical smack before last year’s rankings were released: "snake oil salesmen have more credibility."

Recruiting is a kind of inherently chicken-and-egg business, which leads to opinions like this, cribbed from a run-of-the-mill debate on a UConn message board last week but representative of the same wider scorn for guru ratings (capitalization and lack thereof sic):

for every USC, OSU, LSU, UGA, and USC that are in the top 10, there schools like Tennessee, Michigan State, Penn State, FSU, Miami, & Alabama, that haven't come close to meeting expectations set by the recruiting rankings. 

it's not groundbreaking to predict that the best players are being recruited by the schools that have, traditionally, been the best.  they are playing the odds, they are grading on a curve.  they figure the schools that have been the best, will stay the best.  so they just rate the recruits of the bigtime programs higher, therefore the bigtime programs will always be ranked higher in the standings.  so when LSU wins a national championship they can say "i told you so".  But they certainly never predicted the success of Wake Forest, UConn, Boise State, or Hawaii.
- - -

Oh, the "success" of UConn, following back-to-back seasons of six and eight losses with stirring beatdowns of blue chip factories Duke, Maine, Temple, Pittsburgh, Akron, Louisville, South Florida, Rutgers and Syracuse (be sure to look for these powerhouses on the first chart below).

I respect said poster, though, for actually bringing some numbers to the table, namely those of Miami and Florida State on the one end and Wake Forest, Hawaii, Boise State and UConn, about which the recruitniks have been so, so wrong the last two years (well, in UConn’s just one – most of it, anyway). To wit, his trump cards, aggregating the success of classes since 2003:

2007 results:
Uconn: 9-4 (13-12 over the last 2 seasons) average class of 71
Cuse: 2-10 (6-18 over the last 2 seasons) average class of 53
FSU: 7-6  (14-12 over the last 2 seasons) average class of 7
Miami: 5-7 (12-13 over the last 2 seasons) average class of 8
WVU: 11-2 (22-4 over the last 2 seasons**) average class of 49
BSU: 10-3 (23-3 over the last 2 seasons**) average class of 74
Hawaii: 12-1 (23-4 over the last 2 seasons) average class of 87

**indicates program won a BCS championship within the last 2 years.

We have 3 programs (Uconn, Hawaii, BSU) who's classes are consistently in the bottom third of D1A who have all won conference championships within the past 2 years, and have 1 BCS championship between them.  I know, I know, BSU and Hawaii play much softer schedules.  But only once in four years did one of them lead their conference in recruiting according to Scout.  And in the past 2 years between them they have a BCS championship, 2 conference championships, and a 46-7 record.  Not bad.

we have 2 programs (WVU, Cuse) who's classes are consistently ranked right around the middle of D1A.  One came within 1 loss of a national championship, but still has a BCS championship (actually 2, but that was 3 years ago) while the other has only managed 2 BCS conference wins in 2 years, and only has 6 wins in the last 2 years.  based on recruiting rankings, cuse should be competing with WVU for championships, not UConn.
- - -
[Emphasis original]

Maybe that’s why UConn’s coach makes so much money these days despite losing the de facto conference championship game by 45 points.

To the point (that is, the counterpoint): If you’re going to make a claim about the numbers – especially if the claim is that the numbers don’t matter – you have to use all the numbers. In our case, that means going back to 2002 (the last year for national rankings archives on Rivals.com) and auditing the site’s projections against actual performance over the same span of time. I have no particular reason to use Rivals over any other outlet, except that it has the numbers, its numbers are not wildly different that anyone else’s, and since I want to be able to speak in generalities rather than conduct a journal-ready academic study of an admittedly un-scientific process, it will suffice for that big picture.

To keep things manageable – and since they collectively lost 83 percent of the time last year against schools from BCS conferences, proving a gap that would likely wreck fair comparisons – non-BCS conference teams are not included. Four teams that have made the move from non-BCS to BCS in that span (Louisville, Cincinnati, Connecticut and South Florida) are included, and the one team that has dropped from BCS to non-BCS (Temple) is not included.

Those rankings:

Rank School 2002 2003 2004 2005 2006 2007 Avg.
1 Southern Cal 13 3 1 1 1 2 3.5
2 Georgia 3 6 6 10 4 9 6.3
3 Oklahoma 7 4 8 3 9 14 7.5
4 Florida 20 2 7 15 2 1 7.8
5 LSU 15 1 2 22 7 4 8.5
6 Florida State 4 21 3 2 3 21 9.0
7 Texas 1 15 10 20 5 5 9.3
8 Miami 8 5 4 7 14 19 9.5
9 Tennessee 2 18 11 4 23 3 10.2
10 Auburn 6 11 21 13 10 7 11.3
11 Michigan 16 17 5 6 13 12 11.5
12 Ohio State 5 41 9 12 12 15 15.7
13 South Carolina 11 8 35 23 24 6 17.8
14 Notre Dame 24 12 32 40 8 8 20.7
Texas A&M 23 10 13 8 27 43 20.7
16 Alabama 30 49 15 18 11 10 22.2
17 Nebraska 40 42 27 5 20 13 24.5
18 California 64 14 23 9 19 22 25.2
19 Virginia 12 20 40 19 39 25 25.8
20 Arkansas 26 29 22 24 26 31 26.3
21 UCLA 9 36 34 26 17 40 27.0
22 Maryland 35 32 17 16 29 35 27.3
23 Okla. State 27 15 37 42 22 30 28.8
24 Ole Miss 33 38 30 30 16 27 29.0
25 Oregon 49 26 12 28 49 11 29.3
Arizona State 18 22 31 32 28 45 29.3
27 No. Carolina 42 13 36 44 30 17 30.3
28 Penn State 21 93 14 25 6 24 30.5
29 Arizona 25 33 44 21 18 44 30.8
30 Virginia Tech 45 27 41 14 32 29 31.3
31 Clemson 22 67 53 17 15 16 31.7
32 Washington 19 23 19 66 35 36 33.0
33 N.C. State 34 7 28 27 54 49 33.2
34 Colorado 10 19 49 43 48 32 33.5
35 Miss. State 17 9 62 33 44 39 34.0
36 Missouri 29 28 29 39 47 33 34.2
37 Pittsburgh 39 34 48 38 21 26 34.3
38 Kansas State 14 60 18 36 41 38 34.5
39 Iowa 51 43 38 11 40 28 35.2
40 Illinois 41 30 50 51 30 20 37.0
41 Boston College 43 24 24 49 37 46 37.2
42 Mich. State 32 66 16 35 33 42 37.3
43 Purdue 38 31 20 29 50 59 37.8
44 West Virginia 37 46 47 31 52 23 39.3
45 Wisconsin 50 40 39 33 42 34 39.7
46 Texas Tech 48 44 33 37 25 52 39.8
47 Oregon State 52 51 26 47 43 47 44.3
48 Louisville 59 35 64 45 34 41 46.3
49 Stanford 54 25 57 41 53 51 46.8
50 Kansas 58 39 51 48 38 50 47.3
51 Rutgers 47 45 52 68 45 37 49.0
52 Iowa State 31 46 42 58 63 60 50.0
53 Syracuse 44 52 54 56 51 48 50.8
54 Georgia Tech 63 50 56 62 57 18 51.0
55 Wash. State 28 96 25 52 46 62 51.5
56 Minnesota 55 37 58 55 62 57 54.0
57 Kentucky 94 63 45 67 36 54 59.8
58 South Florida 90 61 43 50 59 58 60.2
59 Northwestern 65 69 78 52 81 53 66.3
60 Baylor 62 76 87 59 68 64 69.3
61 Duke 68 109 70 46 56 78 71.2
62 Vanderbilt 76 78 66 87 60 67 72.3
63 Indiana 81 59 59 72 84 97 75.3
64 Wake Forest 77 57 95 65 75 89 76.3
65 Connecticut 103 88 87 79 85 65 84.5
66 Cincinnati 100 92 80 94 102 89 92.8

Take a breath. Now, on-field winning percentage in relation to those rankings. A more careful study would divide winning percentages into rolling four or five-year periods and weight the results of certain seasons to account for the differences in a class as freshmen and sophomores and as upperclassmen, etc., but again, this is not intended to be that specific or time-consuming. The years 2002-2007 encompass the entire results of three classes ranked above and at least half the contributions of two more. If you long for more specificity, the numbers are there to sate your curiosity.

The records below are against BCS conference teams only (that is a control, since non-BCS teams weren’t included in the rankings) from the same years represented above, 2002-2007. Overachieving teams – those that were ten spots or more better in winning percentage than their recruiting rankings – are shaded in blue; underachieving teams – those that finished ten positions below their average recruiting ranking – are shaded in red. If the reader believes the rankings should be held to a higher standard than "within ten positions," we have very different perceptions of the reasonable limits of scouting.

Winning % vs. BCS Schools, 2002-2007
Rank Team Record Win % Rivals Rank Difference
1 Southern Cal 63-8 .887 1 -
2 Ohio State 54-11 .831 12 + 10
3 Oklahoma 54-13 .806 3 -
4 Texas 48-12 .800 7 + 3
5 Georgia 52-15 .776 2 -3
6 LSU 48-15 .762 5 -1
7 Auburn 46-17 .730 10 + 3
8 Louisville 31-13 .705 48 + 40
9 Michigan 45-19 .703 11 + 2
10 W. Virginia 40-18 .690 44 + 34
11 Virginia Tech 41-20 .672 30 + 19
12 Miami 40-22 .645 8 -4
13 Florida 40-23 .635 4 -9
14 Wisconsin 38-22 .633 45 + 31
15 Boston Coll. 35-21 .625 41 + 26
16 California 38-23 .623 18 + 2
17 Iowa 39-24 .619 39 + 21
18 Florida State 44-28 .611 6 -12
19 Tennessee 38-25 .603 9 -10
20 Virginia 37-25 .597 19 -1
21 Texas Tech 34-24 .586 46 + 25
22 Oregon State 32-24 .571 47 + 25
23 Georgia Tech 37-29 .561 54 + 31
24 Oregon 34-28 .548 25 + 1
25 Arizona State 35-29 .547 25 -
26 Clemson 35-29 .547 31 + 5
27 Notre Dame 33-28 .541 14 -13
28 Missouri 31-27 .535 36 + 8
29 Maryland 32-28 .533 22 -7
30 UCLA 33-29 .532 21 -9
31 Penn State 32-29 .525 28 -3
32 Nebraska 31-31 .500 17 -15
Purdue 32-32 .500 43 + 11
34 Wake Forest 31-33 .484 64 + 30
35 Arkansas 28-31 .475 20 -15
36 Alabama 27-30 .474 16 -20
37 Kansas State 27-29 .466 38 + 1
38 Pittsburgh 25-29 .463 37 + 1
39 Wash. State 27-32 .458 55 + 16
40 N.C. State 26-31 .456 33 -7
41 South Florida 17-21 .447 58 + 17
42 Oklahoma State 25-31 .446 23 -19
43 Colorado 29-37 .439 34 -9
44 Northwestern 24-33 .421 59 + 15
45 Cincinnati 15-21 .417 66 + 21
So. Carolina 25-35 .417 13 -32
47 Texas A&M 24-34 .414 14 -33
48 Minnesota 22-33 .400 56 + 8
49 Mich. State 24-38 .387 42 -7
50 UConn 17-28 .378 65 + 15
51 Kansas 20-33 .377 50 -1
52 Rutgers 18-31 .367 51 -1
53 Kentucky 21-39 .350 57 + 4
54 Ole Miss 18-37 .327 24 -30
55 Iowa State 18-38 .321 52 -3
56 North Carolina 18-45 .286 27 -29
57 Washington 16-43 .271 32 -24
58 Arizona 14-42 .259 29 -29
59 Stanford 13-44 .228 49 -10
60 Illinois 13-47 .217 40 -20
61 Syracuse 13-44 .228 53 -8
62 Vanderbilt 11-45 .196 62 -
63 Indiana 11-46 .193 63 -
64 Miss. State 9-43 .173 35 -29
65 Baylor 8-42 .160 60 -5
66 Duke 4-54 .069 61 -5

The result is outstanding for the recruiting rankings at the top – five of the top seven in winning percentage were in Rivals’ top seven in the recruting averages, all of the recruiting top ten were in the top 20 on the field, and Rivals was exactly right or within a scant three positions of the on-field results of Southern Cal, Oklahoma, Texas, Georgia, LSU, Auburn, Michigan, California, Virginia, Oregon, Arizona State, Penn State, Kansas State, Pittsburgh, Kansas, Rutgers Iowa State, Vanderbilt and Indiana.

It was within nine positions, though, of only 33 teams, exactly half the sample; 16 teams had significantly better winning percentages than their recruiting rankings would predict, 17 teams had a significantly worse winning percentage. Based on that alone, a skeptic could still say, "The rankings have exactly the same chance of being wrong as they do of being right - they are unreliable."

On that point, you may have noticed some conference-specific imbalances in the number of teams appearing in red or blue – in the SEC, for example, half the conference "underperformed," and not one single team finished appreciably better than its recruiting rankings would suggest. Of the seven "overachieving" teams in the top 20 in winning percentage, on the other hand, all either are or were at some point in the sample members of the Big East (Louisville, West Virginia, Virginia Tech, Boston College) or Big Ten (Ohio State, Wisconsin, Iowa), two conferences loaded with overachievers:

Overahievers vs. Underachievers By Conference
ACC Big East Big Ten Big 12 Pac Ten SEC Total
Within Rivals Rank 6 3 6 7 5 6 33
Worse Than Rank 2 - 1 4 3 6 16
Better Than Rank 4 5 4 1 2 - 16

Either the SEC and Big 12 are massively overrated in the recruiting rankings and the Big Ten and Big East are massively underrated, or something else is going on.

Underachievers vs. Higher/Equal-Ranked Opponents
Overall vs. Higher/Equal-Ranked % of Total Losses
Florida State 44-28 4-10 35.7
Tennessee 38-25 16-18 72.0
Notre Dame 33-28 6-12 42.9
Nebraska 31-31 5-13 41.9
Arkansas 28-31 19-24 77.4
Alabama 27-30 9-25 83.3
Okla. State 25-31 8-20 64.5
So. Carolina 25-35 7-26 74.3
Texas A&M 24-34 4-15 44.1
Ole Miss 18-37 8-27 73.0
No. Carolina 18-45 7-19 42.2
Washington 16-43 6-34 79.1
Arizona 14-42 10-28 66.7
Stanford 13-44 13-44 100.0
Illinois 13-47 6-34 72.3
Miss. State 9-43 6-36 83.7

For eleven of the 17 "underachievers," the result is at least partially – in some cases, like Tennessee, Arkansas, South Carolina and Alabama, which all have outstanding records against teams ranked below them in the recruiting rankings – because they were playing so many teams stocked with, according to Rivals, equal or better talent ("equal" in this case being defined as within 5-8 slots of the average ranking over six years); for all eleven, those allegedly more talented teams make up a huge majority of the losses. Their lack of success is predictable because of their schedule. Even teams like Arizona, Illinois, Washington and Mississippi State, which have losing records against less-talented BCS teams since 2002, count the overwhelming majority of their defeats to teams with better or extremely comparable talent based on the rankings.

The inverse is somewhat true in looking at teams that won significantly more than their recruiting ranking would suggest – "higher-ranked" in this case refers to teams with an average Rivals rank over the last six years at least six spots higher than the team in question; it does not include "equals," since there is no "overachieving" at play against equals:

Overachievers vs. Higher-Ranked Opponents
Overall vs. Higher-Ranked % of Total Wins
Ohio State 54-11 2-3 3.7
Louisville 31-13 8-3 25.8
W. Virginia 40-18 14-10 35.0
Virginia Tech 41-20 15-9 36.6
Wisconsin 38-22 14-10 36.8
Boston Coll. 35-21 20-11 57.1
Iowa 39-24 9-11 23.1
Texas Tech 34-24 22-23 64.7
Oregon State 32-34 22-21 68.8
Georgia Tech 37-29 24-27 64.9
Purdue 32-32 9-16 28.1
Wake Forest 31-33 21-32 67.7
Wash. State 27-32 19-27 70.4
So. Florida 17-21 11-16 64.7
N'western 24-33 18-27 60.0
Cincinnati 15-21 8-19 53.3
UConn 17-28 10-23 58.8

Not many teams were ranked appreciably higher than Ohio State, and the Buckeyes have played almost none of them. Behind OSU and Michigan, the rest of the Big Ten was ranked in the middle of the pack: Iowa, Illinois, Michigan State, Purdue and Wisconsin occupied spots 39, 40, 42, 43 and 45 in the aggregate rankings. The winners of that group (Wisconsin and Iowa) are therefore "overachievers" in the long haul while facing very few teams that were thought to have superior talent. The same thing occurs in the Big East: West Virginia and Louisville, despite good record against higher-ranked teams, have feasted overwhelmingly on supposedly less-talented teams, because that’s who occupies the rest of the conference. South Florida, UConn and Cincinnati have overachieved – it is not possible to underachieve from the very bottom of the rankings – but are still below .500 against BCS conference schools overall and well below that mark against those considered more talented in the Rivals rankings. All of them, without exception, are more successful against equally or less-talented teams.

Boston College, Georgia Tech, Texas Tech and Oregon State have a much stronger record as "overachievers" – they play more higher-ranked teams and are all at about .500 against that "superior" talent. Wake Forest is on pace to join that group after the last two seasons, especially if the ACC remains so friendly to conservative, defensive teams, but like their fellow recruiting bottom-dwellers in the Big East, their success is a result of beating teams in the middle of the pack, almost never (Wake’s two-year streak over Florida State, one of the real underachievers, is an exception) against teams at the top.

So: Rivals was very, very good at picking the top teams – of the top 25 winningest teams of the last six years, all were either pegged in or very near their respective positions by the recruiting rankings or achieved them by winning against overwhelmingly lower-ranked opposition; of the top 25 teams according to the recruiting rankings, 18 are in the top 30 in winning percentage. This is to be expected when you spend most of your time distinguishing between a small number of high-profile, four and five-star guys but can’t possibly make the same level of distinction among a much larger number of two and three-star prospects with more variability among them than the star-based rankings are designed to show. If Rivals indicates a team’s talent is good, it’s probably right; if it indicates it’s just average, or below average, that team probably still has a shot – but only to an extent. You’d be wrong if you cast your lot with the gurus completely, and wronger if you ignored them.

To look at this list and conclude "Recruiting rankings are nonsense because Florida State and Wake Forest" is obtuse and shortsighted, as is the suggestion that the players Florida State brought in weren’t really that good, just artificially inflated by the fact they were being recruited by Florida State – "they just rate the recruits of the bigtime programs higher." In one isolated case, this could be true, but given the larger correlation of "on paper" rankings to "on field" success, it’s an argument that’s wrong on its face to the point of absurdity: it suggests that successful programs just are successful, regardless of the players they recruit. If the gurus followed the larger schools’ assessments to a fault, if USC was being ranked at the top of the lists simply because it was USC, even if it was signing the same players as UCLA, the Trojans would obviously be closer to UCLA’s results. Substitute "USC" and "UCLA" in that sentence for an overwhelming majority of the possible comparisons between teams who are not close to one another in the rankings. It's not groundbreaking because it's obvously true: Good teams are good because they recruit good players, and recruiting rankings are more accurate than their reputation in which players are likely to be good, and to what extent. It’s not perfect, but it’s not random. What the gurus get wrong pales in comparison to what they get right, and even much of what they get wrong has more to do with injuries, academics, boneheadedness, competition from other hotshots and coaching – see the immediate bounce from mediocrity to dominance at Florida under Urban Meyer, Georgia under Mark Richt, Texas under Mack Brown, USC under Pete Carroll, Oklahoma under Bob Stoops, Ohio State under Jim Tressel, Auburn under Tommy Tuberville and LSU under Nick Saban in the last decade for evidence that even with raw talent on hand, coaching matters – than missing on the talent. The rankings are a serviceable baseline for expectation.