When it comes to measuring a college football program's success, I think there is no better increment of time than the decade. Fixating on year-by-year results is too short-sighted, as there can just be too much natural variation in results, with significant effects from attrition, injuries, and just plain luck. If we look at a full 10 year body of work, however, we have a powerful long-term indicator of each team's standing.
It is with that introduction we look at the ten best teams for the decade ending 2009 (the "Aughts"). Congratulations to Boise State for putting together an amazing 112 win and 17 loss record to take the top spot. There are many haters out there that will say Boise's schedule is a farce, and while I will grant you it is much weaker than most of the other teams on this list, I think it is a hell of an accomplishment to put together that kind of elite winning consistency at a mid-level school, no matter what the competition. And Boise has had its share of big-name victories when the time has arisen. Texas and Oklahoma check in at #2 and #3, as both have been the clear elite of the Big XII. The rest of the names probably surprise no one, although TCU has quietly become a dominant program with a 0.766 winning percentage.
Then there's the losers. This, too, is an unsurprising list, a who's who of cellar dwellers. Another useful feature of the decade time increment is you can divide all of the wins and losses by 10 to get your average season performance, meaning Duke has not even averaged 2 wins per season. Vandy and Baylor are other perennial losers in power conferences, while the rest of the teams (ignoring independent Army) tend to be in weaker conferences. These teams are not only losing, they are losing consistently.
You can see the entire list ranking at the excellent stassen.com website
As I looked at this data, particularly the bottom ten, I thought it would be useful to see how these teams have done against expectations. I have no doubt Baylor and Buffalo have been underperformers, but no one expects an Eastern Michigan to play well. I thought one objective way to gauge fan "expectations" would be stadium attendance. This captures not only the amount of fans willing to pay money to support the team, but it also embeds the existing infrastructure investment made in previous years for the football program (in order to have 100,000 fans see your game every week, you must have spent a lot of money to build a stadium to hold them). After gathering stadium attendance for every year available, I wanted to confirm there was a relationship. Below is a scatterplot, graphing attendance for all 115 schools against winning percentage.
As you can see, there is a definite relationship. Higher attendance is significantly correlated with higher winning percentages. The equation on the graph is a least-squares regression line (indicated with the black line) to best "fit" the relationship between the two variables. I'll use this as my definition of "expectations." If a program is reasonably close to that line, they are meeting expectations. So, with that in mind, what programs are deviating most from the expectations line?
The "Model" column indicates the expected winning percentage using the attendance of the school as the x-variable in the regression equation above. We've sorted this list by how much the actual winning percentage exceeds the model. Three of the four outperformers make complete sense: Boise, TCU and Utah are getting good bang for their buck with exceptional winning percentages relative to small attendance (a reflection of their smaller school size). Miami, on the other hand, is a reflection of a higher level of disinterest among its fan base (think of how many empty seats you have seen at Miami games), and their winning percentage was heavily influenced by the early part of the decade (including a huge winning streak from 2000 to 2002). Virginia Tech and Boston College appear to be winning more than their expectations would indicate. Large successful programs include Texas, Oklahoma, and USC.
The most fascinating list might be those programs that are failing to meet expectations. This is the same concept, but looks at the bottom twenty programs sorted by greatest negative difference from the "model." Some programs are so bad (i.e. Eastern Michigan and Buffalo) that they show up on this list despite very low attendance. But some major programs pop up on the list as well, including Mississippi State, Illinois, Michigan State, Texas A&M, Alabama and Arizona. The Crimson Tide are clearly on the right path, but fans of those other programs (in some cases averaging nearly 80,000 fans per game) should ask themselves whether a mere 0.500-type winning percentage is really getting the job done relative to how much support they are getting from the university and fans.
Friday, July 16, 2010
Tuesday, July 13, 2010
2010 Pre-season look at schedule and conference strength
As performed last year, I've analyzed the schedules of every college football team, building a hypothetical strength-of-schedule ("SOS") using the 2010 schedule for each team, but assuming their opponents posted records identical to their 2009 win/loss record. For example, when looking at Texas' 12 opponents in 2010, we are using UCLA's 7 win 6 loss record as one of the 12 pieces of Texas' SOS.
Obviously, there are mathematical imperfections and qualitative flaws in such an analysis (teams that were 12-1 last year may have lost a lot to graduation, etc., and they may end up being much worse in 2010), but I think it still gives a good pre-season feel for how each team's schedule stacks up.
As you can see, I've ranked the contenders' schedules from strongest to weakest in two clusters... the presumed top 10 (strong national title contenders) and the presumed 11-25 (potential teams that can make a move, not unlike OU in 2000). I've used the collegefootballpoll.com survey of available preseason magazine rankings to group the two clusters.A few observations on the above. Alabama, Oklahoma, and Virginia Tech will obviously be in prime position if they put together undefeated seasons. Florida, Texas and Ohio State rank in the middle of the pack. Nebraska's schedule is fairly poor, althought the 73th place ranking would be improved by a likely conference title game. TCU and Boise St, despite all of the pre-season hoopla, continue to be hamstrung by particularly weak schedules.
I find the offseason to also be a good time to "keep score" on how the eleven FBS conferences performed in out-of-conference ("OOC") games. The SEC continues to dominate, but we saw a lot of changes below the top dog. The Big 10 and Pac 10 saw improvements in their OOC winning percentage, while the Big XII, MWC and ACC all took significant steps back.
The OOC games can have huge impacts on BCS posturing, but much of the impact comes via strength of the conference. Even if Texas schedules creampuffs on its own OOC schedule, if the Big XII is winning 80% of its games against OOC opponents, Texas will stand to benefit tremendously by playing in a stronger conference. After all, you play twice as many conference games as you do OOC games, and it helps more if your conference is "strong" than it might if your OOC schedule is strong. Basically, this phenomenon happened for Texas in 2008... despite not really playing anyone all that great OOC, the Big XII had a banner year. To me, this is why the month of September is so fascinating and critical. It can help set the stage for so many important developments down the road.
Only 8 weeks until the madness starts again. Giddy up.
Subscribe to:
Posts (Atom)