Saurian Sagacity discovered an interesting stat: teams that throw the ball for the most yardage are the most penalized, and teams that run for the most yardage are the least penalized. Sunday Morning Quarterback also weighed in on this fact. In both cases, people commented that offenses that throw the ball more tend to run more plays, which gives more opportunities for penalties.
That didn’t sound right to me. In my study of offensive efficiency in Week 1, I demonstrated that the number of plays you run has nothing to do with predicting whether you win. The number of plays you run is dependent on so many different variables, it’s impossible to point to one aspect of the game that determines how many you run. Field position, game situations, defense, and special teams all play a part.
The idea behind the comments is that if you throw the ball more, the clock will stop more (since it stops on incompletions) and you will have more game time to run plays. Well, keep in mind that you get more yards per play passing than by running. In 2006, the average rushing play gained 4.02 yards, while the average passing play gained 7.09 yards (looking at yards/attempt, not yards/completion). Now imagine an 80-yard scoring drive after a touch back . If you were to run every play, it would take you 20 plays to score. If you were to throw it every play, it would take you 12 plays to score.
See the difference? I would think that the extra time afforded by incompletions would be balanced out over the course of the season by the fact that you gain more yards throwing than by running. So, breaking I-A football into quintiles like Saurian Sagacity did, this is what I found regarding passing and total plays run in 2006.
First, rather than look at passing yards, which can be deceiving due to teams’ possession or lack thereof of deep threats, I looked at the percentage of total plays run that were passes. The true test if a team is a passing team or a running team is in the play calling, not in the yardage. I then looked for a relationship between % of plays as pass attempts versus total plays. The average team in 2006 ran 63.99 plays per game, so when I quote deviation, that’s what I’m going off of.
Top Quintile (highest passes/plays): Average 65.47 plays/game, +1.48 deviation
2nd Quintile: 64.68 plays/game, +0.68 deviation
3rd Quintile: 63.20 plays/game, -0.80 deviation
4th Quintile: 63.05 plays/game, -0.94 deviation
Bottom quintile: 63.55 plays/game, -0.44 deviation
So, for the teams that do the most passing, they get an extra play and a half per game. In other words, it’s they get 2.31% more plays than the average team. When you consider that teams in 2006 averaged a penalty every 11.01 plays, that bonus play and a half that the top-20 passing teams had netted them an extra .1344 flag per game.
What these numbers say is that only the top 20% in passing see more than a play per game worth of difference versus the average team in terms of passing adding or subtracting plays from your total. As a sanity check, I ordered teams in terms of plays per game. I then looked at the quintiles and how they fared in penalties per game. The average team in 2006 had 5.81 penalties per game, so when I quote deviations, it is in relation to that.
Top Quintile (most plays/game): 5.80 penalties/game, -0.01 deviation
2nd Quintile: 5.85 penalties/game, 0.04 deviation
3rd Quintile: 5.59 penalties/game, -0.23 deviation
4th Quintile: 6.25 penalties/game, +0.44 deviation
Bottom Quintile: 5.59 penalties/game, -0.22 deviation
These numbers confirm the conclusion: there is no relationship between plays per game and penalties per game anyway, so even if passing more yielded significantly more plays, it wouldn’t guarantee significantly more penalties. Myth busted.