Quality Win Index

I figure I should get to this BEFORE the season starts but our good ‘ol friend “Ned” from New England has generated a preseason gauge on the quality of everyone’s schedule in DIII. Readers from last year may remember that Ned created the Quality Win Index (QWI) which assigns points for each win a team gets based on the Pablo Ranking of the opposition. It’s another way to look at Strength Of Schedule (SOS) but it focuses on the quality of the opponents as opposed to the win/loss record. Why is that important? Well, I’ve shown in the past that larger regions have a better chance at playing teams with better records, which in the end allows them to have better SOS numbers than smaller regions. We know that the NCAA Selection Committee looks at the SOS selection criterion as a uniformed number when crossing regions so anything that shows that might not be the case is a good thing in my mind.

So, Ned tried something new this off-season by taking the Pablo Rankings from the end of the year and seeing what each DIII team could get if they won all of their matches. Basically he is attempting to see what the potential maximum QWI score would be for each team. This in turns means that he is basically gauging the quality of everyone’s schedule. When I do my schedule analysis write-ups I rely more on what I know (or think I know) than a formula. Ned now gives us that viewpoint.

If you follow my Podcasts then you know I looked at Ned’s work on a national scale but for the website we’ll focus just on the West Region. With that said, I know I’ve got some Berry readers out there:

Berry College (Pablo Ranked 22) has 18 quality matches with 6 coming in conference for a total Potential QWI score of 63.75. This ranks 20th nationally.

And I’ve got some Marymount readers:

Marymount University (Pablo Ranked 64) has 6 quality matches with 0 coming in conference for a total Potential QWI score of 21.75. This ranks 169th nationally.

Just from those two examples you can see how a strong (deep) conference can help you in QWI. I should also point out that Ned has incorporated bonus points for away wins and for this exercise has included bonus points for wins outside of conference and outside of region. This is an attempt to reward the teams that are trying to play quality teams regardless of where they are located. The exact formula is:

QWI = Opponent Value + (Opponent Value * Away Factor) + (Opponent Value * Region Factor)

Where Opponent Value is 0 – 4 points based on your Pablo Ranking and Away Factor is 0.5 points for an away win and Region Factor is 0.25 points for a non-conference win and 0.5 points for an out-of-region win. Whew! I bet you didn’t think you’d be dealing with match when you started reading this!

As it turns out, Illinois Wesleyan has the highest potential QWI score of 92.75 and 77 teams have the lowest potential QWI score of 0. Now, let’s look at the West Region and note that I have removed Whitman from the list because they made a last minute change to their schedule that would greatly impact these numbers.

Here are the West Region schools with a QWI of 50 or higher:

school Pablo quality matches non conf quality matches conf quality matches total Potential QWI
Trinity University (TX) 8 15 11 4 72.25
Southwestern 48 15 11 4 63.50
Mary Hardin-Baylor 26 15 10 5 62.25
CMS 20 16 8 8 60.25
Cal Lutheran 45 15 7 8 60.25
Hardin-Simmons 136 14 7 7 55.50
La Verne 47 15 7 8 55.25
Austin 169 14 8 6 54.50
George Fox 93 15 5 10 54.00
Linfield 208 17 5 12 53.50
Puget Sound 85 16 5 11 53.50
Pacific Lutheran 14 18 7 11 52.75
Pomona-Pitzer Colleges 132 14 4 10 52.50
Willamette 133 16 4 12 52.25
Whittier 70 13 5 8 52.25
Whitworth 37 16 6 10 51.75

That’s a good representation of our 4 West Region conferences. One thing that jumps out at me instantly is that Pablo basically believes that the SCIAC and the NWC are our deepest conferences. The ASC follows these two and then the SCAC brings up the rear. You can gleam this by looking at the conference quality matches column and see which schools from which conferences have the highest numbers. Two SCAC schools, however, lead the list and that’s because of their out-of-conference schedules they have put together. (Note – the school with the most quality non-conference matches was actually UC Santa Cruz with 12 who finished with a QWI of 49.5.)

Some quick notes about the Pablo rankings and if you look closely you’ll see some rankings that are strange. UMHB was 26th last year but they made it to the regional final losing to the eventual national champion. CMS was highly ranked in the AVCA and actually beat Trinity away but still finished 12 spots lower than the Tigers. I’m not a huge fan of the Pablo rankings but I understand that something has to be used to gauge quality and Pablo is an easy source. I think the biggest sin a ranking system can have is not having the national champion as first, which Pablo didn’t (Calvin, Wittenberg and then Emory were its first three teams).

Here I have the rest of the West Region teams listed in order:

school Pablo quality matches non conf quality matches conf quality matches total Potential QWI
UC Santa Cruz 202 12 12 0 49.50
Texas Lutheran 176 13 7 6 48.50
Texas-Dallas 12 11 8 3 47.75
Dallas 164 11 5 6 44.00
Chapman 44 12 4 8 44.00
Centenary 196 11 5 6 42.75
Occidental 172 12 2 10 41.25
Redlands 138 12 2 10 40.50
Colorado College 10 11 7 4 39.50
Cal tech 297 10 0 10 37.50
Lewis and Clark 88 11 1 10 36.00
Concordia Texas 72 9 4 5 35.75
Schreiner 308 10 4 6 35.00
East Texas Baptist 110 8 3 5 33.75
LeTourneau 218 8 3 5 31.75
Pacific University 40 11 1 10 31.00
Johnson & Wales-CO 279 7 1 6 30.50
Belhaven 238 7 2 5 26.00
Howard Payne 256 8 1 7 23.50
Sul Ross State 251 7 0 7 21.50
Louisiana College 290 5 0 5 17.00
McMurry 68 5 0 5 16.50
Mills 408 1 1 0 5.25

It’s not surprising to see some of the teams trying to rebuild at the bottom of this list. It makes sense and if you read my article on scheduling then you know that there is a pecking order teams have to go through before you can really schedule quality teams. It is a bit surprising to see a team like McMurry here and to a lesser extent Pacific. Pacific decided to hold a tournament in Hawaii and the teams willing to go with them basically hampered their QWI score. I don’t really understand McMurry especially when you compare them to what Hardin-Simmons was able to do. (These schools are only 4 miles away from each other.)

I wanted to get this information out today because in two days we get our first glimpse at the real quality of the DIII schools based on their performances and not a ranking system that is based on a number of players that have since graduated. Still, I think QWI does give us another insight into how schools schedule and which schools really try to play quality opponents. It also should be a guide into which schools should have good SOS numbers and which schools will struggle with that criterion. In the end, that’s what’s important to the selection committee.

Ned will be updating his numbers throughout the season and I’ll share them when he shares them with me. The main thing I like with QWI is comparing the scores to the Ranked Win selection criterion and seeing how certain schools with “good” Ranked Wins look when you factor in the QWI. This will all happen when the regional rankings come out in October.


5 thoughts on “Quality Win Index

  1. I think the biggest sin a ranking system can have is not having the national champion as first, which Pablo didn’t…

    Really? That sort of puts the tail wagging the dog where one match weighs more than the body of work built over months. Should every win move the dial? If there’s an upset you should be able to say “Juniata beats St Anonymous nine times out of ten, but congrats to St A on last night’s win” and that should include the last game.

    Liked by 1 person

    1. Shouldn’t the best team in the nation win the most important match of the year? A match that has unique pressures played normally on a neutral court?

      If you are saying the ranking system should reflect overall quality from day 1 to day END then I see your point. I think the rankings should reflect the best teams at the time of the ranking. That’s why it’s best to rank without consideration of how you ranked prior. Redo the work with the new info.


      1. That’s fine if teams are evenly matched: last game is a great criteria for spearating them. But no system absolutely moves teams based on yesterday’s game results. A #11 losing to an unranked team doesn’t fall completely off the charts, nor does beating a ranked team immediately grant you their spot in the top ten like some medieval crown. Both teams may slide a little, but the work to date dampens the impact of any single match.

        If #22 gets hot and beats #1 in the championship match due to a good day intersecting with an injured setter then a final ranking showing them as better but not top, say #9, reflects the unexpectedness of the win and the David-Goliath nature of the match. I think there’s more value in rankings that are deeper than strobe flashed images of the last game.


      2. #22 didn’t just appear in the championship match. There was a roadmap of victories prior. Of course with DIII it’s possible that a lower ranked team or an unranked team gets an easy regional and then shocks two teams before winning the championship. With the re-ranking of the Elite 8 it probably means the #1 fell to them first and #2 in the championship.

        In that scenario I’m more inclined to believe the ranking didn’t reflect the quality of the school prior to the tournament. I can honestly say that at the most important time, that school was the best in the nation.

        Sounds like a good Twitter poll.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s