My blog statistics show that an old post from 2013 on Engineering Failure Rates continues to be a popular one to visit. There is an updated one available too, from 2018. As those blogs note, the data is from Ontario’s CUDO website and their definition of “success” is rather broad. If you start in Engineering, and graduate within 7 years from the SAME university with ANY degree, that counts as success for degree completion. So, if you start in Engineering then switch and graduate with a degree in Music, that’s success. However, if you start in Engineering, then leave before graduation to complete a Veterinary degree at Guelph, that’s not a successful degree completion for their statistics. So if you look at those statistics, you need to be aware of what they actually mean (or don’t mean)!
Those statistics always bothered me, so I came up with an alternative measure of Engineering graduation rates, using the same CUDO data source. My hypothesis is that if we use the Engineering first year registration data for a certain year, and then compare that with the Engineering “degrees conferred” data four years later, then that will give us a rough estimate of “success”, specifically within Engineering programs as a whole.
So that’s what I did with downloads from the CUDO website, with the admission data from 2006 to 2012, and the degrees conferred data from 2010 to 2017. (I used a 5 year comparison for Waterloo, since our program takes 5 years to complete when you include the co-op work experience. All other universities can be completed in 4 years, so I used that comparison for the rest.) Based on this approach, we can summarize the results in the graph below, showing average degree completion rates. The “error bars” show plus and minus one standard deviation of the average “success rates” for each university (a measure of how variable the results are).

I call the graph “apparent success rates” because it still doesn’t use individual student data; it’s based on bulk numbers that can hide a lot of variables. Indeed, as we gaze at the graph we see some obviously puzzling results. The Engineering programs at Windsor and Lakehead are highly successful at graduating more engineering students then they admit!
Clearly there are some problems with this data analysis. For one, it doesn’t take into account the fact that some students at other universities can do an optional co-op or internship that will delay their graduation by a year. Secondly, it is based on first year registration data for each engineering program. This means that the students who transfer into Engineering from other programs within the University, or from other universities, are not counted. Likely this explains the ones where the graduation/success rates are over 100%, and may be a factor for those who have rates approaching 100%.
I have no deep insights into the other universities, but for Waterloo I know that in my experience we have extremely few transfers from other Universities, and very few from other programs at Waterloo. Therefore the average success/graduation rate at Waterloo of around 78% is likely a reasonable ballpark estimate for the fraction of new admits that graduate in 5 years.
This all just illustrates once again that defining “success” is complicated, and getting meaningful data to measure “success” is even harder. We just have to make do with what we can get, and recognize the limitations of the data.