Actively managed mutual funds have experienced a steady decline in popularity. There is a large and growing crowd of finance professionals who confidently claim that these funds are better at enriching the fund companies than they are at benefitting the investor. They charge too high of fees, and do not provide sufficient investment performance in return. It is said that investors are better off putting their money in a low cost index fund and calling it a day.
A recent article I came across on Twitter was yet another example of this growing chorus. The article, titled “Uncle Sam Loves Actively Managed Mutual Funds” by Ben Carlson, from the blog A Wealth of Common Sense, uses data from Morningstar’s Jeffery Ptak to make this point. While it does not come out and explicitly state, its conclusion is that actively managed mutual funds are a terrible choice for investors, and a complete disaster if used in taxable accounts. Ben’s article can be found here. The chart below is the lead graphic and paints an ugly picture for active management.
These numbers look awful for actively managed mutual funds. This chart shows a tiny sliver of out performing mutual funds (blue bars) compared to the massive amount of underperforming funds (red bars).
Before I even clicked through to read the article I already had a major issue with it. The implied conclusion of this story is that investing in actively managed mutual funds is bad, and this is proven by data which show a large majority of funds do not out perform their respective indexes after fees and taxes. That is an incorrect conclusion to draw from this data. Instead the conclusion should be “If you randomly select a mutual fund from the entire universe of available mutual funds, you have a high likelihood of choosing a fund which underperforms its category’s index fund.” Those are two very different arguments to make. Just because a lot of funds lag their respective indexes does that mean that investing in any actively managed mutual fund is a bad thing? I could take this same data and write a story that says “Data shows that multiple actively managed mutual funds outperform their respective indexes after taxes and fees over a long-term basis.” If I had the ability to identify the outperformers, would I care how many laggards there are?
I do not have any real attachment to actively managed mutual fund managers, and in fact, believe that much of this research has been good for investors in terms of bringing down fees, and educating investors; however, I also believe the attack on these funds often goes too far. As a professional investment manager, I utilize actively managed mutual funds for my clients. So when I see studies like the one above, it makes me cringe because I feel that it paints the entire industry with one misguided brush. So here are my main objections to the study mentioned above:
1) I do not randomly select mutual funds.
2) Some funds are classified into categories by Morningstar that may not be the same objective outlined by the fund managers, making the benchmark of the study inappropriate.
3) Funds may be invested in for reasons other than beating a generic index (desire for income, completion of a portfolio strategy, exposure to a specific risk factor, etc…)
4) The assumptions of the data are that mutual funds are purchased (oldest surviving share class, which also happens to most often be the most expensive), held with no regard to tax appropriateness at the highest marginal rate, and then sold at the end of the period without regard to tax consequences). It also assumes that any discontinued, or non-surviving funds are automatically categorized as laggards. I know that in order to have some kind of academic rigor basic assumptions need to be made, however, this does not make it reflective of the real life experience.
To start, the assumptions of this study are flawed in many respects. This is a common problem in the investment world. Just think of Modern Portfolio Theory which rests on the assumption that there are no fees and taxes, everyone is a rational investor, etc… For instance, this study uses “oldest surviving share class”. The fund screener on Morningstar states that the oldest surviving share class “is often (but not always) the A share class” which is almost always the most expensive share class available. This can make for a meaningful difference in fund performance. Most of the differences in expense ratios between A shares and the cheapest available option I found in the small cap growth category were around 0.25% per year.
The assumption of taxation is also overly critical. If taxes are being applied at the highest marginal rate, then the investor in question would have a very high amount of assets, which would 1) encourage them to get a financial advisor who can apply some discretion in choosing mutual funds, 2) give them an asset base which would allow them to buy into the lower expense ratio share classes that may have higher initial purchase requirements. Also, someone in that tax bracket should be receiving some sort of tax council which would help them avoid blindly assuming large tax hits.
This issue with non-surviving funds is a difficult one. The problem of survivorship bias needs to be addressed but automatically designating all closed funds as laggards does not strike me as being the answer. If you look at the graph above, this study shows 600 laggards in the small cap growth space, while my numbers (outlined below) show currently 151 funds which have been around for 10 years. This suggests that almost 450 laggards are due to fund closings, though I would be happy to hear from Mr. Ptak if I am wrong in this assumption. Some of those funds may have closed for reasons other than under performance. For instance, a fund company may have consolidated similar strategies into one fund, or the fund company may have been acquired and its funds assimilated. Either way, the investor experience in a closing fund is very hard to measure and therefore, makes this assumption questionable at best.
Given all of the above mentioned issues I have with this study and its findings, I did my own research using Morningstar’s premium fund screener. The limitations of my study are admittedly numerous. For instance, my results are backward looking since I am screening funds today, and due to other things I have going in life (namely a full time job and kids) I am using just the small cap growth segment as my sample. However, it does show a somewhat different picture than the one painted above.
If you run a screen for small cap growth funds that have been around for 10 years (inception date 11/30/2005 or older) and screen for “distinct portfolio only”, which narrows the results to one class of each fund, this gives you only 151 funds from which to choose. Notice again the graphic above shows 600 small cap growth laggards. This suggests that the bulk of the laggards are due to funds dying, which as I mentioned, is inappropriate due to the fact that some funds get absorbed due to the fund company combining funds or getting bought out; not just under performance. Either way, the investor experience in those cases is very hard to determine.
Using the Vanguard Small Cap Growth Index fund (VSGIX) as the benchmark, we can run a screen for how many of these funds have 10 year trailing returns greater than those for VSGIX. The first screen comes back with 32 results. 32 out of 151 is hardly as bleak as the Ptak study suggests, and this even includes the assumption of “oldest surviving share class”. If I remove the distinct portfolio requirement, and rerun the performance screen, and then manually remove duplicated share classes I come up with 35 funds. This is a success rate of over 21%. Much improved from the graphic above.
If I apply some simple common sense screens, the results are even less bleak. Let’s screen for no-load funds with expense ratios less than 1.4% (which is roughly industry average). Removing the duplicated share classes, I get a total of 109 funds from which to choose. If I screen for how many of those have beaten VSGIX over ten years I get 28 funds, which improves the success rate to over 25%.
I purposefully did not apply any taxation to these results. While not including a tax cost is just as unrealistic as applying the highest marginal tax rate, my point is that everyone’s tax situation is different and the impact can be alleviated to some degree with prudent portfolio management. I think the only way to compare funds on an apples-to-apples basis is to do so on a pre-tax basis.
Conclusion
Some actively managed mutual funds do outperform the index after fees. My work suggests that the amount is greater than that headlined in Mr. Carlson’s article. And, even though many funds do not beat their category index fund, that does not mean that active management is all bad. If you decide to apply some common sense rigor to your fund selection process, you can improve your chances of success. It is also important to remember that sometimes funds are held for reasons other than outperforming a like index fund.
So while I fully agree with many of Mr. Carlson’s points, (like their should be more movement to Exchange Traded Fund (ETF) structure), and I believe it is worthwhile to educate personal investors that beating the market is hard, (even professionals have a hard time doing it) concluding that investing in actively managed mutual funds is a bad idea misses the point. For many retail investors, a good, low cost index fund will serve them well in most cases. My goal in working with clients is to minimize the risk of not meeting their specific financial goals and it just so happens that actively managed mutual funds often play a role in achieving that objective.
I would be happy to hear your thoughts on the matter.
Related