Val de Vassal uses dozens of quantitative models to identify promising stocks—and avoid potential pitfalls—for his two Glenmede Large Cap funds.
Factor investing is getting a lot of attention in the exchange-traded-fund world. Barron’s has long called the strategy—which chooses stocks according to certain corporate fundamentals, valuation metrics, economic trends, or Wall Street estimates—a hybrid of active and passive management. A true active manager, however, can take the same data and isolate which factors are most applicable to which sectors, and when, while still maintaining the rigor of a rules-based process. That’s what Vladimir de Vassal and his team have done successfully for more than a decade. “We want to make analysis as objective as possible and leave emotion out of it,” he says.The $1.5 billion Glenmede Large Cap Core (ticker: GTLOX) and $1.7 billion Glenmede Large Cap Growth (GTLLX) funds, both launched in 2004, are the products of decades of testing and dozens of quantitative models designed to identify stocks with winning traits—and just as important, spot early signs of trouble.
“A lot of our outperformance comes not just from identifying stocks likely to outperform, but from criteria that focuses on what not to own,” says de Vassal, who works with co-manager Paul Sullivan and three analysts.The Large Cap Core fund has edged out the Standard & Poor’s 500 in eight of the past 10 calendar years; its returns nearly matched the index’s in the other years. Its average annual return of 8.8% over the past decade beats 96% of its large-blend peers.A highly rated chess player in high school, de Vassal (who goes by “Val”), received a degree in finance and accounting from Drexel University and began his career in 1982 in the asset/liability management department of what was then Philadelphia National Bank. He modeled the impact of different interest-rate scenarios and assisted economists in their macroeconomic forecasts. That experience led to a position as director of quantitative analysis in the bank’s investment-management division, where he helped portfolio managers use data to assess their investment decisions.In back-testing performance data, he saw that while there were relationships between stocks’ fundamentals and returns, they were nuanced. Criteria indicative of outperformance for one sector weren’t as meaningful for another, for example, and models that identified promising stocks weren’t as effective at spotting weakness.In the late 1990s, de Vassal left his longtime employer to help start the quantitative-research group at Glenmede Investment Management, a Philadelphia-based asset-management firm founded by the Pew family. There, along with Sullivan, he began putting these models into practice. It was initially a tough go. “This was the peak of the Internet bubble,” recalls de Vassal, now 55. “We had to defend the idea that valuations do matter.” Their models held up—enough so that they were given the green light to launch two strategies in 2002; the mutual funds came two years later.Recently, the core fund was overweight technology and consumer-discretionary stocks. It was underweight capital goods and transportation—all vulnerable to slow or declining growth in Europe and Asia.New positions start at no more than 1% of the portfolio, and no stock takes up more than 2% of assets. “Our goal is to diversify idiosyncratic risk,” says de Vassal.At the heart of the strategy are the models de Vassal started developing early in his career, although they are always evolving—as is the quality of data and speed at which it can be processed. “We can calculate in seconds what used to take hours,” says de Vassal, who credits analyst and mathematician Alex Atanasiu with helping to improve the models.Glenmede Large Cap Core is based on 40 different models, including buy and sell screens for each sector. Free cash flow, for example, is a great measure for many sectors, “but financials don’t have free-cash-flow yield,” he says. Conversely, dividend yield helps to spot promising financials, but isn’t as meaningful when analyzing technology stocks. Fundamental analysts make these distinctions,” says de Vassal.It’s not enough to screen for positive factors. The models look for weak spots by analyzing consistency of earnings, idiosyncratic risk, company debt levels, insider selling, stock liquidity, and volatility. Another hurdle is what de Vassal calls the “whisper signal” related to analysts’ earnings estimates. While no investment decision hinges on sell-side analyst recommendations, “their behavior can offer early signals,” he says. “We don’t think earnings surprises are completely random.”...MORE
Recently in the NVDA fanzine:
We've mentioned, usually in the context of the Top 500* fastest supercomputers, that:"Silicon Valley Is a Big Fat Lie"
Long time readers know we have a serious interest in screaming fast computers and try to get to the Top500 list a couple times a year. Here is a computer that was at the top of that list, the fastest computer in the world just four years ago. And it's being shut down.That was from a 2013 post.
Technology changes pretty fast.
Among the fastest processors in the business are the one's originally developed for video games and known as Graphics Processing Units or GPU's. Since Nvidia released their Tesla hardware in 2008 hobbyists (and others) have used GPU's to build personal supercomputers.
Here's Nvidias Build your Own page.
Or have your tech guy build one for you.
In addition Nvidia has very fast connectors they call NVLink.
Using a hybrid combination of IBM Central Processing Units (CPU's) and Nvidia's GPU's, all hooked together with the NVLink, Oak Ridge National Laboratory is building what will be the world's fastest supercomputer when it debuts in 2018.
As your kid plays Grand Theft Auto....
It's ridiculous I know but it's amazing how often it works out and by working out allows you to use all that power in your Baby Crays or NVIDIA Tesla supercomputers to watch cat videos: