Exploring the Factor Zoo with a Machine-Learning Portfolio

3.August 2023

The latest paper by Sak, H. and Chang, M. T., and Huang, T. delves into the world of financial anomalies, exploring the rise and fall of characteristics in what researchers refer to as the “factor zoo.” While significant research effort is devoted to discovering new anomalies, the study highlights the lack of attention given to the evolution of these characteristics over time. By leveraging machine learning (ML) techniques, the paper conducts a comprehensive out-of-sample factor zoo analysis, seeking to uncover the underlying factors driving stock returns. The researchers train ML models on a vast database of firm and trading characteristics, generating a diverse range of linear and non-linear factor structures. The ML portfolio formed based on these findings outperforms entrenched factor models, presenting a novel approach to understanding financial anomalies. Notably, the paper identifies two subsets of dominant characteristics – one related to investor-level arbitrage constraint and the other to firm-level financial constraint – which alternately play a significant role in generating the ML portfolio return.

Continue reading

Top Models for Natural Language Understanding (NLU) Usage

27.July 2023

In recent years, the Transformer architecture has experienced extensive adoption in the fields of Natural Language Processing (NLP) and Natural Language Understanding (NLU). Google AI Research’s introduction of Bidirectional Encoder Representations from Transformers (BERT) in 2018 set remarkable new standards in NLP. Since then, BERT has paved the way for even more advanced and improved models.

We discussed the BERT model in our previous article. Here we would like to list alternatives for all of the readers that are considering running a project using some large language model (as we do 😀 ), would like to avoid ChatGPT, and would like to see all of the alternatives in one place. So, presented here is a compilation of the most notable alternatives to the widely recognized language model BERT, specifically designed for Natural Language Understanding (NLU) projects.

Continue reading

Subscribe for Newsletter

Be first to know, when we publish new content


    logo
    The Encyclopedia of Quantitative Trading Strategies

    Log in