We encourage all CQA members who have academic contributions to share. If you have a working or published paper you would like to feature, we offer a platform to showcase your research and insights.
This study examines local-thinking bias in sell-side analysts, where they overreact to news within their coverage portfolios while underreacting to news from economically linked firms outside their portfolios. The research shows that these biases lead to predictable return reversals, as market prices reflect the analysts’ overreactions. A trading strategy based on this effect yields significant abnormal returns, with implications for cognitive psychology, behavioral asset pricing, and analyst behavior.
This paper introduces a 170-year-long measure of economic sentiment, developed using text from 200 million pages of 13,000 U.S. local newspapers and machine learning methods. The measure predicts GDP, consumption, and employment growth, even outperforming expert forecasts and influencing monetary policy decisions. Notably, the analysis reveals that news coverage has become more negative across all states over the past 50 years. This paper will be presented at the upcoming ASSA meeting in January.
This paper explores how investors underreact to economic narratives, using a large dataset of digital media sources since 2012 to quantify narrative intensities and construct narrative-mimicking portfolios. Portfolios tied to rising narrative intensities outperform those linked to declining intensities by about 8% annually, revealing the significance of narrative momentum beyond standard risk factors. A related paper, “Crash Narratives,” examines how “crash narratives” in the financial press shape investor behavior and market volatility. Additionally, Robert Shiller’s book, Narrative Economics: How Stories Go Viral and Drive Major Economic Events, delves into how popular stories influence economic events and investor decisions.
This article describes “The Crystal Ball Challenge,” an experiment where 118 finance-trained participants were given one day’s advance access to the Wall Street Journal front page, excluding stock and bond prices, and asked to trade based on this information. Despite their advantage, nearly half lost money, and one in six went bust due to poor market predictions and excessive leverage. In contrast, five experienced macro traders outperformed significantly, demonstrating that while predicting market direction is important, sensible trade-sizing is key to success.
These researchers explore the concept of nonstandard errors (NSEs), which represent additional uncertainty in research results due to variation across researchers. By analyzing the outcomes from 164 teams testing the same hypotheses on identical data, the study finds that NSEs are significant, but smaller in more reproducible or peer-reviewed research. The authors also conclude that researchers often underestimate the level of this uncertainty.
The article discusses how presidential elections influence market behavior and factor performance in quantitative investing. Historically, factors like Earnings Yield and Size have performed well under Republican administrations, while Momentum and Growth have thrived during Democratic leadership. The piece highlights that investors should pay attention to factors like Momentum and industries such as Biotechnology in the current election cycle due to expected volatility.
This paper shows that a deep neural network trained to maximize the Sharpe ratio of the Stochastic Discount Factor is equivalent to a large factor model. It also demonstrates that deeper networks improve out-of-sample performance, with optimal results at around 100 hidden layers.
These researchers propose the Collaborative Multi-Agent, Multi-Reasoning-Path (CoMM) framework to enhance the problem-solving abilities of Large Language Models. CoMM uses multiple agents that collaborate by playing different expert roles, enabling better performance in complex tasks like college-level physics and moral reasoning. The framework is tested in both zero-shot and few-shot scenarios, showing improvements over traditional single-agent and chain-of-thought methods.
This paper introduces a new “quality” factor based on an earnings measure that treats intangible and physical investments equally. Compared to the RMW factor from Fama and French, this new factor exhibits less downside risk and performs better in bear markets. It also generates a significant alpha (2.9%) relative to many existing multi-factor models, including Fama-French, largely due to effective stock selection on the long side and market timing on the short side.
This team highlights common pitfalls and biases in backtesting systematic investment strategies, which can lead to false conclusions and poor out-of-sample performance. It reviews three main types of backtesting methods—walk-forward testing, the resampling method, and Monte Carlo simulations—each with its own challenges and benefits. They also suggest techniques to improve simulation quality and offer approaches for Sharpe ratio calculations that reduce the impact of multiple trials.
This study examines trading costs across different regions and asset classes using data from over 2,000 portfolio transitions between 2016 and 2023, representing $3.1 trillion in trade value. The authors find that actual trading costs align with pretrade estimates, even for large and complex trades, which can be executed at relatively low cost. Using count-data models, they determine that the optimal trade horizon increases with trade risk, value, and complexity, and is similar across equities and fixed income.
This study reveals that small noninstitutional investors, despite holding a smaller share of anomaly portfolios, account for the majority of the variation in anomaly returns. It challenges flow-based explanations by showing that both stock fundamentals and non-fundamental demand equally drive anomalies across different investor types.
This previously featured study employs natural language processing on earnings conference call transcripts from U.S. corporations, uncovering how managers selectively adjust targets to meet predetermined goals. If you are interested in seeing this paper presented, consider attending the Jacobs Levy conference at UPenn.
Kuntara Pukthuanthong, who presented at last year’s Fall Conference, investigates how financial news is distorted as it spreads across different outlets, using ChatGPT-4 to quantify changes in the narrative. She finds that retold articles tend to become more opinionated and negative, leading to a pessimistic public perception, which negatively predicts abnormal returns and increases disagreement among traders.
These researchers introduce a strategy using ChatGPT to identify and analyze unusual aspects of financial communication during earnings calls of S&P 500 firms. The study finds that a significant portion of these calls display unusual communication, which correlates with firm characteristics and business cycles, and leads to negative stock market reactions with increased trading activity, especially when multiple unusual dimensions are detected.
This paper quantifies investor disagreement surrounding corporate news announcements using intraday volume-volatility elasticity, finding that higher disagreement negatively predicts cross-sectional stock returns. The study provides empirical evidence that this disagreement-stock return relationship is particularly strong when optimism outweighs uncertainty.
This study finds that stock-level crowding by S&P 500-benchmarked mutual funds contributes to the low-volatility anomaly in stock prices. The informational networks among these mutual funds lead to high alpha and low volatility in commonly held stocks, driven by increased demand and information sharing within the network.
The paper examines how capital misallocation between superstar and non-superstar firms affects asset prices, finding that these misallocation shocks are linked to lower economic growth and negatively impact stock returns.
This paper examines the growing integration of Large Language Models in finance, highlighting their use in tasks such as automating reports, market forecasting, sentiment analysis, and providing personalized advice. The study demonstrates that LLMs effectively follow natural language prompts in various financial tasks, offering valuable insights and enhancing operational efficiency and decision-making in the financial sector.
This study, which will be presented at next month’s EFA Conference, challenges the notion that only four or five factor principal components (PCs) are sufficient to explain stock returns, demonstrating that out-of-sample Sharpe ratios improve significantly with up to forty PCs due to non-redundant, time-varying factors. These findings suggest that economic conditions and firm diversity significantly influence the number of relevant factors, highlighting substantial barriers to cross-factor arbitrage and the roles of economic complexity and investor learning in asset pricing.
These researchers explore why volatility targeting strategies outperform buy-and-hold positions, attributing the outperformance to trend following due to a negative correlation between return direction and volatility, known as the “leverage effect.” It shows that this trend-based alpha is significant for equities but not for commodities, fixed income, or currency futures, and highlights the connection between volatility targeting and trend following research.
This article from Global Financial Data discusses how market concentration has fluctuated over the past 200 years. It highlights significant historical shifts, including the dominance of railroads in the 19th century and the rise of tech giants in recent decades, illustrating how technological advancements and economic changes have influenced market concentration trends.
The essay series “Situational Awareness: The Decade Ahead” discusses the rapid advancements expected in artificial general intelligence (AGI) by 2027, highlighting economic and security challenges. It emphasizes the need for robust security measures and international cooperation to manage AGI risks. Additionally, it underscores the strategic importance of U.S. government involvement in AGI research to maintain global leadership. Similarly, in this video, Yann Lecrun discusses developing AI systems that can learn, remember, reason, and plan, highlighting the limitations of current technologies and the need for a new framework. Lecun also emphasizes the strategic necessity of addressing the complexities and risks of advanced AI to ensure beneficial outcomes for society.
A new study from CQA Member Dan diBartolomeo integrates traditional finance theory with behavioral finance to enhance asset allocation by constructing efficient portfolios using Modern Portfolio Theory (MPT) and assessing risk through a Risk Assessment Score (RAS). The RAS, derived from an additive questionnaire scoring system, facilitates the selection of an optimal portfolio aligned with an investor’s risk tolerance, demonstrating flexibility across various investor types and market assumptions.
This New York Times article discusses the growing concern over the inadequacy of 401(k) plans for retirement savings, highlighting that a significant portion of the population, especially lower-income groups, faces a bleak retirement outlook, with inadequate savings and an over-reliance on these accounts for both retirement and emergencies. The article underscores the need for systemic changes to address the disparities and inadequacies in retirement savings.
Read
The book by Victor Haghani and James White uses its intriguing title and introductory chapter to highlight how ineffective risk management often leads to the dissipation of family wealth. The rest of the book delves deeper into risk management using utility theory to guide investment decisions. It provides a framework for addressing crucial financial decisions, such as determining trade sizes, assessing personal risk exposure, and asset allocation, all backed by academic theories and quantitative analysis to support the authors’ recommendations.
Read
The combination of “The Virtue of Complexity in Return Prediction” and “Can Machines Time Markets?” offers a compelling narrative on the evolution of market return prediction. The first paper challenges conventional wisdom, advocating for complex models over simplistic approaches in forecasting. Building on this, the second paper explores machine learning’s application in timing markets, revealing the potential of complex models to surpass traditional methods by detecting nonlinear relationships between signals and returns. Though improvements are gradual, the findings suggest a promising trajectory for enhancing return prediction across various asset classes.
This study explores portfolio trading and its effects on corporate bond liquidity, revealing that while portfolio trades generally improve liquidity, especially for riskier and illiquid bonds, they can become costly in troubled markets. The benefits come from diversification, but the costs arise from balance sheet constraints, highlighting the nuanced impacts on bond market liquidity.
Read
In this paper, which will be presented at this year’s EFA Conference, researchers assess the computational reproducibility of over 1,000 empirical finance results from 168 research teams, finding that identical results are reproduced only 52% of the time. Reproducibility improves with better coding skills and greater effort but declines with more complex code and technical questions. Researchers often overestimate the reproducibility of their work, suggesting the need for improved guidelines and policies in academic publishing to enhance research credibility.
Read
This paper demonstrates that using empirical Bayes to analyze 140,000 trading strategies can yield out-of-sample returns comparable to published finance strategies, but without look-ahead bias. The authors highlight that returns are notably concentrated in accounting-based strategies and pre-2004 data. This method offers a rigorous, unbiased approach to identifying asset pricing patterns.
Read
This book offers insights into business, investing, and life through Charlie Munger’s talks, lectures, and public commentary, with the support of Warren Buffett. This volume aims to provide lasting influence and timeless advice, much like Benjamin Franklin’s “Poor Richard’s Almanack.”
Read
This month, we are featuring two articles by our fellow CQA members. Dan DiBartolomeo’s paper, “Climate Risk for Coastal Commercial Real Estate in the Language of Modern Portfolio Theory,” outlines a novel approach for evaluating the risks and returns of coastal real estate investments. By integrating traditional real estate risk models with flood probability data from FEMA and NOAA, he provides crucial insights into how rising sea levels affect commercial real estate portfolios. Meanwhile, Dale Rosenthal’s article, “Liquid Factor Models,” proposes the use of transparent, tradeable, and liquid instruments to create cost-effective factor models. His research demonstrates that these models not only reduce hedging costs but also offer more intuitive and less correlated factors, outperforming traditional models in explaining the variations in diverse investment funds and enhancing the efficiency of fund alpha estimation and management.
This study finds that managers strategically adjust targets in their communications with investors and markets. Using natural language processing on earnings conference call transcripts from U.S. corporations, it reveals that managers select and change targets to ensure they meet their predetermined goals.
Read
These researchers, including Maximilian Muhn, who will be presenting at the upcoming INQUIRE UK webinar, explore the effectiveness of AI tools like ChatGPT in assessing corporate risk. By analyzing earnings call transcripts, it develops measures for political, climate, and AI-related risks, which outperform existing methods in predicting firm volatility and decisions like investment. The AI-based approach also detects emerging risks, like AI-related ones, providing valuable insights at a low cost.
Read
A recent study by three academics challenges the notion that the $1.7 trillion private credit market provides significant extra returns to investors after accounting for risks and fees. Analyzing data from 532 funds, they find that much of the industry’s performance is comparable to stock and credit portfolios, suggesting limited alpha generation. While some alpha may exist, it vanishes once management fees are deducted.
Read
This article presents a portfolio optimization framework called “Signature Trading” that incorporates path-dependencies in signal-asset dynamics using rough path signatures. This method efficiently encodes past observations, allowing for drawdown control and yielding superior results for trading strategies, combining classical theory’s clarity with modern, data-driven flexibility.
Read
This is an open-source resource on causal inference, using Python-based free software for accessibility. It combines scientific rigor with a light-hearted approach, aiming to make learning engaging and inclusive. Part I covers core concepts like potential outcome notation and bias mitigation, while Part II explores modern applications, particularly in tech, focusing on personalization and heterogeneous effect estimation with CATE models.
Read
This paper, from CQA member Greg Forsythe, serves as a tribute to Charlie Munger, aiming to quantify his equity philosophy. It translates Munger’s qualitative principles into quantifiable stock selection criteria, identifying factors like Gross Profitability and Buyout Valuation. Combining these factors yields a strategy resembling Munger’s approach, demonstrating that seeking “wonderful businesses at fair prices” can be replicated using quantitative analytics.
Read
Mary Childs’ article “Upstarts Challenge a Foundation of Modern Investing” delves into Fama and French’s reaction to Pat Akey’s “Noisy Factors,” which scrutinizes alterations in their dataset, prompting widespread discussion among economists due to the lingering unanswered questions. “Change You Can Believe In? Hedge Fund Data Revisions” investigates the reliability of hedge fund data disclosures, revealing systematic revisions in historical returns that impact investor decision-making and underline the importance of accurate and transparent data. Lastly, “Rewriting History” examines alterations in the historical I/B/E/S analyst stock recommendations database, demonstrating nonrandom changes influenced by factors such as analyst reputation and recommendation boldness, with implications for trading signal classifications and back-tests of trading strategies.
Two new studies demonstrate the value of utilizing AI models in processing complex financial information. “Dissecting Corporate Culture Using Generative AI” pioneers the application of generative AI models to unveil analysts’ perceptions of corporate culture from analyst reports, shedding light on the influence of corporate culture on financial outcomes. Meanwhile, “The Usefulness of ChatGPT for Textual Analysis of Annual Reports” investigates the effectiveness of ChatGPT generated sentiment and complexity scores in analyzing UK annual reports for investors. It finds that both measures contain valuable information associated with price reactions to annual report announcements and future profitability levels and changes.
As a chaser to the Vegas cybersecurity presentation, these two papers shed light on the financial implications of cybersecurity risks for firms. “Cyber Risk and the Cross-section of Stock Returns” introduces a machine learning algorithm to quantify firms’ cyber risk, demonstrating its ability to generate significant excess returns for stocks with high cyber risk and emphasizing the importance of cyber risk in pricing stock returns. Conversely, “The Cost of Cybersecurity Exposures: Evidence from a Direct Measure of Firm Network Vulnerabilities” presents a novel measure of cybersecurity exposures based on exploitable vulnerabilities in firms’ networks, revealing that firms with high exposures underperform others by 0.42% per month, resulting in substantial losses.
“Valuing Data as an Asset” underscores the need to update traditional valuation tools to account for the changing dynamics of the data economy, particularly in entrepreneurship, corporate finance, and risk management. Meanwhile, the second paper, “A Century of Macro Factor Investing – Diversified Multi-Asset Multi-Factor Strategies Through the Cycles,” advocates for aligning portfolio allocation with predicted macroeconomic scenarios and leveraging advanced frameworks to enhance risk management and capture predictive signals from asset class and style factors.
This paper investigates non-linear relationships between five equity market factors (Value, Momentum, Small Size, Low Beta, and Profitability) and stock returns, utilizing data from 1964 to 2023. It finds that allowing for non-linearity in modeling these relationships can improve Information Ratios for certain factor portfolios and highlights the complexity of non-linear returns
This article highlights the innovative approach of the newly launched fund BOXX, which employs heartbeat trades to minimize taxes on interest income, a strategy typically utilized to avoid distributing realized capital gains. By utilizing a box spread strategy with options positions, BOXX synthetically earns interest while categorizing it as capital gains under tax regulations.
Pedro Piccoli investigates the relationship between investor attention and returns from betting against beta, finding that higher levels of investor attention are associated with lower BAB returns. The study suggests that investor attention uniquely influences BAB performance, surpassing the impact of other well-known variables such as liquidity constraints, sentiment, lottery demand, or volatility.
This article provides a comprehensive overview of machine learning in asset management, highlighting its applications in portfolio management, factor analysis, sentiment analysis, recommendation, and customer segmentation. Emphasizing the transformative impact of machine learning on the industry, the article underscores its efficiency and necessity while acknowledging challenges in data quality, quantity, interpretability, and fairness during implementation.
This story is part of a larger saga that unfolds in the new book “Tracers in the Dark: The Global Hunt for the Crime Lords of Cryptocurrency.” It details how Sarah Meiklejohn debunked the notion of Bitcoin’s complete anonymity by tracing transactions through the blockchain. Her groundbreaking discovery not only revealed cryptocurrency’s surprising lack of privacy but also provided researchers, tech companies, and law enforcement with unprecedented transparency.
Discover valuable insights into socially responsible investing. Watch Michael Greenstone’s AEA lecture, “The Economics of the Global Energy Challenge,” where he advocates for a broader approach to addressing the climate crisis. Greenstone defines the global energy crisis and offers solutions to energy market failures. “Corporate Climate Lobbying” analyzes corporate spending on anti-climate versus pro-climate lobbying, highlighting correlations with business models and future climate-related incidents. Next, “Performance Attribution for Portfolio Constraints,” introduces a new framework for dissecting portfolio components under constraints, challenging traditional beliefs with evidence that constraints, including those from ESG portfolios, can enhance portfolio performance. Lastly, “Handbook of Sustainable Finance,” compiles lecture notes covering topics like ESG scoring, financial performance of ESG investing, impact investing, and climate risk measures, offering a comprehensive guide to sustainable finance principles and practices.
This paper proposes a new method for imputing daily mutual fund trades in individual stocks using daily stock prices, quarterly fund holdings, and other data, overcoming underidentification through iterative techniques with random and adaptive constraints. The method generates daily stock-level trade estimates with associated confidence levels, showing good accuracy, particularly for larger trades, as validated through proprietary daily fund trading data.
Jegadeesh and Titman revisit the momentum effect they originally documented 30 years ago, providing an evaluation of explanations such as data mining, systematic risk exposure of past winners, and behavioral underreactions to information. Additionally, they review recent literature and analyze post-2000 momentum strategy performance in Pacific Basin and developed Western markets.
In the context of the evolving landscape outlined in the recent Bloomberg article on Wall Street’s adoption of quantitative strategies for private market endeavors, our focus turns to three research papers centered around benchmarking and systematically evaluating private equity. “Private Equity for Pension Plans? Evaluating Private Equity Performance from an Investor’s Perspective” proposes a methodology to evaluate private equity investments, offering valuable perspectives on risk-adjusted performance and optimal allocations among U.S. public pension plans. “Performance Measures in Private Equity” delves into essential performance metrics for assessing private equity funds, providing nuanced understandings crucial for navigating this evolving landscape. Lastly, “Benchmarking Private Equity Portfolios: Evidence from Pension Funds” sheds light on benchmarking practices among U.S. public pension funds, offering insights into the factors influencing benchmark choices and their implications for portfolio management strategies.
These researchers explore the relationship between derivative income strategies, particularly covered calls, and total investment returns, revealing a significant negative correlation over a 25-year period. Despite the allure of generating passive income, investors may unknowingly compromise their overall returns by prioritizing derivative income strategies without considering this tradeoff.
This month, we are highlighting two articles by CQA members. Melissa Brown’s “How Do Low Tracking Error, Multifactor ETFs Fit Into the Factor Investment Landscape?” showcases the advantages of low tracking error, multifactor portfolios over single factor ones, emphasizing the benefits of diversification across factors. Dan diBartolomeo’s “Has ESG Become a Crowded Trade?” investigates whether the widespread adoption of ESG standards distorts financial markets. The study concludes that there is no compelling evidence that crowding has affected the efficiency of equity markets’ asset-pricing mechanisms.
These researchers introduce Multi-Industry Simplex (MIS), a probabilistic model designed to address limitations in the current industry classification standard, GICS. Unlike GICS, which assigns firms to a single industry, MIS utilizes topic modeling in natural language processing to assign firms to multiple industries with relevance probabilities, offering interpretability and auditability.
Read
Top business leaders highlight the year’s 58 must-read books. Notable among these recommendations is Rick Rubin’s The Creative Act: A Way of Being, a compilation of 78 philosophical reflections designed to guide artists in overcoming challenges and grasping the essence of creativity. Additionally, Adam Grant’s Hidden Potential reshapes the narrative of success, prioritizing continuous growth, character development, and effective learning over innate talent. Another standout on the list is Peter Attia’s Outlive: The Science & Art of Longevity, where Attia delves into maximizing one’s “health span,” emphasizing strategies for extending a healthy life span.
Read
This paper reveals that Large Language Models, specifically GPT-4, designed to be helpful and honest, can exhibit misaligned behavior and strategically deceive users in a simulated environment. The model, acting as an autonomous stock trading agent, engages in insider trading and consistently conceals the true reasons behind its decisions when reporting to its manager, demonstrating unexpected and untrained deceptive behavior.
Read
A recent episode of Planet Money, “What Econ Says in the Shadows,” delves into the website Economics Job Market Rumors. EJMR functions as a combination of a job information Wiki and a discussion forum, offering anonymity for users to discuss job searches and share rumors. However, it has gained notoriety for hosting toxic content, prompting a study to determine whether the toxicity is indicative of deeper issues within the economics profession. Economist Florian Ederer and engineer Kyle Jensen revealed in their paper, “Anonymity and Identity Online,” that despite the platform’s anonymous nature, the statistical properties of the username generation scheme allow for the identification of IP addresses. This has led to the recovery of 47,630 distinct IP addresses and the analysis of cross-sectional variations, including toxic speech patterns, across sub-forums, geographies, institutions, and contributors. Additionally, a video presentation of their paper can be found here.
NY Times’ Mark Gimein reviews Copeland’s latest book, The Fund, exposing the notorious hedge-fund giant Ray Dalio’s manipulative leadership at Bridgewater Associates. The book delves into the toxic and dystopic working environment he created. Bloomberg’s review explores Bridgewater’s “believability ranking system” and discusses how the culture at Bridgewater may have revolved more around a cult of personality. In Copeland’s original article, he also delves into Bridgewater’s efforts to gain information advantages, such as building relationships with government officials to anticipate economic interventions.
This paper, set to be presented at the upcoming ASSA conference, investigates the sequential search process in over-the-counter (OTC) financial markets, focusing on corporate bonds, where customers make repeated inquiries to dealers. By analyzing a comprehensive record of inquiries on a leading electronic trading platform, the study finds that after the first failed inquiry, it takes two to three days for a customer to purchase an investment-grade bond, emphasizing the significance of both observed and unobserved heterogeneity in understanding search frictions in OTC markets.
Read
Professor Ethan Mollick discusses the introduction of an AI-powered “Help me write” button in Google Docs, foreseeing widespread use and its impact on outsourcing meaningful tasks. He raises concerns about the potential crisis of meaning as traditionally thoughtful work becomes automated, emphasizing the need for proactive adaptation by organizations to harness the benefits of AI while preserving the value of meaningful contributions.
Read
Pension funds, relying on consultants for asset allocation decisions, replace general consultants due to prior underperformance, while the hiring of specialized consultants is influenced by target asset allocation gaps and board composition. Although specialized consultants enable pension funds to increase investments in private markets, this does not necessarily improve performance, and the concentration of consultants may elevate pension fund flow correlations. This paper is also scheduled for presentation at the ASSA conference.
Read
These researchers investigate intraday ETF arbitrage using options, focusing on the S&P 500 ETF (SPY) and S&P 500 index options (SPX). The study reveals that market makers tap into multi-leg complex options orders to fulfill arbitrage trades following ETF order flow shocks, demonstrating the interconnectedness between ETFs and derivatives markets and highlighting profitable mispricing opportunities for market makers.
Read
The replication crisis in finance and other fields has spurred a healthy dose of reflection. Notably, three articles added to the debate this month. First, Fama & French posted a note that describes in more detail the construction of the SMB and HML factors, and the influence that data and rule changes have had on these factors, which appears to be in response to Pat Akey’s “Noisy Factors.” Second, Bloomberg reported on the retraction of a JFE paper on corporate bond risk factors that has further shaken up the academic world and credit market researchers. Lastly, a WSJ essay delves into more high-profile retractions and suggests that journals may want to consider paying reviewers (more) or hiring scientists who check data and statistics going forward.
Don’t expect to glean any market tips or trading secrets from James Simons, who steadfastly refuses to disclose the method behind his remarkable record in investing. Instead, listen to this mathematician, hedge fund manager and philanthropist sum up a remarkably varied and rich career, and offer some “guiding principles” distilled along the way.
Watch
In this lecture, Abu-Mostafa will explain the science of AI in plain language and explore how the scientific details illustrate the risks and benefits of AI. Between the extremes of “AI will kill us all” and “AI will solve all our problems,” the science can help us identify what is realistic and what is speculative, and guide us in our planning, legislation, and investment in AI.
Watch
Marie Connolly and Alan B. Krueger’s 2005 paper examines economic aspects of the rock and roll industry, emphasizing concert revenues as the primary income for performers. The analysis covers topics such as concert pricing trends, revenue concentration among performers, the secondary ticket market, performer ranking methods, copyright protection, and the influence of technological change.
Read
“Inspired by Dr. Alan Krueger’s research on the economics of the music industry and book, Rockonomics, I created this course in 2021 at the University of California, Riverside. Starting in January 2024, a reimagined version of this course will be offered at the University of Chicago Harris School of Public Policy and will incorporate the study of public policy issues affecting creative sectors.”
More Information
These researchers introduce a method to estimate latent asset-pricing factors by incorporating economically motivated targets for both cross-sectional and time-series properties. The study demonstrates that these targets enhance risk factors’ ability to span the pricing kernel, resulting in significantly improved Sharpe ratios and reduced pricing errors compared to conventional approaches, using a large-scale set of assets. This paper will be presented at the upcoming ASSA conference.
Read
This paper explores the disconnect between Environmental, Social, and Governance (ESG) scores and green patent production, revealing a surprising trend where oil, gas, and energy-producing firms with lower ESG scores are significant innovators in the United States’ green patent landscape. Despite being excluded from ESG funds, these firms contribute substantially to green innovation, particularly in foundational aspects like carbon capture, challenging conventional expectations in sustainable investing.
Read
Chicago Quantitative Alliance
dancardell@gmail.com