Up Next

ki-logo-white
Market-Based Solutions to Vital Economic Issues

SEARCH

Kenan Institute 2024 Grand Challenge: Business Resilience
ki-logo-white
Market-Based Solutions to Vital Economic Issues

economics

SHOW ME:

The paper uses structured machine learning regressions for nowcasting with panel data consisting of series sampled at different frequencies. Motivated by the problem of predicting corporate earnings for a large cross-section of firms with macroeconomic, financial, and news time series sampled at different frequencies, we focus on the sparse-group LASSO regularization which can take advantage of the mixed-frequency time series panel data structures. 

Quantum computers are not yet up to the task of providing computational advantages for practical stochastic diffusion models commonly used by financial analysts. In this paper we introduce a class of stochastic processes that are both realistic in terms of mimicking financial market risks as well as more amenable to potential quantum computational advantages.

This paper surveys the recent advances in machine learning method for economic forecasting. The survey covers the following topics: nowcasting, textual data, panel and tensor data, high-dimensional Granger causality tests, time series cross-validation, classification with economic losses.

When policymakers implement a disinflation program directed at high inflation, the real dollar value of their country’s stock market index experiences a cumulative abnormal 12-month return of 48 percent in anticipation of the event. In contrast, the average cumulative abnormal 12-month return associated with disinflations directed at moderate inflation is negative 18 percent. The 66-percentage point difference between cumulative abnormal returns, along with descriptive evidence and case studies, suggests that unlike the swift eradication of past high inflations documented by Sargent (1982), the US will not experience a quick, low-cost transition from moderate inflation to the Fed’s two-percent target.

Factor analysis is a widely used tool to summarize high dimensional panel data via a small dimensional set of latent factors. Applications, particularly in finance, are often focused on observable factors with an economic interpretation. The objective of this paper is to provide a formal test for the question whether the factor spaces of latent and observable (economic) factors are equal.

In this paper, we develop new methods for analyzing high-dimensional tensor datasets. A tensor factor model describes a high-dimensional dataset as a sum of a low-rank component and an idiosyncratic noise, generalizing traditional factor models for panel data. We propose an estimation algorithm, called tensor principal component analysis (PCA), which generalizes the traditional PCA applicable to panel data.

Dr. Gerald Cohen brings nearly 30 years of high-profile private and public sector experience to the institute, where he is taking a leading role in forwarding Kenan Institute’s mission and translational research efforts.

We model the joint dynamics of intraday liquidity, volume, and volatility in the U.S. Treasury market, especially through the 2007--09 financial crisis and around important economic announcements. Using various specifications based on Bauwens & Giot (2000)'s Log-ACD(1,1) model, we find that liquidity, volume and volatility are highly persistent, with volatility having a lower short-term persistence than the other two. Market liquidity and volume are important to explaining volatility dynamics but not vice versa. In addition, market dynamics change during the financial crisis, with all variables exhibiting increased responsiveness to their most recent realizations.

What is the impact of higher technological volatility on asset prices and macroeconomic aggregates? I find the answer hinges on its sectoral origin. Volatility that originates from the consumption (investment) sector drops (raises) macroeconomic growth rates and stock prices.

We study multi-period sales-force incentive contracting where salespeople can engage in effort gaming, a phenomenon that has extensive empirical support. Focusing on a repeated moral hazard scenario with two independent periods and a risk-neutral agent with limited liability, we conduct a theoretical investigation to understand which effort profiles the firm can expect under the optimal contract. We show that various effort profiles that may give the appearance of being sub-optimal, such as postponing effort exertion (“hockey stick”) and not exerting effort after a bad or a good initial demand outcome (“giving up” and “resting on laurels,” respectively) may indeed be induced optimally by the firm.

Networks of serial entrepreneurs, investors, and their affiliated companies play a critical role in driving entrepreneurial behavior, investor focus, and innovation hot spots within specific industry sectors and are critical for shaping the character of robust regional economies.

A core idea in competitive strategy is that a firm’s ability to capture value depends on the creation of value: maximizing the gap between willingness to pay and cost. This value can depend on or be enhanced through complementarity, where the willingness to pay for an offering is increased in the presence of another offering. Substitutability is assumed to have the opposite effect.