Breaking the Cycle: How the News and Markets Created a Negative Feedback Loop in COVID-19
New research from CBS Professor Harry Mamaysky reveals how negativity in the news and markets can escalate a financial crisis.
New research from CBS Professor Harry Mamaysky reveals how negativity in the news and markets can escalate a financial crisis.
Adapted from “Global Value Chains in Developing Countries: A Relational Perspective from Coffee and Garments,” by Laura Boudreau of Columbia Business School, Julia Cajal Grossi of the Geneva Graduate Institute, and Rocco Macchiavello of the London School of Economics.
Adapted from “Online Advertising as Passive Search,” by Raluca M. Ursu of New York University Stern School of Business, Andrey Simonov of Columbia Business School, and Eunkyung An of New York University Stern School of Business.
This paper from Columbia Business School, “Meaning of Manual Labor Impedes Consumer Adoption of Autonomous Products,” explores marketing solutions to some consumers’ resistance towards autonomous products. The study was co-authored by Emanuel de Bellis of the University of St. Gallen, Gita Johar of Columbia Business School, and Nicola Poletti of Cada.
Co-authored by John B. Donaldson of Columbia Business School, “The Macroeconomics of Stakeholder Equilibria,” proposes a model for a purely private, mutually beneficial financial agreement between worker and firm that keeps decision-making in the hands of stockholders while improving the employment contract for employees.
At Columbia Business School, our faculty members are at the forefront of research in their respective fields, offering innovative ideas that directly impact the practice of business today. A quick glance at our publication on faculty research, CBS Insights, will give you a sense of the breadth and immediacy of the insight our professors provide.
As a student at the School, this will greatly enrich your education. In Columbia classrooms, you are at the cutting-edge of industry, studying the practices that others will later adopt and teach. As any business leader will tell you, in a competitive environment, being first puts you at a distinct advantage over your peers. Learn economic development from Ray Fisman, the Lambert Family Professor of Social Enterprise and a rising star in the field, or real estate from Chris Mayer, the Paul Milstein Professor of Real Estate, a renowned expert and frequent commentator on complex housing issues. This way, when you complete your degree, you'll be set up to succeed.
Columbia Business School in conjunction with the Office of the Dean provides its faculty, PhD students, and other research staff with resources and cutting edge tools and technology to help push the boundaries of business research.
Specifically, our goal is to seamlessly help faculty set up and execute their research programs. This includes, but is not limited to:
All these activities help to facilitate and streamline faculty research, and that of the doctoral students working with them.
This article provides confirmatory evidence of the value-relevance of book values of oil and gas properties. Harris and Ohlson (1987) find that the book values correlate significantly with the inferred market values of oil and gas properties. Reserve recognition accounting requires the simultaneous publication of alternative measures that are often assumed to be more relevant values of the oil and gas properties.
A meta-analysis of results from 320 published studies relates environmental, strategic, and organizational factors to financial performance. The 320 empirical studies that were reviewed were published between 1921 and 1987. Findings from the most frequently studied relationships include: 1. Industry concentration was addressed in almost 100 studies; over 1,100 tests show a clear directional effect. 2. Growth, analyzed in 88 studies, is consistently related to higher financial performance. 3. Market share is positively associated with financial performance. 4.
A number of issues that relate to the desirability and implications of new venture financing are examined within a principal-agent framework that captures the essence of the relationship between entrepreneurs and venture capitalists. The model suggests: (1) As long as the skill levels of entrepreneurs are common knowledge, all will choose to involve venture capital investors, since the risk sharing provided by outside participation dominates the agency relationship that is created.
In estimating functions of continuous-time Markov chains via simulation, one may reduce variance and computation by simulating only the embedded discrete-time chain. To estimate derivatives (with respect to transition probabilities) of functions of discrete-time Markov chains, we propose embedding them in continuous-time processes. To eliminate the additional variance and computation thereby introduced, we convert back to discrete time. For a restricted class of chains, we may embed in a continuous-time Markov chain and apply perturbation analysis estimation.
A new high-performance algorithmic switched-current memory cell with greatly improved charge injection per- formance is described. The new cell uses algorithmic means to achieve an improvement in charge injection of two orders of magnitude and does not rely on matching.
We argue that although decentralization has advantages in finding low-cost solutions, these advantages are accompanied by coordination problems, which lead to delay or duplication of effort or both. Consequently, decentralization is desirable when there is little urgency or a great deal of private information, but it is strictly undesirable in urgent problems when private information is less important. We also examine the effect of large numbers and find that coordination problems disappear in the limit if distributions are common knowledge.
Similarity scaling often requires subjects to produce such a large number of judgments that fatigue may become a problem. Yet it remains unclear just how respondent fatigue affects similarity perceptions and resulting judgments. The present study uses a categorization perspective to examine the effects of fatigue on similarity judgments. The results suggest that subjects rely increasingly on category membership as they progress through a similarity judgment task.
Meta-analysis has become a popular approach for studying systematic variation in parameter estimates across studies. This article discusses the use of meta-analysis results as prior information in a new study. Although hierarchical prior distributions in a traditional Bayesian framework are characterized by complete exchangeability, meta-analysis priors explicitly incorporate heterogeneity in prior vectors.
A shopping mall, new office towers, a convention center, an atrium hotel, a restored historic neighborhood. These are the civic agenda for downtown development in the last third of the twentieth century, a trophy collection that mayors want. Add a domed stadium, aquarium, or cleaned-up waterfront to suit the circumstances, and you have the essential equipment for a first-class American city.
This article presents an exploratory investigation into longitudinal patterns of influence in group decision-making. In particular, we focus on how the outcomes of past decisions affect group members' relative influence in future joint decisions. Results suggest that past outcomes play an important role in the resolution of disagreements when group member preferences are equally intense. Losers in prior decisions are likely to win in the future (and vice versa) due to what appears to be promotion of equity in the group.
In most vehicle routing problems, a given set of customers is to be partitioned into a collection of regions each of which is assigned to a single vehicle starting at a depot and returning there after visiting all of the region's customers exactly once in a route. In this paper we consider problem settings where the cost of a route may depend on its length ϑ as well as m, the number of points on the route, according to some general function f(ϑ,m), assumed to be nondecreasing and concave in ϑ.
A central assumption of meta-analysis is that the sample of studies fairly represents all work done in the field, published and unpublished. However, if studies with "poor" results are less likely to be published, a potential publication bias is present. The authors propose a maximum likelihood approach to estimating publication bias for the situation in which censorship based on effect size may occur. An explicit hypothesis test is provided for testing whether or not censorship is present.
By committing to terminate funding if a firm's performance is poor, investors can mitigate managerial incentive problems. These optimal financial constraints, however, encourage rivals to ensure that a firm's performance is poor; this raises the chance that the financial constraints become binding and induce exit. We analyze the optimal financial contract in light of this predatory threat. The optimal contract balances the benefits of deterring predation by relaxing financial constraints against the cost of exacerbating incentive problems. (JEL 610)
The ultimate success of new product R&D depends as much on customer acceptance as on technological breakthroughs. In this article, Susan Holak and Donald Lehmann focus on customer acceptance by exploring the manner in which the general attributes of Rogers (relative advantage, compatibility, complexity, divisibility and communicability) plus perceived risk combine to form the intention to buy an innovation. Results demonstrate a causal structure among these attributes and lead to various implications for R&D guidelines and product design.
PASTA (Poisson Arrivals See Time Averages) is a term coined by R. Wolff in his well known 1982 paper. In keeping with Wolff's terminology, we use the term anti-PASTA to refer to the following converse of PASTA. Given that arrivals do indeed see time averages, when must the arrival process necessarily be Poisson? We show that anti-PASTA is satisfied in a pure-jump Markov process, provided that the arrival process corresponds to a subset of the Markov process jumps.
Research on object concepts has identified one level of abstraction as "basic" in cognition and communication. We investigated whether concepts for routine social events have a basic level by replicating the converging operations used to investigate object concepts. In Experiment I, subjects were presented with event names from a taxonomy and were asked to list the actions comprising the event.
I construct an intertemporal model in which investors trade shares of a firm. All trading is done through competitive market makers. After the initial period and before the end of the planning horizon, information is asymmetrically distributed among traders, and the prices for investors who buy shares are higher than for those who sell shares. The presence of this deviation from the Walrasian paradigm notwithstanding, dividend policy does not affect the initial period's share price or shareholders' welfare. This result is robust to various extensions of the model.
Federal cutbacks in urban aid in the 1970s forced cities to finance redevelopment projects with their own resources. Freed from federal rules and regulations, cities responded with invention, devising new financial strategies that proved to be powerful alternatives to direct federal aid. The process that fostered the solutions—public-private dealmaking—transformed the nature of city development practice, raising with it troublesome issues of accountability. This article describes these financial strategies and the nature of public subsidies in the deals.
Literature concerning the quality of individual and face-to-face group judgments has generally concluded that both groups and statistically pooled individuals outperform randomly chosen or average individuals. This paper extends previous research by comparing statistically pooled individual judgments of both individuals and face-to-face groups in a stock selection task. In general, decisions that would have resulted from statistically pooled judgments were better (as assessed by future stock value) than those that would have resulted from individual or face-to-face group judgments.
We consider distribution systems with a depot and many geographically dispersed retailers each of which faces external demands occurring at constant, deterministic but retailer specific rates. All stock enters the system through the depot from where it is distributed to the retailers by a fleet of capacitated vehicles combining deliveries into efficient routes. Inventories are kept at the retailers but not at the depot.
We consider a single-server queueing system with Poisson arrivals and general service times. While the server is up, it is subject to breakdowns according to a Poisson process. When the server breaks down, we need to repair the server immediately by initiating one of two available repair operations. The operating costs of the system include customer holding costs, repair costs and running costs. The objective is to find a corrective maintenance policy that minimizes the long-run average operating costs of the system. The problem is formulated as a semi-Markov decision process.
This study reports on the ex-post performance of survivor REITs and RECs over a 14.5-year period covering several business cycles. The results show that the systematic risk and risk-adjusted returns of REITs and RECs are quite different, especially during periods of low growth in real GNP. Relative to the overall stock market, survivor REITs, in particular, equity REITs, exhibited less volatility and higher returns than previous studies revealed.
This article treats the dynamic lot size model with quantity discount in purchase price. We study the problem with two different cost structures: the all-units-discount cost structure and the incremental-discount cost structure. We solve the problem under both discount cost structures by dynamic programming algorithms of complexity O(T3) and O(T2), respectively, with T the number of periods in the planning horizon.
Infinitesimal perturbation analysis is a method of obtaining estimates of performance sensitivity through simulation of a stochastic system. Expressions are derived for the limiting value of a broad class of such estimators associated with queueing networks, in terms of the unique solution to a set of linear equations. The approach used is to augment the underlying queueing process with information about which servers have been "perturbed" and by how much.
The issue of providing segment disclosures has renewed significance because the Securities & Exchange Commission (SEC) has been considering the extension of segment disclosures, both line-of-business (LOB) and geographically segmented (GEOG), to all interim financial statements. To determine whether GEOG data provide incremental information about the earnings process, the specific contribution of sales and income GEOG data was evaluated by estimating their predictive ability. Two sets of GEOG predictions were used in the predictive accuracy tests.
In this study we consider managerial earnings forecasts as voluntary information releases and compare their properties with predictions from a screening or signaling scenario.
Two methods are presented for estimating performance derivatives from simulation of multi-class queueing networks for sensitivity analysis. The methods use approximate subnetwork aggregation to reduce the problem to a single-class derivative estimation problem with which a modified infinitesimal perturbation analysis algorithm is used. The modified algorithm treats a subnetwork as though it had been aggregated, but is actually applied to the original (non-aggregated) network.
Knowledge of the one-month interest rate is useful in forecasting the sign as well as the variance of the excess return on stocks. The services of a portfolio manager who makes use of the forecasting model to shift funds between bills and stocks would be worth an annual management fee of 2 percent of the value of the assets managed. During 1954:4 to 1986:12, the variance of monthly returns on the managed portfolio was about 60 percent of the variance of the returns on the value weighted index, whereas the average return was two basis points higher.
This paper performs a financial statement analysis that combines a large set of financial statement items into one summary measure which indicates the direction of one-year-ahead earnings changes. Positions are taken in stocks on the basis of this measure during the period 1973–1983, which involve canceling long and short positions with zero net investment. The two-year holding-period return to the long and short positions is in the order of 12.5%. After adjustment for "size effects" the return is about 7.0%. These returns cannot be explained by nominated firm risk characteristics.
This paper examines the properties of a market solution to the output uncertainty problem faced by unincorporated primary producers. An insurance contract is formulated and shown to provide income insurance to producers without exposing insurers to moral hazard. The equilibrium relative to this contract is shown to be equivalent to that effected by a stock market available to all producers.
We examine two measures of monthly manufacturing production. The first is the index of industrial production; the second is constructed from the accounting identity that output equals sales plus the change in inventories. We show that the means, variances, and serial correlation coefficients of the log growth rates differ substantially between the two series, and the cross-correlations are in most cases less than 0.4.
This paper utilizes the concept of aggregative consistency defined in Rubinstein and Fishburn [1986], and the FASB's concept of representational faithfulness to evaluate foreign currency translation and accounting for changing prices as embodied in SFAS 70. The paper shows that SFAS 70 produces measurement errors and creates a foreign currency translation adjustment which does not reflect the effects of exchange rate changes. The conditions defined in the paper also facilitate an evaluation of the relative merits of restate/translate and translate/restate.
We consider a single-server queueing system with Poisson arrivals and general service times. While the server is up, it is subject to breakdowns according to a Poisson process. When the server breaks down, we may either repair the server immediately or postpone the repair until some future point in time. The operating costs to the system include customer holding costs, repair costs and running costs. The objective is to find a corrective maintenance policy which minimizes the long-run average operating costs of the system. The problem is formulated as a semi-Markov decision process.
No one has derived closed-form solutions for consumption with stochastic labor income and constant relative risk aversion utility. A numerical technique is used here to give an accurate approximation to the solution. The resulting consumption function is often dramatically different than the certainty equivalence solution typically used, in which consumption is proportional to the sum of financial wealth and the present value of expected future income.
This paper is motivated by two facts: failure of log-linear empirical exchange rate models of the 1970's and the observed variability of risk premiums in the forward market. Rational maximizing models predict that changes in conditional variances of monetary policies, government spendings, and income growths affect risk premiums and induce conditional volatility of exchange rates.
Several recent studies have suggested that empirical rejections of the permanent income/life cycle model might be due to the existence of liquidity constraints. This paper tests the permanent income hypothesis against the alternative hypothesis that consumers optimize subject to a well-specified sequence of borrowing constraints. Implications for consumption in the presence of borrowing constraints are derived and then tested using time-series/cross-section data on families from the Panel Study of Income Dynamics.
Trading on private information creates inefficiencies because there is less than optimal risk sharing. This occurs because the response of market makers to the existence of traders with private information is to reduce the liquidity of the market. The institution of the monopolist specialist may ease this inefficiency somewhat by increasing the liquidity of the market. While competing market makers will expect a zero profit on every trade, the monopolist will average his profits across trades. This implies a more liquid market when there is extensive trading on private information.