Breaking the Cycle: How the News and Markets Created a Negative Feedback Loop in COVID-19
New research from CBS Professor Harry Mamaysky reveals how negativity in the news and markets can escalate a financial crisis.
New research from CBS Professor Harry Mamaysky reveals how negativity in the news and markets can escalate a financial crisis.
Adapted from “Global Value Chains in Developing Countries: A Relational Perspective from Coffee and Garments,” by Laura Boudreau of Columbia Business School, Julia Cajal Grossi of the Geneva Graduate Institute, and Rocco Macchiavello of the London School of Economics.
Adapted from “Online Advertising as Passive Search,” by Raluca M. Ursu of New York University Stern School of Business, Andrey Simonov of Columbia Business School, and Eunkyung An of New York University Stern School of Business.
This paper from Columbia Business School, “Meaning of Manual Labor Impedes Consumer Adoption of Autonomous Products,” explores marketing solutions to some consumers’ resistance towards autonomous products. The study was co-authored by Emanuel de Bellis of the University of St. Gallen, Gita Johar of Columbia Business School, and Nicola Poletti of Cada.
Co-authored by John B. Donaldson of Columbia Business School, “The Macroeconomics of Stakeholder Equilibria,” proposes a model for a purely private, mutually beneficial financial agreement between worker and firm that keeps decision-making in the hands of stockholders while improving the employment contract for employees.
At Columbia Business School, our faculty members are at the forefront of research in their respective fields, offering innovative ideas that directly impact the practice of business today. A quick glance at our publication on faculty research, CBS Insights, will give you a sense of the breadth and immediacy of the insight our professors provide.
As a student at the School, this will greatly enrich your education. In Columbia classrooms, you are at the cutting-edge of industry, studying the practices that others will later adopt and teach. As any business leader will tell you, in a competitive environment, being first puts you at a distinct advantage over your peers. Learn economic development from Ray Fisman, the Lambert Family Professor of Social Enterprise and a rising star in the field, or real estate from Chris Mayer, the Paul Milstein Professor of Real Estate, a renowned expert and frequent commentator on complex housing issues. This way, when you complete your degree, you'll be set up to succeed.
Columbia Business School in conjunction with the Office of the Dean provides its faculty, PhD students, and other research staff with resources and cutting edge tools and technology to help push the boundaries of business research.
Specifically, our goal is to seamlessly help faculty set up and execute their research programs. This includes, but is not limited to:
All these activities help to facilitate and streamline faculty research, and that of the doctoral students working with them.
Marketers are making increasing use of very brief messages that mention just a brand name or a brand name with a short headline, as in event sponsorship and program endorsements. There has been debate over the effectiveness of these "advertising fragments." This paper introduces an approach for controlled testing of the effects of advertising fragments. Using a reaction-time based procedure, we show that a key effect of advertising fragments is to revive established brand associations, even though these associations are not explicitly communicated.
Effective communication requires that consumers attribute the message content to its intended source. The proposed framework distinguishes four types of source identification processes-cued retrieval, memory-trace refreshment, schematic inferencing, and pure guessing-and delineates their contingencies. Two experiments examine portions of the framework, and experiment 2 introduces a new methodology for decomposing multiple processes. Findings suggest that when cued retrieval fails, consumers try to refresh the original memory trace for the learning episode-a process that is effortful.
Effective communication requires that consumers attribute the message content to its intended source. The proposed framework distinguishes four types of source identification processes-cued retrieval, memory-trace refreshment, schematic inferencing, and pure guessing-and delineates their contingencies. Two experiments examine portions of the framework, and experiment 2 introduces a new methodology for decomposing multiple processes. Findings suggest that when cued retrieval fails, consumers try to refresh the original memory trace for the learning episode-a process that is effortful.
US cities capture public benefits from private developers under several bargaining frameworks: exactions, incentive zoning and public-private developments. These frameworks exist along a continuum of policy-intervention strategies, from passive regulation to active development, from a quid pro quo to incentive to investment policy posture. Each strategy defines a public position, structure and process for negotiation and parameters for the bargaining process.
This paper develops a model of the breakup or unification of nations. In each nation the decision to separate is taken by majority voting. A basic trade-off between the efficiency gains of unification and the costs in terms of loss of control on political decisions is highlighted. The model emphasizes political conflicts over redistribution policies.
A guiding principle in the efficient estimation of rare-event probabilities by Monte Carlo is that importance sampling based on the change of measure suggested by a large deviations analysis can reduce variance by many orders of magnitude. In a variety of settings, this approach has led to estimators that are optimal in an asymptotic sense. We give examples, however, in which importance sampling estimators based on a large deviations change of measure have provably poor performance.
We present simple adverse selection model in which a firm finds it advantageous to insure against bad outcomes and thereby improve its credit quality and reduce its cost of capital.
We define sources of heterogeneity in consumer utility functions related to individual differences in response tendencies, drivers of utility, form of the consumer utility function, perceptions of attributes, state dependencies, and stochasticity. A variety of alternative modeling approaches are reviewed that accommodate subsets of these various sources including clusterwise regression, latent structure models, compound distributions, random coefficients models, etc. We conclude by defining a number of promising research areas in this field.
We define sources of heterogeneity in consumer utility functions related to individual differences in response tendencies, drivers of utility, form of the consumer utility function, perceptions of attributes, state dependencies, and stochasticity. A variety of alternative modeling approaches are reviewed that accommodate subsets of these various sources including clusterwise regression, latent structure models, compound distributions, random coefficients models, etc. We conclude by defining a number of promising research areas in this field.
Global brands represent enormous cash-producing assets. To build them requires consistency over time and across country borders. The key for developing consistent strategy across country borders is identifying the global segment and the global position. The key for implementing that strategy is often the global marketing team.
Evidence from the Boston condominium market of the early 1990's reveals that an owner's equity position determines his experience as a seller. An owner of a property with a high loan-to-value ratio sets a higher asking price, has a higher expected time on the market and, if he sells, receives a higher price than an owner with proportionately less debt. The down payment requirement for purchasers, but not incumbent owners, provides a simple explanation for this phenomenon among owner-occupants.
We document extreme bias and dispersion in the small-sample distributions of four standard regression-based tests of the expectations hypothesis of the term structure of interest rates. The biases arise because of the extreme persistence in short interest rates. We derive approximate analytic expressions for the biases under a simple first-order autoregressive data generating process for the short rate.
We document extreme bias and dispersion in the small-sample distributions of four standard regression-based tests of the expectations hypothesis of the term structure of interest rates. The biases arise because of the extreme persistence in short interest rates. We derive approximate analytic expressions for the biases under a simple first-order autoregressive data generating process for the short rate.
It is argued that, with respect to efficiency gains, the distinction between reform toward a broad-based income tax and reform toward a broad-based consumption tax is relatively minor. This is not to say that there are not important efficiency and distributional consequences of moving from the current tax system to a broad-based consumption tax. Most such consequences can be traced to reform of the income tax.
A study examines the long-term effects of promotion and advertising on consumers' brand choice behavior. Some 8 1/4 years of panel data for frequently purchased packaged goods are used to address 2 questions: 1. Do consumers' responses to marketing mix variables, such as price, change over a long period of time? 2. If yes, are these changes associated with changes in manufacturers' advertising and retailers' promotional policies? Using these results, implications for manufactures' pricing, advertising and promotion policies are drawn.
This paper develops a model of growth and income inequalities in the presence of imperfect capital markets, and it analyses the tickle-down effect of capital accumulation. Moral hazard with limited wealth constraints on the part of the borrowers is the source of both capital market imperfections and the emergence of persistent income inequalities. Three main conclusions are obtained from this model. First, when the rate of capital accumulation is sufficiently high, the economy converges to a unique invariant wealth distribution.
We analyze a multistage inventory system with limited production capacity facing stochastic demands. Each node follows a periodic-review base-stock policy for echelon inventory: in each period, each node attempts to produce enough material to restore cumulative down-stream inventory to a fixed target level. We develop approximations to the key measures of interest (average inventories, average backorders, and service levels) by simultaneously letting the mean demand approach the system's bottleneck capacity and letting the base-stock level for finished goods increase without bound.
We propose a procedure for representing a time series as the sum of a smoothly varying trend component and a cyclical component. We document the nature of the comovements of the cyclical components of a variety of macroeconomic time series. We find that these comovements are very different than the corresponding comovements of the slowly varying trend components.
The payoff of a barrier option depends on whether or not a specified asset price, index, or rate reaches a specified level during the life of the option. Most models for pricing barrier options assume continuous monitoring of the barrier; under this assumption, the option can often be priced in closed form. Many (if not most) real contracts with barrier provisions specify discrete monitoring instants; there are essentially no formulas for pricing these options, and even numerical pricing is difficult.
The payoff of a barrier option depends on whether or not a specified asset price, index, or rate reaches a specified level during the life of the option. Most models for pricing barrier options assume continuous monitoring of the barrier; under this assumption, the option can often be priced in closed form. Many (if not most) real contracts with barrier provisions specify discrete monitoring instants; there are essentially no formulas for pricing these options, and even numerical pricing is difficult.
A firm has inventories of a set of components that are used to produce a set of products. There is a finite horizon over which the firm can sell its products. Demand for each product is a stochastic point process with an intensity that is a function of the vector of prices for the products and the time at which these prices are offered. The problem is to price the finished products so as to maximize total expected revenue over the finite sales horizon. An upper bound on the optimal expected revenue is established by analyzing a deterministic version of the problem.
Studies the responsiveness of manager clients to opinions made by auditors based on their qualifications through an equilibrium model. Discussion on the two-period equilibrium model; Propositions on high report of auditors; Related studies on audit opinions and market opinion.
We develop bounds and approximations for setting base-stock levels in production-inventory systems with limited production capacity. Our approximations become exact as inventories become critical, meaning either that the target service level is very high or the backorder penalty is very large. Our bounds apply even without this requirement. We consider both single-stage and multi-stage systems.
We consider the problem of estimating a density function from a sequence of independent and identically distributed observations xi taking value in Rd. The estimation procedure constructs a convex mixture of "basis" densities and estimates the parameters using the maximum likelihood method.
As a tax base, 'consumption' is sometimes argued to be less fair than 'income' because the benefits of not taxing capital income accrue to high-income households. We argue that, despite the common perception that consumption taxation eliminates all taxes on capital income, consumption and income taxes actually treat similarly much of what is commonly called capital income. Indeed, relative to an income tax, a consumption tax exempts only the tax on the opportunity cost of capital. In contrast to a pure income tax, a consumption tax replaces capital depreciation with capital expensing.
The article develops a theoretical framework that explains firms' reactions to accounting standards developed by the U.S. Financial Accounting Standards Board under its extended adoption policy. The proposed theory highlights the differences between recognized and disclosed accounting information and provides a link between a firm's choice of whether to recognize or disclose information under new accounting standards, and stock price behavior around the adoption announcement.
Understanding volatility in emerging capital markets is important for determining the cost of capital and for evaluating direct investment and asset allocation decisions. We provide an approach that allows the relative importance of world and local information to change through time in both the expected returns and conditional variance processes. Our time-series and cross-sectional models analyze the reasons that volatility is different across emerging markets, particularly with respect to the timing of capital market reforms.
A methodology to price American options with finitely many exercise opportunities simulates the evolution of underlying assets via random trees that branch at each of the possible early exercise dates. From these trees, two consistent price estimates are obtained, one biased high and one biased low. These two estimates can be combined to provide a valid, though conservative confidence interval for the option price.
A methodology to price American options with finitely many exercise opportunities simulates the evolution of underlying assets via random trees that branch at each of the possible early exercise dates. From these trees, two consistent price estimates are obtained, one biased high and one biased low. These two estimates can be combined to provide a valid, though conservative confidence interval for the option price.
This paper presents an applied methodology to assist managers in strategically setting prices and allocating resources over the product, brand, or adoption (diffusion) life cycle. While substantial theoretical work has been achieved in this area in the management science and operations research disciplines, approaches which can be implemented as managerial tools are generally lacking.
A report examines whether gender differences in cognition during the depressed mood exist even when males and females are not depressed. Results reveal that females' thoughts are more internally focused than males'.
We consider the problem of scheduling N jobs on M parallel machines so as to minimize the maximum earliness or tardiness cost incurred for each of the jobs. Earliness and tardiness costs are given by general (but job-independent) functions of the amount of time a job is completed prior to or after a common due date. We show that in problems with a nonrestrictive due date, the problem decomposes into two parts. Each of the M longest jobs is assigned to a different machine, and all other jobs are assigned to the machines so as to minimize their makespan.
As the only practical way to deal with most path-dependent instruments, Monte Carlo estimation is now one of the workhorses of modern derivatives valuation. It has the advantage of being relatively easy to implement in its basic form, and, given enough computer resources, it will converge asymptotically to the correct answer. Yet, once these general principles are acknowledged, one faces the fact that many problems have such high dimension that the basic Monte Carlo technique can require an enormous number of simulations before convergence to a reasonably accurate answer is achieved.
Selling information that is later used in decision making constitutes an increasingly important business in modem economies (Jensen 1991). Information is sold under a large variety of forms: industry reports, consulting services, database access, and/or professional opinions given by medical, engineering, accounting/ financial, and legal professionals, among others.
The Monte Carlo approach has proved to be a valuable and flexible computational tool in modern finance. This paper discusses some of the recent applications of the Monte Carlo method to security pricing problems, with emphasis on improvements in efficiency. We first review some variance reduction methods that have proved useful in finance. Then we describe the use of deterministic low-discrepancy sequences, also known as quasi-Monte Carlo methods, for the valuation of complex derivative securities.
The Monte Carlo approach has proved to be a valuable and flexible computational tool in modern finance. This paper discusses some of the recent applications of the Monte Carlo method to security pricing problems, with emphasis on improvements in efficiency. We first review some variance reduction methods that have proved useful in finance. Then we describe the use of deterministic low-discrepancy sequences, also known as quasi-Monte Carlo methods, for the valuation of complex derivative securities.
We consider a general class of queueing systems with multiple job types and a flexible service facility. The arrival times and sizes of incoming jobs are random, and correlations among the sizes of arriving job types are allowed. By choosing among a finite set of configurations, the facility can dynamically control the rates at which it serves the various job types. We define system work at any given time as the minimum time required to process all jobs currently in the backlog.
In this article I begin by discussing the rationale for mandatory convertibles from the point of view of issuers as well as investors. In general, convertibles securities reduce the costs of "information asymmetry" that can make equity offerings especially expensive for some smaller, high-growth companies (or any firm with little additional debt capacity where management is convinced its shares are undervalued).
We develop a simulation algorithm for estimating the prices of American-style securities, i.e., securities with opportunities for early exercise. Our algorithm provides both point estimates and error bounds for the true security price. It generates two estimates, one biased high and one biased low, both asymptotically unbiased and converging to the true price. Combining the two estimators yields a confidence interval for the true price.
We develop a simulation algorithm for estimating the prices of American-style securities, i.e., securities with opportunities for early exercise. Our algorithm provides both point estimates and error bounds for the true security price. It generates two estimates, one biased high and one biased low, both asymptotically unbiased and converging to the true price. Combining the two estimators yields a confidence interval for the true price.
We propose and analyze a heuristic that uses region partitioning and an aggregation scheme for customer attributes (load size, time windows, etc.) to create a finite number of customer types. A math program is solved based on these aggregated customer types to generate a feasible solution to the original problem. The problem class we address is quite general and defined by a number of general consistency properties.
We propose and analyze a heuristic that uses region partitioning and an aggregation scheme for customer attributes (load size, time windows, etc.) to create a finite number of customer types. A math program is solved based on these aggregated customer types to generate a feasible solution to the original problem. The problem class we address is quite general and defined by a number of general consistency properties.
We give a unified probabilistic analysis for a general class of bin packing problems by directly analyzing corresponding mathematical programs. In this general class of packing problems, objects are described by a given number of attribute values. (Some attributes may be discrete; others may be continuous.) Bins are sets of objects, and the collection of feasible bins is merely required to satisfy some general consistency properties.