Printer Friendly

Endogenous expectations.

ABSTRACT: I selectively survey the use of expectations in accounting research. While expectations are central to modeling work and essential in emperical documentation, we tend to rely on largely exogenous expectations, as opposed to closing the analysis with aggressive identification of information sources and an explicit equilibrium argument.

I. INTRODUCTION

Expectations are the centerpiece of accrual accounting, and expectations about these accruals and their use are the centerpiece of accounting research. Yet in our teaching and in our research we typically employ reduced-form specification of the accounting process, coupled with transparent, largely exogenous expectation structures. The manner in which we estimate "abnormal accruals" or make use of analysts' forecasts are cases in point, as are value relevance, audit judgment, compensation and earnings response studies, and FASB deliberations.

This reliance on largely exogenous expectation structures, I think, needlessly limits the depth and boundaries of our teaching and research. My purpose here is to document this claim, and to argue for a more inclusive approach to our scholarship, one that emphasizes "micro foundations" and an equilibrium view of behavior.

Accrual accounting, of course, is a formalized anticipatory statement of stocks and flows. For example, the noncash asset balances and the liability balances on a balance sheet, the various stocks of accruals, represent stocks of anticipated benefits and payments, just as the noncash components of income, the flows of various accruals, represent flows of anticipated benefits and resource consumptions.

These accruals, however, are surely not happenstance. Rather, they are estimates--estimates that can be interpreted as expectations as well as expectations that stem from choices: choice of accounting method as well as the decision to engage in the underlying transactions themselves. Consider, for example, the use of "mark to model" techniques to estimate the fair value of an unusual financial instrument. The firm chooses to employ the instrument, and likewise enjoys considerable discretion in its temporal measurement. Relatedly, the firm is free to adjust its transactions in anticipation of a regulatory event, such as expanded fair value reporting of financial instruments or the demise of pooling. In parallel fashion, the manner in which accounting measures are used is itself a choice: a choice to buy or sell, a choice to go forward with the new product proposal, or a choice to allow management's self-report to stand.

This suggests our focus on understanding the nature and use of accounting measures should, ideally, be based on understanding how these choices are made, including the fact that coordinating forces, such as organizational architecture, market clearing, regulation, and education, are at work. (1) Yet, as noted, we tend to rely on reduced form, somewhat ad hoc techniques instead of more directly on how the choices themselves are made. In an audit judgment experiment, for example, we typically rely on psychology, to the exclusion of organizational and client interaction issues; in an earnings response experiment we typically rely on the difference between reported and analysts' forecasted earnings in identifying the substance of the information release, to the exclusion of other information or the organizational context in which the forecasts are produced and archived.

In emphasizing the underlying choices, my instinct is to rely on the context, forces, and behaviors involved, to emphasize the micro foundations or calculus of the choice setting, so to speak. Coupled with an equilibrium argument, we then have a picture of endogenous expectations and choices. (2)

I begin with a selective survey of the use of expectations in accounting practice, followed by a brief illustration of the expectations theme. From there I examine selected examples of accounting scholarship that highlight various expectations: earnings management studies, use of analyst forecasts, regulation assessment studies, audit judgment studies, compensation studies, cost measurement studies, and governance studies. The recurring theme is a consistent pattern of reliance on exogenous expectations, of underinvestment in understanding and exploiting the underlying choices themselves; this leads to the conjecture that if we broaden our analysis, broaden it to the point the underlying expectations are endogenous, we will then shed new light on these various phenomena. I emphasize my point is not a criticism of where we are, but an attempt to capitalize on what we have accomplished and move to a broadened, more integrated view of our subject matter, a task that depends on theoretical work, empirical work, and their interaction.

II. REFLECTIONS ON CURRENT PRACTICE

As a prelude to my main theme, it is important to lay out several patterns in the practice of accounting. To be sure, expectations are present, indeed widespread, in current practice. Consider the going concern concept, the auditor's reliance on judgment-aided sampling, asset and income adjustments in the typical EVA[R] implementation, the Bureau of Economic Analysis's (BEA) income estimates, Standard & Poor's core earnings project, and the ratio of cash to noncash assets for your favorite firm.

Here we see, understandably, an imprecise notion of "looking forward" or "anticipating," as with the classic definition of an asset in terms of service potential. For example, the FASB's (1985) Concepts Statement No. 6, paragraph 28 states:
   The common characteristic possessed by all assets (economic
   resources) is "service potential" or "future economic benefit," the
   scarce capacity to provide services or benefits to the entities that
   use them. In a business enterprise, that service potential or future
   economic benefit eventually results in net cash inflows to the
   enterprise. In a not-for-profit organization, that service potential
   or future economic benefit is used to provide desired or needed
   goods or services to beneficiaries or other constituents, which
   mayor may not directly result in net cash inflows to the
   organization ... The relationship between service potential or
   future economic benefit of its assets and net cash inflows
   to an entity is often indirect in both business enterprises and
   not-for-profit organizations.


Nonetheless, despite the predilection for expectations as we in the academy think of them, there are distinct patterns of truncation, conservatism, and subjectivity.

In particular, accounting is an information channel that uses financial measures to convey information. Some underlying event structure or partition is to be reported, this is the recognition side of the exercise, and whatever is reported is done so via the accrual process, the scaling or measurement side of the exercise. To a theorist, the substantive issue is the recognition side, not the scaling side. For example, lower of cost or market ignores revaluation events unless they are bad news, while fair value recognizes both good and bad news revaluation events. Likewise, the so-called mixed attribute model embraced by the FASB is, to a theorist, a recognition apparatus that is more aggressive than historical cost and less aggressive than full-blown fair value.

Truncation, now, refers to the deliberate exclusion of particular types of events. The going-concern concept calls for estimation, conditional on the assumption, the "event," the entity will not fail in the foreseeable future. SFAS No. 5 calls for recognition of probable liabilities for which a reasonable estimate can be developed. Unlikely or difficult to estimate items are simply excluded.

This exclusion, or truncation, phenomenon is in fact commonplace. SFAS No. 2 (FASB 1980) excludes the benefit side of an entire class of investments, just as pooling excludes another class. an implicit disposal obligation is not accrued, absent a so-called "obligating event.'' (3) For that matter, a bond issued at discount is accrued with an interest calculation that ignores interest rate movements. SPEs are designed to achieve off-balance-sheet status, another form of truncation; then there are the larger issues of historical cost per se and recognizing growth prospects.

Not to be outdone, in the managerial sphere we are familiar with the lack of opportunity cost measurement, another form of exclusion or truncation. Likewise, an ABC implementation, with its inherent linearity, is based on an expectation of constant returns, to the exclusion of, say, scale or scope effects.

For that matter, the audit risk model focuses on components of an audit, and forces a particular expectation structure onto the calculation. Portfolio effects are absent, be they explicit dependence inherent in the double entry system or, say, covariance with the audit firm's stock of engagements. Moreover, a central feature of the attestation function, as with any control device, is the threat, the expectation; off-equilibrium controls are, by definition, truncated from the potential set of observations.

Fair value is, I suspect, widely viewed as a less truncated approach to use of expectations in the accrual process, in that it forces the notion of arm's length, mutually acceptable trade, into the event recognition process and thus brings events pertinent to temporal valuation into the reporting domain. Yet highly active versus thin markets are not part of the calculus, private seller information, as in the classic winner's curse, is hardly part of the calculus, and strategic issues of a deeper nature are not part of the rhetoric. (4)

Conservatism, meaning an inherently downward-biased estimator, is also at work. (The typical market-to-book ratio being well above unity is illustrative.) Though truncation surely produces conservatism in such cases as delayed recognition of a new product or growth in general, conservatism is not simply a manifestation of truncation. For example, other things equal, the going-concern concept implies an upward-biased estimate of the firm's net asset stock. Similarly, an audit policy that more aggressively challenges good than bad news will provide a conservative appearance. (5)

Moreover, many, if not most, of the accrual expectations are inherently subjective. Subjectivity per se is not an issue (Savage 1954). Rather, subjectivity implies an element of reporting discretion. Regulations, in turn, promote various truncations as well as patches of conservatism and, in the process, facilitate the audit function.

The larger point is accruals, viewed as expectations, are central to the practice of accounting, and they are far from "mechanical." Truncation is widespread, as is conservatism. Subjectivity is an essential part of the landscape as well. And the recorded accruals reflect endogenous transactions, subjective assessments and choice, and, in a larger sense, endogenous regulations themselves.

From a social science standpoint, we observe the accruals, try to divine the underlying choices, and, in the process, learn something about institutions, such as exchange markets, regulations, and managerial behavior. Yet in studying these expectations, we ourselves rely on expectations; it is the nature of our expectations that concerns me. In particular, I think we excessively treat expectations as exogenous when we try to sift through observables. Practice exhibits choices, raw material for our work. But in understanding this raw material I think an approach more grounded in the underlying choices, in the micro foundations so to speak, is appealing. After all, it would allow us to treat the choices that generate these raw data as flowing from endogenous expectations. (6)

III. MOTIVATING STRUCTURE

To put some structure on the issue, I begin with a notation-laden description of a generic organization or process. Let the random variable [bar.x], denote an output or performance variable of interest, such as return on a security, reported income for some firm, earnings forecast, or performance on an audit task. In turn, think of this variable as depending on (1) a history of current and prior actions, such as production choices or audit team assignments, denoted recursively by [[bar.a].sub.t], = ([a.sub.t], [[bar.a].sub.t-1] ); (2) a history, or concatenation, of common knowledge information reports and other observables, denoted by information set [[OMEGA].sub.t-1], e and including such things as macro variables, market statistics, and financial reports; (3) a current stock of information that is privately known by the actor or organization, denoted [[OMEGA].sup.p.sub.t-1]; and (4) the ubiquitous random shock, [??].sub.t],. Using function [F.sub.t] to tie it all together we have:

(1) [??] = [F.sub.t]([[bar.a].sub.t][[OMEGA].sub.t-1], [OMEGA].sup.p.sub.t-1],[[??].sub.t).

Next we focus on the stochastic nature of performance variable [[??].sub.t] From an outsider's perspective, we know, in principle, the public information history, [[OMEGA].sub.t-1], but observe neither the current stock of private information nor the actions per se. So, integrating out the unobservables, we envision this organization or process as producing a random performance variable with a probability distribution conditioned on [[OMEGA].sub.t-1], say G([??]|[[OMEGA].sub.t-1]), 0, and with mean denoted E[[??]|[[OMEGA].sub.t-1]].

Now, in the tradition of rational expectations (here I follow Mishkin [1983] and Sheffrin [1996]), suppose external participants, say market traders, regulators, or organization participants, subjectively assess the stochastic nature of this same performance variable (say via a distribution function denoted [G.sup.m]([??]|[[OMEGA].sub.t-1])). Also let [E.sup.m][[??]|[[OMEGA].sub.t-1]] denote the mean implied by this subjective assessment. This assessment is said to be consistent or rational if it agrees with the underlying reality, at least in terms of the first moment:

(2) [E.sup.m][[??]|[[OMEGA].sub.t-1]] = E[[??]|[[OMEGA].sub.t-1]].

For example, does the auditor's behavior comport with the client's anticipation of that behavior; does the firm's governance structure work as anticipated; do market participants, on average, treat the investment opportunities as efficiently priced; and are the data consistent with this conjecture?

Stylization

In turn, giving this some operational content, we resort to an underlying model of the organization or process, such as an organizational equilibrium model (e.g., an array of agents operating under incentive arrangements) or a market equilibrium model (e.g., a factor-pricing model or oligopolistic competition). Two steps are involved. First we limit the information to some contraction of the public information, say [[??].sub.t-1]. Second, we invoke the aforementioned model, denoted M, and write:

(3) [E.sup.m][[[??].sub.t]|[[OMEGA].sub.t-1] = M([[??].sub.t-1]).

Importantly, now, model M(*) ideally captures the choice behavior that governs the process in question. This is where micro foundations enter: the process model reflects the endogenous nature of the process when we build upon optimizing (or satisficing) behavior by the essential actors. For example, in a rational expectations setting, this model would reflect consistent optimizing behavior, consistent consumption, and investment choices, by the individuals coupled with market clearing. Conversely, in an organization model we might view the actors as consistently optimizing, but within the organizational architecture and incentive structures, implicit and explicit, imposed by an optimizing designer. In a financial reporting context, the model would reflect transaction and accrual choices that, presumably, aggregate to exhibit the noted truncation and conservatism themes as well as the subjectivity theme. (7)

In each instance we view the model, M([[??].sub.t-1]), as reflecting the underlying choices, and doing so by being based on micro foundations of the choices in question. From a researcher's perspective, I think of going down this road, of well-crafting the expectation construction in Equation (3), of basing it on micro foundations, as more endogenous than many with which we work. (8) In addition, we often approximate M(*) with a parametric expression and then statistically estimate the parameters. More about this later.

Illustration

To illustrate, and tie this more closely to our world of accruals, consider a setting where a firm is simply a fair game, iid dividend machine, "paying out" dividend [[??].sub.t] at time t (so we abstract from productive choices). The dividend is a zero mean normally distributed random variable, with variance [[sigma].sup.2]. This provides an admittedly simplistic setting where the underlying stochastic process is stationary, and past dividends are uninformative about future dividends.

The firm's dividend policy is also convenient: it maintains a zero cash balance and thus "pays out" its net cash flow each and every period.

Information is also present. In addition to the time t dividend, a noisy signal of the time t + 1 dividend is observed privately by the firm at time t: [[??].sup.p.sub.t] = [[??].sub.t+1] + [[??].sub.t], where [[??].sub.t], is a zero mean normal random variable, again with variance [[sigma].sup.2], and (for analytic convenience) independent of any period's dividend or its counterpart in any other period. The stark diagonal covariance matrix and consistent use of the single parameter, [[sigma].sup.2], simplifies the presentation. For example, this implies E[[[??].sub.t+1]|[y.sup.p.sub.t]] = .5[y.sup.p.sub.t].

The claim to the dividend stream is valued in a competitive market, under conditions of risk neutrality and a zero interest rate. (Market clearing, under conditions of risk neutrality, is thus assumed.) This pricing is informed by information supplied by the firm, but no other information source is present. (9) In addition to a dividend announcement and subsequent delivery, the firm announces some compilation of what it knows, what it has observed, as of time t. Denote its report at time t, which is conveyed just prior to the dividend payment, by the ordered pair ([z.sub.t], [d.sub.t]), where report [z.sub.t] is given by:

(4) [z.sub.t] = [f.sub.t]([y.sup.p.sub.t], [[bar.y].sup.p.sub.t-1], [d.sub.t], [[bar.d].sub.t-1]).

[[bar.y].sup.p.sub.t-1] ([[bar.d].sup.p.sub.t-1]) denotes the history of private information (dividend) realizations, and the sequence of functions, {[f.sub.t]}, is common knowledge.

The idea, then, is the firm conveys a financial report consisting, basically, of an accrual and a cash flow measure; any information beyond that conveyed by cash flow itself is conveyed by the accrual. The time t financial report is received just before the time t price is determined. The model is heavily streamlined so current period cash flow, [d.sub.t], tells us nothing about future cash flows, while the accrual, [z.sub.t], has the potential to reveal some or all of what the firm knows privately, to tell us something about next period's cash flow, though nothing beyond that.

Let [P.sub.t] denote the ex-dividend market price at time t, following observation of report ([z.sub.t], [d.sub.t]). (So the associated cum-dividend price would be [P.sub.t] + [d.sub.t].) The public information at this point is the current report, the history of past reports, and the history of past prices:

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII.]

Our pricing assumption is that [P.sub.t] = E[[[??].sub.t+1]| [[OMEGA].sub.1]] = M([[OMEGA].sub.t]). (Remember, we have crafted the story so E[[[??].sub.t+n]|[[OMEGA].sub.1]] = 0 for n > 1, and for any feasible information history.)

Nevertheless, the data that would be observable by a social scientist depend critically on what information is made available to the market. For example, if the firm never revealed any of its private information, if truncation were rampant, then the accruals would be pure noise, and we would have [P.sub.t] = 0. Conversely, if the firm reveals fully its private information at each time t, if truncation were absent, then the ex-dividend price at time t would be P, = .5[y.sup.p.sub.t], as the available information now speaks nontrivially to the one-step-ahead dividend, (10)

Yet, even with truncation at bay, statistically relating the price and reporting processes rests on our model of that reporting process. To explore this, suppose the firm reveals fully its private information by simply accruing its best estimate of next period's cash flow. This means the time t accrual on the firm's balance sheet will be E[[[??].sub.t+1]|[[OMEGA].sub.t]] = .5[y.sup.p.sub.t] = [z.sub.t] and its income for period t will be reported as [I.sub.t] = [d.sub.t] + .5[y.sup.p.sub.t] - [z.sub.t-1] = [d.sub.t] + .5([y.sup.p.sub.t] - [y.sup.p.sub.t-1]). Connecting the market's valuation with the accounting report, we have the ex-dividend price equals the stock of reported accruals: [P.sub.t] = [z.sub.t],. (A fair value ideologue would be happy!) It also follows that the "accrual response coefficient" is unity: [delta][P.sub.t]/[delta][z.sub.t], = 1. (11)

Alternatively, suppose the firm smoothes its accruals and reports [z.sub.t] = .5(.5[y.sup.p.sub.t] + .5[y.sup.p.sub.t-1]), in effect reporting period t income of [I.sub.t], = [d.sub.t] + .25([y.sup.p.sub.t] + [y.sup.p.sub.t-1]) - .25([y.sup.p.sub.t-1] + [y.sup.p.sub.t-2]) = [d.sub.t] + .25([y.sup.p.sub.t] - [y.sup.p.sub.t-2]). This is informationally equivalent to reporting [z.sub.t] = .5[y.sup.p.sub.t] via the initial accrual process, as the accrual is readily decoded to highlight the essential [y.sup.p.sub.t] signal and corresponding value estimate of .5[y.sup.p.sub.t]. But the information now arrives in a different scaled format and, as a result, the ex-dividend price, expressed as a function of the public observables, is [P.sub.t] = E[[[??].sub.t]|[[OMEGA].sub.t]] = 2[z.sub.t] - [P.sub.t-1]. Moreover, with the information essential to estimating next period's dividend now arriving in such scaled fashion, we have an "accrual response coefficient" of [delta][P.sub.t]/[delta][z.sub.t] = 2. (12)

Thus, the smoothed accruals story is designed so the accrual stock is autoregressive but can be econometrically identified via a lagged price variable. (For that matter, regressing ex-dividend price on the accrual stock in the smoothed story results in a biased estimate because the noted lagged price is omitted from the estimation.)

Other than reminding us that what we look for in and how we interpret the data depend on our conception of the underlying process, the illustration is largely vacuous, simply because it is essentially an exogenous story. Moving beyond the apocryphal requires we introduce the underlying choices and a model of those choices. This becomes transparent as we turn to several variations on this theme of identifying underlying choices, all connected to our research activities.

IV. VARIATIONS ON A THEME

To be sure, our research, as well as our teaching, is necessarily focused and structured. Yet when we focus narrowly we replace choices with exogenous specifications and controls. Expanding the focus brings some of these choices into play, so to speak, and calls for explicit, endogenous accommodation. Striking an appropriate balance on this score is an ever-present tension, a tension I fear we have been avoiding. (13)

Earnings Management

Earnings management is a case in point--one, in fact, that cuts across financial, managerial, audit, tax, and regulatory concerns (e.g., Hribar and Collins 2002; Ley 2003; Nelson et al. 2002; Burgstahler and Dichev 1997). Here the subjectivity theme is emphasized, and is coupled with the concern that it opens the door to opportunism in the reporting process. A common experimental technique is to estimate accruals as a linear function of change in revenue and property, plant, and equipment, and then use the residual as an estimate of abnormal of discretionary accruals. These discretionary accruals, then, are tracked as we pass through a suspicious event such as a proxy contest, auditor change, period of poor performance or management change. Alternatively, we examine the overall distribution, and search for unusual patterns, say, around zero growth in earnings.

Now impose the conceptual structure in Equations (1) and (3), with its emphasis on underlying choices. Think of [[??].sub.t] as time t accruals and the process model in Equation (1) as specifying the accrual reporting process. Reported accruals naturally depend on the available information and on management's choices. They also depend on the auditor's choices, in a larger sense. Earnings management, in turn, is some type of "intervention" in or garbling of this accrual reporting process, a garbling that is explicitly motivated of implicitly tolerated as annoying opportunism associated with designed decision rights and incentives.

To structure this, we append some model of equilibrium behavior. This is the role of Equation (3), where we model expected accruals as some equilibrium manifestation of observables, M([[??].sub.t-1]). For example, suppose we assume that the accrual reporting process is driven by efficient contracting between management and the shareholders. We then know that if earnings management, in terms of garbling what is known inside the firm, is equilibrium behavior and is a first-order concern, it then follows that the Revelation Principle does not hold and that the underlying garbling is both orchestrated by the contracting arrangement and far from transparent (Arya et al. 1998).

This implies detecting earnings management is unlikely to be straightforward, even if it is feasible. This also suggests that distribution tests, with their more modest use of imposed expectations, are better adapted to the phenomena in question than reliance on explicit estimation of discretionary accruals. (Even so, identifying the appropriate benchmark distribution is no simple task as we are dealing with a variety of--endogenous--accruals, e.g., Beaver et al. 2003.) A second implication is that managing the reporting discretion is a nontrivial task, that the auditor is an important player in the exercise, and, by implication, that access to the auditors' travails is an important window (as in Nelson et al. 2002). (14)

Conversely, we might assume inefficient contracting in the labor market, to the point reported accruals are largely unaffected by management's actions, except when it behaves opportunistically. This implies that a nearly mechanical description of the accrual reporting process, such as the modified Jones model, might be adequate for experimental purposes, and that the thus-estimated discretionary accruals are, in fact, identifiable and indicative of management's intervention in the accrual reporting process. But micro foundation concerns remain, as we want the mechanical accrual process to remain unaffected by the conditioning event in the study, not to mention a consistent story linking the various behaviors.

Next suppose we further append equity valuation of the firm in question along with some model of equilibrium behavior in the financial market. We now have a pair of processes, and this imposes additional consistency requirements on the exercise. The method for estimating accruals should be consistent with the presumed equity market pricing structure, just as the search for unusual patterns should be consistent with the presumed contracting behavior in the managerial market and the ability (or lack thereof) of the pricing structure to mimic the search for unusual patterns. (15)

To illustrate, return to the earlier iid dividend setting where the asset is reported at fair value. Now suppose each such firm during any given period is owned and managed by an entrepreneur who must, for inter-generational reasons, liquidate his holding at the end of the period. The entrepreneur is able to misrepresent the fair value estimate, e.g., via a well-crafted mark-to-model estimate, by reporting [z.sub.t] = .5[y.sup.p.sub.t] + [theta] for some [theta] [greater than or equal to] 0. In turn, an auditor is unable to detect any overstatement of the accrual below a materiality threshold of .5[DELTA]. From here, traders anticipate the firm will report an accrual stock in period t of [z.sub.t] = .5[y.sup.p.sub.t] + .5[DELTA] (i.e., set [theta] = .5[DELTA]) and thus the market price will be [P.sub.t] = [z.sub.t] - E[[theta]] = [z.sub.t] - .5[DELTA]. In turn, given this anticipated behavior being impounded in the pricing, the firm, the liquidating entrepreneur, can do no better than report as conjectured.

Now turn to the data thrown off by such a story. There would be no abnormal returns, on average, and price itself would plot as a zero mean normal random variable. But the accrual stock would plot as a normal random variable with mean .5[DELTA].

More interesting is the case where the entrepreneur can misreport in the presumed manner but with probability [alpha]. It now turns out the market again anticipates maximal opportunism, and the equilibrium price is given by:

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII.]

and given this, the entrepreneur misreports the maximum amount whenever doing so is possible, provided [DELTA] is not too large. (16) In addition, the accrual stock would now exhibit a mean of [alpha](.5[DELTA]) and price would not be a linear function of the reported fair value estimate.

The point is consistency. Experimentally attacking earnings management rests on what we, as social scientists, know about the firm, management's behavior, and the market's behavior. Building these behaviors into the exercise, from a micro foundation level, moves our expectations toward the endogenous category, and imposes consistency checks on those expectations. For example, in the stochastic misreporting illustration firm and market behavior are equally important components of the story. We expect to see the noted relationship between the fair value accrual and the market price, not because the market is using additional information but because it has consistent expectations about the reporting process. We also expect to see a book-to-market ratio above unity, a systematically overstated stock of accruals. Though hardly descriptive, this illustrates my theme of building from micro foundations, closing the argument with an equilibrium construction, and using the resulting (endogenous) structure to shed light on the behaviors in question.

Analysts' Forecasts

While earnings management places management's behavior at the center of the analysis, use of analysts' forecasts places analysts' behavior at the center of the analysis. In a typical information content study we examine the association between change in a firm's market valuation and the "surprise" in its earnings report. The idea, especially in a short-window test, is that the only uncontrolled variable affecting the equity valuation is arrival of the earnings release; surprise, defined as actual less expected, is the new information conveyed by that release. Of course, among other things, this places a premium on identifying that expectation. (It also treats concatenation with older information as a second-order effect.)

In turn, since the market's expectation is unobservable, we often employ analysts' forecasts, say a consensus forecast constructed from the I/B/E/S data set, as an estimate of the market's expectation. Importantly, now, on the information side of the association test we substitute the analysts' consensus forecast for joint specification of the earnings process and market processing of available information, for specification of Equation (3) on the information side of the experiment, just as we substitute an autoregressive identification of the accruals process for a more endogenous specification of the earnings process.

Yet these forecasts are choices themselves, and we are sidestepping the importance of an equilibrium model in the analyst sector of the analysis (just as we sidestep the fact the vector of forecasts is often more informative than some statistic based thereon). We know, for example, that analysts' forecasts are, on average, upward-biased, that their recommendations are skewed toward strong buy and buy recommendations (e.g., Lin and McNichols 1988), and that they tend to censor poorly performing firms (e.g., McNichols and O'Brien 1997). There is also a strong connection between analyst following and investment banking relationships and, to no surprise, reputation, career concerns, and continued access to management are part of the fabric (e.g., the Wall Street Journal and the annual institutional investor poll [Li 2002]). Analysts are also proactive in customizing the measure of earnings, relative to GAAP earnings, they forecast (e.g., Bradshaw and Sloan 2002).

Now think of the forecast activity as a version of the process description in Equation (1). Further suppose these forecasts carry private information and insight to the equity market. Connecting the pieces, then, we envision a market equilibrium process (one component of Equation (3)) mixed with an analyst community equilibrium process (a second component of Equation (3)). Yet, when we work directly with, say, I/B/E/S data, say in an earnings response exercise, we are sidestepping the analyst part of the larger equilibrium, including the censoring and biased reporting tendencies, just as we sidestep institutional fabric when we experimentally explore analyst behavior (e.g., Tan et al. 2002). (17)

Returning to our illustration, suppose the firm reports nothing (extreme truncation, again), while an analyst has access to the firm's private information but for some unmodeled reason reports a fair value estimate of [[omega].sub.t], = .5[y.sup.p.sub.t] + [[??].sub.t], where [[??].sub.t], is a normal random variable with mean [DELTA] > 0. Now we would find a market price that removes the common knowledge bias effect: [P.sub.t] = [[omega].sub.t] - E[[[??].sub.t]] = [[omega].sub.t] - [DELTA].

Conversely, suppose the analyst reports without bias or noise, that is reports [[omega].sub.t] = .5[y.sup.p.sub.t], but only if [y.sup.p.sub.t] > 0. This censoring, to in effect report only good news, now implies in and of itself an upward-biased forecast error.

Regardless, with considerable patience we could stretch this into a story where the analyst reports ahead of the firm, the firm reports its private information, and we connect the "earnings surprise" to the change in price at the time of the firm's report. Indeed, Easton and Zmijewski (1989) link cross-sectional variation in estimated earnings response coefficients to analysts' revisions of earnings forecasts, a step toward integrating the two components of the exercise. Again, however, consistency is the issue, here consistently treating the behavior of management, of analysts, and of market participants, and coordinating that behavior with an equilibrium mechanism. (18)

Regulation Assessment Studies

Documenting the effect of various reporting and auditing regulations provides another illustration of the role micro foundations can play in deepening our understanding. Lease reporting is a case in point. Imhoff and Thomas (1988) document substitution from capital to operating leases as SFAS No. 13 took hold. Interpretation of this substitution, however, requires some understanding of firm and market behavior, that is, some specification of equilibrium behavior at the valuation and at the firm levels. Analyst behavior may also be an important part of the picture. Pensions and OPEB (e.g., Barth et al. 1992; Mittelstaedt et al. 1995), purchase versus pooling (e.g., Aboody et al. 2000), and segment reporting (e.g., Berger and Hann 2003) provide additional examples, not to mention employee options (e.g., Hall and Murphy 2003).

Another illustration is the debate over less or more detailed standards. Here, the interacting roles of equilibrium behavior in the valuation market, the managerial market, the audit market, regulators, and the legal industry come to bear--no small task indeed. Nelson (2003), for example, stresses findings from audit judgment studies and how detailed rules may increase or decrease task complexity for the auditor, may alter the negotiation encounter between the auditor and client, and may alter the supply of earnings management.

The Lucas (1983) critique of econometric assessment also enters the story. Reflect back on the "model" in Equation (1) where we acknowledge that the organization or process makes choices that combine with (the choices of others and) the historical context to produce the observable variable of interest. This is, of course, far too abstract to be of any use, so we attempt to parametrically identify the process by positing some parametric form (e.g., an OLS equation of some sort or a compendium of experimental results), with parameter vector [theta], and then proceed to estimate the parameter vector.

But these assessments, these estimates of vector [theta], are based on the in-place regime. Reflecting on Nelson's (2003) survey, then, the various experimental and survey findings flow from training and experience in the current rule-based regime (both in terms of subjects and experimental instruments) and thus beg the question of what training and experience in a less rule-based regime would exhibit. Our estimates, that is, reflect decisions made by various actors, altering the environment may well alter those decisions, and thus our estimates are inherently suspect when projected to the alternative environment.

Zimbelman's (1997) experimental investigation of SAS No. 82's requirement that auditors separately assess risk of fraud provides another illustration. After reading about an audit client, audit professionals who were instructed to separately assess fraud risk engaged in different behavior (spending more time on fraud cues, though not modifying the nature of their audit plans). Notice, however, that the respondents are working from their experience, based on the existing environment, and client behavior is not part of the experimental apparatus. (19)

The FASB itself illustrates this concern. It has, at least in my limited experience, a penchant for focusing on a type of transaction and then determining the proper accounting treatment of that transaction. It does not, again in my experience, overly concern itself with the supply of transactions if it proscribes some particular accounting treatment. Indeed, it claims a position of neutrality, defined in Concepts Statement No. 2 as "absence in reported information of bias intended to attain a predetermined result or to induce a particular mode of behavior." (20) Notice we teach in a similar manner: exogenous transaction followed by application of the appropriate rule. For that matter, the FASB's Conceptual Framework strikes me as Keynesian in philosophy, being built upon a foundation that sidesteps micro foundations of the underlying choices, and largely inadequate for scholarly purposes. (21)

In sharp contrast, Dye (2002) simultaneously addresses pricing and reporting behavior in a model where a "shadow standard" emerges as an equilibrium response to a reporting standard. (22) Reporting firms are able to exploit the in-place standard, pricing is rational given the exploitation, and the exploitation is rational given the pricing. From there the regulator's behavior is moved into the endogenous category.

Yet another variation on our running illustration exhibits this theme. Suppose, in the earnings management version, that a reporting standard caps the possible fair value over-statement at [DELTA], but the reporting firm is able to structure transactions so as to report a fair value of [z.sub.t] = .5[y.sup.p.sub.t] + [DELTA] + [omega] by incurring a structuring cost of k[([omega] - [DELTA]).sup.2]. In turn, with all firms behaving in this fashion, the resulting market price will be [P.sub.t], = [z.sub.t] - E[[DELTA] + [omega]] = [z.sub.t] - [DELTA] - [omega]*. From here, with [delta][P.sub.t]/[delta][z.sub.t] = 1, the firm's misreporting is governed by:

[max.sub.[omega]] [omega] + k[([omega] - [DELTA]).sup.2]

and we thus have [omega]* = [DELTA] = 1/2k. In short, the market anticipates that the firm will burn resources to advantageously structure its transactions and, in response, the firm can do no better than burn resources to structure its transactions, as anticipated.

Now suppose the regulator decides to tighten the regulation, to lower [DELTA]. On the surface, this is a good idea, but sorting through the details it turns out to be of no social consequence as the market adjusts for the misreporting and the social cost of the transaction structuring to support that misreporting does not depend on [DELTA]. But this assumes the regulation and the penalty coefficient, k, are not linked. Suppose k = [[DELTA].sup.2]. The idea is that a tightly controlling standard is such a bright line, it is easy for the firm to structure transactions to circumvent the regulation, but with a broadly applicable though less specific standard, it takes considerable effort to circumvent the regulation. This linkage now implies [omega]* = [DELTA]+ 1/2[[DELTA].sup.2]. Tightening the regulation lowers k and turns out to increase the amount of misreporting. In fact, a very large [DELTA] is desirable here, because this lowers the social cost of the transaction structuring. (23)

The point, though, is regulations, by their very nature, are designed to affect behavior. (Imagine teaching students accounting regulations are inherently neutral.) Assessing a regulation's impact rests on our ability to detect, or forecast, the attendant behavior changes. Adaptation is the likely norm, and identifying this adaptation returns us to the dual themes of micro foundations of the underlying choices and closure with an equilibrium argument, a perspective vastly more rich than exogenous transactions.

Value-Relevance Studies

Value-relevance studies provide a similar picture. One variation focuses on the truncation theme, as in Lev and Sougiannis (1996). Here, public data are used to construct an estimate of the stock of intangible assets, and this estimated stock, in tuna, loads in statistically significant fashion in a pricing equation. This opens the door to speculation on whether the truncation is a regulatory error, an indication of regulatory lag as the nature of the economy has changed, or is indicative of careful design and maintenance of the financial reporting system. There are, after all, compelling reasons for truncation per se, but headway rests on deepening our understanding of the financial reporting process and its comparative advantage among various information sources. Another variation focuses on the subjectivity (or reliability theme), as in Barth et al.'s (1996) study of fair value disclosures by banks.

It is important, however, to specify the price process, the financial reporting process, and the "other information" reporting process if the pricing coefficients or asset and income adjustments in these studies are to be interpreted in meaningful fashion. For example, return to our running illustration and the case where the firm reports a smoothed asset value. In period t, then, it reports an asset value of [z.sub.t] = .5(.5[y.sup.p.sub.t] + .5[y.sup.p.sub.t-1]) and an income of [I.sub.t] = [d.sub.t] + .25([y.sup.p.sub.t] + [y.sup.p.sub.t-1]) - .25([y.sup.p.sub.t-1] + [y.sup.p.sub.t-2]) = [d.sub.t] + .25([y.sup.p.sub.t] - [y.sup.p.sub.t-2]). Now, inspired by recent work relating market value to accounting stock and flow measures, suppose we link cum-dividend price to accounting value ([z.sub.t]) and accounting income ([I.sub.t]). It turns out that:

[P.sub.t] + [d.sub.t] = [z.sub.t] + [I.sub.t] + .5([P.sub.t-2] - [P.sub.t-1]).

That is, lagged prices are value relevant. This occurs, of course, because the accounting measures carry the valuation information in a particular scaled format, the lagged prices are used to decode, so to speak, the sum of accounting stocks and flows. (24) Introducing an additional, nonaccounting, information source would further cloud the analysis.

The value-relevance exercise, in other words, places a large premium on careful specification of the accounting process, the other information process, and the price process. For example, Aboody et al. (2002) press market inefficiency in a value-relevance exercise, while Chen and Schoderbek (2000) provide evidence consistent with analyst and investor functional fixation in the face of complex tax rules. But the larger, endogenous imbedding remains unaccomplished. The issue of accounting's role as a verification of earlier, alternate sources remains, as does the applicability in the policy arena once we recognize the Lucas (1983) critique.

That said, a market efficiency test in this setting is rather forgiving on this point (aside from power and measurement issues). To see this, the period t return associated with our firm, [[??].sub.t+1] = ([[??].sub.t+1] + [[??].sub.t+1] - [P.sub.t])|[P.sub.t], is a zero mean random variable. Under informationally efficient pricing, these returns are also uncorrelated with any statistics derived from the underlying temporal information set. Continuing, let to, be some variable constructed from or simply contained in [[OMEGA].sub.t], such as the accrual component of the time t income measure. Then [R.sub.t+1] and [[omega].sub.t], should be uncorrelated. Further suppose we forecast [[omega].sub.t] in autoregressive format via [[omega].sub.t] = [alpha] + [beta][[omega].sub.t-1] + [[??].sub.1t] and simultaneously estimate [R.sub.t+1] = [??] + [??][[[omega].sub.t] - [delta]* - [beta]*[[omega].sub.t-1]]] + [[??].sub.2t]. Then we should find our estimates of [beta] and [beta]* agree (Abel and Mishkin 1983; Mishkin 1983). (25) Of course, this is perfunctory in our setting, because we know, by assumption, the equilibrium model. With real data, as in Sloan's (1996) examination of accruals, the statistical test is a joint test of efficiency and the market equilibrium specification. Also notice the Mishkin test places no premium on proper specification of the forecasting equation or even cleverly selecting the variables. Rather, the test rests on no exploitable remaining correlation. Going further, to identify specific "mispricing" rests on confidence in the equilibrium specification as well as on identifying and forecasting the variables--no small matter (e.g., Xie 2001).

Regardless, it is the connection among expectations in the illustration that is important. The pricing structure and the accounting structure are equal parts of the story, parts that we connect with an explicit equilibrium argument. (26)

Audit Judgment Studies

A similar picture emerges in audit judgment studies, where we focus on documenting and improving auditor judgments and decisions (e.g., Ashton and Ashton 1995; Trotman 1998). Biases in information processing, such as use of representativeness or anchoring heuristics, information search processes, productivity of decision aids, expertise, memory, and hierarchical groups, among other issues, have been studied. In a typical study, (slightly) experienced auditors are used as subjects in a controlled experiment where variations on a "rich" case study are used to administer the treatments and collect the responses.

Returning to the earlier conceptualization, audit judgment is now the variable of interest, and the various experiments are aimed at better documenting the process model, Equation (1). This, of course, is not happenstance, but is typically guided by findings in psychology. In this sense, then, findings and models in psychology are the basis for the equilibrium model, Equation (3).

Auditors, however, work in elaborate organizations, are subject to extensive supervision, deal with multiple clients in repeated environments, and face significant career concerns. This leads to such questions as whether their experience and training improve their judgment performance, whether the hierarchical reviews of their work improve their performance, the role of career concerns, and the role played by client interaction. This, in turn, suggests the equilibrium model is not necessarily a close variation to what we might infer from psychology. It also leads to concern whether the typical case study provides sufficiently rich immersion into the fabric of the audit firm and its client. Trotman (1998, 134), for example, states "we need realistic cases that match the audit environment as closely as possible." Moreover, building on micro foundations, would we expect the organizational arrangement to reflect unchecked cognitive consistencies?

Oddly, it seems client interaction is regarded as a second-order effect in the majority of these experiments. Bazerman et al. (2002) take the opposite tack and stress it as central to understanding auditor behavior, though again they stress psychological foundations absent an organizational context, portions of which King (2002) redresses. Gibbins, Salterio, and Webb (2001) explore auditor-client negotiation with a structured questionnaire and conclude these negotiations cover complex issues, are context dependent, and are far from bilateral encounters. In larger part, though, we continue to struggle with an under-identified model of the organizationally imbedded behavior in question.

Returning yet again to our running illustration, suppose the firm's interaction with its auditor results in a reported fair value estimate of [z.sub.t] = .5[y.sup.p.sub.t] + [[??].sub.t], where [[??].sub.t] is a normal random variable with mean F([y.sup.p.sub.t],[y.sup.p.sub.t-1]), reflecting client and auditor interactions (e.g., Antle and Nalebuff 1991) along with cognitive and organizational influences. Further suppose the market mechanism has rational expectations. We then find an ex-dividend price of [P.sub.t] = [z.sub.t] - E[[[??].sub.t]] = [z.sub.t] - F([y.sup.p.sub.t],[y.sup.p.sub.t-1]).

Moreover, if the resulting error in the fair value estimate is a second-order effect, [P.sub.t] should exhibit considerable volatility, just as it should exhibit little volatility were the error a first-order effect; a lag structure may well be part of the estimation exercise. Again, though, consistent, balanced treatment of the endogenous nature of all major components in the story is the issue.

Compensation Studies

Compensation studies provide an additional illustration of the importance of specifying the choices, the micro foundations, Equation (3). Here we typically relate readily observable compensation, such as salary, bonus, and options, to various readily observable performance variables. Yet compensation is multifaceted, including elaborate and subtle timing issues, as well as (at least in an agency model) linked, via a compensating wage differential, to the agent's induced unbalanced portfolio and resulting risk profile (e.g., Abowd and Kaplan 1999; Hall and Liebman 1998; Hall and Murphy 2002). Yet making the connection to accounting variables is hampered by measurement error, the presence of other information, the multiperiod structure of compensation, the ever-present importance of finer, idiosyncratic details in crafting these sorts of arrangements, the fact our modeling work is heavy on likelihood ratios but light on accruals and their endogenous character (e.g., Lambert 2001 and Bushman and Smith 2001), and the fact that incentive structures are but part of the larger governance mechanism (e.g., Bebchuk and Fried's 2003 emphasis of managerial power in the governance structure).

This is well illustrated by the continuing struggle with expensing employee stock options. Surely the options are valued by the employees, and surely their "value" to the employees differs from their "cost" to the existing shareholders. Yet absent an intertemporal, equilibrium linkage in which options are part and parcel of the compensation package, we are left with intuition and ideology to gauge the debate. (27)

Cost Measurement Studies

Cost measurement studies focus on understanding product cost in a multiproduct setting. One approach is based on the classical formulation of an exogenous technology coupled with rational factor choice, based on factor prices and, of course, required quantities (e.g., Hayes and Millar 1990; Mocan 1997). Here cost structure or its dual, technology, is the issue, and rational expenditure minimization is the equilibrium model, Equation (3), that guides the analysis. Importantly, cost reflects factor choices, cost components are not treated in separate fashion absent separability in the underlying technology, and marginal cost is the only well-defined measure at the individual product level. (28)

A second approach, exemplified by ABC studies, is eclectic in nature. Here cost driver identification is an important task (though a classical cost function, given factor prices, depends only on volume). Less aggregation coupled with staged allocations based on well-conceived "cost drivers" is a central, though not unchallenged, theme (e.g., Datar and Gupta 1994; Hwang et al. 1993). (29) More broadly, assessment of an ABC implementation remains a vexing task. Complexity of the resulting cost specification is an issue (e.g., Anderson et al. 2002), thus suggesting a larger "cost-benefit" issue. Perceptual measures (e.g., Foster and Swenson 1997) and statistical association with financial and process statistics (Ittner et al. 2002) have been pursued. Yet the underlying model, Equation (3) again, remains unclear, especially when we remember any such costing measure is but an estimate that will be tempered with other available information, that learning to work with such measures is not a straightforward task, that the implementation process is intrusive, and, most important I suspect, that the firm chose the intrusion at this particular time. This underinvestment in the setting and process hinders our understanding of when, why, and how firms choose to change their internal measurement systems, a theme reflected in Dopuch (1993) and, more broadly, in Zimmerman's (2001) critique of empirical work in managerial accounting and the resulting symposium in the 2002 The European Accounting Review.

Torturing our example a little further, suppose initially the firm privately observes and reports faithfully the realization of [[??].sup.p.sub.t] = [[??].sub.t+1] + [[??].sub.t] at time t, but now [[??].sub.t] is a zero mean normal random variable with variance considerably greater than [[sigma].sup.2]. This implies [y.sup.p.sub.t] is not very informative and that [P.sub.t] will not be very volatile. In turn, suppose at time [tau] the firm implements a new costing system, one designed to improve its internal information. Further suppose following implementation it will take the firm a number of trials to become adept at exploiting the new system. This means, of course, that its fair value disclosures become systematically more informative following implementation, and thus imply an increase in volatility of [P.sub.t] beyond time [tau]. Of course this begs the question of why the firm chose to switch its costing system at time [tau], why the resulting information has no effect on the underlying fundamentals, as well as how the learning process is managed and whether competitors are responding to similar forces and opportunities. But tracing through consistent expectations is the point.

Governance Studies

Governance covers a large set of players and interactions: management, boards, auditors, regulators, legislators, investment bankers, analysts (buy and sell side), consultants, attorneys, and even academics. Here, equilibrium specification, Equation (3), strikes me as particularly important because of its role in delineating which interactions are potentially important, a point stressed in Hermalin and Weisbach's (2003) review of governance issues associated with boards of directors. As they emphasize, the evidence is difficult to interpret because board structure is endogenous and interdependent. To illustrate closer to home, McDaniel et al. (2002) experimentally simulate audit committee member's task performance, and patterns therein, with an eye toward the role of expertise. Yet the simulation removes board interactions from the setting. Similarly, Frankel et al. (2002) find that unusual patterns in the reported financial numbers themselves appear to be associated with unusually large purchases of nonaudit services. In contrast, Antle et al. (2003) simultaneously estimate audit lees, nonaudit fees, and unusual patterns in the reported financial numbers, and find no such problematic connection. (They do, however, document a connection between audit lees themselves and unusual patterns in the reported numbers.)

The equilibrium specification is also important here because, interpreted in equilibrium terms, a well-functioning governance system will exhibit few failures and it is on the failure side where we would observe the strength, the off-equilibrium component, of the system. For example, suppose in our illustration that the dividend process is as modeled, provided the manager is well behaved, and that an internal reporting system will, with positive probability, report any opportunistic behavior. A simple penalty contract will now ensure the manager is dutifully well behaved and, in equilibrium, we will never see the control system functioning. More broadly, though, we are back again at the central issue of coequal treatment of the endogenous nature of all major players in the exercise.

V. CONCLUSIONS

Our scholarship uses the practice of accounting as a laboratory for deepening our understanding of accounting institutions, and for social science purposes more broadly. Growth and efficiency in this process test on investment in the micro foundations of our subject matter. I have sketched how this theme cuts across a variety of our studies, and find a consistent pattern of underinvestment. We tend, it seems, to underinvest in underlying choices, the micro foundations of our subject area: the richness of accruals, the institutional fabric of analysts, the organizational subtleties of audit firms and their on-going client interactions, and the micro structure of governance arrangements and their dynamics.

Moreover, though my ramblings and illustrations are static in nature, our subject area and our aspirations have a substantive learning component as well. Consider the learning associated with a new class of fair value disclosures--learning on the production and on the consumption side. Similarly, an ABC implementation is likely to entail significant, extensive learning both in terms of the production process itself and exploiting the resulting newly estimated product cost statistics.

The larger issues, though, are whether this expanded view of our subject is likely to be productive, and whether we are prepared to accept such a challenge. Lucas (2003, 12) concluded his presidential address to the AEA by answering the first question:
   The macroeconomic research I have discussed today makes essential
   use of value theory in this modern sense: formulating explicit
   models, computing solutions, comparing their behavior quantitatively
   to observed time series and other data sets. As a result, we are
   able to form a much sharper quantitative view of the potential of
   changes in policy to improve peoples' lives than was possible a
   generation ago.


It is my hope that one day someone will stand before the American Accounting Association and identify a body of scholarly work that has deepened our understanding, deepened it to the point we are able to offer informed and reliable quantitative assessments of changes in accounting policies, be they economy-wide or intra-organizational in scope. I think we have made considerable progress, but I also think the key to significant additional progress, in our research and in our teaching, is to broaden our focus to accounting in a more endogenously determined environment, to expand our reliance on micro foundations, to examine closely the economics and social psychology of the context, to be attentive to macro conditions such as the business cycle, in short to embrace and exploit endogenous expectations.

Editorial Data

The following table contains information about turnaround time for manuscripts (including revisions) on which editorial decisions were made in the 12-month period ended August 31, 2003. Turnaround time is the number of days between the date that the manuscript was received and the date of the editor's letter to the author(s):
                                    Number of              Cumulative
                                   Manuscripts   Percent      Number

  0 [less than or equal to]             57        14.54         57
Days [less than or equal to] 30
 31 [less than or equal to]            153        39.03        210
Days [less than or equal to] 60
 61 [less than or equal to]            139        35.46        349
Days [less than or equal to] 90
 91 [less than or equal to]             36         9.18        385
Days [less than or equal to] 120
121 [less than or equal to] Days         7         1.79        392

Total                                  392       100.00

                                   Cumulative
                                    Percent

  0 [less than or equal to]           14.54
Days [less than or equal to] 30
 31 [less than or equal to]           53.57
Days [less than or equal to] 60
 61 [less than or equal to]           89.03
Days [less than or equal to] 90
 91 [less than or equal to]           98.21
Days [less than or equal to] 120
121 [less than or equal to] Days     100.00

Total

The mean review time was 57.09 days; the median review time was 57
days.

New Submissions by calendar year (January 1 to December 31)

                                   1992            241
                                   1993            234
                                   1994            231
                                   1995            195
                                   1996            230
                                   1997            195
                                   1998            196
                                   1999            239
                                   2000            260
                                   2001            260 *
                                   2002            324
                                   2003 (8 mths)   217

* 2001--in addition to the 260 regular submissions, 68 MSs were
submitted to the TAR Quality of Earnings Conference.

The acceptance rate (defined as number of MSs accepted divided by the
number of new submissions) over the past five years is between 10-13
percent per year.


The acceptance rate (defined as number of MSs accepted divided by the number of new submissions) over the past five years is between 10-13 percent per year.

Prepared for Presidential Lecture, American Accounting Association, August 5, 2003. Helpful comments by John Christensen, George Drymiotes, John Fellingham, Hans Frimor, David Larcker, David Sappington, Katherine Schipper, Mary Stone, and seminar participants at Carnegie Mellon University and the University of Southern Denmark are gratefully acknowledged.

Editor's note: The Executive Committee of the American Accounting Association has recommended the commission of a series of Presidential Research Lectures, to be delivered at annual meetings of the Association. To encourage broad dissemination, the Committee has requested that The Accounting Review publish this lecture given at the 2001 American Accounting Association Annual Meeting in Atlanta, GA.

(1) Watts and Zimmerman (1986) are particularly eloquent in stressing choice in the accounting domain. Further note the importance of dynamics. We know change is ever present, and we are also fairly comfortable with the claim that decisions change as the environment changes. This suggests a narrow focus on estimating decision rules or reduced form structures is likely to be self-defeating (e.g., Hansen and Sargent 1980).

(2) These concerns are not new, e.g., Holthausen and Watts (2001) and Hribar and Collins (2002). But I hope to document their breadth. Moreover. and by analogy, the market clearing or rational expectations school in macroeconomics stresses micro foundations, in lieu of the more traditional Keynesian approach, and has led to new understanding of, say, monetary policy and business cycles as well as the limitations in empirically assessing policy choices (e.g, Lucas 2003).

(3) SFAS No. 146 requires costs associated with, say, lease terminations or employee severance that stem from a plant closing or restructuring be recognized when incurred, not when there is "commitment" to the restructuring or closing.

(4) Stocken and Reis (2003) provide a setting in which estimating fair value requires intimate understanding of a multistage game between competitors in the product market, where estimating fair value in effect necessitates estimating a market clearing price.

(5) Christensen and Demski (2002) and Watts (2003) stress an endogenous view of conservatism, and its connection to verifiability.

(6) The consistent pattern in theoretical work over the last few decades is that finer details matter, that context is a first-order effect. We see this emerging in, for example, Arya et al. (2000), Beatty and Weber (2003), Gleason and Lee (2003) and Gibbins, Salterio, and Webb (2001). Better integrating theoretical and empirical work is the key to the next round of progress, and that is why I stress micro foundations.

(7) Dye (2001, 230-231) is particularly eloquent: "[Accounting is distinguished by] the emphasis on standards, the stand-setting process, on the accrual concept, and on how investors and other financial statement users process accounting information."

(8) Contrast an autoregressive model of accruals with a structural model that reflects firm choices, as in Lanen and Thompson (1988), or a cognitively challenged auditor with one imbedded in an organization and personally dealing with both a supervisor and a client.

(9) Notice the price, and for that matter the dividend, may be negative. This is the price we pay for transparency in what follows.

(10) Notice our assumptions imply current price is a sufficient statistic for estimating next period's dividend and price: [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII.] It is important to acknowledge this is a partial equilibrium story. With an economy-wide model, bubbles and cycles would be part of the story, and presumably would carry valuation implications for any particular signal path. For example, the marginal propensity to consume might become an important, interlinked variable in the story (e.g., Barro 2000).

(11) Comparing "price" and earnings surprise at time t + 1, we have:

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII.]

as [I.sub.t+1] = [d.sub.t+1] + [z.sub.t+1] - [z.sub.t] = [d.sub.t+1] + [P.sub.t+1] - [P.sub.t].

(12) Conservatism is readily illustrated as well. Suppose the firm reports an accrual balance of [z.sub.t], = .5[y.sup.p.sub.t] whenever [y.sup.p.sub.t] < 0, but [z.sub.t] = .25[y.sup.p.sub.t] otherwise. Again this is informationally equivalent to reporting [z.sub.t] = [y.sup.p.sub.t], but affects, dramatically, the statistical connection between the price and reporting processes.

(13) As hinted in the illustration, market-based studies, such as earnings response measures and value relevance tests, are a common type of exercise in which we encounter the importance of market-based trading behavior, choices. Public information arriving in a more of less well-functioning trading market is the underlying paradigm. Yet the richness of the institutional setting, multiple sources of information, a larger, economy-wide view of the setting, and the very endogenous nature of reported accruals remain outside our modeling attempts. (Examples are provided by Feltham and Ohlson's 1995 book value plus error formulation or Verrecchia's 2001 survey of the disclosure literature. Also see Dye 2001 and Christensen and Feltham 2003.) This void spills over into our empirical work.

(14) Return to the earlier illustration, but now suppose the firm smoothes when the absolute value of the private observation is large, but reports in straightforward fashion otherwise. Further suppose the market participants are able to decode such reporting to yet again ascertain the fundamentals. Now connecting price and the reported accrual will display the transcedental pattern documented in Freeman and Tse (1992).

(15) Introducing insider trading adds another dimension, and another equilibrium specification exercise.

(16) From here we see the buyer is not disadvantaged, as the equilibrium price reflects the equilibrium reporting behavior; but ex post the seller who is unable to misreport is disadvantaged just as his misreporting counterpart is advantaged.

(17) We are also sidestepping the self-selection, or censorship, that underlies the I/B/E/S data set, not to mention the management policies, e.g., dealing with splits, applied to that data set. In addition, we typically assume earnings surprise, i.e., reported earnings less the corresponding "consensus" forecast, is a sufficient statistic for the new information conveyed by the accounting report. Basu and Markov (2003) build on the micro foundations theme by assuming the institutional fabric invites minimization of the absolute mean forecast error and then proceed to test this in rational expectations fashion (akin to the Mishkin test). Similarly, Clement and Tse (2003) examine the role played by analyst characteristics, related to their forecast accuracy, in a forecast revision setting.

(18) Notice the parallel to research submissions and referee reports, where, arguably, the referee reports tend to be downward-biased and the behaviors are viewed in terms of various incentives, career concerns, and equilibrium forces.

(19) Similarly, Hronsky and Houghton (2001) focus experimentally on how an accounting concept is defined.

(20) To expand, "Neutrality means that either in formulating of implementing standards, the primary concern should be the relevance and reliability of the information that results, not the effect that the new rule may have on a particular interest" (Concepts Statement No. 2, FASB 1980, para. 99). Also, on the supply side, the Board does, on occasion, recognize the transaction supply issue, as in SFAS No. 150 (FASB 2003, para. 8): "The objective of this Statement is to require issuers to classify as liabilities ... three classes of freestanding financial instruments that embody obligations for the issuer. In applying this Statement, that objective shall not be circumvented by nonsubstantive of minimal features included in instruments."

(21) "The Keynesian theory ... assumes that prices on some markets do not adjust perfectly to ensure continual balance ... [that] some markets do not always clear" (Barrow 2000, 757).

(22) Shadow speed limits, e.g., within 10 miles per hour of the posted limit, are an apt analogy.

(23) From here it is a short step to reintroduce the idea that a given firm can so misreport with probability a, which would necessarily complicate the social welfare analysis. We also might pursue cognitive issues, as in, for example, Hopkins (1996), and we might extend the story to include an SPE option. But the message of consistency remains.

(24) Earlier, recall, we saw that a simple regression of price on the accrual stock would identify correctly the connection between the processes in the fair value setting but not in the smoothed setting. Conversely, Ryan (1995) demonstrates, in a model of exogenous investment and conservative reporting, the importance of lagged price variables in explaining the book-to-market ratio.

(25) To illustrate, let [[omega].sub.t] be the accrual itself. In the fair value version we would then find [alpha] = [alpha]* = [beta] = [beta]* = 0, while in the smoothed case we would find [alpha] = [alpha]* = 0 and [beta] = [beta]* = .5, despite the misspecification of the accrual process in the smoothed case.

(26) The observational structure is another part of the story (Ait-Sahalia and Mykland 2003); this surely interacts with the manner in which we test competing models of behavior in institutionally rich environments.

(27) The intertemporal issues cloud the FASB and IASB deliberations on when to expense an employee option.

(28) In addition, cost-based pricing may well create a connection on the demand side, due to the fact using, say, direct labor to allocate overhead costs calls on the direct labor factor to serve both a production role and act as a parameter in the cost reimbursement calculation; this added parametric role is likely to distort the direct labor choice (Rogerson 1992). As a result, cost studies in settings of this nature, such as defense procurement of health care (e.g., Butler 1995), stretch the credulity of the classical cost function specification.

(29) The factor choices themselves are treated as exogenous in the cost driver analysis; separability issues may well influence these issues, though separability per se is largely ignored (e.g., Noreen and Soderstrom 1994). In addition, the resulting cost expression is inherently linear, thus presuming at least locally constant returns.

REFERENCES

Abel, A. B., and F. S. Mishkin. 1983. On the econometric testing of rationality-market efficiency. Review of Economics and Statistics 65 (2): 318-323.

Aboody, D., R. Kasznik, and M. Williams. 2000. Purchase versus pooling in stock-for-stock acquisitions: Why do firms care? Journal of Accounting & Economics 29 (3): 261-286.

--, J. Hughes, and J. Liu. 2002. Measuring value relevance in a (possibly) inefficient market. Journal of Accounting Research 40 (4): 965-976.

Abowd, J. M., and D. S. Kaplan. 1999. Executive compensation: Six questions that need answering. Journal of Economic Perspectives 13 (4): 145-168.

Ait-Sahalia, Y., and P. Mykland. 2003. The effects of random and discrete sampling when estimating continuous-time diffusions. Econometrica 71 (2): 483-549.

Anderson, S. W., J. W. Hesford, and S. M. Young. 2002. Factors influencing the performance of activity based costing teams: A field study of ABC model development in the automobile industry. Accounting, Organizations and Society 27 (3): 195-211.

Antle, R., and B. Nalebuff. 1991. Conservatism and auditor-client negotiations. Journal of Accounting Research 29 (Supplement): 31-54.

--, E. A. Gordon, G. Narayanamoorthy, and L. Zhou. 2003. The joint determination of audit fees, non-audit fees, and abnormal accruals. Working paper, Yale University, New Haven, CT.

Arya, A., J. C. Glover, and S. Sunder. 1998. Earnings management and the revelation principle. Review of Accounting Studies 3 (1/2): 7-34.

--, J. C. Fellingham, J. C. Glover, D. A. Schroeder, and G. Strang. 2000. Inferring transactions and financial statements. Contemporary Accounting Research 17 (3): 365-385.

Ashton, A., and R. Ashton. 1995. Judgment and Decision Making Research in Accounting and Auditing. Cambridge U.K.: Cambridge University Press.

Barro, R. J. 2000. Macroeconomics. Cambridge, MA: MIT Press.

Barth, M. E., W. H. Beaver, and W. R. Landsman. 1992. The market valuation implications of net periodic pension cost components. Journal of Accounting & Economics 15 (1): 27-62.

--, --, and --. 1996. Value-relevance of banks' fair value disclosures under SFAS No. 107. The Accounting Review 71 (4): 513-537.

Basu, S., and S. Markov. 2003. Loss function assumptions in rational expectations tests on financial analysts' earnings forecasts. Working paper, Emory University, Atlanta, GA.

Bazerman, M. H., G. Loewenstein, and D. A. Moore. 2002. Why good accountants do bad audits. Harvard Business Review 80 (11): 96-102.

Beatty, A., and J. Weber. 2003. The effects of debt contracting on voluntary accounting method changes. The Accounting Review 78 (1): 119-142.

Beaver, W. H., M. F. McNichols, and K. K. Nelson. 2003. An alternative interpretation of the discontinuity in earnings distributions. Working paper, Stanford University, Stanford, CA.

Bebchuk, L. A., and J. M. Fried. 2003. Executive compensation as an agency problem. Journal of Economic Perspectives 17 (3).

Berger, P. G., and R. Hann. 2003. The impact of SFAS No. 131 on information and monitoring. Journal of Accounting Research 41 (2): 163-223.

Bradshaw, M. T., and R. G. Sloan. 2002. GAAP versus the street: An empirical assessment of two alternative definitions of earnings. Journal of Accounting Research 40 (1): 41-66.

Burgstahler, D. C., and I. D. Dichev. 1997. Earnings management to avoid earnings decreases and losses. Journal of Accounting & Economics 24 (1): 99-126.

Bushman, R. M., and A. J. Smith. 2001. Financial accounting information and corporate governance. Journal of Accounting & Economics 32 (1-3): 237-333.

Butler, J. 1995. Hospital Cost Analysis. Dordrecht, The Netherlands: Kluwer Academic Publishers.

Chen, K. C. W., and M. P. Schoderbek. 2000. The 1993 tax rate increase and deferred tax adjustments: A test of functional fixation. Journal of Accounting Research 38 (1): 23-44.

Christensen, J., and J. S. Demski. 2002. Accounting Theory: An Information Content Perspective. New York, NY: McGraw/Hill-Irwin.

Christensen, P. O., and G. A. Feltham. 2003. Economics of Accounting: Volume I--Information in Markets. Dordrecht, The Netherlands: Kluwer Academic Publishers.

Clement, M. B., and S. Y. Tse. 2003. Do investors respond to analysts' forecast revisions as if forecast accuracy is all that matters? The Accounting Review 78 (1): 227-249.

Datar, S. M., and M. Gupta. 1994. Aggregation, specification and measurement errors in product costing. The Accounting Review 69 (4): 567-591.

Dopuch, N. 1993. A perspective on cost drivers. The Accounting Review 68 (3): 615-621.

Dye, R. A. 2001. An evaluation of "Essays on Disclosure" and the disclosure literature in accounting. Journal of Accounting & Economics 32 (1-3): 181-235.

--. 2002. Classifications manipulation and Nash accounting standards. Journal of Accounting Research 40 (4): 1125-1162.

Easton, P. D., and M. E. Zmijewski. 1989. Cross-sectional variation in the stock market response to accounting earnings announcements. Journal of Accounting & Economics 11 (2/3): 117-141.

Feltham, G. A., and J. A. Ohlson. 1995. Valuation and clean surplus accounting for operating and financial activities. Contemporary Accounting Research 11 (1): 689-731.

Financial Accounting Standards Board (FASB). 1980. Qualitative Characteristics of Accounting Information. Concepts Statement No. 2. Norwalk, CT: FASB.

--. 1985. Elements of Financial Statements. Concepts Statement No. 6. Norwalk, CT: FASB.

--. 2003. Accounting for Certain Instruments with Characteristics of Both Liabilities and Equity. Statement of Financial Accounting Standard No. 150. Norwalk, CT: FASB.

Foster, G., and D. W. Swenson. 1997. Measuring the success of activity-based cost management and its determinants. Journal of Management Accounting Research 9 (1): 109-141.

Frankel, R. M., M. F. Johnson, and K. K. Nelson. 2002. The relation between auditors' fees for non-audit services and earnings quality. The Accounting Review 77 (Supplement): 71-105.

Freeman, R. N., and S. Y. Tse. 1992. A nonlinear model of security price responses to unexpected earnings. Journal of Accounting Research 30 (2): 185-209.

Gibbins, M., S. Salterio, and A. Webb. 2001. Evidence about auditor-client management negotiation concerning client's financial reporting. Journal of Accounting Research 39 (3): 535-563.

Gleason, C. A., and C. M. C. Lee. 2003. Analyst forecast revisions and market price discovery. The Accounting Review 78 (1): 193-225.

Hall, B. J., and J. B. Liebman. 1998. Are CEOs really paid like bureaucrats? Quarterly Journal of Economics 108 (3): 653-691.

--, and K. J. Murphy. 2002. Stock options for undiversified executives. Journal of Accounting & Economics 108 (1): 3-42.

--, and --. 2003. The trouble with stock options. Working paper, Harvard University, Boston, MA.

Hansen, L. P., and T. J. Sargent. 1980. Formulating and estimating dynamic linear rational expectations models. Journal of Economic Dynamics and Control 2 (1): 7-46.

Hayes, R. D., and J. A. Millar. 1990. Measuring production efficiency in a not-for-profit setting. The Accounting Review 65 (3): 505-519.

Hermalin, B. E., and M. S. Weisbach. 2003. Boards of directors as an endogenously determined institution: A survey of the economic literature. Federal Reserve Bank of New York Economic Policy Review 9 (1): 7-26.

Holthausen, R. W., and R. L. Watts. 2001. The relevance of the value-relevance literature for financial accounting standard setting. Journal of Accounting & Economics 31 (1-3): 3-75.

Hopkins, P. E. 1996. The effect of financial statement classification of hybrid financial instruments on financial analysts' stock price judgments. Journal of Accounting Research 34 (Supplement): 33-50.

Hribar, P., and D. W. Collins. 2002. Errors in estimating accruals: Implications for empirical research. Journal of Accounting Research 40 (1): 105-134.

Hronsky, J. J. F., and K. A. Houghton. 2001. The meaning of a defined accounting concept: Regulatory changes and the effect on auditor decision making. Accounting, Organizations and Society 26 (2): 123-139.

Hwang, Y., J. Evans III, and V. Hegde. 1993. Product cost bias and selection of an allocation base. Journal of Management Accounting Research 5 (1): 213-242.

Imhoff, E. A., and J. K. Thomas. 1988. Economic consequences of accounting standards: The lease disclosure rule change. Journal of Accounting & Economics 10 (4): 277-310.

Ittner, C. D., W. N. Lanen, and D. E Larcker. 2002. The association between activity-based costing and manufacturing performance. Journal of Accounting Research 40 (3): 711-726.

King, R. R. 2002. An experimental investigation of self-serving biases in an auditing trust game: The effect of group affiliation. The Accounting Review 77 (2): 265-284.

Lambert, R. A. 2001. Contracting theory and accounting. Journal of Accounting & Economics 32 (1-3): 3-87.

Lanen, W. N., and R. Thompson. 1988. Stock price reactions as surrogates for the net cash flow effects of corporate policy decisions. Journal of Accounting & Economics 10 (4): 311-334.

Lev, B., and T. Sougiannis. 1996. The capitalization, amortization, and value-relevance of R&D. Journal of Accounting & Economics 21 (1): 107-138.

--. 2003. Corporate earnings: Facts and fiction. Journal of Economic Perspectives 17 (2): 27-50.

Li, X. 2002. Career concerns of equity analysts: compensation, termination, and performance. Working paper, University of Miami, Coral Gables, FL.

Lin, H., and M. F. McNichols. 1998. Underwriting relationships, analysts' earnings forecasts and investment recommendations. Journal of Accounting & Economics 25 (1): 101-127.

Lucas, R. E. Jr. 1983. Studies in Business-Cycle Theory. Cambridge, MA: MIT Press.

--. 2003. Macroeconomic priorities. American Economic Review 93 (1): 1-14.

McDaniel, L., R. D. Martin, and L. A. Maines. 2002. Evaluating financial reporting quality: The effects of financial expertise vs. financial literacy. The Accounting Review 77 (Supplement): 139-167.

McNichols, M. F., and P. C. O'Brien. 1997. Self-selection and analyst coverage. Journal of Accounting Research 35 (Supp): 167-199.

Mishkin, F. 1983. A Rational Expectations Approach to Macroeconomics. Chicago, IL: University of Chicago Press.

Mittelstaedt, H. F., W. D. Nichols, and P. R. Regier. 1995. SFAS No. 106 and benefit reductions in employer-sponsored retiree health care plans. The Accounting Review 70 (4): 535-556.

Mocan, H. N. 1997. Cost functions, efficiency, and quality in day care centers. Journal of Human Resources 32 (4): 861-891.

Nelson, M. W., J. A. Elliott, and R. L. Tarpley. 2002. Evidence from auditors about managers' and auditors' earnings management decisions. The Accounting Review 77 (Supplement): 175-202.

--. 2003. Behavioral evidence on the effects of principles- and rules-based standards. Accounting Horizons 17 (1): 91-104.

Noreen, E., and N. Soderstrom. 1994. Are overhead costs strictly proportional to activity? Evidence from hospital service departments. Journal of Accounting & Economics 17 (1/2): 255-278.

Rogerson, W. P. 1992. Overhead allocation and incentives for cost minimization in defense procurement. The Accounting Review 67 (4): 671-690.

Ryan, S. G. 1995. A model of accrual measurement with implications for the evolution of the book-to-market ratio. Journal of Accounting Research 33 (1): 95-112.

Savage, L. 1954. The Foundations of Statistics. New York, NY: John Wiley & Sons, Inc.

Sheffrin, S. 1996. Rational Expectations. Cambridge, U.K.: Cambridge University Press.

Sloan, R. G. 1996. Do stock prices fully reflect information in accruals and cash flows about future earnings? The Accounting Review 71 (3): 289-315.

Stocken, P., and R. Reis. 2003. Information content of asset valuation rules. Working paper, University of Pennsylvania, Philadelphia, PA.

Tan, H., R. Libby, and J. E. Hunton. 2002. Analysts' reactions to earnings preannouncement strategies. Journal of Accounting Research 40 (1): 223-246.

Trotman, K. T. 1998. Audit judgment research--Issues addressed, research methods and future directions. Accounting and Finance 38 (2): 115-156.

Verrecchia, R. E. 2001. Essays on disclosure. Journal of Accounting & Economics 32 (1-3): 98-180.

Watts, R. L., and J. L. Zimmerman. 1986. Positive Accounting Theory. Englewood Cliffs, NJ: Prentice Hall.

--. 2003. Conservatism in accounting, Part 1: Explanations and implications. Accounting Horizons 17 (3): 207-21.

Xie, H. 2001. The mispricing of abnormal accruals. The Accounting Review 76 (3): 357-373.

Zimbelman, M. F. 1997. The effects of SAS No. 82 on auditors' attention to fraud risk factors and audit planning decisions. Journal of Accounting Research 35 (Supp): 75-97.

Zimmerman, J. L. 2001. Conjectures regarding empirical managerial accounting research. Journal of Accounting & Economics 32 (1-3): 411-427.

Joel S. Demski

University of Florida
COPYRIGHT 2004 American Accounting Association
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2004 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Demski, Joel S.
Publication:Accounting Review
Date:Apr 1, 2004
Words:13275
Previous Article:Reviewers' responses to expectations about the client and the preparer.
Next Article:The balanced scorecard: judgmental effects of performance measures linked to strategy.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters