«THE EFFECT OF SCHOOL FINANCE REFORMS ON THE DISTRIBUTION OF SPENDING, ACADEMIC ACHIEVEMENT, AND ADULT OUTCOMES C. Kirabo Jackson Rucker Johnson ...»
To obtain a measure of district-specific changes in spending caused by court-mandated 27 reforms, we regress the natural log of district per-pupil spending on district fixed effects, year fixed effects, and an indicator variable denoting a post-reform year interacted with each school district. To isolate spending changes associated with the court-ordered reforms, we also include controls for a variety of potentially confounding policies. The period under study overlaps other important policy changes (Johnson, 2013; Chay, Guryan, & Mazumder, 2009; Hoynes, Schanzenbach, & Almond, 2012). To account for the effect of these policy changes, we include county by year measures of school desegregation, hospital desegregation, community health centers, and state funding for kindergarten, in addition to per capita expenditures on Head Start, Title I school funding, and average childhood spending on food stamps, Aid to Families with Dependent Children (AFDC), Medicaid, and unemployment insurance. 18 The coefficients on the district indicators interacted with the post-reform indicator provide the regression estimate of the change in per-pupil spending associated with the passage of a court-mandated reform for each district (net of the effect of a myriad of other policies). For each district, we take the interaction of the post-reform indicator with that district as our time-invariant, district-specific, courtmandated reform effect on spending (“ SPENDc ”).
18 The data sources used to compile these measures are detailed in Johnson (2013).
19 Consistent with the flexible district-specific effects picking up much of the variability associated with the district income quartiles, it is much more positive for lower-income districts. However, one can only explain four percent of the variability across districts in SPENDc with the district income quartile category variables. Using all the observable variables to describe reforms from Part One interacted with the quartile of income can predictably explain about one-third of the variability in SPENDc. 20 This part of the research design is similar in setup to a recent study by Johnson (2011) on the long-run impacts of court-ordered school desegregation.
where i indexes the individual, c the school district, b the year of birth, g the region of birth (defined by nine census division categories), and r the racial group. The term c represents a
change in per-pupil spending in district c. The flexible timing indicators, 1 t icb Tc* t T , equal 1 if the year the individual from school district c turned age 17 ( ticb ) minus the year of the initial SFR court order in school district c ( Tc* ) equals a value between -20 and 20. For example, values for t icb Tc* between -20 and -2 represent pre-treatment years; a value of -1 represents an individual who was 18 when court-mandated SFR was first enacted and thus was not exposed, which is used as the reference group category; values between 0 and 12 represent school-age years of SFR exposure; and values greater than 12 represent years beyond school-age exposure.
The event-study year (t - T) is 0 when the year in which an individual was age 17 (typically, a high-school senior) equals the initial year of court-mandated SFR for the school district in which the person grew up.
Estimation of equation  provides a flexible description of the subsequent adult attainment outcomes in relation to the cohort- and district-specific timing of reform-induced changes in school spending. This allows us to test for the patterns described in Figure 11. The estimates of the post-reform year indicator variables interacted with the reform-induced increase in spending, t T in equation , map out differences in outcomes across cohorts that experienced larger vs smaller changes in per-pupil spending after the passage of reforms.
These estimates provide precise pictures of the exact timing of any changes in attainment outcomes in relation to the number of school-age years of exposure to SFR and its resultant changes in spending. Because the validity of our empirical design depends critically on the assumption that those districts that experienced increases in school spending due to reforms were not already on a differential trajectory of improving outcome, we also present the flexible time indicators interacted with the increase in spending for years prior to reforms. A plot of the
estimates of the pre-reform indicator time dummies interacted with the reform-induced increase in spending, t T, provides a visual portrait of whether there were systematic time trends in outcomes preceding enactment of court-ordered SFR in districts that would have experienced increases or decreases in school spending after reforms. The former uses the specific timing and intensity of changes to test for causal effects of school spending; the latter provides a test of endogeneity in the timing and scope of the initial court orders. Note that in addition to testing for trending in the pre-reform cohorts, estimated effects beyond the maximum 12 school-age years of exposure ( t T, for event-study years (t-T)12) provide an additional specification test, as these should not exhibit significant trends in outcomes because these additional years do not represent any change in school-age exposure.21 This model can be viewed as a triple-difference strategy that compares the difference in outcomes between cohorts within the same district exposed to reforms for different amounts of time (variation in exposure) across districts with larger or smaller changes in school spending due to reforms (variation in intensity). Because the intensity variable SPENDc is invariant within a district and all models include district fixed effects, the validity of the research design relies upon the exogeneity of the timing of passage of court-mandated SFRs, which is addressed and supported by the model specification in several ways. First, the model includes school district fixed effects ( c ), race-specific birth year fixed effects ( b ), and race-by-region of birth cohort r trends ( g * b ), and it controls for an extensive set of child and childhood family characteristics ( r X icb : parental education and occupational status, parental income, mother’s marital status at birth, birth weight, child health insurance coverage, and gender). To account for effect of the other policies discussed above when predicting effects on outcomes, we include county-by-birth year level measures of school desegregation, hospital desegregation, community health centers, and state funding for kindergarten, in addition to per capita expenditures on Head Start(at age four), Title I school funding (average during ages 5-17), and average childhood spending on food stamps, Aid to Families with Dependent Children (AFDC), Medicaid, and unemployment insurance, ( Z cb ).22 Few studies simultaneously account for so comprehensive a set of policies.
21 Only in the case in which SFR plans became more effective with time would we expect a significant relationship between outcomes and event-study years beyond 12, which we explore.
22 The data sources used to compile these measures are detailed in Johnson (2013).
Models that analyze the economic outcomes use all available person-year observations for ages 25-45 and control for a cubic in age to avoid confounding life cycle and birth cohort effects. To control for trends in factors hypothesized to influence the timing of SFR, equation  also includes interactions between 1960 characteristics of the county of birth and linear trends in the year of birth ( W1960c b : 1960 county poverty rate, percent black, average education level, percent urban, and population size). Standard errors are all clustered at the school district level.
One potential parental response to the presence of school quality differences across public schools is to move to a different city or enroll children in a private school.23 Because we did not want to include endogenous residential moves, we identified the neighborhood and school of upbringing based only on the earliest childhood address (in most cases, 1968).24 As such, one can interpret our results as providing “intent to treat” estimates of the impacts of school spending. Because residential mobility across counties and private school attendance are more common among children from affluent families than those from low-income ones, one might expect larger effects among children from low-income families.25 Furthermore, prior research has shown that children from low-income families may be more sensitive to changes in school quality and school-related interventions (e.g., the Tennessee Star class size experiment) than children from more advantaged family backgrounds. For these reasons, we conduct all analyses separately by childhood poverty status. A child is defined as poor if parental family income falls below two times the poverty line for any year during childhood. This measure thus captures both the near poor and the persistently poor during childhood. Results are similar when the sample is restricted to individuals who lived in their childhood residence prior to the initial court orders. The latter part of Section VI provides more discussion of falsification and specification tests performed.
We present graphical plots, based on equation , that show the response of outcomes to
23 After SFRs in California, the share of students attending private schools rose about 50 percent (Downes & Schoeman, 1998), and educational foundations grew tremendously (Brunner & Sonstelie, 2003). Privatization grew disproportionately in districts constrained by the SFR formula to spend less than they traditionally had.
24 Among original sample children in the PSID, the average proportion of childhood spent growing up in the 1968 neighborhood was roughly two-thirds.
25 Prior research has demonstrated that while residential instability is significantly greater for poor families, and they experience intra-county moves more frequently, they most often move to neighborhoods of similar observable quality (Johnson, 2009; Kunz, Solon et al., 2008; Mare et al., 2008). Poor families are far less mobile, as measured by upward residential mobility patterns, and are less responsive to policy changes due to the greater residential location constraints they face.
reform-induced changes in per-pupil spending. This allows us to test for any increase with years of exposure and the resultant amount of spending change. To present both the time variation and the intensity variation on the same graph, we present the estimated event-study effects for a 10 percent school spending increase, a 20 percent increase and a 25 percent increase. If there truly is a causal effect of increased spending on adult outcomes, the event study plot should follow the patterns in Figure 11. The aim of the event-study analysis is to clearly illustrate the patterns in the data. To provide point estimates and statistical inference tests, we complement this evidence with a 2SLS/IV regression analysis.
c. 2SLS/IV Estimates of the Effect of School Spending on Adult Outcomes In addition to presenting the visual evidence on the causal effects of increased school spending on outcomes using the event-study analysis, we present 2SLS/IV regression estimates based on the same sources of exogenous variation that are used to quantify these relationships, for all children and separately by childhood poverty status. The basic empirical approach to identifying the effect of school spending on longer-run outcomes is to compare outcomes of individuals who were exposed to different levels of school spending during childhood. Our measure of exposure to school spending is PPE5-17, the average per-pupil spending in an individual’s birth district during the years when that individual was ages 5 through 17 (school age years). A doubling of this average can be interpreted as a doubling of per-pupil spending for all 12 years of an individual’s school career. Because such large increases are very rare, to allow for a marginal increase in this variable to have a more realistic interpretation, we take the natural log of this average and divide it by 0.2 (i.e., we use ln(PPE5-17)/0.2, where a one-unit change represents a 20 percent change), so that the coefficient on [ln(PPE5-17)/0.2] in a regression can be interpreted as the effect of a 20 percent increase in per-pupil spending for all 12 of one’s school age years. The standard deviation of the district-specific spending change is 0.15, so that a 20 percent increase is somewhat larger than the typical increase. However, this change is well within the range of the data such that one-quarter of districts in reform states experience reforminduced spending increases of at least this amount.