Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Multiple treatments

    Hi
    I have a panel of roughly 1000 firms across 10 years (2004-2013) in 2 states in the US (CA and PA). These firms adopt a technology at different years during the panel. However, firms in CA start getting audited in 2006 and in PA in 2009. Firms can adopt the technology either before the audit or after the audit.

    In our analysis, we are trying to study if the technology and the audit have any impact on the output. The model that we are running is as follows:

    reghdfe output 1.technology##1.audit , absorb(firm_id year)

    We are interested in the effect of technology on output both before and after the audit (and posit that the technology may be used in different ways under and outside the audit - hence the interaction term). So one way to get the econometric effect is to examine the interaction term in the above model. Can we also run the following code:

    reghdfe output 1.technology if audit ==0 , absorb(firm_id year)
    reghdfe output 1.technology if audit ==1 , absorb(firm_id year)

    This will also give us the effect of technology both before and under the audit. However, what would be the difference in the interpretation of the coefficients between the two approaches?

  • #2
    In the stratified analysis (i.e. 2 separate regressions), the coefficient of technology will be an estimate of the expected difference in output associated with adopting the technology, conditional on not being audited (in the first regression), or conditional not being audited (in the second). The effect of being audited itself cannot be estimated with this approach, and there is really no good way to contrast the technology effects across the two audit conditions.

    In the interaction analysis, because of the absorption of firm_id and year, and because it is likely that technology and 1.audit themselves will be colinear with the firm and year effects, there probably will not be coefficients for them. But you will get a coefficient for the interaction term 1.interaction#1.audit, and this will give you the contrast between the technology effects across the two audit conditions (or, equivalentlly, the difference between the audit effect when the technology is adopted and when it isn't.)

    I also hasten to add that if you had additional covariates in the model, that would lead to other differences between the two approaches. But with just this simple model, there are no other issues.

    Comment


    • #3
      Thanks Clyde -

      Had a follow up question - how would you test the parallel trends assumption (for the effect of the technology) in such a situation? The implementation of the audit will confound the identification of the effect.

      Thanks!

      Comment


      • #4
        Yes, well the classical parallel trends assumption isn't directly applicable here because you have a second effect (audit) in play as well.

        I think I would look at mean outcomes at each time period prior to the adoption of the technology separately in the presence and absence of audit. Then I would graph those and see if the trends look parallel in the audit condition and if the trends look parallel in the non-audit condition.

        I generally advise assessing parallel trends assumptions graphically and not doing formal testing of it. While formal testing is, in principle, possible, many data sets in which this kind of study is done, if a prospective sample size analysis has been done at all, are collected with just enough data to adequately power the test of the effect of interest itself. That necessarily means that it will be underpowered to test parallel trends (because that is a subset of the data, typically half of it or less), and it will likely be grossly underpowered to test parallel trends separately in two subsets of that! So unless you have a very large data set, a formal test of parallel trends is likely to conclude that the trends are parallel whether, in reality, they are or not.

        Comment

        Working...
        X