Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Dear Clyde, thank you very much for help!
    "event_id" identifier will be substitution for "co_id" identifier in order to control for several mergers by one company.
    And I agree with you, even if I take [-180;+180] event window, it is extremely rare when company can get involved into another merger during this period.
    I understood that missing values can be a problem for the code you have written? Can the solution to delete companies with missing values be valid (at least for selected event windows)?
    Thank you in advance.

    Comment


    • #17
      The treatment of missing values requires thinking about it from a finance point of view. The code, as written, omits observations of abnormal_return that have missing values from the calculation of total_abnl_return. (Equivalently, you can think of those observations as being treated as if the abnormal_return were zero.) If you drop those observations from the data set altogether, the result will be exactly the same.

      Dealing with missing values requires understanding why they are missing in the first place, and what, if anything, can be said about the values that would be observed if they were not missing. There is no generic solution to this problem, and it depends far more on the underlying science than on statistical fixes.

      There are a number of people involved in finance on this forum. Perhaps one of them can advise you on the best treatment of missing values for this kind of data.

      Comment


      • #18
        Dear Clyde!

        Thank you for the previous advice you provided me with.
        I have one more general issue to address. In my previous example I was struggling to calculate the sum of abnormal returns for event windows around the merger date.
        Can you please help me to solve the following task: I am struggling to write a code that can generate new variable that simply displays the returns (or market index) for all dates in the selected event window around merger date. In other words, if your previous code helped me to calculate directly the sum of abnormal returns in the event window, now I need the code that simply displays the market returns and company returns accordingly.

        Will I be correct if I transform your code in the following way:

        Code:
        by event_id (ann_date), sort: assert ann_date[1] == ann_date[_N]
        gen byte in_window = inrange(date, merger_date-180, merger_date+180)
        gen displayed_return = company_return*in_window
        Thank you in advance.

        P.S. I also mentioned that your code counts calendar dates, while my case can be better researched by using trading dates ()as my "date" variable shows trading days). How can it be better addressed?
        Last edited by Misha_Iasinskyi; 05 Dec 2014, 19:41.

        Comment


        • #19
          With regard to the code you wrote, displayed_return will be zero for days outside the window. Is that what you want? Or do you want it to be missing outside the window? If the latter, change the last line in your code to
          Code:
          gen displayed_return = cond(in_window, company_return, .)
          If all you're going to do is sum over the window, then it doesn't matter if the out-of-windows are zero or missing, but if the individual values are of interest, then do it whichever way is appropriate for your problem.

          As for trading days, Stata has business calendars that deal with that problem. I never need to use them and know nothing about them. But you can check the on-line help and the manuals. They are designed for precisely the problem you have described.

          Comment


          • #20
            Dear Misha_Iasinskyi,

            Recently, I also do the event study about the top management turnover, I followed the code on this page http://dss.princeton.edu/online_help...ventstudy.html ,and it said that : abnormal return=ret-predicted return ,I used this method and my result is negative. So, I wonder what your result is and if you used the estimate window to predict the normal return. besides, if you would describe your event in detail, you can send it to my e-mail:[email protected] ,thanks.

            Regards!

            Comment


            • #21
              I might be able to help you with problem 1. Despite all the effort you put into preparing your question, however, you didn't provide the information I would actually need to help you. The situation seems to be that the data sets are not matching in the way you expect them to. So you need to post excerpts of Stata datasets 1, 2, and 3. Use the -dataex- command to do that. And be absolutely sure to post the examples in such a way that the observations you show from the data sets should properly match up with each other. Without that, I can't discover what is going wrong with the data management.

              If you are running version 15.1 or a fully updated version 14.2, -dataex- is already part of your official Stata installation. If not, run -ssc install dataex- to get it. Either way, run -help dataex- to read the simple instructions for using it. -dataex- will save you time; it is easier and quicker than typing out tables. It includes complete information about aspects of the data that are often critical to answering your question but cannot be seen from tabular displays or screenshots. It also makes it possible for those who want to help you to create a faithful representation of your example to try out their code, which in turn makes it more likely that their answer will actually work in your data.

              After we have figured out how to get your data merged properly (assuming it is possible at all), then I may be able to help you further with the event study. Actually there is a good chance that once the data are properly merged, the rest of the code will run without difficulties.

              By the way, NEVER use -merge m:m-. It is a trap for the unwary that produces data salad. There is a circumstance in which it is useful, but that is sufficiently rare that most people will never encounter it in a lifetime. Even then, there are other ways of doing what it does.

              By the way, as a complete aside, you should not use the formula -gen co_ret = ln(closing_price/l.closing_price)-. It is, at best, an approximation. And it is only a decent approximation if the return is small. And it is enormously computationally inefficient. Use -gen co_ret = closing_price/l.closing_price - 1-. That one is exactly correct and runs hundreds of times faster.

              Comment


              • #22
                Hello,

                Code:
                * Example generated by -dataex-. To install: ssc install dataex
                clear
                input long location float lconsum byte(randomgrp heatscore) float(calday calmonth daymntemp tp treatalt tr_alt_heat treat_numb_of_month tr_alt_heat_month)
                500001  4.332319 0 8 20820 684        -12 0 0 0 -397 0
                500001  4.396276 0 8 20821 684     -17.85 0 0 0 -396 0
                500001 4.4473995 0 8 20822 684 -22.141666 0 0 0 -395 0
                500001 4.4349075 0 8 20823 684   -15.2875 0 0 0 -394 0
                500001 4.3301753 0 8 20824 684    -10.625 0 0 0 -393 0
                500001  3.985716 0 8 20825 684        -14 0 0 0 -392 0
                500001 4.2141517 0 8 20826 684 -18.729166 0 0 0 -391 0
                500001 4.4054755 0 8 20827 684   -18.2625 0 0 0 -390 0
                500001 4.2378565 0 8 20828 684  -19.46667 0 0 0 -389 0
                500001 4.3244976 0 8 20829 684    -18.925 0 0 0 -388 0
                end
                format %td calday
                format %tm calmonth

                I have panel data for 2017-2018. It is a RCT. The treatment (started on February 2, 2018) is actually a specific type of a bill sent to a household which includes a comparison between a household’s energy use and its neighbors, as well as contains a specific photograph of a household’s home showing its energy use. It is expected that the treatment will reduce the energy use of treated households.

                My model specification is a diff-in-diff estimator that models energy use conditional on treatment group indicator, post-treatment indicator, heatscore indicator, month-of-year indicators, linear time trend indicators, non-linear control for temperature (‘mkspline’ command), as well as household fixed effects. This is to be estimated in OLS using standard fixed effects estimator, using robust standard errors, clustered by household.

                My variables:

                location; household’s location id
                lconsum; log of energy consumption
                tp; post-treatment variable; gen tp = (calday >= td(02feb2018))
                randomgr; one of three treatment groups (can be 0,1,2,3)
                heatscore; score of a household based on its photograph (can be 1,2,…,10);
                calday; day and year 01jan2017
                calmonth; month and year 2017m1
                daymntemp; temperature
                treatalt; gen treatalt = (calday >= td(02feb2018) & randomgrp>0)
                tr_alt_heat; treatment indicator; gen tr_alt_heat = treatalt*heatscore

                I have the following code for a diff-in-diff model of my data (as discussed https://www.statalist.org/forums/for...erences-model/, the estimation can be done in a different way – for example, with xtreg and/or using a full factorial – but I was asked to use this code):
                mkspline daymntemp1 17.222221 daymntemp2 21.111111 daymntemp3 23.888889 daymntemp4 = daymntemp
                areg lconsum tr_alt_heat tp daymntemp1-daymntemp4 i.month c.calmonth, absorb(location) vce(cluster location)

                Now, I need to do an event study.

                First, I generate a variable showing the number of days before/after the date when the treatment starts:
                gen treat_numb_of_month = calday-td(02feb2018)

                Then, I guess, I should generate the interaction between my treatment variable, tr_alt_heat, and the one I have just created:
                gen tr_alt_heat_month = tr_alt_heat*treat_numb_of_month

                Now, the code for the event-study regression:
                areg lconsum tr_alt_heat_month tp daymntemp1-daymntemp4 i.month c.calmonth, absorb(location) vce(cluster location)

                1. I am not sure about the code above.
                2. In addition, I need to draw an event-study figure; if I am not mistaken, it should have treat_numb_of_month on the x-axis and the coefficient for the interaction variable, tr_alt_heat_month, on the y-axis. I do not know how to plot the figure. Actually, I tried using coefplot but it only plots the coefficient value, without treat_numb_of_month on the x-axis; I also tried scatter, but I cannot plot the coefficient in this case.

                Thank you in advance!
                Last edited by Katherine Adams; 30 Jan 2019, 13:38.

                Comment


                • #23
                  Hello stata users,
                  I want to run an event-study approach to see the impact on alcohol related-hospital admission due to the Minimum unit pricing legislation towards alcohol in Scotland, in 2018.
                  The formula below is my statistical model. The hypothesis I want to test is, after the MUP policy imposition in other words after alcohol prices increase, will that have an impact on the number of patients going to the hospital due to alcohol? My question is how to run this formula. (whenever i search in google event-study approach it shows me tutorials for finance)
                  Thanks
                  Averkos

                  Yit = αi + λi + ∑(κ=-∞)^∞〖θk*1{Κit=κ}+εit 〗

                  avgconsunitsoff "AverageAlcoholConsumptionUnitsOff-Trade"
                  avgconsltroff "AverageAlcoholConsumptionLtrOff-Trade"
                  totalvlmoff "TotalAlcoholVolumeOff-Trade"
                  avgalcpoff "AverageAlcoholPriceOff-Trade"
                  avgalcpt "AverageAlcoholPriceOn&OFf-Trade"
                  totalvlmt "TotalAlcoholVolumeOn&Off-Trade"
                  avgconsltrst "AverageAlcoholConsumptionLtrOn&Off-Trade"
                  avgconsunitst "AverageAlcoholConsumptionUnitsOn&Off-Trade"
                  alchospadm "Alcohol-related hospital admissions"
                  acuteint "Acute intoxication due to alcohol intake"
                  population "Entire population estimate"
                  proportiond "Estimated proportion of adult drinkers"
                  "Estimated adult drinker population"
                  input int year str8 country float(avgconsunitsoff avgconsltroff totalvlmoff avgalcpoff avgconsunitst avgconsltrst totalvlmt avgalcpt) long population float proportiond double drinkingpop long alchospadm int acuteint float(YearD CountryD Diff _diff)
                  2000 "EngWales" 514.79065 5.147907 213991.3 .3893808 9.868109 986.8109 410203.5 .6307347 41568600 90 37411740 . . 0 0 0 0
                  2001 "EngWales" 543.3936 5.433936 227490.4 .3859698 10.14759 1014.759 424826.3 .6407194 41864752 88 36840981.76 . . 0 0 0 0
                  2002 "EngWales" 561.6189 5.616189 236735.14 .3856243 10.30378 1030.3779 434327.75 .6481868 42152278 88 37094004.64 . . 0 0 0 0
                  2003 "EngWales" 578.21014 5.782102 245374.47 .3901539 10.398236 1039.8236 441268.9 .6594344 42436900 89 37768841 . . 0 0 0 0
                  2004 "EngWales" 589.84705 5.898471 252217.1 .3924679 10.423694 1042.3694 445714.5 .669508 42759747 88 37628577.36 236770 9760 0 0 0 0
                  2005 "EngWales" 609.3009 6.093009 263212.2 .39673 10.483427 1048.3427 452874.1 .6790578 43199048 86.99779 37582215.58 253370 12240 0 0 0 0
                  2006 "EngWales" 611.5901 6.115901 266551.03 .40017945 10.398604 1039.8605 453205.3 .6949785 43583285 86.5972 37741903.29 277060 15360 0 0 0 0
                  2007 "EngWales" 629.9857 6.299857 277261.63 .4044064 10.294744 1029.4745 453079.8 .6976805 44010784 87.20243 38378474.31 287450 15860 0 0 0 0
                  2008 "EngWales" 625.3168 6.253168 277827.88 .4189067 9.868193 986.8192 438443.2 .7166591 44429942 85.88465 38158500.86 293520 15780 0 0 0 0
                  2009 "EngWales" 631.8096 6.318097 282888.9 .4361536 9.752556 975.2556 436664.7 .7286623 44774391 85.10242 38104092.82 300930 15390 0 0 0 0
                  2010 "EngWales" 630.8068 6.308067 284977.28 .4522443 9.567835 956.7835 432242.7 .7481729 45176640 84.73016 38278240.49 315870 17540 0 0 0 0
                  2011 "EngWales" 623.0678 6.230678 284025.2 .48290434 9.356197 935.6196 426501.8 .7985248 45584956 83 37835513.48 325910 17770 0 0 0 0
                  2012 "EngWales" 613.7817 6.137817 281609.28 .5021546 9.171845 917.1845 420813.6 .8349898 45881018 81.698 37483874.09 327190 17610 0 0 0 0
                  2013 "EngWales" 610.0798 6.100798 281758.2 .52371943 9.008068 900.8069 416027.1 .8573536 46183826 82.344 38029609.68 321660 15900 0 0 0 0
                  2014 "EngWales" 615.7837 6.157837 286648.9 .5307747 9.001105 900.1106 419003.8 .8714793 46550257 81.69073 38027248.54 329970 16750 0 0 0 0
                  2015 "EngWales" 622.3548 6.223548 292040.03 .5322524 9.007735 900.7735 422688.1 .8882245 46925010 82.8 38853908.28 330010 16070 0 0 0 0
                  2016 "EngWales" 624.3541 6.24354 295288.47 .5345036 8.981342 898.1342 424772.9 .8973002 47295038 80.8 38214390.7 339280 16320 0 0 0 0
                  2017 "EngWales" 630.0804 6.300804 299586.63 .55171394 8.974834 897.4834 426729.7 .9188067 47547364 81.5 38751102 337110 14160 0 0 0 0
                  2018 "EngWales" 648.8198 6.488198 310166 .5633827 9.127992 912.7992 436360.4 .9362261 47804642 81.9 39152002 337870 13700 1 0 0 0
                  2019 "EngWales" 654.9763 6.549763 315060 .5736653 9.146619 914.6619 439975.25 .9545288 48102501 81.9 39395948 357660 15170 1 0 0 0
                  2000 "Sct" 649.6078 6.496078 26492.156 .3867758 10.897624 1089.7625 44442.44 .6140769 4078177 89.72 3658940.404 . . 0 1 0 0
                  2001 "Sct" 663.8988 6.638988 27178.86 .3867254 11.084907 1108.4906 45379.68 .6274047 4093826 89.54 3665611.8 25782 8220 0 1 0 0
                  2002 "Sct" 678.5484 6.785484 27876.01 .3857644 10.986223 1098.6223 45133.42 .6102332 4108183 89.36 3671072.329 27072 8745 0 1 0 0
                  2003 "Sct" 701.8754 7.018754 28934.06 .3885694 11.27755 1127.7549 46490.49 .6175265 4122393 88.9367 3666320.295 27960 8346 0 1 0 0
                  2004 "Sct" 715.2648 7.152648 29658.84 .38729465 11.50004 1150.004 47685.54 .6247438 4146554 89 3690433.06 27687 7497 0 1 0 0
                  2005 "Sct" 732.8439 7.328439 30619.27 .3947784 11.66955 1166.955 48757.05 .6413528 4178143 88.8 3710190.984 29094 7053 0 1 0 0
                  2006 "Sct" 739.8911 7.398911 31126.217 .3988899 11.641888 1164.1888 48975.85 .6580011 4206865 88.6 3727282.39 28758 6579 0 1 0 0
                  2007 "Sct" 755.9628 7.559628 32101.02 .40030175 11.691287 1169.1287 49645.59 .6732001 4246375 88.4 3753795.5 29661 7116 0 1 0 0
                  2008 "Sct" 766.1778 7.661778 32802.04 .4130346 11.6125 1161.2501 49716.1 .684679 4281257 88.2836 3779647.805 31005 7866 0 1 0 0
                  2009 "Sct" 786.3604 7.863605 33905.15 .4304926 11.58546 1158.546 49952.51 .7017193 4311655 87.1363 3757016.636 30483 7491 0 1 0 0
                  2010 "Sct" 790.5214 7.905214 34073.844 .4498936 11.468416 1146.8416 49553.82 .7297109 4344402 85.3275 3706969.617 28644 6852 0 1 0 0
                  2011 "Sct" 782.8397 7.828397 34318.1 .4769129 11.053094 1105.3094 48454.52 .7699904 4383797 85.5175 3748913.599 28026 7176 0 1 0 0
                  2012 "Sct" 740.6321 7.406321 32580.21 .4973112 10.591862 1059.1863 46593.33 .8081049 4398974 85.0143 3739756.953 27690 9063 0 1 0 0
                  2013 "Sct" 734.7968 7.347968 32449.514 .52005833 10.333633 1033.3632 45634.57 .8241854 4416121 83.7044 3696487.586 25983 8370 0 1 0 0
                  2014 "Sct" 745.8414 7.458414 33089.695 .52344036 10.394287 1039.4287 46114.87 .8458828 4436559 84.0854 3730498.381 26100 8025 0 1 0 0
                  2015 "Sct" 745.514 7.455141 33255.43 .52493715 10.349268 1034.9268 46165.37 .8677589 4460738 84.2505 3758194.069 25254 7455 0 1 0 0
                  2016 "Sct" 735.9532 7.359533 33035.344 .5279879 10.24759 1024.759 45999.21 .8827349 4488783 83.8 3761600.154 25197 7347 0 1 0 0
                  2017 "Sct" 740.387 7.40387 33371.89 .5469399 10.256257 1025.6257 46228.62 .8962643 4507358 83.1 1270320 25503 7584 0 1 0 0
                  2018 "Sct" 720.0552 7.200552 32536.4 .5943602 9.939983 993.9983 44914.79 .9504272 4518598 83.6 1193197.5 24906 7425 1 1 1 1
                  2019 "Sct" 721.7499 7.217499 32716.43 .6208439 9.940753 994.0753 45060.76 .9875119 4532932

                  Comment

                  Working...
                  X