Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Thank you very much for your help. Your help was very useful for me.

    Comment


    • #17
      Hi Dear Professor Clyde,

      I applied the method and got seasonally adjusted valor of GDP for each year.

      The doubt I currently have is whether I can apply the following method:

      expand 4
      bysort year : gen qdate = yq(year, _n)
      by year: replace name = . if _n > 1
      list, sepby(year)
      gen logu = log(name)
      ipolate name qdate, gen(linear) epolate
      ipolate logu qdate, gen(log) epolate
      replace log = exp(log)
      format linear log %4.1f
      list, sepby(year)

      to convert annual seasonal values to quarterly seasonal values.

      My goal is to find the log values for each quarter.

      I would like to know if this is econometrically correct, or should I apply some other technique?

      Comment


      • #18
        There are two reasons I cannot comment on this.

        1. You have introduced a new variable, name, that has not been previously contemplated in this thread, and I have no idea what it is.

        2. You are asking what is econometrically correct--I am not an economist, econometrician, or financier. I do not know what is the accepted approach in those disciplines.

        Comment


        • #19
          Thank you for your answer. I understood what you said. In fact, the variable "name" is put as an illustrative example. Considering my data, the command is as follows:


          clear
          input str4 Year str14 pib_real_waemu str11 seasonality str23 seasonally_adjusted_pib str5 waemu
          "1994" "10.471.478" "0" "12035.7" ""
          "1995" "14.920.825" "1" "13356.6" ""
          "1996" "17.921.403" "0" "19485.63" ""
          "1997" "19.774.119" "1" "18209.9" ""
          "1998" "21.581.415" "0" "23145.64" ""
          "1999" "23.308.649" "1" "21744.43" ""
          "2000" "23.778.534" "0" "25342.76" ""
          "2001" "25.244.328" "1" "23680.11" ""
          "2002" "26.784.469" "0" "28348.69" ""
          "2003" "28.936.719" "1" "27372.5" ""
          "2004" "30.210.493" "0" "31774.72" ""
          "2005" "31.326.834" "1" "29762.61" ""
          "2006" "33.363.106" "0" "34927.33" ""
          "2007" "35.031.681" "1" "33467.46" ""
          "2008" "37.934.371" "0" "39498.59" ""
          "2009" "42162.29" "1" "40598.07" ""
          "2010" "44.690.855" "0" "46255.08" ""
          "2011" "46.633.489" "1" "45069.27" ""
          "2012" "52.228.508" "0" "53792.73" ""
          "2013" "57.326.338" "1" "55762.11" ""
          "2014" "61.859.435" "0" "63423.66" ""
          "2015" "66.668.672" "1" "65104.45" ""
          "2016" "72.160.762" "0" "73724.98" ""
          "2017" "76.872.971" "1" "75308.75" ""
          "2018" "81.763.177" "0" "83327.4" ""
          "2019" "87.210.897" "1" "85646.67" ""
          "2019" "87.210.897" "1" "85646.67" ""
          end

          expand 4
          bysort year: gen qdate = yq(year, _n)
          by year: replace waemu =. if _n > 1
          list, sepby(year)
          gen logu = log(waemu)
          ipolate waemu qdate, gen(linear) epolate
          ipolate logu qdate, gen(log) epolate
          replace log = exp(log)
          format linear log %4.1f
          list, sepby(year)

          Comment


          • #20
            Well, thank you for explaining. I still can't tell you what is considered acceptable in your field. What I can tell you is that your code interpolates (and extrapolates) piecewise constant linear and exponential growth in seasonally adjusted gdp. If that's the convention in your discipline, then you're good to go.

            One quibble with the code. In the second and third to last lines, you refer to the variable logu as just log. While that is legal in Stata, it's not a good idea to do it. Whatever you think of using abbreviated variable names, log is also the name of a built-in mathematical function (one you even use earlier in your code). It makes the code potentially confusing for others to read. So I would spell it out fully as logu in those places.

            Comment


            • #21
              Thank you very much, Dear Clyde.

              Comment


              • #22
                Hi Dear,

                I'm trying to use the Hodrick-Prescott filter with quarterly data (the smoothing coefficient is 1600).

                Stata provided the tsset command.

                tsset manages the time series settings for a data set.

                The tsset command does not accept decimal values or repeated values.

                I put my data in integer values using commands float, ceil and ground.

                Using these commands, the values of my data were repeated.

                This made it difficult to use tsset and the Hodrick-Prescott filter.

                I wanted to know if there is another way to apply the Hodrick-Prescott filter with stata.

                My date set .dta is in the annex.

                Thank you very much.

                Attached Files

                Comment


                • #23
                  While it is easy to think of these threads as a dialog between a questioner and a responder, that is not what they are. This is a public forum. Others follow along. And still others come later from time to time searching for pre-existing answers to questions. For that reason, it is very important to keep threads on topic, so others do not waste their time on extraneous material, or fail to find something that has been hidden in a thread with a different title.

                  The question in #22 is completely off topic. Please re-post as a new thread, with an appropriate title. When you do that, please also note the preference, expressed in the FAQ that all forum participants are requested to read before posting, that attachments not be used. Rather, the preferred way to show example data is with the -dataex- command. If you are running version 16 or a fully updated version 15.1 or 14.2, -dataex- is already part of your official Stata installation. If not, run -ssc install dataex- to get it. Either way, run -help dataex- to read the simple instructions for using it. -dataex- will save you time; it is easier and quicker than typing out tables. It includes complete information about aspects of the data that are often critical to answering your question but cannot be seen from tabular displays or screenshots. It also makes it possible for those who want to help you to create a faithful representation of your example to try out their code, which in turn makes it more likely that their answer will actually work in your data.

                  If you re-post in this way, a respondent who is familiar with Stata and whatever ways the Hodrick_prescott filter is implemented will be able to see this and respond.

                  Comment


                  • #24
                    I understand, Dear Clyde. Thank you very much.

                    Comment


                    • #25
                      Hi Dear,

                      Please, help me to apply the Hodrick-Prescott filter with quarterly data following.

                      * Example generated by -dataex-. To install: ssc install dataex
                      clear
                      input byte qtr double(gdp lgdp dlgdp)
                      2 12695.5 4.103061024 .
                      3 13025.8 4.114367418 .011306393
                      4 13356 4.12567641 .011308993
                      5 14888.3 4.166681677 .041005267
                      6 16420.5 4.207688213 .041006536
                      7 17952.8 4.248694038 .041005825
                      8 19485 4.28970041 .041006372
                      9 19166 4.282347902 -.007352508
                      10 18847 4.274993293 -.007354609
                      11 18528 4.267638637 -.007354656
                      12 18209 4.260286096 -.007352541
                      13 19443 4.286328453 .026042357
                      14 20677 4.312372026 .026043573
                      15 21911 4.338414656 .02604263
                      16 23145 4.364457185 .026042529
                      17 22794.8 4.357677623 -.006779562
                      18 22444.5 4.350898972 -.006778651
                      19 22094.3 4.344119034 -.006779938
                      20 21744 4.337339439 -.006779595
                      21 22643.5 4.353964291 .016624852
                      22 23543 4.3705908 .016626509
                      23 24442.5 4.387215362 .016624561
                      24 25342 4.403840887 .016625525
                      25 24926.5 4.396476579 -.007364307
                      26 24511 4.389111129 -.00736545
                      27 24095.5 4.381746651 -.007364479
                      28 23680 4.374381698 -.007364953
                      29 24847 4.39391724 .019535542
                      30 26014 4.413452327 .019535087
                      31 27181 4.432986919 .019534591
                      32 28348 4.452522424 .019535505
                      33 28104 4.448718684 -.00380374
                      34 27860 4.444914077 -.003804607
                      35 27616 4.441110447 -.003803629
                      36 27372 4.437306531 -.003803916
                      37 28472.5 4.45349722 .016190689
                      38 29573 4.469689499 .016192279
                      39 30673.5 4.485880355 .016190856
                      40 31774 4.502071891 .016191536
                      41 31271 4.494969523 -.007102368
                      42 30768 4.487866305 -.007103218
                      43 30265 4.480764141 -.007102164
                      44 29762 4.473662112 -.007102028
                      45 31053.3 4.491036552 .017374439
                      46 32344.5 4.508411198 .017374646
                      47 33635.8 4.52578701 .017375813
                      48 34927 4.543161284 .017374274
                      49 34562 4.538524722 -.004636562
                      50 34197 4.533888939 -.004635783
                      51 33832 4.529253212 -.004635726
                      52 33467 4.524616784 -.004636429
                      53 34974.8 4.542606358 .017989574
                      54 36482.5 4.560596402 .017990044
                      55 37990.3 4.57858535 .017988948
                      56 39498 4.596575105 .017989756
                      57 39773 4.599557775 .00298267
                      58 40048 4.602539622 .002981847
                      59 40323 4.605522678 .002983056
                      60 40598 4.608504639 .002981962
                      61 42012.3 4.622667774 .014163135
                      62 43426.5 4.636831786 .014164012
                      63 44840.8 4.650994855 .014163069
                      64 46255 4.665158685 .014163831
                      65 45958.5 4.662338441 -.002820245
                      66 45662 4.659518786 -.002819655
                      67 45365.5 4.656697939 -.002820848
                      68 45069 4.653877922 -.002820017
                      69 47249.8 4.673088213 .019210291
                      70 49430.5 4.692297758 .019209545
                      71 51611.3 4.711507758 .01921
                      72 53792 4.730717692 .019209934
                      73 54284.5 4.734623037 .003905345
                      74 54777 4.738528468 .003905431
                      75 55269.5 4.742433671 .003905204
                      76 55762 4.746338342 .003904671
                      77 57677.3 4.760315405 .013977062
                      78 59592.5 4.774292827 .013977422
                      79 61507.8 4.788269506 .01397668
                      80 63423 4.802246781 .013977275
                      81 63843.3 4.805086756 .002839975
                      82 64263.5 4.807927204 .002840448
                      83 64683.8 4.810767325 .002840121
                      84 65104 4.813607673 .002840347
                      85 67259 4.827108101 .013500428
                      86 69414 4.840608506 .013500405
                      87 71569 4.85410859 .013500084
                      88 73724 4.86760889 .0135003
                      89 74120 4.869917246 .002308356
                      90 74516 4.872225055 .002307809
                      91 74912 4.87453284 .002307785
                      92 75308 4.876841114 .002308274
                      93 77312.8 4.887827084 .01098597
                      94 79317.5 4.898813457 .010986374
                      95 81322.3 4.90979985 .010986393
                      96 83327 4.920785746 .010985896
                      97 83906.8 4.923766102 .002980356
                      98 84486.5 4.926746708 .002980606
                      99 85066.3 4.92972691 .002980202
                      100 85646 4.932707085 .002980174
                      end

                      Thank you very much.
                      Last edited by ABDOULAYE ABOUBACARI MOHAMED; 24 Dec 2020, 21:16.

                      Comment


                      • #26
                        .
                        Last edited by ABDOULAYE ABOUBACARI MOHAMED; 24 Dec 2020, 21:17.

                        Comment


                        • #27
                          Originally posted by Clyde Schechter View Post
                          Try this:
                          Code:
                          * Example generated by -dataex-. To install: ssc install dataex
                          clear
                          input int year long(pci_boston pci_newyork)
                          2001 64362 61332
                          2002 64223 61063
                          2003 65655 61044
                          2004 67421 62733
                          2005 68926 65390
                          2006 70115 67357
                          2007 71990 68173
                          2008 71445 65854
                          2009 69411 65627
                          2010 71696 67563
                          2011 72643 67187
                          2012 73815 68723
                          2013 72518 68421
                          2014 73595 69105
                          2015 74545 69971
                          end
                          
                          // CREATE AN OBSERVATION FOR 2016
                          isid year
                          expand 2 if year == 2015
                          by year, sort: replace year = 2016 if _n == 2
                          replace pci_boston = . if year == 2016
                          replace pci_newyork = . if year == 2016
                          
                          // NOW EXPAND DATA TO QUARTERLY
                          expand 4
                          by year, sort: gen month = 3*_n
                          gen quarterly_date = qofd(mdy(month, 1, year))
                          format quarterly_date %tq
                          
                          // REPLACE VALUES FOR QUARTERS OTHER THAN 2 WITH
                          // MISSING
                          replace pci_boston = . if inlist(month, 3, 9, 12)
                          replace pci_newyork = . if inlist(month, 3, 9, 12)
                          
                          // INTERPOLATE/EXTRAPOLATE
                          ipolate pci_boston quarterly_date, epolate generate(pci_boston_ep)
                          ipolate pci_newyork quarterly_date, epolate generate(pci_newyork_ep)

                          I used the above do-file for my data which contains decimal places but the output I get has not decimal places. How do I ensure that the decimal places are kept. Thanks you.
                          Below is my do-file.

                          * Example generated using TS_Seminar data
                          clear
                          input year long(GE FDI INF INT BMD GDP)
                          1986 16.32 0.07507504 34.16 20 3.628 5.199160679
                          1987 13.85 0.092613941 26.57 25.5 3.154 4.794898878
                          1988 13.57 0.096193786 30.46 25.58333333 3.139 5.628168765
                          1989 13.41 0.285618303 35.9 28 5.8445 5.0858727
                          1990 11.85 0.251308552 10.26 30 4.928 3.328817883
                          1991 11.34 0.303188963 13.33 20 3.657 5.28182629
                          1992 14.44 0.350800517 27.66 30 4.941 3.879419253
                          1993 19.79 2.095116345 34.18 35 4.838 4.850000564
                          1994 21.71 4.27950048 70.82 33 5.25 3.299999591
                          1995 24.22 1.647296722 26.12 45 5.07 4.11241904
                          1996 22.53 1.730357096 22.14 45 6.005 4.602461045
                          1997 19.2 1.187002423 15.75 45 8.195 4.196357576
                          1998 18.33 2.237678076 13.79 37 9.36 4.700390779
                          1999 16.81 3.156999582 40.54 27 12.56 4.39999686
                          2000 20.1 3.329303379 21.29 27 13.97 3.700000115
                          2001 23.08 1.680555283 15.17 27 11.85 4
                          2002 17.11 0.955673775 23.56 24.5 12.05 4.499999699
                          2003 20.37 1.791715353 11.78 21.5 11.84 5.199999984
                          2004 20.54 1.568114186 14.84 18.5 13.08 5.59999999
                          2005 19.53 1.350866032 10.92 15.5 15.54 5.900003953
                          2006 21.77 3.116281896 12.75 12.5 17.78 6.399912419
                          2007 23.09 5.586606897 18.13 13.5 18.9 4.346819153
                          2008 24.51 9.517042508 15.97 17 20 9.149799094
                          2009 22.35 9.132935172 8.579 18 22 4.845756132
                          2010 23.92 7.855067085 8.579 13.5 11 7.899740293
                          2011 24.43 8.207966429 9.751 12.5 13.52 14.04600263
                          2012 25.08 7.855367882 9.01 15 15.6 9.292511869
                          2013 23.66 6.75033011 8.51 16 17.5 7.312525021
                          2014 24.65 8.70971294 8.01 21 11.8 3.985865624
                          2015 24.49 8.503022714 7.51 26 11.95 3.915895644
                          end

                          // CREATE AN OBSERVATION FOR 2016
                          isid year
                          expand 2 if year == 2015
                          by year, sort: replace year = 2016 if _n == 2
                          replace GDP = . if year == 2016
                          replace GE = . if year == 2016
                          replace FDI = . if year == 2016
                          replace INF = . if year == 2016
                          replace INT = . if year == 2016
                          replace BMD = . if year == 2016

                          // NOW EXPAND DATA TO QUARTERLY
                          expand 4
                          by year, sort: gen month = 3*_n
                          gen quarterly_date = qofd(mdy(month, 1, year))
                          format quarterly_date %tq

                          // REPLACE VALUES FOR QUARTERS OTHER THAN 2 WITH
                          // MISSING
                          replace GDP = . if inlist(month, 3, 9, 12)
                          replace GE = . if inlist(month, 3, 9, 12)
                          replace FDI = . if inlist(month, 3, 9, 12)
                          replace INF = . if inlist(month, 3, 9, 12)
                          replace INT = . if inlist(month, 3, 9, 12)
                          replace BMD = . if inlist(month, 3, 9, 12)


                          // INTERPOLATE/EXTRAPOLATE
                          ipolate GDP quarterly_date, epolate generate(GDP_ep)
                          ipolate GE quarterly_date, epolate generate(GE_ep)
                          ipolate FDI quarterly_date, epolate generate(FDI_ep)
                          ipolate INF quarterly_date, epolate generate(INF_ep)
                          ipolate INT quarterly_date, epolate generate(INT_ep)
                          ipolate BMD quarterly_date, epolate generate(BMD_ep)

                          Comment


                          • #28
                            It is important to read and understand code before adapting it to new data. Code is necessarily written to work with specific data sets, and different kinds of data may break it. That is what happened here.

                            If you look at the original example data for which the code you quoted in #27 was written, all of the variables are integer-valued. This is reflected in the -input- command near the top where Stata is told to input the data as -long- (which means long integer data type). It appears that you simply pasted your data, where the values are not integers, over the data in that -dataex- command. So now Stata has been told to input these numbers as long integers. Well, it may not make sense to do it, but Stata does what you told it to do: it strips the decimal places away because you specified that you wanted integer variables.

                            The solution is to simply change the word -long- to -float- or -double- and you will solve that problem.

                            But more generally, when you adapt code to different data, you need to understand exactly what it is doing first: simply substituting new data can cause problems, as here. There is another aspect of this that I find odd. It is clear that you did not create this code by -use-ing your actual data set and running -dataex-. If you had, -dataex- would not have specified -long-. It would have specified -float- or -double-. Instead, you just pasted your data into the original code, without changing anything else. Why did you do that? The purpose of -dataex- is not to serve as a way to load data into Stata for production purposes. It is specifically intended to create code that allows people using Statalist to exchange short examples of data without using attachments or providing inadequate information (as one would get from -list- output or HTML tables or the like. But in regular use, -dataex- would generaly not be used to create data sets: Stata has other regular commands, as well as the Data Editor, for that purpose.

                            Comment


                            • #29
                              Dear Clyde Schechter ,
                              Thank you for providing the code for expanding annual data into quarterly. My question is: any hints on expanding annual PANEL data into quarterly panel? The data I am working on looks like this: (Data is long, so I sampled 3 countries and 6 years. Normally, I have 33 countries and 28 years. My goal is to make this into 3696 quarters)

                              Country CountryID Year Density Penetration
                              Australia 1 2015 2926 5.8
                              Australia 1 2016 2660 5.3
                              Australia 1 2017 2540 4.7
                              Australia 1 2018 2383 4.3
                              Australia 1 2019 2012 3.8
                              Australia 1 2020 1835 3.5
                              Austria 2 2015 2227 5
                              Austria 2 2016 2143 4.7
                              Austria 2 2017 2176 4.6
                              Austria 2 2018 2284 4.5
                              Austria 2 2019 2219 4.4
                              Austria 2 2020 2285 4.7
                              Belgium 3 2015 2653 6.4
                              Belgium 3 2016 2613 6.2
                              Belgium 3 2017 2653 6
                              Belgium 3 2018 2903 6.1
                              Belgium 3 2019 2853 6.1
                              Belgium 3 2020 2744 6.1

                              Thank you so much everyone!

                              Comment


                              • #30
                                #29 is a new question but most of the likely answers have already been given in this thread.

                                Backing up, why do you want to do this? If you have other data that are quarterly, a merge that is 1:m or m:1 will automatically expand your yearly dataset here when merging with quarterly data.

                                Otherwise, if you want 4 quarterly values to be different reflecting trends and/or seasonality, the question comes back to you as what interpolation or smoothing makes sense for your data and goals, and what other information you have.

                                Comment

                                Working...
                                X