Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • AR(1) simulations

    Hi,

    I'm new to stata programming and need some help on a simulation exercise.

    I need to simulate an AR(1) process defined by:

    yt = Byt-1 + et

    et ~ N(0;1)

    y0 = 0

    For this exercise B can assume three values: i) '1.0'; ii) '0.9'; and iii) '0.5'. I need to simulate 1,000 series of 100 obs for each one of these three Bs.

    Then I'm asked to estimate everything through OLS, so I can compare the results and comment the properties of the estimators.

    Can anyone help me to implement a routine for this? I'm currently using STATA 14.

    Thanks in advance.

  • #2
    No idea if this is efficient but:

    Code:
    clear all 
    version 14.2
    set seed 123
    set matsize 1000 
    set more off 
    
    mat def coefs = J(3, 1000, .)
    loc bs 1 .9 .5 
    loc bcount = 1 
    
    foreach b in `bs' { 
        
        forvalues i = 1 / 1000 {
        qui {
            clear 
            set obs 100
    
            gen e = rnormal()
            gen y= 0 in 1
            
            forvalues j = 2 /100 {
                replace y = `b' * y[_n-1] + e in `j'
            }
            gen y_1 = y[_n-1]
            
            reg y y_1
            mat reg = r(table)
            mat coefs[`bcount',`i'] = reg[1,1]        
        }
        }
        loc ++bcount 
    
    }
    clear
    svmat coefs

    Comment


    • #3
      Hi Julian Duggan

      Te code worked perfectly. I needed to show that the estimators grow assimetric as the estimatives tends to 1.

      Thank you very much.

      Comment


      • #4
        This thread has attracted interest outside Statalist, so the following may be of interest. Focusing only on the first part of #1 as quoted below, I offer this shorter code,

        I need to simulate an AR(1) process defined by:

        yt = Byt-1 + et

        et ~ N(0;1)

        y0 = 0

        For this exercise B can assume three values: i) '1.0'; ii) '0.9'; and iii) '0.5'. I need to simulate 1,000 series of 100 obs for each one of these three Bs.

        Code:
        clear all 
        set seed 123
        set obs 100000 
        
        egen i = seq(), block(100)
        bysort i : gen t = _n 
        xtset i t 
        
        gen e = . 
        tokenize "1.0 0.9 0.5"
        
        forval j = 1/3 { 
               qui replace e = rnormal()
               gen y`j'= 0 if t == 1
               replace y`j' = ``j'' * L.y`j' + e if t > 1 
        } 
        
        
        line y1 t if _n < 1000, c(L) scheme(s1color)
        The graph is propaganda, to show that all looks well. The main coding point is that you don't need so many loops. Your multiple series can be a panel dataset and you don't need to loop over observations, at least for this part of the problem.

        The regressions in #1 are better done with rangestat (SSC),

        Comment

        Working...
        X