Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • gaps in panel

    I have panel data of two year survey as the example below:

    * Example generated by -dataex-. To install: ssc install dataex
    clear
    input int year double map byte(age mentalindex emptype tenure se status metfam weight smoking pipe alc drug)
    2014 592014 17 1 0 0 0 2 1 2 1 1 1 2
    2009 1302009 28 0 0 2 0 0 1 0 0 2 1 1
    2009 1892009 25 0 3 0 0 0 1 0 0 1 1 2
    2009 3182009 24 0 2 0 1 2 1 1 1 1 1 2
    2009 3312009 17 0 2 0 1 2 1 0 1 1 1 2
    2009 3682009 22 0 3 0 0 2 1 0 1 1 1 2
    2014 3682014 28 0 0 0 0 2 1 0 1 1 1 2
    2009 3782009 23 0 2 2 1 2 1 0 1 1 1 2
    2009 3872009 25 0 0 0 0 2 1 1 0 1 2 2
    2014 3872014 28 0 0 0 0 0 1 1 0 2 1 2
    2014 4212014 24 0 0 2 0 0 1 0 0 1 1 2
    2014 4542014 19 0 3 2 0 2 1 0 1 1 2 2
    2009 4682009 19 1 3 0 0 2 1 0 1 1 1 2
    2014 4682014 25 0 2 0 0 2 1 0 0 2 1 2
    2009 4702009 18 1 0 0 0 2 1 0 0 2 2 0
    2014 4702014 25 0 0 0 0 2 1 0 0 1 1 2
    2014 4882014 25 0 3 0 0 2 1 0 0 1 1 2
    2014 6782014 20 0 0 2 1 2 1 0 1 1 1 2
    2014 11092014 30 0 3 0 0 2 1 1 0 2 1 2
    2009 11102009 23 0 0 0 0 0 1 1 1 1 1 2
    2009 11262009 25 1 0 0 0 0 1 0 1 1 1 2
    2009 12122009 25 0 0 0 0 0 1 0 0 2 1 2
    2014 12122014 29 0 2 0 0 0 1 0 0 1 1 2
    2009 12182009 24 0 0 2 0 0 1 0 0 1 1 2
    2009 12742009 28 1 2 0 0 0 1 0 0 2 1 2
    2009 12922009 27 0 0 2 0 0 1 0 1 0 1 2
    2014 12922014 32 0 2 2 0 0 1 0 0 1 1 2
    2009 12952009 29 0 0 0 0 0 1 0 1 1 1 2
    2014 12952014 34 0 2 0 0 0 1 0 0 1 1 2
    2009 12972009 26 0 3 0 0 0 1 0 0 1 1 2
    2014 12972014 31 0 3 0 0 0 1 0 0 1 1 2
    2014 13052014 35 0 0 0 0 0 1 0 0 1 1 2
    2014 15192014 33 0 0 0 0 0 1 1 0 1 1 2
    2009 15612009 22 0 3 2 0 0 1 0 1 1 1 2
    2009 15622009 25 0 0 2 0 0 1 0 1 0 1 2
    2009 15712009 27 0 0 2 0 0 1 0 0 1 1 2
    2014 15712014 31 0 0 2 0 0 1 0 0 1 1 2
    2009 16112009 25 1 0 0 0 0 1 0 1 1 1 2
    2014 16112014 29 0 1 2 0 0 1 1 4 1 1 1
    2009 17722009 26 0 1 0 0 2 1 0 1 1 1 2
    2009 17772009 27 0 0 0 0 0 1 0 0 2 1 2
    2014 17772014 32 0 0 0 0 0 1 0 0 2 1 0
    2009 18252009 22 0 0 0 0 0 1 0 1 1 1 2
    2009 18282009 27 0 0 0 0 0 1 0 0 2 1 2
    2014 18282014 32 0 0 0 0 0 1 0 0 2 1 0
    2009 18412009 27 0 2 0 0 1 1 0 1 1 1 2
    2009 18522009 22 0 3 0 0 0 1 0 1 1 1 2
    2014 18522014 27 0 0 0 0 0 1 0 0 2 1 2
    2009 18572009 23 0 0 2 0 2 1 0 1 1 1 2
    2009 19242009 28 0 0 2 0 0 1 0 0 1 1 2
    2014 19242014 32 0 2 2 0 0 1 0 0 2 1 2
    2009 19252009 27 0 0 0 0 0 1 0 0 1 1 2
    2014 19252014 31 0 3 2 0 0 1 0 1 0 1 2
    2009 19322009 22 0 0 2 0 0 1 0 0 1 1 2
    2014 19322014 26 0 0 0 0 1 1 1 1 2 1 2
    2009 19352009 24 1 0 2 0 0 0 0 1 1 1 2
    2014 21052014 29 0 0 0 0 0 1 0 4 1 1 2
    2009 21402009 22 1 1 2 1 2 1 0 1 1 1 2
    2014 21402014 25 0 0 2 1 0 1 0 1 1 1 2
    2014 21812014 33 1 0 0 1 2 1 0 1 1 1 2
    2009 22042009 25 0 0 0 0 2 1 0 1 1 1 2
    2014 22042014 29 0 0 0 0 0 1 1 1 1 1 2
    2014 22052014 30 0 2 0 0 0 1 0 1 1 1 2
    2009 22622009 28 0 1 0 1 0 1 0 1 0 1 2
    2009 23632009 22 0 0 0 0 2 1 1 0 1 1 2
    2014 23632014 30 0 3 2 0 2 1 1 0 1 1 2
    2009 23922009 29 1 1 2 1 0 1 1 1 1 1 2
    2014 23922014 34 0 0 2 1 0 1 1 1 1 1 2
    2009 24342009 29 0 0 2 1 2 1 0 1 1 1 2
    2014 24342014 35 0 0 2 1 1 1 0 1 1 1 2
    2009 24512009 28 0 0 2 0 2 1 0 0 2 1 2
    2014 24512014 32 0 0 2 0 2 1 1 1 1 1 2
    2014 24562014 28 0 2 2 0 2 1 0 0 2 1 2
    2009 24642009 26 0 1 2 1 2 1 0 1 1 1 2
    2014 24642014 30 1 0 2 1 2 1 1 1 1 1 2
    2009 24902009 28 1 0 2 1 0 1 0 1 1 1 2
    2014 24902014 31 1 0 2 1 0 1 1 1 1 1 2
    2014 25622014 29 0 0 2 1 0 1 0 0 1 1 2
    2009 25712009 25 0 0 2 1 0 1 1 1 1 1 2
    2014 25712014 30 0 0 2 1 0 1 1 1 1 1 2
    2014 25832014 23 0 3 0 0 0 1 0 0 1 1 2
    2009 27212009 27 1 2 0 1 2 0 1 1 1 1 2
    2014 27212014 32 0 4 0 1 2 1 0 1 1 1 2
    2009 27542009 28 0 0 2 1 2 1 0 1 1 1 2
    2014 27542014 27 0 0 2 1 2 1 0 1 1 1 2
    2014 27562014 20 0 4 2 1 2 1 0 1 1 1 2
    2009 27752009 28 0 0 2 0 2 1 0 1 1 1 2
    2014 27752014 32 0 0 2 0 2 1 1 1 1 1 2
    2014 27912014 24 0 0 0 0 2 1 0 1 1 1 2
    2009 28422009 26 0 0 0 0 2 1 1 1 1 1 2
    2014 28422014 28 0 2 0 0 2 1 0 0 2 1 2
    2014 28532014 25 1 0 2 0 2 1 0 0 2 1 2
    2009 28802009 22 0 0 0 0 2 1 1 4 1 1 2
    2014 28802014 28 0 0 2 0 2 1 1 0 1 1 0
    2009 29242009 28 1 0 2 1 0 1 0 1 1 1 2
    2014 29242014 32 1 2 2 1 0 1 0 1 1 1 2
    2014 29262014 32 1 2 0 1 0 1 1 1 1 1 2
    2014 31112014 23 0 0 0 0 2 1 0 2 1 1 2
    2014 31402014 25 1 0 2 1 2 1 0 1 1 1 2
    2014 31952014 24 0 0 0 0 2 1 0 1 1 1 2
    end
    [/CODE]
    ------------------ copy up to and including the previous line ------------------

    Listed 100 out of 4230 observations

    Then


    . xtset map year, delta(5)
    panel variable: map (weakly balanced)
    time variable: year, 2009 to 2014
    delta: 5 units
    but I can't run any reg as it gives me no observations error, so with tsreport i found i have gaps almost my entire set:

    tsreport

    Panel variable: map
    Time variable: year
    --------------------------
    Starting period = 2009
    Ending period = 2014
    Observations = 4,230
    Number of gaps = 4,229
    (Gap count includes panel changes)

    I know the ordering of year should be 2009 then 2014 but I can't figure out how to manage that

    . list year in 1/10

    +------+
    | year |
    |------|
    1. | 2014 |
    2. | 2009 |
    3. | 2009 |
    4. | 2009 |
    5. | 2009 |
    |------|
    6. | 2009 |
    7. | 2014 |
    8. | 2009 |
    9. | 2009 |
    10. | 2014 |
    +------+

  • #2
    Fatma:
    welcome to this forum.
    Two comments about your query:
    - you do not provide the code of the regression you ran; so, I cannot reply your example;
    - if you have panel data with a continuous regressand, your first choice should be -xtreg-, not -regress-.
    Kind regards,
    Carlo
    (Stata 18.0 SE)

    Comment


    • #3
      Thanks for the data example. Your identifier includes the year, so is never repeated.

      Code:
      gen id = floor(map/10000)
      xtset id year, delta(5)
      
      list id map year in 1/20, sepby(id)
      
           +------------------------+
           |   id        map   year |
           |------------------------|
        1. |   59     592014   2014 |
           |------------------------|
        2. |  130    1302009   2009 |
           |------------------------|
        3. |  189    1892009   2009 |
           |------------------------|
        4. |  318    3182009   2009 |
           |------------------------|
        5. |  331    3312009   2009 |
           |------------------------|
        6. |  368    3682009   2009 |
        7. |  368    3682014   2014 |
           |------------------------|
        8. |  378    3782009   2009 |
           |------------------------|
        9. |  387    3872009   2009 |
       10. |  387    3872014   2014 |
           |------------------------|
       11. |  421    4212014   2014 |
           |------------------------|
       12. |  454    4542014   2014 |
           |------------------------|
       13. |  468    4682009   2009 |
       14. |  468    4682014   2014 |
           |------------------------|
       15. |  470    4702009   2009 |
       16. |  470    4702014   2014 |
           |------------------------|
       17. |  488    4882014   2014 |
           |------------------------|
       18. |  678    6782014   2014 |
           |------------------------|
       19. | 1109   11092014   2014 |
           |------------------------|
       20. | 1110   11102009   2009 |
           +------------------------+

      Comment


      • #4
        Thanks Carlo and Nick.

        Carlo, you are correct, I mean I want to use pvar but when I do

        var mentalindex emptype smoking, lags(1)
        no observations
        r(2000);

        and for
        . pvar mentalindex emptype
        no non-missing values returned for equation 1 at initial values
        r(498);

        Nick, I did what you did and I still have gaps fewer though, but can this be the reason i can't go on with pvar model?

        Comment


        • #5
          I have never used pvar so can’t advise authoritatively — but panels that short do constrain what will work.

          Comment

          Working...
          X