Hello!
I'm trying to make a Monte Carlo Simulation with time series and I can't get what I'm doing wrong due to my little knowledge in Stata. With my ARMA model I'm able to generate a time series with 300 observations. My idea is to do this process 1000 times and in every process I want to calculate and save the mean and variance of the 300 observations in a matrix.
This is my ARMA model:

This is my code:
Code:
clear all set more off set matsize 1000 matrix simulaciones =J(1000,2,0) *To save every simulation of every time series generated matrix serie = J(300,3,0) *To save each time series set obs 300 *For the 300 observations in every time series gen t = _n tsset t g y1=0 forvalue j = 1(1)1000{ * Creating a time series forvalues i = 1(1)300 { gen e = rnormal(0,1) replace y1=0 if t==1 replace y1 = 0.7*L1.y1 + e - 0.6*L1.e if t == 2 replace y1 = 0.7*L1.y1 - 0.1*L2.y1 + e - 0.6*L1.e + 0.08*L2.e if t > 2 matrix serie[`i',3] = y1 drop e y1 } svmat serie matrix simulaciones[`j',1] = mean(y1) matrix simulaciones[`j',2] = var(y1) }
I'm very new to Stata and I'm finding trouble to even look for answers on the internet.
If you could see what I'm doing wrong, it would be awesome.
For now, I can generate a single time series with 300 observations.
So, the inner for values works, but the outer one doesn't.
Thank you for your time and help

Comment