Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • optimize_init_technique(S, "nr")

    Hi,

    I have a methodological question about how optimize works with the Newton-Raphson algorithm.

    At each iteration optimize updates the parameters according to the following formula:

    new parameters = old parameters - lambda*(Hessian^-1)*gradient

    At the beginning the scalar lambda is set equal to some constant lambda0, but if the objective function fails to improve, lambda is reduced.

    However, if this does not work, after a while the initial lambda is actually increased and the search starts over.

    I was wondering how the increase of lambda works. Is there a maximum number of iterations after which if decreasing lambda does not improve the objective function, the initial lambda is increased? Or is this modelled in a iterative way by moving from increasing/decresing lambda until the objective function is maximized? Any help would be greatly appreciated.

    Thank you very much.

    Simone
Working...
X