This is a variant of empirical gradient descent that tries to estimate the gradient using a minimal number of samples. It is more efficient than empirical gradient descent, but it only works well if the optimization surface is quite locally linear.
|
| GSampleClimber (GTargetFunction *pCritic, GRand *pRand) |
|
virtual | ~GSampleClimber () |
|
virtual const GVec & | currentVector () |
| Returns the best vector yet found. More...
|
|
virtual double | iterate () |
| Performs a little more optimization. (Call this in a loop until acceptable results are found.) More...
|
|
void | setAlpha (double d) |
| Sets the alpha value. It should be small (like 0.01) A very small value updates the gradient estimate slowly, but precisely. A bigger value updates the estimate quickly, but never converges very close to the precise gradient. More...
|
|
void | setStepSize (double d) |
| Sets the current step size. More...
|
|
| GOptimizer (GTargetFunction *pCritic) |
|
virtual | ~GOptimizer () |
|
void | basicTest (double minAccuracy, double warnRange=0.001) |
| This is a helper method used by the unit tests of several model learners. More...
|
|
double | searchUntil (size_t nBurnInIterations, size_t nIterations, double dImprovement) |
| This will first call iterate() nBurnInIterations times, then it will repeatedly call iterate() in blocks of nIterations times. If the error heuristic has not improved by the specified ratio after a block of iterations, it will stop. (For example, if the error before the block of iterations was 50, and the error after is 49, then training will stop if dImprovement is > 0.02.) If the error heuristic is not stable, then the value of nIterations should be large. More...
|
|