This project has moved and is read-only. For the latest updates, please go here.

Using Brent to find minima

Jan 17, 2014 at 1:14 AM
Hi,

I'm trying to use the Brent algo to find the minima for a given function. It converges very quickly, however it doesn't "stop" once the result is close enough. I need it to stop at the nearest 1/100. I believe the accuracy is the value I need to use, but the value I've tested (1, 0.01) don't give me result I need.

From looking at the code, it seems I should hit "return true;" once my required accuracy is reached...
 // convergence check
                double xAcc1 = 2.0*Precision.DoublePrecision*Math.Abs(root) + 0.5*accuracy;
                double xMidOld = xMid;
                xMid = (upperBound - root)/2.0;

                if (Math.Abs(xMid) <= xAcc1 && froot.AlmostEqualNormRelative(0, froot, accuracy))
                {
                    return true;
                }
Any suggestions?

Thanks,

Dominic
Jan 17, 2014 at 1:30 AM
Actually stepping through it again with the accuracy set to 0.01 I end up hitting return false;

 if (xMid == xMidOld)
                {
                    // accuracy not sufficient, but cannot be improved further
                    return false;
                }
The problem I have is that the next thing that happens is an exception is thrown.

throw new NonConvergenceException(Resources.RootFindingFailed);
My assumption on the way accuracy works, is that once the result varies by less than the accuracy (0.01 in my case), the result cannot be improved further or in my case is close enough for my purpose.

Is there a way I can use the library to return the "close enough" value, without change the source code - so I can easily update to latest library version in the future?

Thanks again!

Dominic

P.S. I believed the quote source code is from the 3.x version.