Post by Sarah on Jul 31, 2017 15:43:40 GMT 8
This is one of the great advantages of ProFit: It doesn't just use a given algorithm for the optimisation, but lets you choose between lots of options. In fact, there's so many options that I found it rather overwhelming at the start - so maybe people can add some tips and tricks here to help beginners.
Overview: profitSetupData lets you choose between 4 options (the algo.func parameter): 'optim', 'CMA', 'LA' (LaplaceApproximation), 'LD' (LaplacesDemon). Each of these options in turn then has its own options, ranging from 6 different algorithms in optim to ~40 different MCMC algorithms available in LaplacesDemon.
optim is a simple downhill-gradient optimiser, usually fast but may get stuck in local minima. Can be useful for fitting simple objects with good initial guesses (stars), or for obtaining reasonable initial guesses to input into LD. method='BFGS' is probably a good starting point here; or 'L-BFGS-B' if you need limits. Note that you need to set the fnscale parameter to something negative here (i.e. when calling optim, use control=list(fnscale=-1)), otherwise optim performs minimisation rather than maximisation of the likelihood.
CMA I have never used, maybe other people can comment on that.
LA I haven't used much either, but as far as I'm concerned it's similar in speed to optim but uses a different method (which probably has its own advantages and disadvantages). Can also be useful to find initial guesses. Method='BFGS' is probably a sensible choice here as well if you don't really know what you're doing.
LD has lots of MCMC algorithms. They are much slower but also much more robust/better at finding global rather than local maxima and sampling the entire likelihood space. To get an overview of the different algorithms (including a table indicating whether they're suitable for beginnners), I found this website very helpful web.archive.org/web/20150206014000/http://www.bayesian-inference.com/mcmc which is linked to from the documentation. Useful starting points are probably 'CHARM' or 'HARM' although there's lots more that other people may have more experience with.
In general, reading the (rather extensive) documentation usually helps.
Overview: profitSetupData lets you choose between 4 options (the algo.func parameter): 'optim', 'CMA', 'LA' (LaplaceApproximation), 'LD' (LaplacesDemon). Each of these options in turn then has its own options, ranging from 6 different algorithms in optim to ~40 different MCMC algorithms available in LaplacesDemon.
optim is a simple downhill-gradient optimiser, usually fast but may get stuck in local minima. Can be useful for fitting simple objects with good initial guesses (stars), or for obtaining reasonable initial guesses to input into LD. method='BFGS' is probably a good starting point here; or 'L-BFGS-B' if you need limits. Note that you need to set the fnscale parameter to something negative here (i.e. when calling optim, use control=list(fnscale=-1)), otherwise optim performs minimisation rather than maximisation of the likelihood.
CMA I have never used, maybe other people can comment on that.
LA I haven't used much either, but as far as I'm concerned it's similar in speed to optim but uses a different method (which probably has its own advantages and disadvantages). Can also be useful to find initial guesses. Method='BFGS' is probably a sensible choice here as well if you don't really know what you're doing.
LD has lots of MCMC algorithms. They are much slower but also much more robust/better at finding global rather than local maxima and sampling the entire likelihood space. To get an overview of the different algorithms (including a table indicating whether they're suitable for beginnners), I found this website very helpful web.archive.org/web/20150206014000/http://www.bayesian-inference.com/mcmc which is linked to from the documentation. Useful starting points are probably 'CHARM' or 'HARM' although there's lots more that other people may have more experience with.
In general, reading the (rather extensive) documentation usually helps.