regression
only.Expand description
Various regression models to fit the best line to your data. All written to be understandable.
Vocabulary:
- Predictors - the independent values (usually denoted
x
) from which we want a equation to get the: - outcomes - the dependant variables. Usually
y
orf(x)
. - model - create an equation which optimally (can optimize for different priorities) fits the data.
The *Coefficients
structs implement Predictive
which calculates the predicted outcomes
using the model and their determination; and Display
which can be used to
show the equations.
Linear regressions are often used by other regression methods. All linear regressions therefore
implement the LinearEstimator
trait. You can use the *Linear
structs to choose which method to
use.
§Info on implementation
Details and comments on implementation can be found as docs under each item.
§Power & exponent
See derived
for the implementations.
I reverse the exponentiation to get a linear model. Then, I solve it using the method linked above. Then, I transform the returned variables to fit the target model.
This is not very good, as the errors of large values are reduced compared to small values when taking the logarithm. I have plans to address this bias in the future. The current behaviour is however still probably the desired behaviour, as small values are often relatively important to larger.
Many programs (including LibreOffice Calc) simply discards negative & zero values. I chose to go the explicit route and add additional terms to satisfy requirements. This is naturally a fallback, and should be a warning sign your data is bad.
Under these methods the calculations are inserted, and how to handle the data.
Re-exports§
pub use binary_search::Options as BinarySearchOptions;
pub use derived::exponential_ols;
ols
pub use derived::power_ols;
ols
pub use gradient_descent::ParallelOptions as GradientDescentParallelOptions;
pub use gradient_descent::SimultaneousOptions as GradientDescentSimultaneousOptions;