Machine Learning in Ocient
Whether you are training a model or using machine learning functionality to do scoring or prediction, enables machine learning in the SQL syntax directly.
You can use the CREATE MLMODEL command to create models. This step is for model training and is referred to as model creation. Similarly, you can use the model on new input data to generate new predictions by executing the scalar function that has the same name as the model. This action is typically referred to as executing the model.
Machine learning in depends upon linear algebra functionality that is built into the system. OcientML provides machine learning capabilities that you can invoke in SQL statements with some application logic.
Matrices are a first class data type in the Ocient system. Create a matrix using a simple SQL SELECT statement.
You can also use shorthand notations to create row _r or column vectors _c. For example, _r{1,2,3} creates a row vector with values 1.0, 2.0, and 3.0.
You can use functions on values in the vectors. This query does some matrix math, finds the inverse matrix, and then returns the two eigenvalues and eigenvectors of the inverse.
Regression is the process of finding a best fit function for a set of data. The simplest case is simple linear regression. Simple linear regression finds a best fit linear relationship between one variable and another.
Suppose you have this data with a bit of noise added.
Create a simple linear regression model against this data.
Explore the created model by viewing the system catalog.
You can see the calculated slope and the y-intercept from the model, and see that these values are very close to the values used to generate the data. While the input to the model creation in this case is very simple, the SQL statement in the ON clause of the model creation can be an arbitrarily complex SQL statement as long as it meets the requirements of the input for the specific model type being created.
Execute the created model by executing the SQL scalar function with the same name as the model you created. The input arguments of the function depend on the specific model type. For regression, the values are the independent variables and then the function returns the dependent variables.
The model makes predictions that are very close to the actual training data. Then, you can use this data to generate predictions for data that is unknown.
You can provide options to most model types. Provide options by pairing the string for an option to its corresponding value. For example, simple linear regression models have a metrics option that you can use to ask the database to calculate some quality of fit metrics. Use the same model and enable metrics collection, which is off by default.
Examine the system catalog tables for more fields.
The database also calculates the coefficient of determination (frequently called r2), the adjusted r2 value, and the RMSE (root mean squared error). All of these values indicate that this model has an extremely good fit.
In addition, you can force a specific y-intercept, such that the value of f(0) is fixed. Use the yIntercept option. Create a model on the same data but force the y-intercept to be zero. The metrics show that the model is not a good fit.
In multiple linear regression, the model is still predicting an output value. But instead of handling one input variable, the model can handle an arbitrary number of input variables. Multiple linear regression finds the best fit model of the form:
Simple linear regression and multiple linear regression (as well as polynomial regression and linear combination regression) all use a least-squares loss function. Some of the more advanced models allow you to define best fit yourself if the default definition is not satisfactory. Least-squares best fit means that the model minimizes the sum (over all training data rows) of the square of the model error.
Create a training data set with some noise and show that the model can reconstruct the data. Multiple linear regression models also support a metrics option (the metrics are the same as simple linear regression).
With polynomial regression, the database finds a least-squares best fit polynomial of whatever degree you specify. Same as multiple linear regression, you determine the number of independent variables.
The training data has a 1/(x1+x2) term that the model does not match, but that term quickly tends towards zero, so the model should provide a good fit with a quadratic polynomial of x1 and x2. Some of the coefficients match the data but some do not because the model is compensating for not being able to fit the form of the data exactly. However, the metrics indicate that this model is still a great fit.
Polynomial regression models have a lot of available options, including everything that is available for multiple linear regression. The negativePowers option enables you to fit Laurent polynomials. If you set this option to true, the model includes independent variables raised to negative powers. These variables are called Laurent polynomials. The model generates all possible terms such that the sum of the absolute value of the power of each term in each product is less than or equal to the order.
For example, with two independent variables and the order set to 2, the model is:
y = a1*x1^2 + a2*x1^-2 + a3*x2^2 + a4*x2^-2 + a5*x1*x2 + a6*x1^-1*x2 + a7*x1*x2^-1 + a8*x1-1*x2-1 + a9*x1 + a10*x1^-1 + a11*x2 + a12*x2^-1 + b
You can see that now the model almost perfectly recovers the coefficients, with the rest being nearly zero.
Execute the pr2 model.
The most complex type of linear regression that Ocient supports is linear combination regression. This regression is a generalization of polynomial regression. Polynomial regression tries to find the best linear combination of polynomial terms (x1, x2,x1^2,x2^2,x1*x2, and so on). Linear combination regression finds the best fit linear combination where you can specify all the terms.
Assume this training data.
Fit this data with some kind of higher degree polynomial model.
The coefficient of determination shows that the model is not a bad fit, but the fit is not great due to the structure of the data. However, given that the model is still a linear combination of functions, linear combination regression can recover the fit if you specify the functions to use.
You can see that the model finds the coefficients perfectly and has a coefficient of determination of 1.0.
The coefficient of determination represents the amount of variance in the data that is explained by the model. A coefficient of determination of 1.0 indicates that all the variance is explained, and that the model is a perfect fit. A coefficient of determination of 0.0 indicates that none of the variance is explained. A negative coefficient of determination is possible, but you do not find it in linear regression models because these models always find a fit that is at least as good as the constant function (the average of all the training data). That model has a coefficient of determination of 0.0.