Analysis in Ocient
Machine Learning in Ocient

Regression Analysis

Regression is the process of finding a best-fit function for a data set.

Simple Linear Regression

The simplest case is simple linear regression. Simple linear regression finds a best-fit linear relationship between one variable and another.

Suppose you have this data with a bit of noise added. Use the CREATE TABLE AS SELECT SQL statement to create the mldemo.slr table. Generate 100 rows of random data.

SQL


Create a simple linear regression model using this data.

SQL


Explore the created model by viewing the data in the machine_learning_models and simple_linear_regression_models system catalog tables.

SQL


You can see the calculated slope and the y-intercept from the model, and see that these values are very close to the values used to generate the data. While the input to the model creation, in this case, is very simple, the SQL statement in the ON clause of the model creation can be arbitrarily complex as long as it meets the requirements of the input for the specific model type being created.

Execute the created model by executing the SQL scalar function with the same name as the model you created. The input arguments of the function depend on the specific model type. For regression, the values are the independent variables and then the function returns the dependent variables.

SQL


The model makes predictions that are very close to the actual training data. Then, you can use this data to generate predictions for data that is unknown.

Machine Learning Model Options

You can provide options for most model types. Provide options by pairing the string for an option to its corresponding value. For example, simple linear regression models have a metrics option that you can use to ask the database to calculate some quality of fit metrics. Use the same model and enable metrics collection, which is off by default.

SQL


Examine the machine_learning_models and simple_linear_regression_models system catalog tables for more fields.

SQL


The database also calculates the coefficient of determination (frequently called r2), the adjusted r2 value, and the RMSE (root mean squared error). All of these values indicate that this model has an extremely good fit.

In addition, you can force a specific y-intercept, such that the value of f(0) is fixed. Use the yIntercept option. Create a model on the same data but force the y-intercept to be zero. The metrics show that the model is not as good a fit as when the intercept is free to vary.

SQL


Multiple Linear Regression

In multiple linear regression, the model is still predicting an output value. But instead of handling one input variable, the model can handle an arbitrary number of input variables. Multiple linear regression finds the best-fit model of the form:

y = a1x1 + a2x2 + a3x3 + … + a0

Simple linear regression and multiple linear regression (as well as polynomial regression and linear combination regression) all use a least-squares loss function. Some of the more advanced models allow you to define best fit yourself if the default definition is not satisfactory. Least-squares best fit means that the model minimizes the sum (over all training data rows) of the square of the model error.

Create a training data set with some noise and show that the model can reconstruct the data. Multiple linear regression models also support a metrics option (the metrics are the same as simple linear regression).

SQL


Polynomial Regression

With polynomial regression, the database finds a least-squares best-fit polynomial of whatever degree you specify. Same as multiple linear regression, you determine the number of independent variables.

The training data has a 1/(x1 * x2) term that the model does not match, but that term quickly tends towards zero, so the model should provide a good fit with a quadratic polynomial of x1 and x2. Some of the coefficients match the data but some do not because the model is compensating for not being able to fit the form of the data exactly. However, the metrics indicate that this model is still a great fit.

SQL


Polynomial regression models have many available options, including everything available for multiple linear regression. The negativePowers option enables you to fit Laurent polynomials. If you set this option to true, the model includes independent variables raised to negative powers. Sums of such terms are named Laurent polynomials. The model generates all possible terms such that the sum of the absolute value of the power of each term in each product is less than or equal to the order.

For example, with two independent variables and the order set to 2, the model is:

y = a1*x1^2 + a2*x1^-2 + a3*x2^2 + a4*x2^-2 + a5*x1*x2 + a6*x1^-1*x2 + a7*x1*x2^-1 + a8*x1^-1*x2^-1 a9*x1 + a10*x1^-1 + a11*x2 + a12*x2^-1 + b

SQL


You can see that now the model almost perfectly recovers the coefficients, with the rest being nearly zero.

Execute the pr2 model.

SQL


Linear Combination Regression

The most complex type of linear regression that Ocient supports is linear combination regression. This regression is a generalization of polynomial regression. Polynomial regression tries to find the best linear combination of polynomial terms (x1, x2,x1^2,x2^2,x1*x2, and so on). Linear combination regression finds the best fit linear combination where you can specify all the terms.

Assume this training data. Use the CREATE TABLE AS SELECT SQL statement to create the mldemo.lcr table. Generate 1000 rows of data.

SQL


Fit this data using a fifth-degree polynomial model.

SQL


The coefficient of determination shows that the model is not a bad fit, but the fit is not great due to the structure of the data. However, given that the model is still a linear combination of functions, linear combination regression can recover the fit if you specify the functions to use.

SQL


You can see that the model finds the coefficients perfectly and has a coefficient of determination of 1.0.

The coefficient of determination represents the amount of variance in the data that is explained by the model. A coefficient of determination of 1.0 indicates that all the variance is explained and that the model is a perfect fit. A coefficient of determination of 0.0 indicates that none of the variance is explained. A negative coefficient of determination is possible, but you do not find it in linear regression models because these models always find a fit that is at least as good as the constant function (the average of all the training data). That model has a coefficient of determination of 0.0.

Non-Linear Regression

The non-linear regression model is similar to linear combination regression in that you define the form of the model. However, non-linear regression can be more than a linear combination of terms.

Assume this training data. Use the CREATE TABLE AS SELECT SQL statement to create the mldemo.nlo_input1 table. Generate 100 rows of data. Then, create a non-linear regression model using five parameters.

SQL


In this case, specify only one function that covers the whole model.

Use the machine_learning_models and nonlinear_regression_models system catalog tables to see the metrics and the values that the model chose.

SQL


If you look at the coefficients, you can see the amplitude (2.0) is negative in the model, but that is offset by subtracting pi from the phase. The model still almost exactly matches how the data is generated, but it just found a slightly different, but equivalent, way of doing it.

Non-linear regression models (and all of the other models that use the same infrastructure) have lots of options that control how the search works. These options affect the length of time it takes for training the model, but they also affect the likelihood of getting stuck in a global minimum. The model chooses default values such that they should be sufficient for a wide range of use cases.

With non-linear regression models, you can specify a loss function other than least-squares. In this example, specify the abs(f-y) loss function.

SQL


In this case, find the best fit that minimizes the sum of the absolute value of the model errors instead of the sum of the square of the model errors. The variable f represents the value of the model, and the variable y represents the actual value in the training data.

The catalog indicates the model is nearly the same as the previous model. In other words, changing the loss function here did not substantially change anything. But there are cases where other loss functions are appropriate and they could substantially change the model.

Non-Linear Regression with Neural Networks

In all of the non-linear cases, you have to make a good guess for the form of the target function. However, with neural networks, you can leave it up to the network to try to figure out that form. The downside is that aspects of the model are a black box.

This example uses the same example as non-linear regression, but instead of specifying the form of the function, it relies on what the neural net can do with little information. The size of the network layers and the number of layers greatly influence how good of a fit a neural net model has. This example uses a simple network.

SQL


In the case of neural network models, the metric used is an average loss because RMSE and the coefficient of determination are generally only appropriate for least squares models, and neural networks might be using other loss functions.

You can see the average loss by querying the machine_learning_models and feedforward_network_models system catalog tables.

SQL


Compute the coefficient of determination manually.

SQL


This model fits the data very well.

Vector-Valued Regression

Vector-valued regression is the case where the dependent variable has multiple components. You can use any type of model to learn each component separately, or you can use a feedforward network to learn all components in one model.

The example creates 100 rows of sample data.

SQL


There is a dependent variable that depends on x1, which is vector-valued. Each component can fit a linear model, which can then predict vector-valued results.

SQL


The query verifies that it produces reasonable results.

SQL


This example shows a single model can perform this with the feedforward network model type.

SQL


outputs, which is set to the size of the vector, and the lossFunction option is set to use vector_squared_error.

This example shows how well the model can predict the output.

SQL


Autoregression

Autoregression is linear regression where the independent variables are lags of the dependent variable. In other words, it is a series where the next value depends linearly on the previous N values.

These examples do not use a new model type. Instead, they use multiple linear regression models.

One of the important things about autoregression is that some variables indicate the order of the data. This variable can be a time variable, but it does not need to be. This variable is necessary to correctly set up the model, but it is not directly used by the model. Instead, the variable helps build the correct lags.

This example builds a model to predict the Fibonacci sequence. This statement creates a table with the first several Fibonacci terms.

SQL


This example builds a model where the dependent variable depends on two lags.

SQL


When you execute the model, which provides the model with two lags, the model can predict the next value of the sequence that is not in the data, 55.

SQL


Vector Autoregression

Vector autoregression is just a more complex version of autoregression. This model uses multiple time series, where the next value for each time series depends linearly on the lags of all of the time series.

This statement sets up some data where the next value for two time series depends on the previous two values of both time series.

SQL


The query inspects the data.

SQL


This example fits the vector autoregression model to the data.

SQL


See the details of the model in the machine_learning_models and vector_autoregression_models system catalog tables.

SQL


The model recovered the expected coefficients. The coefficient of determination is NULL here because the model did not enable metrics collection in the options.

This example checks that the model matches the data reasonably well.

SQL


The first two values have no predictions because the lags necessary to drive the predictions do not exist.

For more information, see Vector Autoregression.

Related Links