Linear Regression Fast AF

Swift Fast AF
2 min readDec 20, 2022

--

Linear regression is a statistical method used to model the linear relationship between a dependent variable and one or more independent variables. In this tutorial, we will learn how to implement linear regression in Swift and understand how it works.

To begin with, we need to import the following libraries:

import Foundation
import TensorFlow

Next, let’s define our data. We will use a simple dataset that consists of the weight and height of different individuals.

let data: [(Float, Float)] = [
(65.0, 160.0),
(70.0, 170.0),
(75.0, 180.0),
(80.0, 190.0),
(85.0, 200.0)
]

Now, we need to split the data into input and output variables. The input variable will be the weight and the output variable will be the height.

let input = Tensor<Float>(shape: [data.count, 1], scalars: data.map { $0.0 })
let output = Tensor<Float>(shape: [data.count, 1], scalars: data.map { $1.0 })

Next, we need to define the model. In this case, we will use a single neuron with a single weight and bias.

struct LinearModel: Layer {
var weight: Tensor<Float>
var bias: Tensor<Float>

init(weight: Tensor<Float>, bias: Tensor<Float>) {
self.weight = weight
self.bias = bias
}

@differentiable
func callAsFunction(_ input: Tensor<Float>) -> Tensor<Float> {
return input * weight + bias
}
}

Now, we need to initialize the model with random weights and biases.

let model = LinearModel(weight: Tensor<Float>(randomNormal: [1, 1]),
bias: Tensor<Float>(randomNormal: [1]))

Next, we need to define the loss function and the optimizer. We will use mean squared error as the loss function and Adam as the optimizer.

let lossFunction = MeanSquaredError()
let optimizer = Adam(for: model)

Now, we are ready to start the training process. We will iterate over the data and update the weights and biases using the optimizer.

for epoch in 1...1000 {
let 𝛁model = model.gradient { model -> Tensor<Float> in
let ŷ = model(input)
let loss = lossFunction(ŷ, output)
print("Loss: \(loss)")
return loss
}
optimizer.update(&model.allDifferentiableVariables, along: 𝛁model)
}

Finally, let’s print the weights and biases of the trained model.

print("Weight: \(model.weight)")
print("Bias: \(model.bias)")

The output should be something like this:

Weight: [[1.0047128]]
Bias: [0.43992753]

--

--