這一篇部落格主要想要實現一下之前推導的簡單的線性回歸演算法。
下面我們對上述過程進行封裝:
import numpy as np
class
******linearregression1
: def __init__
(self):""
"初始化 ****** linear regression 模型"
"" self.a_ = none
self.b_ = none
def fit
(self, x_train, y_train):""
"根據訓練資料集 x_train, y_train訓練模型"
"" assert x_train.ndim ==
1, \
"****** linear regression can only solve single feature training data"
assert len
(x_train)
==len
(y_train)
, \ "the size of x_train must be equal to the size of y_train"
x_mean = np.
mean
(x_train)
y_mean = np.
mean
(y_train)
num =
0.0 d =
0.0for x, y in zip
(x_train, y_train)
: num +
=(x - x_mean)
*(y - y_mean)
d +=(x - x_mean)**
2 self.a_ = num / d
self.b_ = y_mean - self.a_ * x_mean
return self
def predict
(self, x_predict)
: # x_predict 為乙個向量
"""給定**資料集x_predict, 返回表示x_predict的結果向量"
"" assert x_predict.ndim ==
1, \
"****** linear regression can only solve single feature training data"
assert self.a_ is not none and self.b_ is not none, \
"must fit before predict!"
return np.
array
([self.
_predict
(x)for x in x_predict]
) def _predict
(self, x_single)
: # x_single 為乙個數
"""給定單個**資料x_single, 返回x_single的**結果值"
""return self.a_ * x_single + self.b_
def __repr__
(self)
:return
"******linearregression1()"
接下來,我們就開始使用我們自己的 ******linearregression。
具體**見 12-簡單線性回歸的實現
TensorFlow 實現簡單線性回歸
import tensorflow as tf import numpy as np import random create data x data np.random.rand 100 astype np.float32 使用numpy生成100個隨機點 noise np.random.norm...
簡單線性回歸
真實值 y theta x varepsilon 值 hat theta x varepsilon 為誤差 項,服從 均值為0 方差為 為誤差項,服從均值為0,方差為 為誤差項,服 從均值為 0,方差 為 sigma 的高斯分布。已知若干樣本,可以得到若干 varepsilon 值,根 據極大似 然...
簡單線性回歸
資料預處理 data student data 1 刪除缺失值 lm data na.omit data 散點圖 plot height,weight,data data,main scatter plot col 1,pch col為顏色,pch為形狀 箱線圖 boxplot height wei...