import numpy as np
import random
defgendata
(pointcont, bias, variance)
:"""
x是多個二維的點,沿著y=x+b直線附近分布,b為bias,
variance為y的基礎偏差,總偏差為基礎偏差+隨機偏差
:param pointcont: 生成的點的數量
:param bias: 結果的偏差
:param variance:
:return: x:平面上的一系列點,y是對應點的標誌
"""x = np.zeros(shape=
(pointcont,2)
) y = np.zeros(shape=
(pointcont)
)for i in
range(0
, pointcont)
: x[i][0
]=1 x[i][1
]= i
y[i]
=(i + bias)
+ random.uniform(0,
1)+ variance
return x, y
defgradientdescent
(x, y, theta, alpha, itemscont, iters)
:"""
min cost :cost = sum(loss**2)/2m
= sum((h-y)**2)/2m
= sum (x*theta - y)**2/2m
梯度:d(cost) = sum 2*(x*theta - y) * theta/2m
= sum 2*loss * theta/2m
= sum loss*theta/m
:param x:
:param y:
:param theta: 初始權重引數
:param alpha: 學習率
:param itemscont: 資料集大小
:param iters: 迭代次數
:return: 新的權重
"""xtran = np.transpose(x)
for i in
range
(iters)
: hypothesis = np.dot(x, theta)
#**值
loss = hypothesis - y #偏差
cost = np.
sum(loss**2)
/(2*itemscont)
#損失函式可以自行設定,這只是最簡單的
gradient = np.dot(xtran, loss)
/itemscont
theta = theta - alpha*gradient
return theta
x, y = gendata(
100,25,
10)print
(x, y)
theta = np.ones(2)
theta = gradientdescent(x, y, theta,
0.0005
,100
,10000
)print
(theta)
tensorflow非線性回歸
該程式有輸入層,中間層和輸出層 執行環境 ubuntun menpo queen queen x550ld downloads py python nonliner regression.py coding utf 8 定義乙個神經網路 輸入層乙個元素,中間層10個神經元,輸出層1個元素 impor...
2 非線性回歸
import keras import numpy as np import matplotlib.pyplot as plt sequential按順序構成的模型 from keras.models import sequential dense全連線層 from keras.layers imp...
Tensorflow學習教程 非線性回歸
自己搭建神經網路求解非線性回歸係數 coding utf 8 import tensorflow as tf import numpy as np import matplotlib.pyplot as plt 使用numpy 生成200個隨機點 x data np.linspace 0.5,0.5...