思路:在資料上選擇一條直線y=wx+b,在這條直線上附件隨機生成一些資料點如下圖,讓tensorflow建立回歸模型,去學習什麼樣的w和b能更好去擬合這些資料點。
1)隨機生成1000個資料點,圍繞在y=0.1x+0.3 周圍,設定w=0.1,b=0.3,屆時看構建的模型是否能學習到w和b的值。
import numpy as np構造資料如下圖import tensorflow as tf
import matplotlib.pyplot as plt
num_points=1000
vectors_set=
for i in range(num_points):
x1=np.random.normal(0.0,0.55) #橫座標,進行隨機高斯處理化,以0為均值,以0.55為標準差
y1=x1*0.1+0.3+np.random.normal(0.0,0.03) #縱座標,資料點在y1=x1*0.1+0.3上小範圍浮動
x_data=[v[0] for v in vectors_set]
y_data=[v[1] for v in vectors_set]
plt.scatter(x_data,y_data,c='r')
plt.show()
2)構造線性回歸模型,學習上面資料圖是符合乙個怎麼樣的w和b
w = tf.variable(tf.random_uniform([1], -1.0, 1.0), name='w') # 生成1維的w矩陣,取值是[-1,1]之間的隨機數列印每一次結果,如下圖,隨著迭代進行,訓練的w、b越來越接近0.1、0.3,說明構建的回歸模型確實學習到了之間建立的資料的規則。loss一開始很大,後來慢慢變小,說明模型表達效果隨著迭代越來越好。b = tf.variable(tf.zeros([1]), name='b') # 生成1維的b矩陣,初始值是0
y = w * x_data + b # 經過計算得出預估值y
loss = tf.reduce_mean(tf.square(y - y_data), name='loss') # 以預估值y和實際值y_data之間的均方誤差作為損失
optimizer = tf.train.gradientdescentoptimizer(0.5) # 採用梯度下降法來優化引數 學習率為0.5
train = optimizer.minimize(loss, name='train') # 訓練的過程就是最小化這個誤差值
sess = tf.session()
init = tf.global_variables_initializer()
sess.run(init)
print ("w =", sess.run(w), "b =", sess.run(b), "loss =", sess.run(loss)) # 初始化的w和b是多少
for step in range(20): # 執行20次訓練
sess.run(train)
print ("w =", sess.run(w), "b =", sess.run(b), "loss =", sess.run(loss)) # 輸出訓練好的w和b
w = [-0.9676645] b = [0.] loss = 0.45196822
w = [-0.6281831] b = [0.29385352] loss = 0.17074569
w = [-0.39535886] b = [0.29584622] loss = 0.07962803
w = [-0.23685378] b = [0.2972129] loss = 0.03739688
w = [-0.12894464] b = [0.2981433] loss = 0.017823622
w = [-0.05548081] b = [0.29877672] loss = 0.008751821
w = [-0.00546716] b = [0.29920793] loss = 0.0045472304
w = [0.02858179] b = [0.2995015] loss = 0.0025984894
w = [0.05176209] b = [0.29970136] loss = 0.0016952885
w = [0.06754307] b = [0.29983744] loss = 0.0012766734
w = [0.07828666] b = [0.29993007] loss = 0.001082654
w = [0.08560082] b = [0.29999313] loss = 0.0009927301
w = [0.09058025] b = [0.30003607] loss = 0.0009510521
w = [0.09397022] b = [0.30006528] loss = 0.00093173544
w = [0.09627808] b = [0.3000852] loss = 0.00092278246
w = [0.09784925] b = [0.30009875] loss = 0.000918633
w = [0.09891889] b = [0.30010796] loss = 0.00091670983
w = [0.0996471] b = [0.30011424] loss = 0.0009158184
w = [0.10014286] b = [0.3001185] loss = 0.00091540517
w = [0.10048037] b = [0.30012143] loss = 0.0009152137
w = [0.10071015] b = [0.3001234] loss = 0.0009151251
注:以上內容為我學習唐宇迪老師的tensorflow課程所做的筆記
用TensorFlow實現簡單線性回歸
使用tensorflow 構造乙個神經元,簡單的線性回歸網路。問題 現有一組有雜訊的樣本資料,共2000個,每乙個樣本 x 有 3 個特徵,對應乙個標籤 y 值。從資料樣本中學習 y w x b y w times x b y w x b 中的引數 首先我們來生成樣本資料,w real 和 b re...
用Python完成購物車簡單操作
題目要求 用python完成購物車簡單操作 列印購物清單 計算錢款等目錄 每次習題記錄進步腳步。便於以後回顧並給自己以動力!2.答案1為第一次記錄有很多需要改進 答案2等以後回顧再次書寫 products iphone 6888 6888 macpro 14800 14800 小公尺6 2499 2...
TensorFlow簡單介紹
tensorflow簡單介紹 tensorflow中文社群 中文社群中是這個介紹的 tensorflow tensorflow是乙個採用資料流圖 data flow graphs 用於數值計算的開源軟體庫。節點 nodes 在圖中表示數學操作,圖中的線 edges 則表示在節點間相互聯絡的多維資料陣...