本文是官方例子做了一點小小的改動,基於tensorflow實現擬合曲線,對初學tensorflow者有一定幫助。
訓練500步,每50步輸出一次,並顯示在plt中。
開始訓練:
訓練200步後:
訓練完成:
以下是完整**:
#coding: utf-8
#author: 吳晶
#wechat: 18007148050
import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
def add_layer(inputs,in_size,out_size,activation_function=none):
weights = tf.variable(tf.random_normal([in_size,out_size]))
biases = tf.variable(tf.zeros([1,out_size])) + 0.1
wx_plus_b = tf.matmul(inputs,weights) + biases
if activation_function is none:
outputs = wx_plus_b
else:
outputs = activation_function(wx_plus_b)
return outputs
x_data = np.linspace(-1,1,300)[:,np.newaxis]
noise = np.random.normal(0,0.05,x_data.shape)
y_data = np.square(x_data) - 0.5 + noise
xs = tf.placeholder(tf.float32,[none,1])
ys = tf.placeholder(tf.float32,[none,1])
l1 = add_layer(xs,1,10,activation_function=tf.nn.relu)
prediction = add_layer(l1,10,1,activation_function=none)
loss = tf.reduce_mean(tf.reduce_sum(tf.square(ys - prediction),reduction_indices=[1]))
train_step = tf.train.gradientdescentoptimizer(0.1).minimize(loss)
init = tf.global_variables_initializer()
with tf.session() as sess:
fig = plt.figure()
ax = fig.add_subplot(1,1,1)
ax.scatter(x_data,y_data)
plt.show(block = false)
sess.run(init)
for train in range(500):
sess.run(train_step,feed_dict=)
prediction_value = sess.run(prediction, feed_dict=)
lines = ax.plot(x_data, prediction_value, 'r-', lw=5)
plt.pause(0.1)
try:
ax.lines.remove(lines[0])
except exception:
pass
if train % 50 == 0:
print(train,sess.run(loss,feed_dict=))
TensorFlow 平面曲線擬合
平面曲線屬於非線性函式,至少需要 3 層的神經網路 輸入層,隱藏層x1,輸出層 來實現,為達到較好的效果,可嘗試更多層,下面的例子使用了2層隱藏層,採用最基本的全連線形式,隱藏層的神經元個數沒有嚴格要求,根據實際專案選擇,下面例子選用8個。下面通過 實現 import tensorflow as t...
scipy optimize 擬合曲線
這個擬合曲線是真正意義的運用最下二乘法進行擬合的,先算出來擬合直線k和b,在畫圖。和seaborn那個擬合不一樣,那個是做分類處理而已。最小二乘法試驗 import numpy as np from scipy.optimize import leastsq 取樣點 xi,yi xi np.arra...
MATLAB 曲線擬合
x0.1 0.20.15 0.0 0.2 0.3y 0.95 0.84 0.86 1.06 1.50 0.72 函式功能多項式的擬合運算 呼叫方法polyfit x,y,n x為橫座標,y為縱座標,n為擬合階數。例子x 0 0.1 2.5 1y erf x p polyfit x,y,6 p 0.0...