2020.12.03更新:pytorch不香麼…
regression losses
2、metrics
變數定義:
損失函式的分類:
probabilistic losses
binarycrossentropy class
# 輸入示例
# y_true = [[0., 1.], [0., 0.]]
# y_pred = [[0.6, 0.4], [0.4, 0.6]]
tf.keras.losses.binarycrossentropy(
from_logits=
false
, label_smoothing=
0, reduction=
"auto"
, name=
"binary_crossentropy"
)
categoricalcrossentropy class# 輸入示例
# y_true = [[0, 1, 0], [0, 0, 1]]
# y_pred = [[0.05, 0.95, 0], [0.1, 0.8, 0.1]]
tf.keras.losses.categoricalcrossentropy(
from_logits=
false
, label_smoothing=0,
reduction=
"auto"
, name=
"categorical_crossentropy"
,)
categoricalcrossentropy class# y_true = [1, 2]
# y_pred = [[0.05, 0.95, 0], [0.1, 0.8, 0.1]]
tf.keras.losses.sparsecategoricalcrossentropy(
from_logits=
false
, reduction=
"auto"
, name=
"sparse_categorical_crossentropy"
)
poisson class
泊松損失
參考:泊松回歸詳細介紹
定義:
loss = y_pred - y_true * log(y_pred)
kldivergence class
kl散度:
定義:loss = y_true * log(y_true / y_pred)
參考:regression losses
meansquarederror class
均方差
loss = square(y_true - y_pred)
meanabsoluteerror class
絕對值均差
loss =
abs(y_true - y_pred)
meanabsolutepercentageerror class
絕對值相對差
loss =
100*
abs(y_true - y_pred)
/ y_true
meansquaredlogarithmicerror class
詞窮了,咋翻譯?
loss = square(log(y_true +1.
)- log(y_pred +1.
))
cosinesimilarity class
余弦相似度
loss =
-sum
(l2_norm(y_true)
* l2_norm(y_pred)
)
huber class
去掉離群點,減輕離群點對loss的影響
loss =
0.5* x^2if
|x|<= d
loss =
0.5* d^
2+ d *
(|x|
- d)
if|x|
> d
huber classlogcosh = log(
(exp(x)
+ exp(
-x))/2
), where x is the error y_pred - y_true.
keras的官方doc竟然也把loss作為metrics
classification metrics based on true/false positives & negatives
auc class (area under the curve)
tf.keras.metrics.auc(
num_thresholds=
200,
curve=
"roc"
, summation_method=
"interpolation"
, name=
none
, dtype=
none
, thresholds=
none
, multi_label=
false
, label_weights=
none
,)
precision class
recall class
precisionatrecall class
計算當 recall is >= specified value 時的 precision
truepositives class,truenegatives class, falsepositives class, falsenegatives class
sensitivityatspecificity class
specificityatsensitivity class
Keras損失函式 評價標準
計算真實標籤和 標籤之間的交叉熵損失 當只有兩個標籤類 假設為0和1 時,使用這個交叉熵損失。對於每個示例,每個 都應該有乙個浮點值 tf.keras.losses.binary crossentropy y true tf.cast label,dtype tf.float32 y pred tf...
損失函式與評價函式
4.1 學習目標 掌握常見的評價函式和損失函式dice iou bce focal loss lov sz softmax 掌握評價 損失函式的實踐 4.2 tp tn fp fn 在講解語義分割中常用的評價函式和損失函式之前,先補充一 tp 真正例 true positive tn 真反例 tru...
Keras加權損失函式
ref keras提供的損失函式binary crossentropy和categorical crossentropy沒有加權,如果想實現樣本的不同權重功能有一種策略是對損失函式加權處理。二分加權交叉熵損失 class weightedbinarycrossentropy object def i...