B^B .. Oh

一群爱搞事情的小弱鸡

Tensorflow 线性回归

使用Tensorflow 2.0.0.a0 版本

1、设置一元线性方程

2、计算Cost损失函数

3、通过梯度下降法求Cost最小值

4、输出W、b

 

代码如下:

import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt

tf.compat.v1.disable_eager_execution()

tx = np.asanyarray([1,2,2,3,3,4,5,6,6,6,8,10])
ty = np.asanyarray([-890,-1411,-1560,-2220,-2091,-2878,-3537,-3268,-3920,-4163,-5471,-5157])

n = len(tx)

X = tf.compat.v1.placeholder(tf.float32)
Y = tf.compat.v1.placeholder(tf.float32)

W = tf.Variable(np.random.rand(), name="w")
b = tf.Variable(np.random.rand(), name="b")

pred = tf.add(tf.multiply(X, W), b)

cost = tf.reduce_sum(tf.abs(pred-Y))/n

optimizer = tf.compat.v1.train.GradientDescentOptimizer(1).minimize(cost)

init = tf.compat.v1.global_variables_initializer()

with tf.compat.v1.Session() as sess:
    sess.run(init)
    for epoch in range(9000):
        for (x, y) in zip(tx, ty):
            sess.run(optimizer, feed_dict={X: x, Y: y})

        if (epoch+1) % 50 == 0:
            c = sess.run(cost, feed_dict={X: x, Y:y})
            print("Epoch:", '%04d' % (epoch+1), "cost=", "{:.9f}".format(c), \
                "W=", sess.run(W), "b=", sess.run(b))
            plt.plot(tx, ty, 'ro', label='Original data')
            plt.plot(tx, sess.run(W) * tx + sess.run(b), label='Fitted line')
            plt.legend()
            plt.show()

    print("Optimization Finished!")
    training_cost = sess.run(cost, feed_dict={X: tx, Y: ty})
    print("Training cost=", training_cost, "W=", sess.run(W), "b=", sess.run(b), '\n')
    plt.plot(tx, ty, 'ro', label='Original data')
    plt.plot(tx, sess.run(W) * tx + sess.run(b), label='Fitted line')
    plt.legend()
    plt.show()

 

点赞

发表评论

电子邮件地址不会被公开。 必填项已用*标注