【tensorflow2.0】低阶api--张量操作、计算图、自动微分

前端之家收集整理的这篇文章主要介绍了【tensorflow2.0】低阶api--张量操作、计算图、自动微分前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。

下面的范例使用TensorFlow的低阶API实现线性回归模型。

低阶API主要包括张量操作,计算图和自动微分。

import@H_403_8@ tensorflow as tf
 
#@H_403_8@ 打印时间分割线
@H_403_8@@tf.function
@H_403_8@def@H_403_8@ printbar():
    ts =@H_403_8@ tf.timestamp()
    today_ts = ts%(24*60*60@H_403_8@)
 
    hour = tf.cast(today_ts//3600+8,tf.int32)%tf.constant(24@H_403_8@)
    minite = tf.cast((today_ts%3600)//60@H_403_8@,tf.int32)
    second = tf.cast(tf.floor(today_ts%60@H_403_8@),tf.int32)
 
    @H_403_8@ timeformat(m):
        @H_403_8@if tf.strings.length(tf.strings.format("{}",m))==1@H_403_8@:
            @H_403_8@return(tf.strings.format(0{}"403_8@else403_8@ tf.strings.join([timeformat(hour),timeformat(minite),timeformat(second)],separator = :)
    tf.@H_403_8@print(=========="*8,end = ""@H_403_8@print@H_403_8@(timestring)
 
@H_403_8@ 样本数量
n = 400
 
@H_403_8@ 生成测试用数据集
X = tf.random.uniform([n,2],minval=-10,maxval=10@H_403_8@) 
w0 = tf.constant([[2.0],[-1.0@H_403_8@]])
b0 = tf.constant(3.0@H_403_8@)
Y = X@w0 + b0 + tf.random.normal([n,1],mean = 0.0,stddev= 2.0)  @H_403_8@ @表示矩阵乘法,增加正态扰动
 
@H_403_8@ 使用动态图调试
@H_403_8@ 
w =@H_403_8@ tf.Variable(tf.random.normal(w0.shape))
b = tf.Variable(0.0@H_403_8@)
 
@H_403_8@ train(epoches):
    @H_403_8@for epoch @H_403_8@in tf.range(1,epoches+1@H_403_8@):
        with tf.GradientTape() as tape:
            @H_403_8@正向传播求损失
            Y_hat = X@w +@H_403_8@ b
            loss = tf.squeeze(tf.transpose(Y-Y_hat)@(Y-Y_hat))/(2.0*@H_403_8@n)   
 
        @H_403_8@ 反向传播求梯度
        dloss_dw,dloss_db =@H_403_8@ tape.gradient(loss,[w,b])
        @H_403_8@ 梯度下降法更新参数
        w.assign(w - 0.001*@H_403_8@dloss_dw)
        b.assign(b - 0.001*@H_403_8@dloss_db)
        @H_403_8@if epoch%1000 ==@H_403_8@ 0:
            printbar()
            tf.@H_403_8@epoch = loss =403_8@w =403_8@b =403_8@)
 
train(5000)

结果:

================================================================================15:18:17@H_403_8@
epoch = 1000  loss = 2.66289544@H_403_8@
w = [[2.0176034@H_403_8@]
 [-1.02091444@H_403_8@]]
b = 1.92718041

================================================================================15:18:19@H_403_8@
epoch = 2000  loss = 2.12707591@H_403_8@
w = [[2.01378@H_403_8@]
 [-1.01979101@H_403_8@]]
b = 2.63039422

================================================================================15:18:21@H_403_8@
epoch = 3000  loss = 2.05447602@H_403_8@
w = [[2.01237178@H_403_8@]
 [-1.01937926@H_403_8@]]
b = 2.88924217

================================================================================15:18:23@H_403_8@
epoch = 4000  loss = 2.04463911@H_403_8@
w = [[2.01185489@H_403_8@]
 [-1.01922464@H_403_8@]]
b = 2.98452425

================================================================================15:18:24@H_403_8@
epoch = 5000  loss = 2.04330635@H_403_8@
w = [[2.01166272@H_403_8@]
 [-1.01917028@H_403_8@]]
b = 3.01959634

转换成静态图加速:

@H_403_8@# 使用autograph机制转换成静态图加速
@H_403_8@)
 
@tf.function
@H_403_8@)
train(5000)

结果:

================================================================================15:19:50@H_403_8@
epoch = 1000  loss = 2.6668539@H_403_8@
w = [[2.01762223@H_403_8@]
 [-1.02092016@H_403_8@]]
b = 1.92363214

================================================================================15:19:51@H_403_8@
epoch = 2000  loss = 2.12761354@H_403_8@
w = [[2.01378703@H_403_8@]
 [-1.01979291@H_403_8@]]
b = 2.6290853

================================================================================15:19:52@H_403_8@
epoch = 3000  loss = 2.0545485@H_403_8@
w = [[2.0123744@H_403_8@]
 [-1.01938@H_403_8@]]
b = 2.888762

================================================================================15:19:53@H_403_8@
epoch = 4000  loss = 2.04464912@H_403_8@
w = [[2.01185584@H_403_8@]
 [-1.019225@H_403_8@]]
b = 2.98434567

================================================================================15:19:54@H_403_8@
epoch = 5000  loss = 2.04330778@H_403_8@
w = [[2.0116632@H_403_8@]
 [-1.0191704@H_403_8@]]
b = 3.01952934

 

参考:

开源电子书地址:https://lyhue1991.github.io/eat_tensorflow2_in_30_days/

GitHub 项目地址:https://github.com/lyhue1991/eat_tensorflow2_in_30_days

猜你在找的Tensorflow相关文章