class CustGradClass:
def __init__(self):
pass
@tf.custom_gradient
def f(self,x):
fx = x
def grad(dy):
return dy * 1
return fx,grad
我收到以下错误:
ValueError: Attempt to convert a value (<main.CustGradClass object at 0x12ed91710>) with an unsupported type () to a Tensor.
原因是自定义渐变接受函数f(* x),其中x是张量序列.传递的第一个参数是对象本身,即自我.
从documentation开始:
f: function f(*x) that returns a tuple (y,grad_fn) where:
x is a sequence of Tensor inputs to the function.
y is a Tensor or sequence of Tensor outputs of applying TensorFlow operations in f to x.
grad_fn is a function with the signature g(*grad_ys)
我该如何运作?我需要继承一些python tensorflow类吗?
我正在使用TF版本1.12.0和渴望模式.
最佳答案
这是一种可能的简单解决方法:
原文链接:https://www.f2er.com/python/533152.htmlimport tensorflow as tf
class CustGradClass:
def __init__(self):
self.f = tf.custom_gradient(lambda x: CustGradClass._f(self,x))
@staticmethod
def _f(self,x):
fx = x * 1
def grad(dy):
return dy * 1
return fx,grad
with tf.Graph().as_default(),tf.Session() as sess:
x = tf.constant(1.0)
c = CustGradClass()
y = c.f(x)
print(tf.gradients(y,x))
# [<tf.Tensor 'gradients/IdentityN_grad/mul:0' shape=() dtype=float32>]
编辑:
如果您想在不同的类上多次执行此操作,或者只想使用更可重用的解决方案,则可以使用如下所示的装饰器:
import functools
import tensorflow as tf
def tf_custom_gradient_method(f):
@functools.wraps(f)
def wrapped(self,*args,**kwargs):
if not hasattr(self,'_tf_custom_gradient_wrappers'):
self._tf_custom_gradient_wrappers = {}
if f not in self._tf_custom_gradient_wrappers:
self._tf_custom_gradient_wrappers[f] = tf.custom_gradient(lambda *a,**kw: f(self,*a,**kw))
return self._tf_custom_gradient_wrappers[f](*args,**kwargs)
return wrapped
然后,您可以这样做:
class CustGradClass:
def __init__(self):
pass
@tf_custom_gradient_method
def f(self,grad
@tf_custom_gradient_method
def f2(self,x):
fx = x * 2
def grad(dy):
return dy * 2
return fx,grad