Welcome to Software Development on Codidact!
Will you help us build our independent community of developers helping developers? We're small and trying to grow. We welcome questions about all aspects of software development, from design to code to QA and more. Got questions? Got answers? Got code you'd like someone to review? Please join us.
Comments on Can't use tf.timestamp() from within @tf.function with XLA / jit_compile=True
Post
Can't use tf.timestamp() from within @tf.function with XLA / jit_compile=True
+1
−0
I would like to use tf.timestamp()
when it is available (eager mode and graph mode without XLA), and use 0.
(or a better fallback if there is one) when it is not available (with XLA; @tf.function(jit_compile=True)
).
I tried this:
def tf_timestamp_or_zero():
try:
return tf.timestamp()
except tf.python.framework.errors_impl.InvalidArgumentError:
return 0.
#@tf.function
@tf.function(jit_compile=True)
def __call__(self):
#...
#t = tf.timestamp() # same error as below
t = tf_timestamp_or_zero() # error, see below
#...
Error:
tensorflow.python.framework.errors_impl.InvalidArgumentError:
Detected unsupported operations when trying to compile graph
_inference___call___5559[
_XlaMustCompile=true,
config_proto=3175580994766145631,
executor_type=11160318154034397263
] on XLA_CPU_JIT:
Timestamp (No registered 'Timestamp' OpKernel for XLA_CPU_JIT devices
compatible with node {{node Timestamp}}){{node Timestamp}}
1 comment thread