Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

save model for inference later #15

Open
neuralminds opened this issue May 7, 2020 · 0 comments
Open

save model for inference later #15

neuralminds opened this issue May 7, 2020 · 0 comments

Comments

@neuralminds
Copy link

neuralminds commented May 7, 2020

thanks for the example codes. Experimented with Text Classification for character_rnn (https://github.com/baidu-research/tensorflow-allreduce/blob/master/tensorflow/examples/learn/text_classification_character_rnn.pyy).

How can i write a serving_input_fn for it ? I want to save and restore this model

extended the code to save but getting error, please help
from tensorflow.contrib.learn.python.learn.utils import input_fn_utils
feature_spec = {"feature":tf.FixedLenFeature([100],tf.int64)}
serving_input_fn = input_fn_utils.build_parsing_serving_input_fn(feature_spec)
and than
classifier.export_savedmodel(export_dir_base='model', serving_input_receiver_fn=serving_input_fn)

and getting this error

TypeError: Failed to convert object of type <class 'dict'> to Tensor. Contents: {'feature': <tf.Tensor 'ParseExample/ParseExample:0' shape=(?, 100) dtype=int64>}. Consider casting elements to a supported type.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant