"grpc_message":"Serving signature key "serving_default" not found.","grpc_status":9

【"grpc_message":"Serving signature key "serving_default" not found.","grpc_status":9】最近在学习Tensorflow Serving部署服务的时候,遇到了一个很大的问题,这里做一下记录。我主要是通过这个 tensorflow-serving-example 学习Tensorflow serving的使用。在最后一步,运行grpc-mnist-client.py时遇到这个问题,调试栈如下:

Traceback (most recent call last): File "/home/wuyenan/LiZeB/tensorflow-serving-example/python/grpc_mnist_client.py", line 55, in run(args.host, args.port, args.image, args.model, args.signature_name) File "/home/wuyenan/LiZeB/tensorflow-serving-example/python/grpc_mnist_client.py", line 33, in run result = stub.Predict(request, 10.0) File "/home/wuyenan/anaconda3/envs/tensorflow-serving-example/lib/python2.7/site-packages/grpc/_channel.py", line 565, in __call__ return _end_unary_response_blocking(state, call, False, None) File "/home/wuyenan/anaconda3/envs/tensorflow-serving-example/lib/python2.7/site-packages/grpc/_channel.py", line 467, in _end_unary_response_blocking raise _Rendezvous(state, None, None, deadline) grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with: status = StatusCode.FAILED_PRECONDITION details = "Serving signature key "serving_default" not found." debug_error_string = "{"created":"@1562657172.223509298","description":"Error received from peer ipv4:127.0.0.1:8500","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":"Serving signature key "serving_default" not found.","grpc_status":9}"

重点是这一句:
debug_error_string = “{“created”:”@1562657172.223509298",“description”:“Error received from peer ipv4:127.0.0.1:8500”,“file”:“src/core/lib/surface/call.cc”,“file_line”:1052,“grpc_message”:“Serving signature key “serving_default” not found.”,“grpc_status”:9}"
根据以上信息可得是 服务的签名对不上,编译器并没有找到 “Serving_default”。我已经在docker里运行了那个服务模型,也就是服务器端程序运行并没有出错,Serving signature一定已经正确生成了,只是在client端程序里没有正确地传进去。
stackflow上找到相关问题:
https://stackoverflow.com/questions/53199932/serving-retrained-tensorflow-hub-module-with-new-features
了解到还有一个CLI工具检查保存下来的服务模型,CIL官方文档如下:https://www.tensorflow.org/guide/saved_model#cli_to_inspect_and_execute_savedmodel
从中了解到这条命令可以查看保存模型的签名信息,果然是服务签名信息对不上,改完就能在客户端正常运行了。
saved_model_cli show --dir /tmp/saved_model_dir --tag_set serve

注意
  1. 这里要该服务器端模型文件的路径,服务器端运行的模型文件是一个.pb文件,通过estimator类在训练完成之后生成的;
  2. 客户端的访问IP一定要正确; 端口必须和采取的相应通信API相联系,下图参考:https://github.com/tensorflow/serving/blob/master/tensorflow_serving/g3doc/docker.md
    "grpc_message":"Serving signature key "serving_default" not found.","grpc_status":9
    文章图片

    推荐阅读