web service address: http://172.28.6.55:9292/uci/prediction * Serving Flask app "serve" (lazy loading) * Environment: production WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead. * Debug mode: off * Running on http://...
这条命令直接在 flask run 命令中通过 --no-debug 选项关闭了调试模式,并允许从任何 IP 地址访问该应用。 验证Flask应用已成功运行,并且调试模式已关闭: 启动应用后,你将在命令行中看到类似以下的输出: sh * Serving Flask app "app" (lazy loading) * Environment: production * Debug mode: off * Running...
Ive resorted to running my model on flask, though I would have loved to use tensorflow serving, I just can't get it to work soar-zhengjian commentedon Dec 27, 2017 soar-zhengjian on Dec 27, 2017 I get the same error when I try to run mnist_saved_model by bazel. But It's ok w...