Skip to main content

Model Serving made Efficient in the Cloud

Project description

框架环境变量

环境变量 说明 默认值
TI_MODEL_DIR 模型路径 /data/model
TI_PREPROCESS_NUMS 预处理进程数 0
TI_INFERENCE_NUMS 推理进程数 1
TI_POSTPROCESS_NUMS 后处理进程数 0
TI_INFERENCE_MAX_BATCH_SIZE 推理Batch数 1

说明

  • 当TI_PREPROCESS_NUMS==0且TI_POSTPROCESS_NUMS==0时;

    model_service.py 文件中preprocess,postpress,predict 函数在一个进程中执行

  • 否则,model_service.py 文件中preprocess, postpress, predict 函数分别处于不同进程中执行,

    load 函数与predict 函数处于同一进程;

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

ti_cloud_infer_framework-0.1.6-py3-none-any.whl (8.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page