This repo contains code for benchmarking deep learning solutions, including RedisAI. This code is based on a fork of work initially made public by TSBS at https://github.com/timescale/tsbs.
- RedisAI: an AI serving engine for real-time applications built by Redis Labs and Tensorwerk, seamlessly plugged into Redis.
- Nvidia Triton Inference Server: An open source inference serving software that lets teams deploy trained AI models from any framework (TensorFlow, TensorRT, PyTorch, ONNX Runtime, or a custom framework), from local storage or Google Cloud Platform or AWS S3 on any GPU- or CPU-based infrastructure.
- TorchServe: built and maintained by Amazon Web Services (AWS) in collaboration with Facebook, TorchServe is available as part of the PyTorch open-source project.
- Tensorflow Serving: a high-performance serving system, wrapping TensorFlow and maintained by Google.
- Common REST API serving: a common DL production grade setup with Gunicorn (a Python WSGI HTTP server) communicating with Flask through a WSGI protocol, and using TensorFlow as the backend.
Currently, aibench supports two use cases:
-
creditcard-fraud [details here]: from Kaggle with the extension of reference data. This use-case aims to detect a fraudulent transaction based on anonymized credit card transactions and reference data.
-
vision-image-classification[details here]: an image-focused use-case that uses one network “backbone”: MobileNet V1, which can be considered as one of the standards by the AI community. To assess inference performance we’re recurring to COCO 2017 validation dataset (a large-scale object detection, segmentation, and captioning dataset).
Use case/Inference Server | model | RedisAI | TensorFlow Serving | Torch Serve | Nvidia Triton | Rest API |
---|---|---|---|---|---|---|
Vision Benchmark (CPU/GPU) (details) | mobilenet-v1 (224_224) | ✔️ | Not supported | Not supported | ✔️ | Not supported |
Fraud Benchmark (CPU) (details) | Non standard Kaggle Model with the extension of reference data | ✔️ docs | ✔️ docs | ✔️ docs | Not supported | ✔️ docs |
The easiest way to get and install the go benchmark programs is to use go get
and then issuing make:
# Fetch aibench and its dependencies
go get github.com/RedisAI/aibench
cd $GOPATH/src/github.com/RedisAI/aibench
make
- [1] Announcing RedisAI 1.0: AI Serving Engine for Real-Time Applications, May 19, 2020