0.5.0 (14/12/2022) Inference on Android with ONNX Runtime
Features:
- Added Android inference support
- Built Android artifacts for "impl", "onnx" and "visualization" modules #422
- Added Android-specific models to the model zoo
- Implemented preprocessing operations working on Android
Bitmap
#416
#478:Resize
Rotate
Crop
ConvertToFloatArray
- Added utility functions to convert
ImageProxy
toBitmap
#458 - Added
NNAPI
execution provider #420 - Added api to create
OnnxInferenceModel
from theByteArray
representation #415 - Introduced a gradle task to download model hub models before the build #444
- Added utility functions to draw detection results on Android
Canvas
#450
- Implemented new preprocessing API #425
- Introduced an
Operation
interface to represent a preprocessing operation for any input and output - Added
PreprocessingPipeline
class to combine operations together in a type-safe manner - Re-implemented old operations with the new API
- Added convenience functions such as
pipeline
to start a new preprocessing pipeline,
call
to invoke operations defined elsewhere,onResult
to access intermediate preprocessing results - Converted
ModelType#preprocessInput
function toOperation
#429 - Converted common preprocessing functions for models trained on ImageNet to
Operation
#429
- Introduced an
- Added new ONNX features
- Added execution providers support (
CPU
,CUDA
,NNAPI
) and convenient extensions for inference with them
#386 - Introduced
OnnxInferenceModel#predictRaw
function which allows customOrtSession.Result
processing and extension functions
to extract common data types from the result #465 - Added validation of input shape #385
- Added execution providers support (
- Added
Imagenet
enum to represent different Imagenet dataset labels and added support for zero indexed COCO labels
#438 #446 - Implemented unified summary printing for Tensorflow and ONNX models #368
- Added
FlatShape
interface to allow manipulating the detected shapes in a unified way #480 - Introduced
DataLoader
interface for loading and preprocessing data for dataset implementations #424 - Improved swing visualization utilities #379
#388 - Simplified
Layer
interface to leave onlybuild
function to be implemented and remove explicit output shape computation
#408
Breaking changes:
- Refactored module structure and packages #412 #469
- Extracted "tensorflow" module for learning and inference with Tensorflow backend
- Extracted "impl" module for implementation classes and utilities
- Moved preprocessing operation implementations to the "impl" module
- Removed dependency of "api" module on the "dataset" module
- Changed packages for "api", "impl", "dataset" and "onnx" so that they match the corresponding module name
- Preprocessing classes such as
Preprocessing
,ImagePreprocessing
,ImagePreprocessor
,
ImageSaver
,ImageShape
,TensorPreprocessing
,Preprocessor
got removed in favor of the new preprocessing API #425 - Removed
Sharpen
preprocessor since theModelType#preprocessor
field was introduced, which can be used in the preprocessing
pipeline using thecall
function #429
Bugfixes:
- Fix loading of jpeg files not supported by standard java ImageIO #384
- Updated ONNX Runtime version to enable inference on M1 chips #361
- Fixed channel ordering in for image recognition models #400
- Avoid warnings from
loadWeightsForFrozenLayers
function for layers without parameters #382
New documentation and examples:
- Inference with KotlinDL and ONNX Runtime on desktop and Android
- KotlinDL ONNX Model Zoo
- Sample Android App
Thanks to our contributors:
- Nikita Ermolenko (@ermolenkodev)
- Julia Beliaeva (@juliabeliaeva)
- Burak Akgün (@mbakgun)
- Pavel Gorgulov (@devcrocod)