Build a deep learning inference framework from scratch
# https
git clone --recursive https://github.com/masteryi-0018/MiniNN.git
# ssh
git clone --recursive git@github.com:masteryi-0018/MiniNN.git
- cmake
.\build.ps1
- bazel
.\bazel.ps1
更多请查看:https://github.com/masteryi-0018/MiniNN/blob/main/docs/README.md
- cmake
./build.sh
- bazel
./bazel.sh
更多请查看:https://github.com/masteryi-0018/MiniNN/blob/main/docs/README.md
- cmake
.\build\mininn\test-main.exe
.\build\mininn\test\gtest-main.exe
- bazel
.\bazel-bin\mininn\test-main.exe
.\bazel-bin\mininn\gtest-main.exe
- cmake
./build/mininn/test-main
./build/mininn/test/gtest-main
- bazel
./bazel-bin/mininn/test-main
./bazel-bin/mininn/gtest-main
- mininn convertor
- 支持将onnx模型转换为gynn格式
- 支持将pytorch模型转换为gynn格式
- 支持将tensorflow模型转换为gynn格式
- mininn IR
- 支持多算子构图
- mininn kernel
- 增加opencl后端
- 增加cuda后端
- 工具类
- 增加示例程序,链接mininn.so
- 构建系统
- 引入flatbuffers头文件,集成到bazel脚本中
- Windows使用bazel构建
- Windows使用msvc编译问题解决
- 使用clang编译器
- 疑问
- 堆上申请内存的示意图