-
Notifications
You must be signed in to change notification settings - Fork 2.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add KaNN
format support
#1333
base: main
Are you sure you want to change the base?
Add KaNN
format support
#1333
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
but adapting an existing format do not address the specific requirements for handling KaNN's images and layers, among other things.
Can you elaborate on what these requirements are? For example, would it be possible to expose these as operator extensions or metadata in ONNX instead of introducing yet another graph format? What process is used to add this information to KaNN files?
source/kann-metadata.json
Outdated
"category": "Activation" | ||
}, | ||
{ | ||
"name": "ImageSMEM", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Multiple of these operators are not included in other implementations. Remove so they show without category.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you clarify which operators are concerned ? As explained, KaNN handles tensors and layers differently than other implementations, so it would be valuable to keep model-specific operators to help users visually understand its structure.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any operators which are not standard across formats. For example, "ImageSMEM" does not seem to be documented in online search. Its function or how it relates to other similar operators in other frameworks is not clear. Start by leaving it blank and create a follow-up issue in a KaNN repository to do such an investigation across formats and update categories for all similar operators in all formats.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This would be ImageSMEM
and ImageDDR
then. However in relation with the global answer, those two operators are not actual layers but are necessary for Netron to visualize KaNN's unique tensor memory management at a lower level and will not be encountered in any other frameworks. They will be removed if it's non negociable, but doing so will be a significant loss of information for understanding our models.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These ones as well: Erf, Input, Output, MaxWithConstant, Reciprocal, MulElementwise, Copy, Division, MatMul, MaxElementwise, LRN, Abs, MulWithConstant, Abs, Invert, Sqrt, MinElementwise, Logistic, AddElementwise, Saturate, ScalingVector, Convert, MinWithConstant, WriteValue, ImageSMEM, ImageDDR
The goal is to have consistent categories across formats. For now let's remove them and approach this separately. For example, instead of some frameworks categorizing "LRN" as "Normalization" and others as "Activation" this should be applied consistently. Similarly, "MatMul" and other basic math functions currently have no category in other frameworks. For new operators there should be some public documentation of functionality and discussion what the category is based on. There might be a need for new categories.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, this makes sense. Those operators are removed from metadata and we'll investigate how to categorize them properly for ImageSMEM
and ImageDDR
really need to be displayed distinctly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LRN
is category Normalization
.
tools/kann.fbs
Outdated
@@ -0,0 +1,60 @@ | |||
namespace kann; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This file should not be checked in to this repo. Instead download from and official repo which always contains the latest version and documentation for this format. Follow tflite
implementation as an example, e.g. Kalray should formally take ownership of .fbs
content and publish and maintain it under an open source license.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove from ./tools
and move the source file to an officially maintained repository for this format. For example, "netron-test-models" is not the official main location to learn about this format, signals Netron as the owner of this format which is not the case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In progress, but this will result in the .fbs
not being accessible for now as the repo is currently private
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great to hear. We can block merging this pull request until this work is done.
test/models.json
Outdated
{ | ||
"type": "kann", | ||
"target": "SqueezeNet-v1.0-ILSVRC2012", | ||
"source": "https://github.com/kalray/netron-test-models/raw/main/SqueezeNetV1_0-ILSVRC2012-fp16-5c.kann", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Move to an official model zoo repo that is being maintained.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Our model zoo repo isn't public yet, but this repository is completely maintained and will be merged into the official zoo repo once it becomes public.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please front load this work to avoid churn and make it clear that these are official supported files not directly related to Netron.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In progress, but this will result in the models not being accessible for now
4931574
to
3b43032
Compare
First, the current ONNX implementation doesn't meet the modularity needed to fully address KaNN's data type requirements. Additionally, not all KaNN layers have corresponding ONNX equivalents. For instance, custom layers like It's important to clarify that KaNN is an inference engine similar to TensorRT, so the graphs we handle are more low-level than ONNX models. We have our own internal model format, and we'd like to use Netron for visualizing it but the difference in structure and purpose makes adapting existing formats insufficient to meet KaNN’s specific needs. |
Can you provide an example or such a type or modularity need?
ONNX can act as a container with an entirely custom set of operators provided via a "domain".
Can you elaborate in this. What is an "Arc" and is there some context why this is needed? |
Indeed, we could extend ONNX to support KaNN’s data types and layer types, but the core difference lies elsewhere. KaNN’s purpose is not to introduce yet another framework for neural network design but rather to translate models from popular frameworks into a lower-level, processor-executable format. The
Unlike ONNX, where layers and tensors are distinct (layers as nodes and tensors as arcs), KaNN manipulates tensors as individual graph nodes, much like layers. The input/output relationships between these nodes are then not whole tensors but subviews (smaller parts of the tensors). These subviews correspond to the arcs in our graph, with an Arc being a connection between nodes, as defined in graph theory. Adopting the ONNX format would not work effectively and information would not be displayed properly (subviews' information being different from standard tensors').
Although ONNX implementation does support many data types and could potentially meet our needs at one point, our KaNN implementation allows us to use any data type (whether as a single value or a list) at ANY point in the code without needing to specify individual compatibility for different parts (e.g. nodes, quantization...). This corresponds to our goal of improving our models in the near future to accept a wider variety of data types without having to modify Netron with each update. Hope this helps you understand our need for a specific implementation of KaNN. |
plus resolve lint error
241285e
to
f3e981f
Compare
Hi 👋,
We've added support for
.kann
files to visualize KaNN graphs, a format developed by Kalray Inc. for accelerating ML computations with our MPPA processors.Compatibility with Netron is essential for both our employees and customers but adapting an existing format do not address the specific requirements for handling KaNN's images and layers, among other things.
kann.js
,kann-metadata.json
,kann-schema.js
have been added insource/
as well askann.fbs
and the script to process it intools/
. Examples are also available intest/models.json
.Could you please review our pull request?
Let us know if you have any comments or need further information.
Thanks!