-
Notifications
You must be signed in to change notification settings - Fork 483
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add API docs for ONNX Runtime and Intel #100
Conversation
@@ -48,7 +48,7 @@ jobs: | |||
cd .. | |||
|
|||
cd optimum | |||
pip install .[dev] | |||
pip install .[dev,onnxruntime,intel] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have to include all the dependencies here to build the docs
@@ -11,3 +11,9 @@ | |||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||
# See the License for the specific language governing permissions and | |||
# limitations under the License. | |||
|
|||
from .neural_compressor.config import IncConfig, IncPruningConfig, IncQuantizationConfig |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here I have to include the neural_compressor
imports in the root init of the intel
subpackage to build the docs
|
||
## IncOptimizer | ||
|
||
[[autodoc]] intel.neural_compressor.optimizer.IncOptimizer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For consistency with the onnxruntime
package, I suggest we rename this module to intel.neural_compressor.optimization
. I realise this is a breaking change, but I think it will help optimum
users more easily navigate the API.
We can discuss this in a separate issue / thread since it doesn't relate to this PR very strongly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that's a good point, I was also thinking to rename intel.neural_compressor.config
to intel.neural_compressor.configuration
for the same reason. I agree that this is an important change that should be done as soon as possible (before our next release, which should be soon)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK great, let's tackle that in a separate PR :)
|
||
## IncOptimizer | ||
|
||
[[autodoc]] intel.neural_compressor.optimizer.IncOptimizer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that's a good point, I was also thinking to rename intel.neural_compressor.config
to intel.neural_compressor.configuration
for the same reason. I agree that this is an important change that should be done as soon as possible (before our next release, which should be soon)
docs/source/intel/configuration.mdx
Outdated
# Trainer | ||
|
||
## IncTrainer | ||
|
||
[[autodoc]] intel.neural_compressor.trainer_inc.IncTrainer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There seems to be a mismatch with docs/source/intel/trainer.mdx
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I wasn't sure whether to use the same hierarchy as we have in the onnxruntime
docs. Happy to nest it though so it lines up with the source code!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What I meant is that IncTrainer
is displayed in configuration.mdx
whileIncOptimizer
is displayed in trainer.mdx
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch, fixed now!
Great addition, thanks a lot @lewtun !! |
@@ -48,7 +48,7 @@ jobs: | |||
cd .. | |||
|
|||
cd optimum | |||
pip install .[dev,onnxruntime,intel] | |||
pip install -e .[dev,onnxruntime,intel] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, I am not very knowledgeable on the subject, but why do we need to install the packages in development mode?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, that was me trying to debug the doc-builder
in the GitHub actions - it's removed now!
@@ -88,6 +88,7 @@ jobs: | |||
NODE_OPTIONS: --max-old-space-size=6656 | |||
run: | | |||
cd doc-build-dev && git pull | |||
python -c "from optimum import intel; print(dir(intel))" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a temporary debug for the GH actions - will remove before merging
What does this PR do?
This PR adds the API docs for the
intel
andonnxruntime
packages. It depends on the fixes todoc-builder
in this PR, so the CI should pass once that's merged.The docs are structured as follows:
While putting together the docs, I ran into a few
optimum
quirks that I would like feedback on:transformers
,optimum
doesn't have a dev install that includes theintel
andonnxruntime
packages. Is this by design? Would it make sense to havepip install '.[dev]'
install all subpackages?intel
andonnxruntime
. For example, we haveintel.neural_compressor.optimizer
andonnxruntime.optimization
. I've standardised this in the docs to have an "optimization" section per subpackage, and am happy to rename theintel
module name if you want.intel.neural_compressor
imports in the root init of theintel
subpackage. Is this OK?