The purpose of this repository is to bring back to life the snips-nlu
bin that wasn't fully open source.
The snips-nlu training part is provided by this repository: https://github.com/snipsco/snips-nlu.
You can build from source or download and install pre-built binaries I built. These binaries are for armhf architectures like raspberrypi and are working for python3.7. Don't follow steps 1 and 2 if you want to build snips-nlu yourself.
- Download all wheels by running following commands (MD5 and SHA256 checksums: Prebuilt wheels README.md)
sudo apt install libatlas3-base libgfortran5
cd /home/pi
wget --content-disposition https://github.com/jr-k/snips-nlu-rebirth/blob/master/wheels/scipy-1.3.3-cp37-cp37m-linux_armv7l.whl?raw=true
wget --content-disposition https://github.com/jr-k/snips-nlu-rebirth/blob/master/wheels/scikit_learn-0.22.1-cp37-cp37m-linux_armv7l.whl?raw=true
wget --content-disposition https://github.com/jr-k/snips-nlu-rebirth/blob/master/wheels/snips_nlu_utils-0.9.1-cp37-cp37m-linux_armv7l.whl?raw=true
wget --content-disposition https://github.com/jr-k/snips-nlu-rebirth/blob/master/wheels/snips_nlu_parsers-0.4.3-cp37-cp37m-linux_armv7l.whl?raw=true
wget --content-disposition https://github.com/jr-k/snips-nlu-rebirth/blob/master/wheels/snips_nlu-0.20.2-py3-none-any.whl?raw=true
- Install them all in this order:
sudo pip3 install scipy-1.3.3-cp37-cp37m-linux_armv7l.whl
sudo pip3 install scikit_learn-0.22.1-cp37-cp37m-linux_armv7l.whl
sudo pip3 install snips_nlu_utils-0.9.1-cp37-cp37m-linux_armv7l.whl
sudo pip3 install snips_nlu_parsers-0.4.3-cp37-cp37m-linux_armv7l.whl
sudo pip3 install snips_nlu-0.20.2-py3-none-any.whl
- Thanks to snips-nlu tools you'll be able to train a model. But first we need to prepare the targeted language. (Warning: if you installed wheels from
pi
user withoutsudo
,snips-nlu
path will be/home/pi/.local/bin/snips-nlu
)
snips-nlu download en
- Then train a dataset. Let's take the sample available on the snips-nlu repository.
git clone https://github.com/snipsco/snips-nlu
cd snips-nlu/
snips-nlu train sample_datasets/lights_dataset.json path_to_output_trained_engine/
- rustup
- mqtt server/client (Mosquitto)
- clang
Get these dependencies from apt repositories by running
sudo apt install mosquitto mosquitto-clients clang
! You won't be able to compile this on a raspberry pi, you need more power so you'll need to cross compile using a specific toolchain, there is more information in this page: https://github.com/jr-k/snips-nlu-rebirth/blob/master/XCOMPILE.md !
Prebuilt binary is now available, see section "Download prebuilt package" below for more informations.
- We need a rust compiler so let's install rustup:
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
- Download repository
git clone https://github.com/jr-k/snips-nlu-rebirth && cd snips-nlu-rebirth
- Setup your configuration and edit
cp snips-nlu.toml.dist snips-nlu.toml && nano snips-nlu.toml
- Don't forget to add
path_to_output_trained_engine
to the configuration filesnips-nlu.toml
(from this project) in theengine_dir
variable of the[global]
section and you're ready to parse any query trained from thelights_dataset
model. *
- Finally build/run project
cargo run # or build
- If you don't want to cross-compile a binary for raspberry you can find a prebuilt binary here: (MD5 and SHA256 checksums: Prebuilt binary snips-nlu-rebirth README.md)
mkdir -p /home/pi/snips-nlu-rebirth && cd /home/pi/snips-nlu-rebirth
wget -O snips-nlu-rebirth https://github.com/jr-k/snips-nlu-rebirth/blob/master/dist/snips-nlu-rebirth?raw=true
wget -O snips-nlu.toml https://github.com/jr-k/snips-nlu-rebirth/blob/master/snips-nlu.toml.dist?raw=true
chmod +x ./snips-nlu-rebirth
- Don't forget to add
path_to_output_trained_engine
to the configuration filesnips-nlu.toml
(from this project) in theengine_dir
variable of the[global]
section and you're ready to parse any query trained from thelights_dataset
model. *
- Finally run project
./snips-nlu-rebirth
-
Run
mosquitto_sub -t '#' -v
to see whats going on -
You can trigger the NLU by sending a MQTT message
mosquitto_pub -t 'hermes/nlu/query' -m '{"input":"light in the garage", "sessionId":"42"}'
the output on topic
hermes/nlu/intentParsed
would be:
{
"input": "light in the garage",
"id": null,
"sessionId": "42",
"intent": {
"intentName": "turnLightOn",
"confidenceScore": 0.3685922
},
"slots": [{
"rawValue": "garage",
"value": {
"kind": "Custom",
"value": "garage"
},
"alternatives": [],
"range": {
"start": 13,
"end": 19
},
"entity": "room",
"slotName": "room"
}]
}
This project follows the hermes protocol described here: https://docs.snips.ai/reference/hermes#natural-language-understanding-nlu
API for NLU :
hermes/nlu/query
: ✅ (addedintentWhitelist
which is an alias forintentFilter
; addedintentBlacklist
)hermes/nlu/partialQuery
: ❌hermes/nlu/intentParsed
: ✅hermes/nlu/slotParsed
: ❌hermes/nlu/intentNotRecognized
: ✅hermes/error/nlu
: ✅ (for backward compatibility withsnips-nlu
)
Extras:
hermes/nlu/exit
: ✅ (exit the program gracefully)hermes/nlu/reload/engine
: ✅ (hotreload the trained engine)
- TLS for MQTT server
This library is provided by JRK as Open Source software. See LICENSE for more information.