Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: add detailed example #12

Merged
merged 4 commits into from
Aug 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 7 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,21 +1,17 @@
tantivy-jieba
============================
# tantivy-jieba

[![Crates.io version][crate-img]][crate]
[![docs.rs][docs-img]][docs]
[![Changelog][changelog-img]][changelog]
[![FOSSA Status](https://app.fossa.io/api/projects/git%2Bgithub.com%2Fjiegec%2Ftantivy-jieba.svg?type=shield)](https://app.fossa.io/projects/git%2Bgithub.com%2Fjiegec%2Ftantivy-jieba?ref=badge_shield)


An adapter that bridges between tantivy and jieba-rs.

Usage
===========================
## Usage

Add dependency `tantivy-jieba` to your `Cargo.toml`.

Example
---------------------------
### Example

```rust
use tantivy::tokenizer::*;
Expand All @@ -25,8 +21,7 @@ assert_eq!(token_stream.next().unwrap().text, "测试");
assert!(token_stream.next().is_none());
```

Register tantivy tokenizer
---------------------------
### Register tantivy tokenizer

```rust
use tantivy::schema::Schema;
Expand All @@ -38,13 +33,15 @@ index.tokenizers()
.register("jieba", tokenizer);
```

See [examples/mod.rs](examples/mod.rs) for detailed example.

[crate-img]: https://img.shields.io/crates/v/tantivy-jieba.svg
[crate]: https://crates.io/crates/tantivy-jieba
[changelog-img]: https://img.shields.io/badge/changelog-online-blue.svg
[changelog]: https://github.com/jiegec/tantivy-jieba/blob/master/CHANGELOG.md
[docs-img]: https://docs.rs/tantivy-jieba/badge.svg
[docs]: https://docs.rs/tantivy-jieba


## License

[![FOSSA Status](https://app.fossa.io/api/projects/git%2Bgithub.com%2Fjiegec%2Ftantivy-jieba.svg?type=large)](https://app.fossa.io/projects/git%2Bgithub.com%2Fjiegec%2Ftantivy-jieba?ref=badge_large)
63 changes: 63 additions & 0 deletions examples/mod.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
use tantivy::collector::TopDocs;
use tantivy::doc;
use tantivy::query::QueryParser;
use tantivy::schema::{IndexRecordOption, Schema, TextFieldIndexing, TextOptions, Value};
use tantivy::tokenizer::*;
use tantivy::Index;
use tantivy::TantivyDocument;

fn main() {
// Build schema
let mut schema_builder = Schema::builder();
let name = schema_builder.add_text_field(
"name",
TextOptions::default()
.set_indexing_options(
TextFieldIndexing::default()
.set_tokenizer("jieba")
.set_index_option(IndexRecordOption::WithFreqsAndPositions),
)
.set_stored(),
);
let schema = schema_builder.build();

// Register tantivy tokenizer
let tokenizer = tantivy_jieba::JiebaTokenizer {};
let index = Index::create_in_ram(schema);
let analyzer = TextAnalyzer::builder(tokenizer)
.filter(RemoveLongFilter::limit(40))
.filter(LowerCaser)
.filter(Stemmer::default())
.build();
index.tokenizers().register("jieba", analyzer);

// Index some documents
let mut index_writer = index.writer(50_000_000).unwrap();
index_writer.add_document(doc!(
name => "张华考上了北京大学;李萍进了中等技术学校;我在百货公司当售货员:我们都有光明的前途",
)).unwrap();
index_writer.commit().unwrap();

// Search keywords
let reader = index.reader().unwrap();
let searcher = reader.searcher();
let query_parser = QueryParser::for_index(&index, vec![name]);
let query = query_parser.parse_query("售货员").unwrap();
let top_docs = searcher.search(&query, &TopDocs::with_limit(10)).unwrap();
println!("Search Result:");
for (_, doc_address) in top_docs {
let retrieved_doc: TantivyDocument = searcher.doc(doc_address).unwrap();
let val = retrieved_doc.get_first(name).unwrap();
let res = val.as_str().unwrap_or_default().to_string();
println!("{res}");
assert_eq!(
res,
*"张华考上了北京大学;李萍进了中等技术学校;我在百货公司当售货员:我们都有光明的前途"
);
}
}

#[test]
fn test() {
main();
}
Loading