Skip to content

Commit

Permalink
Updated README to recommend use of an external data store with large …
Browse files Browse the repository at this point in the history
…corpora.

Signed-off-by: Govind Kamat <govkamat@amazon.com>
  • Loading branch information
gkamat committed Jul 9, 2024
1 parent f4a830e commit a6be2cb
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion big5/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -182,9 +182,10 @@ Running range-auto-date-histo-with-metrics [

### Considerations when Using the 1 TB Data Corpus

*Caveat*: This corpus is being made available as a feature that is currently being alpha tested. Some points to note when carrying out performance runs using this corpus:
*Caveat*: This corpus is being made available as a feature that is currently in beta test. Some points to note when carrying out performance runs using this corpus:

* Due to CloudFront download size limits, the uncompressed size of the 1 TB corpus is actually 0.95 TB (~0.9 TiB). This [issue has been noted](https://github.com/opensearch-project/opensearch-benchmark/issues/543) and will be resolved in due course.
* Use an external data store to record metrics. Using the in-memory store will likely result in the system running out of memory and becoming unresponsive, resulting in inaccurate performance numbers.
* Use a load generation host with sufficient disk space to hold the corpus.
* Ensure the target cluster has adequate storage and at least 3 data nodes.
* Specify an appropriate shard count and number of replicas so that shards are evenly distributed and appropriately sized.
Expand Down

0 comments on commit a6be2cb

Please sign in to comment.