Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/main' into cve
Browse files Browse the repository at this point in the history
Signed-off-by: Aman Khare <amkhar@amazon.com>
  • Loading branch information
Aman Khare committed Mar 12, 2024
2 parents 1afd6f5 + 69fc7dd commit 630de5b
Show file tree
Hide file tree
Showing 59 changed files with 342 additions and 42 deletions.
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Bump `org.bouncycastle:bcmail-jdk15to18` to `org.bouncycastle:bcmail-jdk18on` version 1.77 ([#12317](https://github.com/opensearch-project/OpenSearch/pull/12317))
- Bump `org.bouncycastle:bcpkix-jdk15to18` to `org.bouncycastle:bcpkix-jdk18on` version 1.77 ([#12317](https://github.com/opensearch-project/OpenSearch/pull/12317))
- Bump `org.apache.commons:commons-compress` from 1.24.0 to 1.26.0 ([#12604](https://github.com/opensearch-project/OpenSearch/pull/12604))
- Bump Jackson version from 2.16.1 to 2.16.2 ([#12611](https://github.com/opensearch-project/OpenSearch/pull/12611))

### Changed
- [CCR] Add getHistoryOperationsFromTranslog method to fetch the history snapshot from translogs ([#3948](https://github.com/opensearch-project/OpenSearch/pull/3948))
Expand All @@ -64,6 +65,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Add task completion count in search backpressure stats API ([#10028](https://github.com/opensearch-project/OpenSearch/pull/10028/))
- Deprecate CamelCase `PathHierarchy` tokenizer name in favor to lowercase `path_hierarchy` ([#10894](https://github.com/opensearch-project/OpenSearch/pull/10894))
- Switched to more reliable OpenSearch Lucene snapshot location([#11728](https://github.com/opensearch-project/OpenSearch/pull/11728))
- Breaking change: Do not request "search_pipelines" metrics by default in NodesInfoRequest ([#12497](https://github.com/opensearch-project/OpenSearch/pull/12497))

### Deprecated

Expand Down Expand Up @@ -105,6 +107,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Add toString methods to MultiSearchRequest, MultiGetRequest and CreateIndexRequest ([#12163](https://github.com/opensearch-project/OpenSearch/pull/12163))
- Support for returning scores in matched queries ([#11626](https://github.com/opensearch-project/OpenSearch/pull/11626))
- Add shard id property to SearchLookup for use in field types provided by plugins ([#1063](https://github.com/opensearch-project/OpenSearch/pull/1063))
- Force merge API supports performing on primary shards only ([#11269](https://github.com/opensearch-project/OpenSearch/pull/11269))
- [Tiered caching] Make IndicesRequestCache implementation configurable [EXPERIMENTAL] ([#12533](https://github.com/opensearch-project/OpenSearch/pull/12533))
- Add kuromoji_completion analyzer and filter ([#4835](https://github.com/opensearch-project/OpenSearch/issues/4835))
- The org.opensearch.bootstrap.Security should support codebase for JAR files with classifiers ([#12586](https://github.com/opensearch-project/OpenSearch/issues/12586))
Expand Down
4 changes: 2 additions & 2 deletions buildSrc/version.properties
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@ bundled_jdk = 21.0.2+13
# optional dependencies
spatial4j = 0.7
jts = 1.15.0
jackson = 2.16.1
jackson_databind = 2.16.1
jackson = 2.16.2
jackson_databind = 2.16.2
snakeyaml = 2.1
icu4j = 70.1
supercsv = 2.4.0
Expand Down
1 change: 0 additions & 1 deletion client/sniffer/licenses/jackson-core-2.16.1.jar.sha1

This file was deleted.

1 change: 1 addition & 0 deletions client/sniffer/licenses/jackson-core-2.16.2.jar.sha1
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
b4f588bf070f77b604c645a7d60b71eae2e6ea09

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
dfcd11c847ea7276aa073c25f5fe8ee361748d7f

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
7fda67535b54d74eebf6157682b835c847410932
1 change: 0 additions & 1 deletion libs/core/licenses/jackson-core-2.16.1.jar.sha1

This file was deleted.

1 change: 1 addition & 0 deletions libs/core/licenses/jackson-core-2.16.2.jar.sha1
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
b4f588bf070f77b604c645a7d60b71eae2e6ea09
1 change: 0 additions & 1 deletion libs/x-content/licenses/jackson-core-2.16.1.jar.sha1

This file was deleted.

1 change: 1 addition & 0 deletions libs/x-content/licenses/jackson-core-2.16.2.jar.sha1
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
b4f588bf070f77b604c645a7d60b71eae2e6ea09

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
1a1a3036016ea2ae3061c0bb46cba6968ff7faae

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
209fd9ae0e6c6b233b0c14baa8f17acea71e5766

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
13088f6762211f264bc0ebf5467be96d8e9e3ebf

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
dfcd11c847ea7276aa073c25f5fe8ee361748d7f

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
7fda67535b54d74eebf6157682b835c847410932

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
dfcd11c847ea7276aa073c25f5fe8ee361748d7f

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
7fda67535b54d74eebf6157682b835c847410932

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
dfcd11c847ea7276aa073c25f5fe8ee361748d7f

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
7fda67535b54d74eebf6157682b835c847410932

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
dfcd11c847ea7276aa073c25f5fe8ee361748d7f

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
7fda67535b54d74eebf6157682b835c847410932

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
796c3141d3bbcf67dc06751695dca116b2838a73

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
58e86108e4b1b1e893e7a69b1bbca880acfca143

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
d1274db656edefe242fbd26d3266f7b4abb6f57b

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
dfcd11c847ea7276aa073c25f5fe8ee361748d7f

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
7fda67535b54d74eebf6157682b835c847410932
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,10 @@
"wait_for_completion": {
"type" : "boolean",
"description" : "If false, the request will return a task immediately and the operation will run in background. Defaults to true."
},
"primary_only": {
"type" : "boolean",
"description" : "Specify whether the operation should only perform on primary shards. Defaults to false."
}
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,3 +27,23 @@
index: test
max_num_segments: 10
only_expunge_deletes: true

---
"Test primary_only parameter":
- skip:
version: " - 2.99.99"
reason: "primary_only is available in 3.0+"

- do:
indices.create:
index: test
body:
settings:
index.number_of_shards: 2
index.number_of_replicas: 1

- do:
indices.forcemerge:
index: test
primary_only: true
- match: { _shards.total: 2 }
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
wait_for_completion: true
task_id: $taskId
- match: { task.action: "indices:admin/forcemerge" }
- match: { task.description: "Force-merge indices [test_index], maxSegments[1], onlyExpungeDeletes[false], flush[true]" }
- match: { task.description: "Force-merge indices [test_index], maxSegments[1], onlyExpungeDeletes[false], flush[true], primaryOnly[false]" }

# .tasks index is created when the force-merge operation completes, so we should delete .tasks index finally,
# if not, the .tasks index may introduce unexpected warnings and then cause other test cases to fail.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -100,6 +100,24 @@ public void testForceMergeUUIDConsistent() throws IOException {
assertThat(primaryForceMergeUUID, is(replicaForceMergeUUID));
}

public void testForceMergeOnlyOnPrimaryShards() throws IOException {
internalCluster().ensureAtLeastNumDataNodes(2);
final String index = "test-index";
createIndex(
index,
Settings.builder().put(IndexMetadata.SETTING_NUMBER_OF_SHARDS, 1).put(IndexMetadata.SETTING_NUMBER_OF_REPLICAS, 1).build()
);
ensureGreen(index);
final ForceMergeResponse forceMergeResponse = client().admin()
.indices()
.prepareForceMerge(index)
.setMaxNumSegments(1)
.setPrimaryOnly(true)
.get();
assertThat(forceMergeResponse.getFailedShards(), is(0));
assertThat(forceMergeResponse.getSuccessfulShards(), is(1));
}

private static String getForceMergeUUID(IndexShard indexShard) throws IOException {
try (GatedCloseable<IndexCommit> wrappedIndexCommit = indexShard.acquireLastIndexCommit(true)) {
return wrappedIndexCommit.get().getUserData().get(Engine.FORCE_MERGE_UUID_KEY);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@
import org.opensearch.action.admin.cluster.stats.ClusterStatsResponse;
import org.opensearch.action.admin.indices.alias.Alias;
import org.opensearch.action.admin.indices.flush.FlushRequest;
import org.opensearch.action.admin.indices.forcemerge.ForceMergeResponse;
import org.opensearch.action.admin.indices.stats.IndicesStatsRequest;
import org.opensearch.action.admin.indices.stats.IndicesStatsResponse;
import org.opensearch.action.get.GetResponse;
Expand Down Expand Up @@ -400,6 +401,14 @@ public void testMultipleShards() throws Exception {
}

public void testReplicationAfterForceMerge() throws Exception {
performReplicationAfterForceMerge(false, SHARD_COUNT * (1 + REPLICA_COUNT));
}

public void testReplicationAfterForceMergeOnPrimaryShardsOnly() throws Exception {
performReplicationAfterForceMerge(true, SHARD_COUNT);
}

private void performReplicationAfterForceMerge(boolean primaryOnly, int expectedSuccessfulShards) throws Exception {
final String nodeA = internalCluster().startDataOnlyNode();
final String nodeB = internalCluster().startDataOnlyNode();
createIndex(INDEX_NAME);
Expand Down Expand Up @@ -430,8 +439,16 @@ public void testReplicationAfterForceMerge() throws Exception {
waitForDocs(expectedHitCount, indexer);
waitForSearchableDocs(expectedHitCount, nodeA, nodeB);

// Force a merge here so that the in memory SegmentInfos does not reference old segments on disk.
client().admin().indices().prepareForceMerge(INDEX_NAME).setMaxNumSegments(1).setFlush(false).get();
// Perform force merge only on the primary shards.
final ForceMergeResponse forceMergeResponse = client().admin()
.indices()
.prepareForceMerge(INDEX_NAME)
.setPrimaryOnly(primaryOnly)
.setMaxNumSegments(1)
.setFlush(false)
.get();
assertThat(forceMergeResponse.getFailedShards(), is(0));
assertThat(forceMergeResponse.getSuccessfulShards(), is(expectedSuccessfulShards));
refresh(INDEX_NAME);
verifyStoreContent();
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@
@PublicApi(since = "1.0.0")
public class NodesInfoRequest extends BaseNodesRequest<NodesInfoRequest> {

private Set<String> requestedMetrics = Metric.allMetrics();
private Set<String> requestedMetrics = Metric.defaultMetrics();

/**
* Create a new NodeInfoRequest from a {@link StreamInput} object.
Expand All @@ -73,7 +73,7 @@ public NodesInfoRequest(StreamInput in) throws IOException {
*/
public NodesInfoRequest(String... nodesIds) {
super(nodesIds);
all();
defaultMetrics();
}

/**
Expand All @@ -85,13 +85,24 @@ public NodesInfoRequest clear() {
}

/**
* Sets to return all the data.
* Sets to return data for all the metrics.
* See {@link Metric}
*/
public NodesInfoRequest all() {
requestedMetrics.addAll(Metric.allMetrics());
return this;
}

/**
* Sets to return data for default metrics only.
* See {@link Metric}
* See {@link Metric#defaultMetrics()}.
*/
public NodesInfoRequest defaultMetrics() {
requestedMetrics.addAll(Metric.defaultMetrics());
return this;
}

/**
* Get the names of requested metrics
*/
Expand Down Expand Up @@ -156,7 +167,7 @@ public void writeTo(StreamOutput out) throws IOException {

/**
* An enumeration of the "core" sections of metrics that may be requested
* from the nodes information endpoint. Eventually this list list will be
* from the nodes information endpoint. Eventually this list will be
* pluggable.
*/
public enum Metric {
Expand Down Expand Up @@ -187,8 +198,25 @@ boolean containedIn(Set<String> metricNames) {
return metricNames.contains(this.metricName());
}

/**
* Return all available metrics.
* See {@link Metric}
*/
public static Set<String> allMetrics() {
return Arrays.stream(values()).map(Metric::metricName).collect(Collectors.toSet());
}

/**
* Return "the default" set of metrics.
* Similar to {@link #allMetrics()} except {@link Metric#SEARCH_PIPELINES} metric is not included.
* <br>
* The motivation to define the default set of metrics was to keep the default response
* size at bay. Metrics that are NOT included in the default set were typically introduced later
* and are considered to contain specific type of information that is not usually useful unless you
* know that you really need it.
*/
public static Set<String> defaultMetrics() {
return allMetrics().stream().filter(metric -> !(metric.equals(SEARCH_PIPELINES.metricName()))).collect(Collectors.toSet());
}
}
}
Loading

0 comments on commit 630de5b

Please sign in to comment.