Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Ingest Manager] Don't retain POST /setup results. fixes #74587 #75372

Merged
merged 23 commits into from
Aug 20, 2020

Conversation

jfsiii
Copy link
Contributor

@jfsiii jfsiii commented Aug 18, 2020

Summary

extracted /setup only changes from #74507

  • Don't retain setup error. Run setupIngestManager on every POST /setup Fixes [Ingest Manager] Plugin won't retry failed initial setup  #74587
  • Refactor setupIngestManager to make a distinction between the code a) for the current "one-setup-at-a-time" behavior and b) to create the required side-effects. Only cut/pasted blocks into functions. Didn't change any behavior. Added tests to confirm current/expected behavior.

Checklist

Delete any items that are not applicable to this PR.

Added tests

ingestManagerSetupHandler
  ✓ POST /setup succeeds w/200 and body of resolved value (135ms)
  ✓ POST /setup fails w/500 on custom error (140ms)
  ✓ POST /setup fails w/502 on RegistryError (46ms)

setupIngestManager
  should reject with any error thrown underneath
    ✓ SO client throws plain Error (83ms)
    ✓ SO client throws other error (32ms)

  awaitIfPending
    ✓ first promise called blocks others (3ms)
    ✓ does not block other calls after batch is fulfilled. can call again for a new result (7ms)
    first promise created, not necessarily first fulfilled, sets value for all in queue
      ✓ succeeds (1001ms)
      ✓ throws (1018ms)

John Schulz and others added 22 commits August 6, 2020 08:18
works, afaict. no tests. one TS issue.
```
  firstSuccessOrTryAgain
    ✓ reject/throws is called again & its value returned (18ms)
    ✓ the first success value is cached (2ms)
```
Terrible tests. Committing & pushing to see if it fixes failures like https://github.com/elastic/kibana/pull/74507/checks?check_run_id=980178887

https://kibana-ci.elastic.co/job/elastic+kibana+pipeline-pull-request/67892/execution/node/663/log/

```
07:36:56               └-> "before all" hook
07:36:56               └-> should not allow to enroll an agent with a invalid enrollment
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 │ proc [kibana]  error  [11:36:56.369]  Error: Internal Server Error
07:36:56                 │ proc [kibana]     at HapiResponseAdapter.toError (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/response_adapter.js:132:19)
07:36:56                 │ proc [kibana]     at HapiResponseAdapter.toHapiResponse (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/response_adapter.js:86:19)
07:36:56                 │ proc [kibana]     at HapiResponseAdapter.handle (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/response_adapter.js:81:17)
07:36:56                 │ proc [kibana]     at Router.handle (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/router.js:164:34)
07:36:56                 │ proc [kibana]     at process._tickCallback (internal/process/next_tick.js:68:7)
07:36:56                 │ proc [kibana]   log   [11:36:56.581] [info][authentication][plugins][security] Authentication attempt failed: [security_exception] missing authentication credentials for REST request [/_security/_authenticate], with { header={ WWW-Authenticate={ 0="ApiKey" & 1="Basic realm=\"security\" charset=\"UTF-8\"" } } }
07:36:56                 └- ✓ pass  (60ms) "Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should not allow to enroll an agent with a invalid enrollment"
07:36:56               └-> should not allow to enroll an agent with a shared id if it already exists
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 └- ✓ pass  (111ms) "Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should not allow to enroll an agent with a shared id if it already exists "
07:36:56               └-> should not allow to enroll an agent with a version > kibana
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 └- ✓ pass  (58ms) "Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should not allow to enroll an agent with a version > kibana"
07:36:56               └-> should allow to enroll an agent with a valid enrollment token
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 └- ✖ fail: Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should allow to enroll an agent with a valid enrollment token
07:36:56                 │      Error: expected 200 "OK", got 500 "Internal Server Error"
07:36:56                 │       at Test._assertStatus (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:268:12)
07:36:56                 │       at Test._assertFunction (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:283:11)
07:36:56                 │       at Test.assert (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:173:18)
07:36:56                 │       at assert (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:131:12)
07:36:56                 │       at /dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:128:5
07:36:56                 │       at Test.Request.callback (/dev/shm/workspace/kibana/node_modules/superagent/lib/node/index.js:718:3)
07:36:56                 │       at parser (/dev/shm/workspace/kibana/node_modules/superagent/lib/node/index.js:906:18)
07:36:56                 │       at IncomingMessage.res.on (/dev/shm/workspace/kibana/node_modules/superagent/lib/node/parsers/json.js:19:7)
07:36:56                 │       at endReadableNT (_stream_readable.js:1145:12)
07:36:56                 │       at process._tickCallback (internal/process/next_tick.js:63:19)
07:36:56                 │
07:36:56                 │
```
`firstPromiseBlocksAndFufills` for "the first promise created blocks others from being created, then fufills all with that first result"
Add explicit `isPending` value instead of overloading role of `status`. Could probably do without it, but it makes the intent more clear.
@@ -40,14 +47,19 @@ export const registerRoutes = (router: IRouter, config: IngestManagerConfigType)
},
getFleetStatusHandler
);
};

export const registerRoutes = (router: IRouter, config: IngestManagerConfigType) => {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I split this function up and exported the individual register*Route functions for some now-removed tests. I left it because it seems useful is common in other Kibana plugins.

#74507 (comment)

@jfsiii jfsiii added release_note:skip Skip the PR/issue when compiling release notes Team:Fleet Team label for Observability Data Collection Fleet team v7.10.0 v8.0.0 labels Aug 19, 2020
@ph ph requested review from nchaulet and jen-huang and removed request for nchaulet August 19, 2020 15:32
@ph ph requested a review from nchaulet August 19, 2020 15:32
@jfsiii jfsiii requested a review from neptunian August 19, 2020 15:35
@jfsiii jfsiii changed the title 74587 retry failed setup [Ingest Manager] Don't retain POST /setup error fixes #74587 Aug 19, 2020
@jfsiii jfsiii changed the title [Ingest Manager] Don't retain POST /setup error fixes #74587 [Ingest Manager] Don't retain POST /setup results. fixes #74587 Aug 19, 2020
@jfsiii jfsiii marked this pull request as ready for review August 19, 2020 15:55
@jfsiii jfsiii requested a review from a team August 19, 2020 15:55
@elasticmachine
Copy link
Contributor

Pinging @elastic/ingest-management (Team:Ingest Management)

@jfsiii jfsiii self-assigned this Aug 19, 2020
@jfsiii jfsiii added the v7.9.1 label Aug 19, 2020
@jfsiii
Copy link
Contributor Author

jfsiii commented Aug 19, 2020

@elasticmachine merge upstream

@elasticmachine
Copy link
Contributor

merge conflict between base and head

@kibanamachine
Copy link
Contributor

💛 Build succeeded, but was flaky


Test Failures

Chrome UI Functional Tests.test/functional/apps/visualize/_vega_chart·ts.visualize app vega chart in visualize app vega chart with filters should render different data in response to filter change

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has not failed recently on tracked branches

[00:00:00]       │
[00:00:00]         └-: visualize app
[00:00:00]           └-> "before all" hook
[00:00:00]           └-> "before all" hook
[00:00:00]             │ debg Starting visualize before method
[00:00:00]             │ info [logstash_functional] Loading "mappings.json"
[00:00:00]             │ info [logstash_functional] Loading "data.json.gz"
[00:00:01]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [logstash-2015.09.22] creating index, cause [api], templates [], shards [1]/[0]
[00:00:01]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[logstash-2015.09.22][0]]])." previous.health="YELLOW" reason="shards started [[logstash-2015.09.22][0]]"
[00:00:01]             │ info [logstash_functional] Created index "logstash-2015.09.22"
[00:00:01]             │ debg [logstash_functional] "logstash-2015.09.22" settings {"index":{"analysis":{"analyzer":{"url":{"max_token_length":"1000","tokenizer":"uax_url_email","type":"standard"}}},"number_of_replicas":"0","number_of_shards":"1"}}
[00:00:01]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [logstash-2015.09.20] creating index, cause [api], templates [], shards [1]/[0]
[00:00:01]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[logstash-2015.09.20][0]]])." previous.health="YELLOW" reason="shards started [[logstash-2015.09.20][0]]"
[00:00:01]             │ info [logstash_functional] Created index "logstash-2015.09.20"
[00:00:01]             │ debg [logstash_functional] "logstash-2015.09.20" settings {"index":{"analysis":{"analyzer":{"url":{"max_token_length":"1000","tokenizer":"uax_url_email","type":"standard"}}},"number_of_replicas":"0","number_of_shards":"1"}}
[00:00:01]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [logstash-2015.09.21] creating index, cause [api], templates [], shards [1]/[0]
[00:00:01]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[logstash-2015.09.21][0]]])." previous.health="YELLOW" reason="shards started [[logstash-2015.09.21][0]]"
[00:00:01]             │ info [logstash_functional] Created index "logstash-2015.09.21"
[00:00:01]             │ debg [logstash_functional] "logstash-2015.09.21" settings {"index":{"analysis":{"analyzer":{"url":{"max_token_length":"1000","tokenizer":"uax_url_email","type":"standard"}}},"number_of_replicas":"0","number_of_shards":"1"}}
[00:00:10]             │ info progress: 3007
[00:00:20]             │ info progress: 10524
[00:00:24]             │ info [logstash_functional] Indexed 4633 docs into "logstash-2015.09.22"
[00:00:24]             │ info [logstash_functional] Indexed 4757 docs into "logstash-2015.09.20"
[00:00:24]             │ info [logstash_functional] Indexed 4614 docs into "logstash-2015.09.21"
[00:00:24]             │ info [long_window_logstash] Loading "mappings.json"
[00:00:24]             │ info [long_window_logstash] Loading "data.json.gz"
[00:00:24]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [long-window-logstash-0] creating index, cause [api], templates [], shards [1]/[0]
[00:00:25]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[long-window-logstash-0][0]]])." previous.health="YELLOW" reason="shards started [[long-window-logstash-0][0]]"
[00:00:25]             │ info [long_window_logstash] Created index "long-window-logstash-0"
[00:00:25]             │ debg [long_window_logstash] "long-window-logstash-0" settings {"index":{"analysis":{"analyzer":{"makelogs_url":{"max_token_length":"1000","tokenizer":"uax_url_email","type":"standard"}}},"number_of_replicas":"0","number_of_shards":"1"}}
[00:00:34]             │ info progress: 9332
[00:00:39]             │ info [long_window_logstash] Indexed 14005 docs into "long-window-logstash-0"
[00:00:40]             │ info [visualize] Loading "mappings.json"
[00:00:40]             │ info [visualize] Loading "data.json"
[00:00:40]             │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [.kibana_1/CrR2FdI5Q5Oj2mrxb-v4EA] deleting index
[00:00:40]             │ info [visualize] Deleted existing index [".kibana_1"]
[00:00:40]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [.kibana] creating index, cause [api], templates [], shards [1]/[1]
[00:00:40]             │ info [visualize] Created index ".kibana"
[00:00:40]             │ debg [visualize] ".kibana" settings {"index":{"number_of_replicas":"1","number_of_shards":"1"}}
[00:00:40]             │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [.kibana/-cUHxc5LSBSkAfcCDAyXDA] update_mapping [_doc]
[00:00:40]             │ info [visualize] Indexed 12 docs into ".kibana"
[00:00:40]             │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [.kibana/-cUHxc5LSBSkAfcCDAyXDA] update_mapping [_doc]
[00:00:40]             │ debg Migrating saved objects
[00:00:40]             │ proc [kibana]   log   [11:12:05.216] [info][savedobjects-service] Creating index .kibana_2.
[00:00:40]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [.kibana_2] creating index, cause [api], templates [], shards [1]/[1]
[00:00:40]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] updating number_of_replicas to [0] for indices [.kibana_2]
[00:00:40]             │ proc [kibana]   log   [11:12:05.365] [info][savedobjects-service] Reindexing .kibana to .kibana_1
[00:00:40]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [.kibana_1] creating index, cause [api], templates [], shards [1]/[1]
[00:00:40]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] updating number_of_replicas to [0] for indices [.kibana_1]
[00:00:41]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [.tasks] creating index, cause [auto(task api)], templates [], shards [1]/[1]
[00:00:41]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] updating number_of_replicas to [0] for indices [.tasks]
[00:00:41]             │ info [o.e.t.LoggingTaskListener] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] 1548 finished with response BulkByScrollResponse[took=138.6ms,timed_out=false,sliceId=null,updated=0,created=12,deleted=0,batches=1,versionConflicts=0,noops=0,retries=0,throttledUntil=0s,bulk_failures=[],search_failures=[]]
[00:00:41]             │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [.kibana/-cUHxc5LSBSkAfcCDAyXDA] deleting index
[00:00:41]             │ proc [kibana]   log   [11:12:06.147] [info][savedobjects-service] Migrating .kibana_1 saved objects to .kibana_2
[00:00:41]             │ proc [kibana]   log   [11:12:06.190] [error][savedobjects-service] Error: Unable to migrate the corrupt Saved Object document index-pattern:test_index*. To prevent Kibana from performing a migration on every restart, please delete or fix this document by ensuring that the namespace and type in the document's id matches the values in the namespace and type fields.
[00:00:41]             │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [.kibana_2/L0gdyyqtQ9-hnlKTFQ--UA] update_mapping [_doc]
[00:00:41]             │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [.kibana_2/L0gdyyqtQ9-hnlKTFQ--UA] update_mapping [_doc]
[00:00:41]             │ proc [kibana]   log   [11:12:06.350] [info][savedobjects-service] Pointing alias .kibana to .kibana_2.
[00:00:42]             │ proc [kibana]   log   [11:12:06.445] [info][savedobjects-service] Finished in 1232ms.
[00:00:42]             │ debg applying update to kibana config: {"accessibility:disableAnimations":true,"dateFormat:tz":"UTC"}
[00:00:42]             │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-16-tests-xxl-1597920941231118364] [.kibana_2/L0gdyyqtQ9-hnlKTFQ--UA] update_mapping [_doc]
[00:00:44]             │ debg replacing kibana config doc: {"defaultIndex":"logstash-*","format:bytes:defaultPattern":"0,0.[000]b"}
[00:00:45]           └-: 
[00:00:45]             └-> "before all" hook
[00:22:42]             └-: vega chart in visualize app
[00:22:42]               └-> "before all" hook
[00:22:42]               └-> "before all" hook
[00:22:42]                 │ debg navigateToApp visualize
[00:22:42]                 │ debg navigating to visualize url: http://localhost:61201/app/visualize#/
[00:22:42]                 │ debg navigate to: http://localhost:61201/app/visualize#/
[00:22:43]                 │ debg browser[INFO] http://localhost:61201/app/visualize?_t=1597923247390#/ 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:22:43]                 │
[00:22:43]                 │ debg browser[INFO] http://localhost:61201/bootstrap.js 42:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:22:43]                 │ debg ... sleep(700) start
[00:22:43]                 │ debg ... sleep(700) end
[00:22:43]                 │ debg returned from get, calling refresh
[00:22:44]                 │ debg browser[INFO] http://localhost:61201/35726/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js 452:106112 "INFO: 2020-08-20T11:34:08Z
[00:22:44]                 │        Adding connection to http://localhost:61201/elasticsearch
[00:22:44]                 │
[00:22:44]                 │      "
[00:22:44]                 │ERROR browser[SEVERE] http://localhost:61201/35726/bundles/core/core.entry.js 83:273753 TypeError: Failed to fetch
[00:22:44]                 │          at Fetch._callee3$ (http://localhost:61201/35726/bundles/core/core.entry.js:34:105310)
[00:22:44]                 │          at l (http://localhost:61201/35726/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:368:164563)
[00:22:44]                 │          at Generator._invoke (http://localhost:61201/35726/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:368:164316)
[00:22:44]                 │          at Generator.forEach.e.<computed> [as throw] (http://localhost:61201/35726/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js:368:164920)
[00:22:44]                 │          at fetch_asyncGeneratorStep (http://localhost:61201/35726/bundles/core/core.entry.js:34:99403)
[00:22:44]                 │          at _throw (http://localhost:61201/35726/bundles/core/core.entry.js:34:99811)
[00:22:44]                 │ debg browser[INFO] http://localhost:61201/app/visualize?_t=1597923247390#/ 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:22:44]                 │
[00:22:44]                 │ debg browser[INFO] http://localhost:61201/bootstrap.js 42:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:22:44]                 │ debg currentUrl = http://localhost:61201/app/visualize#/
[00:22:44]                 │          appUrl = http://localhost:61201/app/visualize#/
[00:22:44]                 │ debg TestSubjects.find(kibanaChrome)
[00:22:44]                 │ debg Find.findByCssSelector('[data-test-subj="kibanaChrome"]') with timeout=60000
[00:22:45]                 │ debg browser[INFO] http://localhost:61201/35726/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js 452:106112 "INFO: 2020-08-20T11:34:10Z
[00:22:45]                 │        Adding connection to http://localhost:61201/elasticsearch
[00:22:45]                 │
[00:22:45]                 │      "
[00:22:45]                 │ debg ... sleep(501) start
[00:22:46]                 │ debg ... sleep(501) end
[00:22:46]                 │ debg in navigateTo url = http://localhost:61201/app/visualize#/?_g=(filters:!(),refreshInterval:(pause:!t,value:0),time:(from:now-15m,to:now))
[00:22:46]                 │ debg --- retry.try error: URL changed, waiting for it to settle
[00:22:46]                 │ debg ... sleep(501) start
[00:22:47]                 │ debg ... sleep(501) end
[00:22:47]                 │ debg in navigateTo url = http://localhost:61201/app/visualize#/?_g=(filters:!(),refreshInterval:(pause:!t,value:0),time:(from:now-15m,to:now))
[00:22:47]                 │ debg TestSubjects.exists(statusPageContainer)
[00:22:47]                 │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="statusPageContainer"]') with timeout=2500
[00:22:50]                 │ debg --- retry.tryForTime error: [data-test-subj="statusPageContainer"] is not displayed
[00:22:50]                 │ debg TestSubjects.exists(newItemButton)
[00:22:50]                 │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="newItemButton"]') with timeout=10000
[00:22:50]                 │ debg TestSubjects.click(newItemButton)
[00:22:50]                 │ debg Find.clickByCssSelector('[data-test-subj="newItemButton"]') with timeout=10000
[00:22:50]                 │ debg Find.findByCssSelector('[data-test-subj="newItemButton"]') with timeout=10000
[00:22:50]                 │ debg TestSubjects.find(visNewDialogTypes)
[00:22:50]                 │ debg Find.findByCssSelector('[data-test-subj="visNewDialogTypes"]') with timeout=10000
[00:22:50]                 │ debg clickVega
[00:22:50]                 │ debg TestSubjects.click(visType-vega)
[00:22:50]                 │ debg Find.clickByCssSelector('[data-test-subj="visType-vega"]') with timeout=10000
[00:22:50]                 │ debg Find.findByCssSelector('[data-test-subj="visType-vega"]') with timeout=10000
[00:22:50]                 │ debg isGlobalLoadingIndicatorVisible
[00:22:50]                 │ debg TestSubjects.exists(globalLoadingIndicator)
[00:22:50]                 │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="globalLoadingIndicator"]') with timeout=1500
[00:22:50]                 │ debg browser[INFO] http://localhost:61201/app/visualize#/create?type=vega 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:22:50]                 │
[00:22:50]                 │ debg browser[INFO] http://localhost:61201/bootstrap.js 42:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:22:52]                 │ debg browser[INFO] http://localhost:61201/35726/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js 452:106112 "INFO: 2020-08-20T11:34:16Z
[00:22:52]                 │        Adding connection to http://localhost:61201/elasticsearch
[00:22:52]                 │
[00:22:52]                 │      "
[00:22:52]                 │ debg TestSubjects.exists(globalLoadingIndicator-hidden)
[00:22:52]                 │ debg Find.existsByCssSelector('[data-test-subj="globalLoadingIndicator-hidden"]') with timeout=100000
[00:22:52]                 │ debg Waiting up to 20000ms for rendering count to stabilize...
[00:22:52]                 │ debg TestSubjects.find(visualizationLoader)
[00:22:52]                 │ debg Find.findByCssSelector('[data-test-subj="visualizationLoader"]') with timeout=10000
[00:22:52]                 │ proc [kibana]   log   [11:34:16.796] [info][data][data][plugins] Get strategy es
[00:22:52]                 │ proc [kibana]   log   [11:34:16.797] [info][data][data][plugins] search _all
[00:22:52]                 │ debg -- firstCount=0
[00:22:52]                 │ debg ... sleep(2000) start
[00:22:52]                 │ debg browser[INFO] http://localhost:61201/35726/bundles/plugin/visTypeVega/visTypeVega.chunk.1.js 0:16084 "%cWelcome to Kibana Vega Plugin!" "font-size: 16px; font-weight: bold;"
[00:22:52]                 │ debg browser[INFO] http://localhost:61201/35726/bundles/plugin/visTypeVega/visTypeVega.chunk.1.js 0:16170 "You can access the Vega view with VEGA_DEBUG. Learn more at https://vega.github.io/vega/docs/api/debugging/."
[00:22:54]                 │ debg ... sleep(2000) end
[00:22:54]                 │ debg TestSubjects.find(visualizationLoader)
[00:22:54]                 │ debg Find.findByCssSelector('[data-test-subj="visualizationLoader"]') with timeout=10000
[00:22:54]                 │ debg -- secondCount=1
[00:22:54]                 │ debg TestSubjects.find(visualizationLoader)
[00:22:54]                 │ debg Find.findByCssSelector('[data-test-subj="visualizationLoader"]') with timeout=10000
[00:22:54]                 │ debg -- firstCount=1
[00:22:54]                 │ debg ... sleep(2000) start
[00:22:56]                 │ debg ... sleep(2000) end
[00:22:56]                 │ debg TestSubjects.find(visualizationLoader)
[00:22:56]                 │ debg Find.findByCssSelector('[data-test-subj="visualizationLoader"]') with timeout=10000
[00:22:56]                 │ debg -- secondCount=1
[00:22:56]               └-: vega chart
[00:22:56]                 └-> "before all" hook
[00:22:59]                 └-: with filters
[00:22:59]                   └-> "before all" hook
[00:22:59]                   └-> "before all" hook
[00:22:59]                     │ debg setAbsoluteRange
[00:22:59]                     │ debg Setting absolute range to Sep 19, 2015 @ 06:31:44.000 to Sep 23, 2015 @ 18:31:44.000
[00:22:59]                     │ debg TestSubjects.exists(superDatePickerToggleQuickMenuButton)
[00:22:59]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="superDatePickerToggleQuickMenuButton"]') with timeout=20000
[00:22:59]                     │ debg TestSubjects.exists(superDatePickerShowDatesButton)
[00:22:59]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="superDatePickerShowDatesButton"]') with timeout=2500
[00:22:59]                     │ debg TestSubjects.click(superDatePickerShowDatesButton)
[00:22:59]                     │ debg Find.clickByCssSelector('[data-test-subj="superDatePickerShowDatesButton"]') with timeout=10000
[00:22:59]                     │ debg Find.findByCssSelector('[data-test-subj="superDatePickerShowDatesButton"]') with timeout=10000
[00:22:59]                     │ debg TestSubjects.exists(superDatePickerstartDatePopoverButton)
[00:22:59]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="superDatePickerstartDatePopoverButton"]') with timeout=2500
[00:22:59]                     │ debg TestSubjects.click(superDatePickerendDatePopoverButton)
[00:22:59]                     │ debg Find.clickByCssSelector('[data-test-subj="superDatePickerendDatePopoverButton"]') with timeout=10000
[00:22:59]                     │ debg Find.findByCssSelector('[data-test-subj="superDatePickerendDatePopoverButton"]') with timeout=10000
[00:22:59]                     │ debg Find.findByCssSelector('div.euiPopover__panel-isOpen') with timeout=10000
[00:22:59]                     │ debg TestSubjects.click(superDatePickerAbsoluteTab)
[00:22:59]                     │ debg Find.clickByCssSelector('[data-test-subj="superDatePickerAbsoluteTab"]') with timeout=10000
[00:22:59]                     │ debg Find.findByCssSelector('[data-test-subj="superDatePickerAbsoluteTab"]') with timeout=10000
[00:22:59]                     │ debg TestSubjects.click(superDatePickerAbsoluteDateInput)
[00:22:59]                     │ debg Find.clickByCssSelector('[data-test-subj="superDatePickerAbsoluteDateInput"]') with timeout=10000
[00:22:59]                     │ debg Find.findByCssSelector('[data-test-subj="superDatePickerAbsoluteDateInput"]') with timeout=10000
[00:23:00]                     │ debg TestSubjects.setValue(superDatePickerAbsoluteDateInput, Sep 23, 2015 @ 18:31:44.000)
[00:23:00]                     │ debg TestSubjects.click(superDatePickerAbsoluteDateInput)
[00:23:00]                     │ debg Find.clickByCssSelector('[data-test-subj="superDatePickerAbsoluteDateInput"]') with timeout=10000
[00:23:00]                     │ debg Find.findByCssSelector('[data-test-subj="superDatePickerAbsoluteDateInput"]') with timeout=10000
[00:23:00]                     │ debg ... sleep(500) start
[00:23:01]                     │ debg ... sleep(500) end
[00:23:01]                     │ debg TestSubjects.click(superDatePickerstartDatePopoverButton)
[00:23:01]                     │ debg Find.clickByCssSelector('[data-test-subj="superDatePickerstartDatePopoverButton"]') with timeout=10000
[00:23:01]                     │ debg Find.findByCssSelector('[data-test-subj="superDatePickerstartDatePopoverButton"]') with timeout=10000
[00:23:01]                     │ debg Find.waitForElementStale with timeout=10000
[00:23:01]                     │ debg Find.findByCssSelector('div.euiPopover__panel-isOpen') with timeout=10000
[00:23:01]                     │ debg TestSubjects.click(superDatePickerAbsoluteTab)
[00:23:01]                     │ debg Find.clickByCssSelector('[data-test-subj="superDatePickerAbsoluteTab"]') with timeout=10000
[00:23:01]                     │ debg Find.findByCssSelector('[data-test-subj="superDatePickerAbsoluteTab"]') with timeout=10000
[00:23:01]                     │ debg TestSubjects.click(superDatePickerAbsoluteDateInput)
[00:23:01]                     │ debg Find.clickByCssSelector('[data-test-subj="superDatePickerAbsoluteDateInput"]') with timeout=10000
[00:23:01]                     │ debg Find.findByCssSelector('[data-test-subj="superDatePickerAbsoluteDateInput"]') with timeout=10000
[00:23:01]                     │ debg TestSubjects.setValue(superDatePickerAbsoluteDateInput, Sep 19, 2015 @ 06:31:44.000)
[00:23:01]                     │ debg TestSubjects.click(superDatePickerAbsoluteDateInput)
[00:23:01]                     │ debg Find.clickByCssSelector('[data-test-subj="superDatePickerAbsoluteDateInput"]') with timeout=10000
[00:23:01]                     │ debg Find.findByCssSelector('[data-test-subj="superDatePickerAbsoluteDateInput"]') with timeout=10000
[00:23:02]                     │ debg TestSubjects.exists(superDatePickerApplyTimeButton)
[00:23:02]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="superDatePickerApplyTimeButton"]') with timeout=2500
[00:23:04]                     │ debg --- retry.tryForTime error: [data-test-subj="superDatePickerApplyTimeButton"] is not displayed
[00:23:05]                     │ debg TestSubjects.click(querySubmitButton)
[00:23:05]                     │ debg Find.clickByCssSelector('[data-test-subj="querySubmitButton"]') with timeout=10000
[00:23:05]                     │ debg Find.findByCssSelector('[data-test-subj="querySubmitButton"]') with timeout=10000
[00:23:05]                     │ proc [kibana]   log   [11:34:29.748] [info][data][data][plugins] Get strategy es
[00:23:05]                     │ proc [kibana]   log   [11:34:29.748] [info][data][data][plugins] search _all
[00:23:05]                     │ debg Find.waitForElementStale with timeout=10000
[00:23:05]                     │ debg TestSubjects.exists(globalLoadingIndicator-hidden)
[00:23:05]                     │ debg Find.existsByCssSelector('[data-test-subj="globalLoadingIndicator-hidden"]') with timeout=100000
[00:23:05]                   └-> should render different data in response to filter change
[00:23:05]                     └-> "before each" hook: global before each
[00:23:05]                     │ debg TestSubjects.find(vega-editor)
[00:23:05]                     │ debg Find.findByCssSelector('[data-test-subj="vega-editor"]') with timeout=10000
[00:23:06]                     │ debg TestSubjects.find(visualizationLoader)
[00:23:06]                     │ debg Find.findByCssSelector('[data-test-subj="visualizationLoader"]') with timeout=10000
[00:23:06]                     │ debg Before Rendering count 2
[00:23:06]                     │ debg TestSubjects.clickWhenNotDisabled(visualizeEditorRenderButton)
[00:23:06]                     │ debg Find.clickByCssSelectorWhenNotDisabled('[data-test-subj="visualizeEditorRenderButton"]') with timeout=10000
[00:23:06]                     │ debg Find.findByCssSelector('[data-test-subj="visualizeEditorRenderButton"]') with timeout=10000
[00:23:06]                     │ debg Waiting up to 20000ms for rendering count to be greater than or equal to [3]...
[00:23:06]                     │ debg TestSubjects.find(visualizationLoader)
[00:23:06]                     │ debg Find.findByCssSelector('[data-test-subj="visualizationLoader"]') with timeout=10000
[00:23:06]                     │ debg -- currentRenderingCount=2
[00:23:07]                     │ debg TestSubjects.find(visualizationLoader)
[00:23:07]                     │ debg Find.findByCssSelector('[data-test-subj="visualizationLoader"]') with timeout=10000
[00:23:07]                     │ debg -- currentRenderingCount=3
[00:23:07]                     │ debg Waiting up to 20000ms for rendering count to stabilize...
[00:23:07]                     │ debg TestSubjects.find(visualizationLoader)
[00:23:07]                     │ debg Find.findByCssSelector('[data-test-subj="visualizationLoader"]') with timeout=10000
[00:23:07]                     │ debg -- firstCount=3
[00:23:07]                     │ debg ... sleep(2000) start
[00:23:09]                     │ debg ... sleep(2000) end
[00:23:09]                     │ debg TestSubjects.find(visualizationLoader)
[00:23:09]                     │ debg Find.findByCssSelector('[data-test-subj="visualizationLoader"]') with timeout=10000
[00:23:09]                     │ debg -- secondCount=3
[00:23:09]                     │ debg Find.findByCssSelector('[aria-label^="Y-axis"]') with timeout=10000
[00:23:19]                     │ info Taking screenshot "/dev/shm/workspace/parallel/20/kibana/test/functional/screenshots/failure/visualize app  vega chart in visualize app vega chart with filters should render different data in response to filter change.png"
[00:23:19]                     │ info Current URL is: http://localhost:61201/app/visualize#/create?type=vega&_g=(filters:!(),refreshInterval:(pause:!t,value:0),time:(from:%272015-09-19T06:31:44.000Z%27,to:%272015-09-23T18:31:44.000Z%27))&_a=(filters:!(),linked:!f,query:(language:kuery,query:%27%27),uiState:(),vis:(aggs:!(),params:(spec:%27%7B%0A%2F*%0A%0AWelcome%20to%20Vega%20visualizations.%20%20Here%20you%20can%20design%20your%20own%20dataviz%20from%20scratch%20using%20a%20declarative%20language%20called%20Vega,%20or%20its%20simpler%20form%20Vega-Lite.%20%20In%20Vega,%20you%20have%20the%20full%20control%20of%20what%20data%20is%20loaded,%20even%20from%20multiple%20sources,%20how%20that%20data%20is%20transformed,%20and%20what%20visual%20elements%20are%20used%20to%20show%20it.%20%20Use%20help%20icon%20to%20view%20Vega%20examples,%20tutorials,%20and%20other%20docs.%20%20Use%20the%20wrench%20icon%20to%20reformat%20this%20text,%20or%20to%20remove%20comments.%0A%0AThis%20example%20graph%20shows%20the%20document%20count%20in%20all%20indexes%20in%20the%20current%20time%20range.%20%20You%20might%20need%20to%20adjust%20the%20time%20filter%20in%20the%20upper%20right%20corner.%0A*%2F%0A%0A%20%20$schema:%20https:%2F%2Fvega.github.io%2Fschema%2Fvega-lite%2Fv4.json%0A%20%20title:%20Event%20counts%20from%20all%20indexes%0A%0A%20%20%2F%2F%20Define%20the%20data%20source%0A%20%20data:%20%7B%0A%20%20%20%20url:%20%7B%0A%2F*%0AAn%20object%20instead%20of%20a%20string%20for%20the%20%22url%22%20param%20is%20treated%20as%20an%20Elasticsearch%20query.%20Anything%20inside%20this%20object%20is%20not%20part%20of%20the%20Vega%20language,%20but%20only%20understood%20by%20Kibana%20and%20Elasticsearch%20server.%20This%20query%20counts%20the%20number%20of%20documents%20per%20time%20interval,%20assuming%20you%20have%20a%20@timestamp%20field%20in%20your%20data.%0A%0AKibana%20has%20a%20special%20handling%20for%20the%20fields%20surrounded%20by%20%22%25%22.%20%20They%20are%20processed%20before%20the%20the%20query%20is%20sent%20to%20Elasticsearch.%20This%20way%20the%20query%20becomes%20context%20aware,%20and%20can%20use%20the%20time%20range%20and%20the%20dashboard%20filters.%0A*%2F%0A%0A%20%20%20%20%20%20%2F%2F%20Apply%20dashboard%20context%20filters%20when%20set%0A%20%20%20%20%20%20%25context%25:%20true%0A%20%20%20%20%20%20%2F%2F%20Filter%20the%20time%20picker%20(upper%20right%20corner)%20with%20this%20field%0A%20%20%20%20%20%20%25timefield%25:%20@timestamp%0A%0A%2F*%0ASee%20.search()%20documentation%20for%20:%20%20https:%2F%2Fwww.elastic.co%2Fguide%2Fen%2Felasticsearch%2Fclient%2Fjavascript-api%2Fcurrent%2Fapi-reference.html%23api-search%0A*%2F%0A%0A%20%20%20%20%20%20%2F%2F%20Which%20index%20to%20search%0A%20%20%20%20%20%20index:%20_all%0A%20%20%20%20%20%20%2F%2F%20Aggregate%20data%20by%20the%20time%20field%20into%20time%20buckets,%20counting%20the%20number%20of%20documents%20in%20each%20bucket.%0A%20%20%20%20%20%20body:%20%7B%0A%20%20%20%20%20%20%20%20aggs:%20%7B%0A%20%20%20%20%20%20%20%20%20%20time_buckets:%20%7B%0A%20%20%20%20%20%20%20%20%20%20%20%20date_histogram:%20%7B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%2F%2F%20Use%20date%20histogram%20aggregation%20on%20@timestamp%20field%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20field:%20@timestamp%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%2F%2F%20The%20interval%20value%20will%20depend%20on%20the%20daterange%20picker%20(true),%20or%20use%20an%20integer%20to%20set%20an%20approximate%20bucket%20count%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20interval:%20%7B%25autointerval%25:%20true%7D%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%2F%2F%20Make%20sure%20we%20get%20an%20entire%20range,%20even%20if%20it%20has%20no%20data%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20extended_bounds:%20%7B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%2F%2F%20Use%20the%20current%20time%20range!%27s%20start%20and%20end%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20min:%20%7B%25timefilter%25:%20%22min%22%7D%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20max:%20%7B%25timefilter%25:%20%22max%22%7D%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%2F%2F%20Use%20this%20for%20linear%20(e.g.%20line,%20area)%20graphs.%20%20Without%20it,%20empty%20buckets%20will%20not%20show%20up%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20min_doc_count:%200%0A%20%20%20%20%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%20%20%2F%2F%20Speed%20up%20the%20response%20by%20only%20including%20aggregation%20results%0A%20%20%20%20%20%20%20%20size:%200%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%2F*%0AElasticsearch%20will%20return%20results%20in%20this%20format:%0A%0Aaggregations:%20%7B%0A%20%20time_buckets:%20%7B%0A%20%20%20%20buckets:%20%5B%0A%20%20%20%20%20%20%7B%0A%20%20%20%20%20%20%20%20key_as_string:%202015-11-30T22:00:00.000Z%0A%20%20%20%20%20%20%20%20key:%201448920800000%0A%20%20%20%20%20%20%20%20doc_count:%200%0A%20%20%20%20%20%20%7D,%0A%20%20%20%20%20%20%7B%0A%20%20%20%20%20%20%20%20key_as_string:%202015-11-30T23:00:00.000Z%0A%20%20%20%20%20%20%20%20key:%201448924400000%0A%20%20%20%20%20%20%20%20doc_count:%200%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20...%0A%20%20%20%20%5D%0A%20%20%7D%0A%7D%0A%0AFor%20our%20graph,%20we%20only%20need%20the%20list%20of%20bucket%20values.%20%20Use%20the%20format.property%20to%20discard%20everything%20else.%0A*%2F%0A%20%20%20%20format:%20%7Bproperty:%20%22aggregations.time_buckets.buckets%22%7D%0A%20%20%7D%0A%0A%20%20%2F%2F%20%22mark%22%20is%20the%20graphics%20element%20used%20to%20show%20our%20data.%20%20Other%20mark%20values%20are:%20area,%20bar,%20circle,%20line,%20point,%20rect,%20rule,%20square,%20text,%20and%20tick.%20%20See%20https:%2F%2Fvega.github.io%2Fvega-lite%2Fdocs%2Fmark.html%0A%20%20mark:%20line%0A%0A%20%20%2F%2F%20%22encoding%22%20tells%20the%20%22mark%22%20what%20data%20to%20use%20and%20in%20what%20way.%20%20See%20https:%2F%2Fvega.github.io%2Fvega-lite%2Fdocs%2Fencoding.html%0A%20%20encoding:%20%7B%0A%20%20%20%20x:%20%7B%0A%20%20%20%20%20%20%2F%2F%20The%20%22key%22%20value%20is%20the%20timestamp%20in%20milliseconds.%20%20Use%20it%20for%20X%20axis.%0A%20%20%20%20%20%20field:%20key%0A%20%20%20%20%20%20type:%20temporal%0A%20%20%20%20%20%20axis:%20%7Btitle:%20false%7D%20%2F%2F%20Customize%20X%20axis%20format%0A%20%20%20%20%7D%0A%20%20%20%20y:%20%7B%0A%20%20%20%20%20%20%2F%2F%20The%20%22doc_count%22%20is%20the%20count%20per%20bucket.%20%20Use%20it%20for%20Y%20axis.%0A%20%20%20%20%20%20field:%20doc_count%0A%20%20%20%20%20%20type:%20quantitative%0A%20%20%20%20%20%20axis:%20%7Btitle:%20%22Document%20count%22%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A%22config%22:%20%7B%20%22kibana%22:%20%7B%22renderer%22:%20%22svg%22%7D%20%7D,%27),title:%27%27,type:vega))
[00:23:19]                     │ info Saving page source to: /dev/shm/workspace/parallel/20/kibana/test/functional/failure_debug/html/visualize app  vega chart in visualize app vega chart with filters should render different data in response to filter change.html
[00:23:19]                     └- ✖ fail: visualize app  vega chart in visualize app vega chart with filters should render different data in response to filter change
[00:23:19]                     │      TimeoutError: Waiting for element to be located By(css selector, [aria-label^="Y-axis"])
[00:23:19]                     │ Wait timed out after 10013ms
[00:23:19]                     │       at /dev/shm/workspace/kibana/node_modules/selenium-webdriver/lib/webdriver.js:842:17
[00:23:19]                     │       at process._tickCallback (internal/process/next_tick.js:68:7)
[00:23:19]                     │ 
[00:23:19]                     │ 

Stack Trace

{ TimeoutError: Waiting for element to be located By(css selector, [aria-label^="Y-axis"])
Wait timed out after 10013ms
    at /dev/shm/workspace/kibana/node_modules/selenium-webdriver/lib/webdriver.js:842:17
    at process._tickCallback (internal/process/next_tick.js:68:7) name: 'TimeoutError', remoteStacktrace: '' }

Build metrics

✅ unchanged

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

@neptunian
Copy link
Contributor

Looks good. Wondering if there's been any thought to @nchaulet 's idea to use a SO to track the status of the setup as we haven't solved the issue of multiple instances of Kibana with setup?

@jfsiii
Copy link
Contributor Author

jfsiii commented Aug 20, 2020

@neptunian there's an issue tracking it #70333 take a look and please leave any thoughts about the open questions

@jfsiii jfsiii merged commit caf3bac into elastic:master Aug 20, 2020
jfsiii pushed a commit that referenced this pull request Aug 21, 2020
…) (#75583)

* add retries for registry requests.

works, afaict. no tests. one TS issue.

* Fix TS issue. Add link to node-fetch error docs

* Restore some accidentally deleted code.

* Add more comments. Remove logging.

* Add tests for plugin setup service & handlers

* Add tests for Registry retry logic

* Extract setup retry logic to separate function/file

* Add tests for setup retry logic

```
  firstSuccessOrTryAgain
    ✓ reject/throws is called again & its value returned (18ms)
    ✓ the first success value is cached (2ms)
```

* More straightforward(?) tests for setup caching

* Revert cached setup. Still limit 1 call at a time

Terrible tests. Committing & pushing to see if it fixes failures like https://github.com/elastic/kibana/pull/74507/checks?check_run_id=980178887

https://kibana-ci.elastic.co/job/elastic+kibana+pipeline-pull-request/67892/execution/node/663/log/

```
07:36:56               └-> "before all" hook
07:36:56               └-> should not allow to enroll an agent with a invalid enrollment
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 │ proc [kibana]  error  [11:36:56.369]  Error: Internal Server Error
07:36:56                 │ proc [kibana]     at HapiResponseAdapter.toError (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/response_adapter.js:132:19)
07:36:56                 │ proc [kibana]     at HapiResponseAdapter.toHapiResponse (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/response_adapter.js:86:19)
07:36:56                 │ proc [kibana]     at HapiResponseAdapter.handle (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/response_adapter.js:81:17)
07:36:56                 │ proc [kibana]     at Router.handle (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/router.js:164:34)
07:36:56                 │ proc [kibana]     at process._tickCallback (internal/process/next_tick.js:68:7)
07:36:56                 │ proc [kibana]   log   [11:36:56.581] [info][authentication][plugins][security] Authentication attempt failed: [security_exception] missing authentication credentials for REST request [/_security/_authenticate], with { header={ WWW-Authenticate={ 0="ApiKey" & 1="Basic realm=\"security\" charset=\"UTF-8\"" } } }
07:36:56                 └- ✓ pass  (60ms) "Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should not allow to enroll an agent with a invalid enrollment"
07:36:56               └-> should not allow to enroll an agent with a shared id if it already exists
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 └- ✓ pass  (111ms) "Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should not allow to enroll an agent with a shared id if it already exists "
07:36:56               └-> should not allow to enroll an agent with a version > kibana
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 └- ✓ pass  (58ms) "Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should not allow to enroll an agent with a version > kibana"
07:36:56               └-> should allow to enroll an agent with a valid enrollment token
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 └- ✖ fail: Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should allow to enroll an agent with a valid enrollment token
07:36:56                 │      Error: expected 200 "OK", got 500 "Internal Server Error"
07:36:56                 │       at Test._assertStatus (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:268:12)
07:36:56                 │       at Test._assertFunction (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:283:11)
07:36:56                 │       at Test.assert (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:173:18)
07:36:56                 │       at assert (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:131:12)
07:36:56                 │       at /dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:128:5
07:36:56                 │       at Test.Request.callback (/dev/shm/workspace/kibana/node_modules/superagent/lib/node/index.js:718:3)
07:36:56                 │       at parser (/dev/shm/workspace/kibana/node_modules/superagent/lib/node/index.js:906:18)
07:36:56                 │       at IncomingMessage.res.on (/dev/shm/workspace/kibana/node_modules/superagent/lib/node/parsers/json.js:19:7)
07:36:56                 │       at endReadableNT (_stream_readable.js:1145:12)
07:36:56                 │       at process._tickCallback (internal/process/next_tick.js:63:19)
07:36:56                 │
07:36:56                 │
```

* New name & tests for one-at-a-time /setup behavior

`firstPromiseBlocksAndFufills` for "the first promise created blocks others from being created, then fufills all with that first result"

* More (better?) renaming

* Fix name in test description

* Fix spelling typo.

* Remove registry retry code & tests

* Use async fn's .catch to avoid unhandled rejection

Add explicit `isPending` value instead of overloading role of `status`. Could probably do without it, but it makes the intent more clear.

Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>

Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
thomasneirynck pushed a commit to thomasneirynck/kibana that referenced this pull request Aug 21, 2020
elastic#75372)

* add retries for registry requests.

works, afaict. no tests. one TS issue.

* Fix TS issue. Add link to node-fetch error docs

* Restore some accidentally deleted code.

* Add more comments. Remove logging.

* Add tests for plugin setup service & handlers

* Add tests for Registry retry logic

* Extract setup retry logic to separate function/file

* Add tests for setup retry logic

```
  firstSuccessOrTryAgain
    ✓ reject/throws is called again & its value returned (18ms)
    ✓ the first success value is cached (2ms)
```

* More straightforward(?) tests for setup caching

* Revert cached setup. Still limit 1 call at a time

Terrible tests. Committing & pushing to see if it fixes failures like https://github.com/elastic/kibana/pull/74507/checks?check_run_id=980178887

https://kibana-ci.elastic.co/job/elastic+kibana+pipeline-pull-request/67892/execution/node/663/log/

```
07:36:56               └-> "before all" hook
07:36:56               └-> should not allow to enroll an agent with a invalid enrollment
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 │ proc [kibana]  error  [11:36:56.369]  Error: Internal Server Error
07:36:56                 │ proc [kibana]     at HapiResponseAdapter.toError (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/response_adapter.js:132:19)
07:36:56                 │ proc [kibana]     at HapiResponseAdapter.toHapiResponse (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/response_adapter.js:86:19)
07:36:56                 │ proc [kibana]     at HapiResponseAdapter.handle (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/response_adapter.js:81:17)
07:36:56                 │ proc [kibana]     at Router.handle (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/router.js:164:34)
07:36:56                 │ proc [kibana]     at process._tickCallback (internal/process/next_tick.js:68:7)
07:36:56                 │ proc [kibana]   log   [11:36:56.581] [info][authentication][plugins][security] Authentication attempt failed: [security_exception] missing authentication credentials for REST request [/_security/_authenticate], with { header={ WWW-Authenticate={ 0="ApiKey" & 1="Basic realm=\"security\" charset=\"UTF-8\"" } } }
07:36:56                 └- ✓ pass  (60ms) "Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should not allow to enroll an agent with a invalid enrollment"
07:36:56               └-> should not allow to enroll an agent with a shared id if it already exists
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 └- ✓ pass  (111ms) "Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should not allow to enroll an agent with a shared id if it already exists "
07:36:56               └-> should not allow to enroll an agent with a version > kibana
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 └- ✓ pass  (58ms) "Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should not allow to enroll an agent with a version > kibana"
07:36:56               └-> should allow to enroll an agent with a valid enrollment token
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 └- ✖ fail: Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should allow to enroll an agent with a valid enrollment token
07:36:56                 │      Error: expected 200 "OK", got 500 "Internal Server Error"
07:36:56                 │       at Test._assertStatus (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:268:12)
07:36:56                 │       at Test._assertFunction (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:283:11)
07:36:56                 │       at Test.assert (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:173:18)
07:36:56                 │       at assert (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:131:12)
07:36:56                 │       at /dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:128:5
07:36:56                 │       at Test.Request.callback (/dev/shm/workspace/kibana/node_modules/superagent/lib/node/index.js:718:3)
07:36:56                 │       at parser (/dev/shm/workspace/kibana/node_modules/superagent/lib/node/index.js:906:18)
07:36:56                 │       at IncomingMessage.res.on (/dev/shm/workspace/kibana/node_modules/superagent/lib/node/parsers/json.js:19:7)
07:36:56                 │       at endReadableNT (_stream_readable.js:1145:12)
07:36:56                 │       at process._tickCallback (internal/process/next_tick.js:63:19)
07:36:56                 │
07:36:56                 │
```

* New name & tests for one-at-a-time /setup behavior

`firstPromiseBlocksAndFufills` for "the first promise created blocks others from being created, then fufills all with that first result"

* More (better?) renaming

* Fix name in test description

* Fix spelling typo.

* Remove registry retry code & tests

* Use async fn's .catch to avoid unhandled rejection

Add explicit `isPending` value instead of overloading role of `status`. Could probably do without it, but it makes the intent more clear.

Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
jfsiii pushed a commit that referenced this pull request Aug 22, 2020
…) (#75587)

* add retries for registry requests.

works, afaict. no tests. one TS issue.

* Fix TS issue. Add link to node-fetch error docs

* Restore some accidentally deleted code.

* Add more comments. Remove logging.

* Add tests for plugin setup service & handlers

* Add tests for Registry retry logic

* Extract setup retry logic to separate function/file

* Add tests for setup retry logic

```
  firstSuccessOrTryAgain
    ✓ reject/throws is called again & its value returned (18ms)
    ✓ the first success value is cached (2ms)
```

* More straightforward(?) tests for setup caching

* Revert cached setup. Still limit 1 call at a time

Terrible tests. Committing & pushing to see if it fixes failures like https://github.com/elastic/kibana/pull/74507/checks?check_run_id=980178887

https://kibana-ci.elastic.co/job/elastic+kibana+pipeline-pull-request/67892/execution/node/663/log/

```
07:36:56               └-> "before all" hook
07:36:56               └-> should not allow to enroll an agent with a invalid enrollment
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 │ proc [kibana]  error  [11:36:56.369]  Error: Internal Server Error
07:36:56                 │ proc [kibana]     at HapiResponseAdapter.toError (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/response_adapter.js:132:19)
07:36:56                 │ proc [kibana]     at HapiResponseAdapter.toHapiResponse (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/response_adapter.js:86:19)
07:36:56                 │ proc [kibana]     at HapiResponseAdapter.handle (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/response_adapter.js:81:17)
07:36:56                 │ proc [kibana]     at Router.handle (/dev/shm/workspace/parallel/5/kibana/build/kibana-build-xpack/src/core/server/http/router/router.js:164:34)
07:36:56                 │ proc [kibana]     at process._tickCallback (internal/process/next_tick.js:68:7)
07:36:56                 │ proc [kibana]   log   [11:36:56.581] [info][authentication][plugins][security] Authentication attempt failed: [security_exception] missing authentication credentials for REST request [/_security/_authenticate], with { header={ WWW-Authenticate={ 0="ApiKey" & 1="Basic realm=\"security\" charset=\"UTF-8\"" } } }
07:36:56                 └- ✓ pass  (60ms) "Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should not allow to enroll an agent with a invalid enrollment"
07:36:56               └-> should not allow to enroll an agent with a shared id if it already exists
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 └- ✓ pass  (111ms) "Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should not allow to enroll an agent with a shared id if it already exists "
07:36:56               └-> should not allow to enroll an agent with a version > kibana
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 └- ✓ pass  (58ms) "Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should not allow to enroll an agent with a version > kibana"
07:36:56               └-> should allow to enroll an agent with a valid enrollment token
07:36:56                 └-> "before each" hook: global before each
07:36:56                 └-> "before each" hook: beforeSetupWithDockerRegistry
07:36:56                 └- ✖ fail: Ingest Manager Endpoints Fleet Endpoints fleet_agents_enroll should allow to enroll an agent with a valid enrollment token
07:36:56                 │      Error: expected 200 "OK", got 500 "Internal Server Error"
07:36:56                 │       at Test._assertStatus (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:268:12)
07:36:56                 │       at Test._assertFunction (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:283:11)
07:36:56                 │       at Test.assert (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:173:18)
07:36:56                 │       at assert (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:131:12)
07:36:56                 │       at /dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:128:5
07:36:56                 │       at Test.Request.callback (/dev/shm/workspace/kibana/node_modules/superagent/lib/node/index.js:718:3)
07:36:56                 │       at parser (/dev/shm/workspace/kibana/node_modules/superagent/lib/node/index.js:906:18)
07:36:56                 │       at IncomingMessage.res.on (/dev/shm/workspace/kibana/node_modules/superagent/lib/node/parsers/json.js:19:7)
07:36:56                 │       at endReadableNT (_stream_readable.js:1145:12)
07:36:56                 │       at process._tickCallback (internal/process/next_tick.js:63:19)
07:36:56                 │
07:36:56                 │
```

* New name & tests for one-at-a-time /setup behavior

`firstPromiseBlocksAndFufills` for "the first promise created blocks others from being created, then fufills all with that first result"

* More (better?) renaming

* Fix name in test description

* Fix spelling typo.

* Remove registry retry code & tests

* Use async fn's .catch to avoid unhandled rejection

Add explicit `isPending` value instead of overloading role of `status`. Could probably do without it, but it makes the intent more clear.

Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
# Conflicts:
#	x-pack/plugins/ingest_manager/server/services/setup.ts

Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
release_note:skip Skip the PR/issue when compiling release notes Team:Fleet Team label for Observability Data Collection Fleet team v7.9.1 v7.10.0 v8.0.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Ingest Manager] Plugin won't retry failed initial setup
5 participants