Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🌱 Add Pathfinder assessment migration to CLI #507

Merged
merged 35 commits into from
Oct 30, 2023

Conversation

aufi
Copy link
Member

@aufi aufi commented Oct 6, 2023

Extending tackle CLI tool with option to export Pathfinder assessments and import it to Konveyor API assessments endpoint.

$ ./tackle migrate-assessments

More information about CLI usage: https://github.com/konveyor/tackle2-hub/tree/main/hack/tool#readme

Related to #489

Extending tackle CLI tool with option to export Pathfinder assessments
and import it to Konveyor API assessments endpoint.

```$ ./tackle migrate-assessments```

Related to konveyor#489

Signed-off-by: Marek Aufart <maufart@redhat.com>
hack/tool/tackle Outdated
@@ -722,6 +725,65 @@ class TackleTool:
# Push the updated assessment
apiJSON(self.tackle2Url + tackle2path("assessments/%d" % assmnt2['id']), self.tackle2Token, data=assmnt2, method='PATCH', ignoreErrors=ignoreErrors)

def migrateAssessments(self, assessmentsFileName="assessments", ignoreErrors=False):
cnt = 0
## Export Pathfinder data for each Application
Copy link
Member Author

@aufi aufi Oct 6, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jmontleon @jortel Thinking about usage in operator upgrade:

  • option 1 - Does it make sense first dump Pathfinder data on Konveyor 0.2 version, then upgrade Konveyor and import assessments to 0.3 as part of the upgrade steps (option 1) or
  • option 2 - upgrade Konveyor, but keep running Pathfinder, then run the assessment migration live directly from Pathfinder (without Hub proxy) to Hub API without creating temp files and kill Pathfinder after migration succeeded?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm trying to think about this from a workflow where the operator is orchestrating it. Option 2 sounds cleaner to me, from a logic standpoint. Is it possible to run the migration utility process within a pod (maybe a job would be appropriate) that can do the migration between pathfinder and hub?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A job sounds good to me too to execute the python migration script (@dymurray, was it planned to have the script part of the hub image or a special one?). It'd need to get endpoints for Pathfinder (probably the internal one) and Hub API to run the migration.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps I am being confused by the terminology. When I read execute the python migration script I feel like we are rolling something special for this.
I think the original plan is the simplest:

  • Add functionality to the export/import tool to work with Assessments. This is what the tool is for.
  • Add support for hub API version 3 (for all things including assessment).
  • Package the tool with the operator.
  • The operator uses to tool to export/import assessments.

This is an application level migration that does should be performed/orchestrated by the operator, not the hub.

The workflow:

  1. The operator uses the tool to export assessments though the hub API (this is how the tool works).
  2. upgrade tackle.
  3. import the assessments
  4. On success, Get rid of pathfinder.

If the operator doing this in a job is better, it seems to just add moving parts but open to understanding why it's better.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Previous comments included also the "execution" part of the assessments migration that will be part of operator repo, so that was misleading, sorry for mixing it.

One more open question is about credentials/authorization - the tool expects username&password which is not known to the Hub (AFAIK), just Keycloak knows it. But as @mansam suggested, an addon token could be used for this assessments migration purpose. would this addon token (that is known to the operator too) be suitable for calling Hub API by the tool instead of login/password @jortel?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The hub has a builtin auth provider the is used for addon tokens.
Yes, we have the operator build/sign an auth token with assessment scopes and signed using the key in the secret. I will prototype that.

hack/tool/tackle Outdated Show resolved Hide resolved
hack/tool/tackle Outdated
# Skip if Assessment for given Application already exists
if len(apiJSON(self.tackle2Url + "/hub/applications/%d/assessments" % passmnt["applicationId"], self.tackle2Token, data={"questionnaire": {"id": 1}})) > 0:
print(" Assessment already exists, skipping.")
continue
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If there is an assessment on the application, import is skipped, so the script can run multiple times without failing.

@aufi
Copy link
Member Author

aufi commented Oct 6, 2023

The export part is commented out until we decide best matching scenario for execution with operator upgrade. Examples from import part:

$ ./tackle migrate-assessments -d ./tackle-data-010223
Getting auth token via Hub from http://192.168.39.5
Starting Pathfinder Assessments to Konveyor Assessment migration.
Processing Pathfinder assessments..
# Assessment for Application 2
Category 1 Application details
  Question 1 Does the application development team understand and actively develop the application?
    Answer 5 greenfield application
  Question 2 How is the application supported in production?
    Answer 5 DevOps approach with the same team building the application and supporting it in production
  Question 3 How much time passes from when code is committed until the application is deployed to production?
    Answer 5 1-7 days
  Question 4 How often is the application deployed to production?
    Answer 4 Weekly
  Question 5 What is the application's mean time to recover (MTTR) from failure in a production environment?
    Answer 5 Less than 1 hour
  Question 6 Does the application have legal and/or licensing requirements?
    Answer 4 None
  Question 7 Which model best describes the application architecture?
    Answer 5 Independently deployable components
Category 2 Application dependencies
  Question 1 Does the application require specific hardware?
    Answer 4 Requires CPU that is supported by red Hat
  Question 2 What operating system does the application require?
    Answer 5 Standard Linux distribution
  Question 3 Does the vendor provide support for a third-party component running in a container?
    Answer 6 No third-party components required
  Question 4 Incoming/northbound dependencies
    Answer 3 Many dependencies exist, can be changed because the systems are internally managed
  Question 5 Outgoing/southbound dependencies
    Answer 5 No outgoing/southbound dependencies
Category 3 Application architecture
  Question 1 How resilient is the application? How well does it recover from outages and restarts?
    Answer 4 Application employs resilient architecture patterns (examples: circuit breakers, retry mechanisms)
  Question 2 How does the external world communicate with the application?
    Answer 5 HTTP/HTTPS
  Question 3 How does the application manage its internal state?
    Answer 5 Stateless or ephemeral container storage
  Question 4 How does the application handle service discovery?
    Answer 4 Uses Kubernetes DNS name resolution
  Question 5 How is the application clustering managed?
    Answer 4 No cluster management required
Category 4 Application observability
  Question 1 How does the application use logging and how are the logs accessed?
    Answer 6 Logs are configurable (example: can be sent to stdout)
  Question 2 Does the application provide metrics?
    Answer 4 Metrics exposed using a third-party solution (examples: Dynatrace, AppDynamics)
  Question 3 How easy is it to determine the application's health and readiness to handle traffic?
    Answer 4 Dedicated, independent liveness and readiness endpoints
  Question 4 What best describes the application's runtime characteristics?
    Answer 4 Intermittent traffic with predictable CPU and memory usage
  Question 5 How long does it take the application to be ready to handle traffic?
    Answer 5 Less than 10 seconds
Category 5 Application cross-cutting concerns
  Question 1 How is the application tested?
    Answer 4 Highly repeatable automated testing (examples: unit, integration, smoke tests) before deploying to production; modern test practices are followed
  Question 2 How is the application configured?
    Answer 6 Configuration loaded from files in a single configurable location; environment variables used
  Question 4 How is the application deployed?
    Answer 6 Fully automated (GitOps), blue-green, or canary deployment
  Question 5 Where is the application deployed?
    Answer 6 Hybrid cloud (public and private cloud providers)
  Question 6 How mature is the containerization process, if any?
    Answer 4 Proficient with containers and container platforms (examples: Swarm, Kubernetes)
  Question 3 How does the application acquire security keys or certificates?
    Answer 7 Not required
PUT filled Assessment.
Done. 1 new Assessment(s) for Applications were migrated!
$ ./tackle migrate-assessments -d ./tackle-data-010223
Getting auth token via Hub from http://192.168.39.5
Starting Pathfinder Assessments to Konveyor Assessment migration.
Processing Pathfinder assessments..
# Assessment for Application 2
  Assessment already exists, skipping.
Done. Assessments for 0 Application(s) were migrated!

@aufi aufi changed the title Add Pathfinder assessment migration to CLI 🌱 Add Pathfinder assessment migration to CLI Oct 6, 2023
@aufi aufi requested review from mansam, jortel and dymurray October 6, 2023 12:54
hack/tool/tackle Outdated Show resolved Hide resolved
hack/tool/tackle Outdated Show resolved Hide resolved
@aufi aufi requested a review from jmontleon October 9, 2023 13:56
Signed-off-by: Marek Aufart <maufart@redhat.com>
@aufi aufi requested a review from mansam October 11, 2023 12:21
hack/tool/tackle Outdated
print(" Assessment already exists, skipping.")
continue

# Prepare new Assessment
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This all looks great.

aufi added a commit to konveyor/operator that referenced this pull request Oct 19, 2023
Since Pathfinder component is being replaced by Konveyor Hub API,
already existing Assessments need to be migrated. Adding a script based
on Hub's CLI tool
https://github.com/konveyor/tackle2-hub/tree/main/hack/tool that
connects to running Pathfinder and migrates Assessments to running Hub
API v0.3.

This script is expected to be executed by operator during upgrade
Konveyor 0.2->0.3.

Example usage: (only auth token argument is optional, both urls are
required)

```
$ ./migrate-pathfinder-assessments --pathfinder-url http://pathfinder.svc --hub-base-url http://hub.svc:8080 --token ""
```

Related to konveyor/tackle2-hub#489 and
konveyor/tackle2-hub#507

---------

Signed-off-by: Marek Aufart <maufart@redhat.com>
Signed-off-by: Marek Aufart <maufart@redhat.com>
@aufi aufi marked this pull request as ready for review October 27, 2023 12:38
@aufi
Copy link
Member Author

aufi commented Oct 27, 2023

Updated assessment migration part to do pathfinder query and migration to Hub API without temporary files. Also added --token option to provide auth token to skip login with username/password, that might not be known. Ready for review.

@aufi aufi requested a review from mansam October 27, 2023 14:49
Copy link
Collaborator

@mansam mansam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. I recommend rebasing on main to clear the go vet errors.

jortel and others added 9 commits October 30, 2023 08:55
The `/applications/:id/analysis` and `/applications/:id/analysis/report`
returned the report for: analysis.ID=:id (application ID) instead of the
Last analysis for the application.ID = :id.

Potentially related to: https://issues.redhat.com/browse/MTA-1344

Signed-off-by: Jeff Ortel <jortel@redhat.com>
Better support for test run analysis with seed PRs using 
`--build-arg SEED_PROJECT <url>`
`--build-arg SEED_BRANCH <branch> `

Example:
```
$ docker build . -t quay.io/jortel/tackle2-hub:latest \
  --build-arg 'SEED_PROJECT=jortel/tackle2-seed' \
  --build-arg 'SEED_BRANCH=add-targets' \
  && docker push quay.io/jortel/tackle2-hub:latest
```

Signed-off-by: Jeff Ortel <jortel@redhat.com>
Marshalling the Assessment resource's `Sections` field unconditionally
results in populating the model's `Sections` with the byte string
`null`, which breaks the test for empty sections, making it always look
like the assessment is being imported with sections "as-is". To fix
this, only marshal the sections in the Assessment's `Model()` method if
they are not empty. Additionally, replace the `if m.Sections == nil`
test with a test for length, which works on nil as well as empty slices.

Signed-off-by: Sam Lucidi <slucidi@redhat.com>
Previously PrepareForArchetype was using only the membership criteria to
autofill assessments, but the enhancement specifies that it should use
the archetype tags as well as the criteria tags.

Fixes https://issues.redhat.com/browse/MTA-1396

Signed-off-by: Sam Lucidi <slucidi@redhat.com>
Only changing the `required` field is permitted for builtin
questionnaires, all other changes and deletion are disallowed.

---------

Signed-off-by: Sam Lucidi <slucidi@redhat.com>
The HTML report (tarball) is composed/streamed by the hub in steps:
1. Add the static report directory (template).
2. Add the composed `output.js` file which contains the analysis data.

Using this method, the ReportWriter was adding the `output.js` (entry)
twice in the tar. First from the static report (template) and the second
generated by the hub.
The tar command line seems to deal with this by overwriting the file.
However the _extract_ functionality used by browsers ignores the 2nd
occurrence.

The fix is to filter out the 1st output.js when using
tar.Writer.AddDir().

The `Filter` in the `tar` package only supported _include_ filtering.
This PR updates the `tar.Filter` to support both _included_ and
_excluded_ patterns.

Refitted and tested the /bucket.

---------

Signed-off-by: Jeff Ortel <jortel@redhat.com>
This was missing from the Assessment/Questionnaire sections and is
necessary to completely implement the enhancement.

Signed-off-by: Sam Lucidi <slucidi@redhat.com>
jortel and others added 23 commits October 30, 2023 08:55
The static report JS needs the app ID to be a string.

Signed-off-by: Jeff Ortel <jortel@redhat.com>
Updating Maven Identity sample to use (currently) correct "maven" kind
instead of "mvn" to make it working with addon
https://github.com/konveyor/tackle2-addon-analyzer/blob/61027b29f36ffa5b884ceac81c61d74341bdd762/cmd/mode.go#L151-L153

Signed-off-by: Marek Aufart <maufart@redhat.com>
Add the ticket API Test

---------

Signed-off-by: Yash Khare <yash2010118@akgec.ac.in>
Signed-off-by: Sam Lucidi <slucidi@redhat.com>
Addresses: https://bugzilla.redhat.com/show_bug.cgi?id=2242803.

CVE which impacts http/2 and rapid reset DDOS attacks.

Signed-off-by: Dylan Murray <dymurray@redhat.com>
Using the container status `Reason` for more accurate reporting.
```
state: Failed
image: quay.io/jortel/tackle2-addon-analyzer:debug
pod: konveyor-tackle/task-13-gcmjs
retries: 1
started: 2023-10-16T10:36:30.221282042-07:00
terminated: 2023-10-16T10:36:40.301254088-07:00
bucket:
    id: 17
    name: ""
errors:
    - severity: Error
      description: 'Pod failed: OOMKilled'
```

Also, the RWX should be disabled by default.

---------

Signed-off-by: Jeff Ortel <jortel@redhat.com>
Fixes FOSSA complaint.

Signed-off-by: Jeff Ortel <jortel@redhat.com>
`ArchivedIssue` in the API package being an alias to another type
confuses the swag tool. To mitigate this, I had to overwrite the swagger
type of that field on the Analysis resource with `object`. This means
that the api documentation for that field will be unhelpfully vague, but
it allows the documentation to build.

Signed-off-by: Sam Lucidi <slucidi@redhat.com>
The refactor to using the new `tar` package accidentally removed setting
the X-Directory:Expand.
As a result, the client is not expanding downloaded directory (tarball).

---

Fixed API tests. The bucket directory test really needs 2+ files to be
accurate. It missed this issue.
Added the task bucket (subresource) test for symmetry.

---------

Signed-off-by: Jeff Ortel <jortel@redhat.com>
Fixing resources cleanup at the end of the test, by sending the
identity's ID to `Identity.Delete` instead of the tracker's ID

Signed-off-by: Maayan Hadasi <mguetta@redhat.com>
…konveyor#530)

* Add `confidence` and `risk` fields to Application and Archetype
resources. Overall confidence calculated using the algorithm from
Pathfinder for each assessment, then summed and divided by total number
of assessments. Overall risk level is `green` if all assessments are
`green`, otherwise the overall risk is `red`, `unknown`, or `yellow` in
that priority order if any assessments have that risk level.
* Moved some assessment status logic out of the resource and into the
assessment package where it belongs.
* Added ArchetypeResolver to better encapsulate the related behaviors
and for symmetry with ApplicationResolver.
* Simplified the Application and Archetype endpoints by replacing
several related resource `With*` methods with `WithResolver`.
* Improved consistency of the APIs within the `assessment` package.

---------

Signed-off-by: Sam Lucidi <slucidi@redhat.com>
Fixes konveyor#527

Signed-off-by: Sam Lucidi <slucidi@redhat.com>
Add hack script to generate jwt tokens for _builtin_ auth. Likley used
by actors such as the operator for migration related to upgrade.

Related:
- auth.Validator signature changed to return error. This is more
flexible and _failures_ can be propagated.
- auth.Builtin.Authenticate() returns NotAuthenticated when should have
returned NotValid for clearity.
- auth.NotValid includes `Reason`.

---------

Signed-off-by: Jeff Ortel <jortel@redhat.com>
closes konveyor#539

---------

Signed-off-by: Jeff Ortel <jortel@redhat.com>
This also adds some convenience methods to the `Setting` model to make
reading and updating their values somewhat less tedious.

Fixes konveyor#505

---------

Signed-off-by: Sam Lucidi <slucidi@redhat.com>
Signed-off-by: Sam Lucidi <slucidi@redhat.com>
The yaml encoder/decoder needs to be explicitly told inline the
`Resource` struct's fields on each of the API resources, otherwise
attempts to bind requests that include Resource fields (`id`,
`createTime`, `createUser`, `updateUser`) fail.

Signed-off-by: Sam Lucidi <slucidi@redhat.com>
The assessment stakeholder relationships need to be preloaded so that
the `applications/id/assessments` and `archetypes/id/assessments`
subresources populate their association fields properly.

Signed-off-by: Sam Lucidi <slucidi@redhat.com>
This PR represents a first step towards improving our automatically
generated API documentation. It includes the following changes:

* Adds general API level doc comments that are a requirement for
Swagger/OpenAPI.
* Adds missing godocs for parameters where they were missing. This was
causing Swagger validation to fail and was preventing QE from
successfully converting our Swagger 2 documentation to OpenAPI 3.0
* Fixes some incorrect parameter documentation
* Adds a new target to convert our Swagger 2 documentation to OpenAPI
3.0 using the Swagger conversion API at `converter.swagger.io`.
* Corrects the existing docs-swagger target command invocation.
* Updates the redoc target to use the newly generated OpenAPI 3.0
documentation.

---------

Signed-off-by: Sam Lucidi <slucidi@redhat.com>
Signed-off-by: Sam Lucidi <slucidi@redhat.com>
This was missed in the original PR. konveyor#544

Signed-off-by: Sam Lucidi <slucidi@redhat.com>
@aufi aufi merged commit 2e323af into konveyor:main Oct 30, 2023
10 of 11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: ✅ Done
Development

Successfully merging this pull request may close these issues.

7 participants