-
Notifications
You must be signed in to change notification settings - Fork 458
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GCP package billing datastream tests #2312
Conversation
💔 Build Failed
Expand to view the summary
Build stats
Test stats 🧪
Steps errorsExpand to view the steps failures
|
@@ -0,0 +1,45 @@ | |||
import csv |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will remove this script, as with bq
cli tool is possible to easily export in JSON format (not possible from GCP console)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let me ask a few questions:
- What is missing in the Terraform Docker image, so that you don't have to add any Go source code here?
- Do you prefer to add the
bq
tool to the image or prefer to include a special script here to execute and installbq
tool in runtime? For example:init.sh
withyum install bq
(probably not so simple).
Both approaches required modifications in the elastic-package, but I think it is worth it? Let me know what you think.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- nothing. The issue is with the Terraform google provider that does not support loading data. The proposed solution is either through the
bq
tool or through the BigQuery SDK. So either there is a programming language available or the GCP Cloud SDK CLI - I would prefer it in the image, so at least every test uses the same base image, sort of like GitHub Action runners guarantee a tooling baseline through preinstalled software.
About point 2 there are some considerations to make:
- tests should be able to use different base image versions, to support different toolchains; for example using a toolchain for some database or cloud provider with breaking changes (think AWS v1 vs v2); this version must be selectable by test developers
- security wise is easier to scan/check vulnerabilities if the baseline image does not change, and developers/contributors cannot insert vulnerable software as easily
- the disadvantage is with flexibility, as developers cannot just add some init code but have to discuss it, but I think this limitation is actually a positive forcing function, preventing each integration to setup the CI differently, which would result in a exponential combination of CI configurations
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, security constraint is important here, so I think we could add bq
tool to the Terraform image.
Regarding different toolchains, let's focus on solving your problem at the moment, don't need to invent a super-flexible option on Day 1.
If you're happy to work on this, feel free to open a pull request to elastic-package or report an issue there. As you wish :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will work on it, just a question: should I just change the Dockerfile.terraform_deployer
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, that's the goal of the hashicorp/terraform:light
image, not to bring heavy dependencies or the entire image. Could you please try to extend the terraform:light
with Python distribution? It shouldn't be a problem.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tired adding Python to the alpine image, but I went down the system dependencies rabbit hole. At the end, the "light" terraform image is not light anymore.
I tried approaching this the other way round, starting from google/cloud-sdk:latest
and adding terraform and is easier. Can this be a viable route for this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wouldn't recommend depending on the google image as we may end up with a similar request for other SDK (AWS, Azure).
Did you try using the python:3-alpine
or alpine with apk add --update python3 py3-pip
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried the latter. I will try again starting from python:3-alpine
and add things on top of that. AWS and Azure CLIs are python based too!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Proper PR for adding gcloud
SDk support available elastic/elastic-package#638
Metricbeat metricset expect a different key name.
While testing I discovered that |
Project ID should be specified via the proper CLI flag instead that being part of table name.
Test file should contain data in a specified time range to be collected by beats and be useful for testing.
8cf80ec
to
fe54af3
Compare
edc3905
to
826b985
Compare
What does this PR do?
Adds system tests for
billing
data_stream ingcp
package.Checklist
changelog.yml
file.Author's Checklist
How to test this PR locally
Run
elastic-package test system --data-streams billing -v
from withingcp
package.Note: this is not working yet, as the terraform docker image used for running tests does not support go.
Related issues
Screenshots