-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
client: improve error handling and reporting #372
Conversation
// If err is non-nil, this function will return ture and take one the following actions: | ||
// - if it is a client.DetailedError, a DetailedErrorDiagnostic will be added to diag. | ||
// - otherwise a generic error diagnostic will be added to diag. | ||
func AddDiagnosticOnError(diag *diag.Diagnostics, summary string, err error) bool { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I attempted to come up with a signature for this helper that could expose the 404 case needed to be handled on all the Read
methods for resources and didn't love anything I came up with.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the best thing I can think of would be passing in a callback func to the function, like
trigger, err := r.client.Triggers.Get(ctx, state.Dataset.ValueString(), state.ID.ValueString())
if helper.AddDiagnosticOnError(&resp.Diagnostics, "Creating Honeycomb Trigger", err, func(detailedErr *client.DetailedError) bool {
if detailedErr.IsNotFound() {
// not super sure if this works with Go's closures like it would if this was JS
resp.State.RemoveResource(ctx)
return false
}
return true
}) {
return
}
// ...
func AddDiagnosticOnError(diag *diag.Diagnostics, summary string, err error, callback func(detailedErr *client.DetailedError) bool) bool {
if err == nil {
return false
}
var detailedErr *client.DetailedError
if errors.As(err, &detailedErr) {
// the return val of callback determines if we try to append the error, letting the caller decide
if callback(detailedErr) {
diag.Append(DetailedErrorDiagnostic{
summary: "Error " + summary,
e: detailedErr,
})
}
} else {
diag.AddError("Error "+summary, err.Error())
}
return true
}
...but I don't think it makes it much clearer at all. what you have seems like a great improvement already!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this looks great to me!
// If err is non-nil, this function will return ture and take one the following actions: | ||
// - if it is a client.DetailedError, a DetailedErrorDiagnostic will be added to diag. | ||
// - otherwise a generic error diagnostic will be added to diag. | ||
func AddDiagnosticOnError(diag *diag.Diagnostics, summary string, err error) bool { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the best thing I can think of would be passing in a callback func to the function, like
trigger, err := r.client.Triggers.Get(ctx, state.Dataset.ValueString(), state.ID.ValueString())
if helper.AddDiagnosticOnError(&resp.Diagnostics, "Creating Honeycomb Trigger", err, func(detailedErr *client.DetailedError) bool {
if detailedErr.IsNotFound() {
// not super sure if this works with Go's closures like it would if this was JS
resp.State.RemoveResource(ctx)
return false
}
return true
}) {
return
}
// ...
func AddDiagnosticOnError(diag *diag.Diagnostics, summary string, err error, callback func(detailedErr *client.DetailedError) bool) bool {
if err == nil {
return false
}
var detailedErr *client.DetailedError
if errors.As(err, &detailedErr) {
// the return val of callback determines if we try to append the error, letting the caller decide
if callback(detailedErr) {
diag.Append(DetailedErrorDiagnostic{
summary: "Error " + summary,
e: detailedErr,
})
}
} else {
diag.AddError("Error "+summary, err.Error())
}
return true
}
...but I don't think it makes it much clearer at all. what you have seems like a great improvement already!
out := make([]string, len(in)) | ||
|
||
// AsStringSlice converts a slice of string-like things to a slice of strings. | ||
func AsStringSlice[S ~string](in []S) []string { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yay generics!
client/client.go
Outdated
|
||
// if API Key is still unset, try using the legacy environment variable | ||
if c.APIKey == "" { | ||
c.APIKey = os.Getenv("HONEYCOMBIO_API_KEY") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: what would you think of moving these HONEYCOMB_API_KEY
& HONEYCOMBIO_API_KEY
strings into some constants? I definitely thought they were the same value for a solid 5 minutes until I noticed the IO
in the legacy env var 😅
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oops! Sure, no problem :)
( I'd like to nuke the 'legacy' one for 1.0, but having it as a const makes good sense )
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done!
An overhaul of the provider's embedded API client with the goal of making it more resilient to transient errors (429, 502, 504) and doing a better job of making error details available when an unrecoverable error happens.
The retries brought to us by the standard (in Terraform land) go-retryablehttp in a fairly simplistic initial setup[1].
Richer error messages brought in by trying to make as much use as possible of RFC7807-style errors coming back from our API and then wiring them into detailed errors (
Diagnostics
) during Terraform runs.1: very open to feedback on the max retries, the min and max values (link). They feel safe and conservative to me