Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

limit task retry when system error #3340

Merged

Conversation

chengjoey
Copy link
Contributor

What type of this PR

Add one of the following kinds:
/kind bugfix

What this PR does / why we need it:

limit task retry when system error

Which issue(s) this PR fixes:

Specified Reviewers:

/assign @your-reviewer

ChangeLog

Need cherry-pick to release versions?

Add comment like /cherry-pick release/1.0 when this PR is merged.

For details on the cherry pick process, see the cherry pick requests section under CONTRIBUTING.md.

@chengjoey
Copy link
Contributor Author

image
image

@codecov
Copy link

codecov bot commented Dec 10, 2021

Codecov Report

Merging #3340 (942443f) into master (dd78a2f) will increase coverage by 0.00%.
The diff coverage is 53.84%.

Impacted file tree graph

@@            Coverage Diff            @@
##           master    #3340     +/-   ##
=========================================
  Coverage   17.85%   17.85%             
=========================================
  Files        1377     1386      +9     
  Lines      142888   144326   +1438     
=========================================
+ Hits        25511    25773    +262     
- Misses     114700   115875   +1175     
- Partials     2677     2678      +1     
Impacted Files Coverage Δ
modules/pipeline/pipengine/reconciler/task.go 16.92% <0.00%> (-1.11%) ⬇️
...pipeline/pipengine/reconciler/taskrun/framework.go 0.00% <0.00%> (ø)
apistructs/pipeline_task.go 59.49% <100.00%> (+45.60%) ⬆️
...ashboard-workload-detail/operationButton/render.go 37.83% <0.00%> (-16.64%) ⬇️
pkg/database/sqlparser/snapshot/snapshot.go 14.81% <0.00%> (-8.89%) ⬇️
...p-dashboard-workloads-list/workloadTable/render.go 49.01% <0.00%> (-8.55%) ⬇️
...tocol/components/issue-manage/issueTable/render.go 35.81% <0.00%> (-6.16%) ⬇️
...e/monitor/metric/storage/elasticsearch/provider.go 14.42% <0.00%> (-5.58%) ⬇️
...rotocol/components/cmp-cluster-list/list/render.go 30.55% <0.00%> (-2.46%) ⬇️
.../components/cmp-dashboard-pods/podsTable/render.go 14.84% <0.00%> (-1.29%) ⬇️
... and 26 more

@chengjoey chengjoey force-pushed the fix/pipeline-task-long-time-retry branch 2 times, most recently from 26f9358 to 4d61ffd Compare December 13, 2021 05:20
@@ -23,7 +23,9 @@ import (

const (
// TerminusDefineTag add this tag env to container for collecting logs
TerminusDefineTag = "TERMINUS_DEFINE_TAG"
TerminusDefineTag = "TERMINUS_DEFINE_TAG"
PipelineTaskMaxRetryLimit = 100
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How much time will the normal result of 100 retry times cost?
Please compare to 24hours.

@@ -96,6 +96,10 @@ func reconcileTask(tr *taskrun.TaskRun) error {
rlog.TErrorf(tr.P.ID, tr.Task.ID, "failed to handle taskOp: %s, user abnormalErr: %v, don't need retry", taskOp.Op(), abnormalErr)
return abnormalErr
}
if tr.Task.Inspect.IsErrorsExceed() {
rlog.TErrorf(tr.P.ID, tr.Task.ID, "failed to handle taskOp: %s, errors exceed limit, stop retry", taskOp.Op())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

log add concrete limit number.

@chengjoey chengjoey force-pushed the fix/pipeline-task-long-time-retry branch from 4d61ffd to 942443f Compare December 13, 2021 11:30
@sfwn
Copy link
Member

sfwn commented Dec 16, 2021

/approve

@erda-bot erda-bot merged commit 1450a91 into erda-project:master Dec 16, 2021
@chengjoey chengjoey deleted the fix/pipeline-task-long-time-retry branch April 18, 2022 02:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Development

Successfully merging this pull request may close these issues.

3 participants