You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After upgrading to 1.5.0 SSO logins are failing in some environments due to intermittent Redis failures.
As part of 1.5.0 release bundled Redis was upgraded to 4.3.4 with enabled HAProxy . It appears that default HAProxy check interval is too tight which might cause intermittent failures
Version
v1.5.0
Logs
LOGS
argocd-redis-ha-haproxy haproxy [WARNING] 093/131235 (6) : Server check_if_redis_is_master_1/R2 is DOWN, reason: Layer7 timeout, info: " at step 5 of tcp-check (expect string '172.20.31.132')", check duration: 1000ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
argocd-redis-ha-haproxy haproxy [ALERT] 093/131235 (6) : backend 'check_if_redis_is_master_1' has no server available!
argocd-redis-ha-haproxy haproxy [WARNING] 093/131241 (6) : Server check_if_redis_is_master_0/R1 is UP, reason: Layer7 check passed, code: 0, info: "(tcp-check)", check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
argocd-redis-ha-haproxy haproxy [WARNING] 093/131242 (6) : Server bk_redis_master/R0 is DOWN, reason: Layer4 timeout, info: " at step 1 of tcp-check (connect)", check duration: 1000ms. 0 active and 0 backup servers left. 1 sessions active, 0 requeued, 0 remaining in queue.
argocd-redis-ha-haproxy haproxy [ALERT] 093/131242 (6) : backend 'bk_redis_master' has no server available!
argocd-redis-ha-haproxy haproxy [WARNING] 093/131244 (6) : Server check_if_redis_is_master_0/R0 is DOWN, reason: Layer4 connection problem, info: "Connection refused at step 1 of tcp-check (connect)", check duration: 1001ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
argocd-redis-ha-haproxy haproxy [WARNING] 093/131246 (6) : Server bk_redis_master/R2 is UP, reason: Layer7 check passed, code: 0, info: "(tcp-check)", check duration: 1ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
argocd-redis-ha-haproxy haproxy [WARNING] 093/131248 (6) : Server check_if_redis_is_master_2/R1 is UP, reason: Layer7 check passed, code: 0, info: "(tcp-check)", check duration: 0ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
argocd-redis-ha-haproxy haproxy [WARNING] 093/131248 (6) : Server check_if_redis_is_master_2/R2 is UP, reason: Layer7 check passed, code: 0, info: "(tcp-check)", check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
argocd-redis-ha-haproxy haproxy [WARNING] 093/131251 (6) : Server check_if_redis_is_master_0/R2 is DOWN, reason: Layer7 timeout, info: " at step 5 of tcp-check (expect string '172.20.3.51')", check duration: 1001ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
argocd-redis-ha-haproxy haproxy [WARNING] 093/131251 (6) : Server check_if_redis_is_master_0/R1 is DOWN, reason: Layer7 timeout, info: " at step 5 of tcp-check (expect string '172.20.3.51')", check duration: 1001ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
argocd-redis-ha-haproxy haproxy [ALERT] 093/131251 (6) : backend 'check_if_redis_is_master_0' has no server available!
argocd-redis-ha-haproxy haproxy [WARNING] 093/131257 (6) : Server check_if_redis_is_master_2/R0 is UP, reason: Layer7 check passed, code: 0, info: "(tcp-check)", check duration: 2ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
The text was updated successfully, but these errors were encountered:
Describe the bug
After upgrading to 1.5.0 SSO logins are failing in some environments due to intermittent Redis failures.
As part of 1.5.0 release bundled Redis was upgraded to 4.3.4 with enabled HAProxy . It appears that default HAProxy check interval is too tight which might cause intermittent failures
Version
Logs
LOGS
The text was updated successfully, but these errors were encountered: