Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exceptions and High CPU on Packetbeat 1.0.0 GA for Redis traffic #410

Closed
chiriaev opened this issue Dec 1, 2015 · 5 comments
Closed

Exceptions and High CPU on Packetbeat 1.0.0 GA for Redis traffic #410

chiriaev opened this issue Dec 1, 2015 · 5 comments

Comments

@chiriaev
Copy link

chiriaev commented Dec 1, 2015

Guys, Hi. Submitting an exception I am getting at least once per second. My configuration below.

Packetbeat takes a lot of CPU... My Redis traffic is intense but not out of the ordinary.

Thanks in advance.

Regards, Andrei


---
  logging: {}
  shipper: {}
  interfaces:
    device: any
  output:
    elasticsearch:
      username: username
      hosts:
        - "host:9200"
      password: password
  protocols:
    redis:
      ports:
        - 6379
    dns:
      include_additionals: true
      include_authorities: true
      ports:
        - 53
    http:
      ports:
        - 8002
Dec  1 19:01:48 c068amw /usr/bin/packetbeat[14563]: log.go:114: Stacktrace: /go/src/github.com/elastic/packetbeat/Godeps/_workspace/src/github.com/elastic/libbeat/logp/log.go:114 (0x48c3f6)#012/usr/local/go/src/runtime/asm_amd64.s:437 (0x47d72e)#012/usr/local/go/src/runtime/panic.go:423 (0x44d329)#012/usr/local/go/src/runtime/panic.go:42 (0x44b9e9)#012/usr/local/go/src/runtime/sigpanic_unix.go:24 (0x46228a)#012/go/src/github.com/elastic/packetbeat/protos/redis/redis.go:579 (0x51b542)#012/go/src/github.com/elastic/packetbeat/protos/tcp/tcp.go:87 (0x5208d3)#012/go/src/github.com/elastic/packetbeat/protos/tcp/tcp.go:173 (0x521a0d)#012/go/src/github.com/elastic/packetbeat/decoder/decoder.go:136 (0x6c7481)#012/go/src/github.com/elastic/packetbeat/sniffer/sniffer.go:352 (0x532fe9)#012/go/src/github.com/elastic/packetbeat/packetbeat.go:212 (0x422f2b)#012/usr/local/go/src/runtime/asm_amd64.s:1696 (0x47fa71)
Dec  1 19:01:48 c068amw /usr/bin/packetbeat[14563]: log.go:113: ParseRedis exception. Recovering, but please report this: runtime error: invalid memory address or nil pointer dereference.
@urso
Copy link

urso commented Dec 1, 2015

nil pointer has been fixed in #384 / #402 . Recent nightly builds should contain the fixes: https://beats-nightlies.s3.amazonaws.com/index.html?prefix=packetbeat/

Without proper profiling, stats on traffic and CPU/memory usage stats I can not tell much about required CPU usage. Maybe nightlies will help due to not generating this many panics and stack traces (basically per transaction).

@chiriaev
Copy link
Author

chiriaev commented Dec 1, 2015

Thanks! I'll keep you posted. Regards, Andrei

@chiriaev
Copy link
Author

chiriaev commented Dec 1, 2015

Hi, Urso. Now I get this. CPU consumption was divide by 2. Good trend. Regards, Andrei
Dec 1 21:18:21 c068amw /usr/bin/packetbeat[19388]: log.go:113: ParseRedis exception. Recovering, but please report this: runtime error: index out of range. Dec 1 21:18:21 c068amw /usr/bin/packetbeat[19388]: log.go:114: Stacktrace: /go/src/github.com/elastic/packetbeat/Godeps/_workspace/src/github.com/elastic/libbeat/logp/log.go:114 (0x4c2936)#012/usr/local/go/src/runtime/asm_amd64.s:437 (0x47ac1e)#012/usr/local/go/src/runtime/panic.go:423 (0x44a819)#012/usr/local/go/src/runtime/panic.go:12 (0x448cf9)#012/go/src/github.com/elastic/packetbeat/protos/redis/redis_parse.go:359 (0x52326f)#012/go/src/github.com/elastic/packetbeat/protos/redis/redis_parse.go:243 (0x521e97)#012/go/src/github.com/elastic/packetbeat/protos/redis/redis_parse.go:217 (0x521aca)#012/go/src/github.com/elastic/packetbeat/protos/redis/redis.go:176 (0x51f70b)#012/go/src/github.com/elastic/packetbeat/protos/redis/redis.go:126 (0x51ed86)#012/go/src/github.com/elastic/packetbeat/protos/tcp/tcp.go:87 (0x5268f3)#012/go/src/github.com/elastic/packetbeat/protos/tcp/tcp.go:173 (0x527a2d)#012/go/src/github.com/elastic/packetbeat/decoder/decoder.go:153 (0x6cd9f5)#012/go/src/github.com/elastic/packetbeat/sniffer/sniffer.go:356 (0x539069)#012/go/src/github.com/elastic/packetbeat/beat/packetbeat.go:219 (0x47fd7b)#012/usr/local/go/src/runtime/asm_amd64.s:1696 (0x47cf61)

@urso
Copy link

urso commented Dec 1, 2015

oh, missing length check. Did find this with my test traces. Definitely a regression in master branch.

Fixing this should be quite easy. We're in the process of restructuring repos right now. Will do fix afterwards. Will share new build when fixed.

Thanks for finding this.

@tsg
Copy link
Contributor

tsg commented Jul 19, 2016

I think this ticket got lost through the cracks, and in the meantime we had quite a lot of refactorings to the redis output and to the outputs in general, so I doubt this is still a problem. Please reopen if you think otherwise.

@tsg tsg closed this as completed Jul 19, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants