Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

In logtrail.json, change "timezone" field to another value than "UTC" raise an error #24

Closed
fbaligand opened this issue Dec 21, 2016 · 33 comments
Labels

Comments

@fbaligand
Copy link

In logtrail.json, if I set "timezone": "Europe/Paris", I get the following error each time logtrail plugin posts a search request :

Error while executing search { [Error: [parse_exception] failed to parse date field [1478552461000] with format [epoch_millis]]
  status: 400,
  message: '[parse_exception] failed to parse date field [1478552461000] with format [epoch_millis]',
  path: '/logstash-*/_search',
  query: { size: 500 },
  body:
   { error:
      { root_cause: [Object],
        type: 'search_phase_execution_exception',
        reason: 'all shards failed',
        phase: 'query_fetch',
        grouped: true,
        failed_shards: [Object] },
     status: 400 },
  statusCode: 400,
  response: '{"error":{"root_cause":[{"type":"parse_exception","reason":"failed to parse date field [1478552461000] with format [epoch_millis]"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query_fetch","grouped":true,"failed_shards":[{"shard":0,"index":"logstash-2016.11.05","node":"sD0Fz592SCmmMebQHT4sTQ","reason":{"type":"parse_exception","reason":"failed to parse date field [1478552461000] with format [epoch_millis]","caused_by":{"type":"illegal_argument_exception","reason":"Invalid format: \\"1478552461000\\""}}}]},"status":400}',
  toString: [Function],
  toJSON: [Function] }
@sivasamyk sivasamyk added the bug label Jan 6, 2017
@brandond
Copy link
Contributor

brandond commented Jan 12, 2017

Getting the same with America/Los_Angeles. Removing the timezone setting appears to fix the issue.

{
  "msg":"[parse_exception] failed to parse date field [1484267001673] with format [epoch_millis]",
  "path":"/logstash-*/_search",
  "query":{
    "size":500
  },
  "body":{
    "sort":[
      {
        "@timestamp":{
          "order":"asc",
          "unmapped_type":"boolean"
        }
      }
    ],
    "query":{
      "bool":{
        "must":{
          "query_string":{
            "analyze_wildcard":true,
            "default_field":"message",
            "query":"logGroup.keyword: \"/var/log/squid/access.log\""
          }
        },
        "filter":{
          "bool":{
            "must":[
              {
                "range":{
                  "@timestamp":{
                    "gte":1484267001673,
                    "time_zone":"America/Los_Angeles",
                    "format":"epoch_millis"
                  }
                }
              }
            ],
            "must_not":[

            ]
          }
        }
      }
    }
  },
  "statusCode":400,
  "response":{
    "error":{
      "root_cause":[
        {
          "type":"parse_exception",
          "reason":"failed to parse date field [1484267001673] with format [epoch_millis]"
        }
      ],
      "type":"search_phase_execution_exception",
      "reason":"all shards failed",
      "phase":"query",
      "grouped":true,
      "failed_shards":[
        {
          "shard":0,
          "index":"logstash-2016.12.07",
          "node":"v13nGpFcQrC9EUTjRh8FGA",
          "reason":{
            "type":"parse_exception",
            "reason":"failed to parse date field [1484267001673] with format [epoch_millis]",
            "caused_by":{
              "type":"illegal_argument_exception",
              "reason":"Parse failure at index [0] of [1484267001673]"
            }
          }
        }
      ],
      "caused_by":{
        "type":"parse_exception",
        "reason":"failed to parse date field [1484267001673] with format [epoch_millis]",
        "caused_by":{
          "type":"illegal_argument_exception",
          "reason":"Parse failure at index [0] of [1484267001673]"
        }
      }
    },
    "status":400
  }
}

@brandond
Copy link
Contributor

brandond commented Jan 13, 2017

Guessing this may be an issue with ES, as the search filter is valid as per the documentation. It also barfs on absolute TZ offsets ("-8:00", etc) which the docs claim are supported.
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-range-query.html

@brandond
Copy link
Contributor

brandond commented Jan 13, 2017

Found it:
https://github.com/elastic/elasticsearch/blob/master/core/src/main/java/org/elasticsearch/common/joda/Joda.java#L337
which is called from here:
https://github.com/elastic/elasticsearch/blob/master/core/src/main/java/org/elasticsearch/common/joda/DateMathParser.java#L204

If I'm reading this correctly, timestamps epoch formats (epoch_millis and epoch_seconds) don't support time zones other than UTC. Not sure how it was working previously?

@sivasamyk
Copy link
Owner

Logtrail uses time_zone in range queries. From this documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-range-query.html#_time_zone_in_range_queries. Timezone specifies the timezone of timestamps used in range queries, which then will be converted to UTC based on the timezone specified.

Can I know the reason why you want to change the time_zone in logtrail.json? Is your @timestamp field in event doc not in UTC?

@fbaligand
Copy link
Author

The reason is that I want to search logs using time range in my time zone.
For example, if I search for yesterday logs, I want to search in yesterday logs in my time zone, not in UTC time zone. I mean "now-1d/d" doesn't mean the same thing if you use UTC timezone or UTC+1

@brandond
Copy link
Contributor

brandond commented Jan 13, 2017

I guess I was also expecting it to convert the event timestamps to the specified timezone for display. Kibana does this based on the browser's TZ setting, but it seems like Logtrail does not - the timestamps are all UTC.

I agree that the documentation SAYS they should be supported, but it seems that the code itself does not allow them for epoch timestamps. You have to provide the timestamp in a different format (not epoch_seconds or epoch_millis) if you want to use a TZ.

@brandond
Copy link
Contributor

Opened upstream issue: elastic/elasticsearch#22621

@fbaligand
Copy link
Author

Thanks for the elasticsearch issue and logtrail PR !

@fbaligand
Copy link
Author

Automatic timezone detection in browser would be great and thus coherent with Kibana.
But I imagine it's more work :)

@brandond
Copy link
Contributor

@fbaligand it's not too bad, it looks like I could steal most of it from the Kibana code. Might take a shot at it this weekend.

@fbaligand
Copy link
Author

Nice news !
I'm available to test it if you need.

@sivasamyk
Copy link
Owner

@fbaligand To understand your requirement again, if you have 2 systems A ( running in IST) , B (running in PST) shipping logs to ES via logstash, with @timestamp being stored in UTC. When you launch logtrail from be browser running in EST and query for logs from yesterday midnight ( 15/Jan/2017 00:00 AM ), logtrail should show all events from A (IST) from 15/Jan/2017 10:30 AM and all events from B (PST) from 14/Jan/2017 9 PM .

Correct me if I am wrong?

Today since display_timestamp is mapped to source host time by default , each event will be show time in their respective source time zone (the default config assumed all hosts are running in same timezone). If we map this to @timestamp to display_timestamp and convert to browser/configured timezone ( @brandond pull request does this), it should solve the problem.

@brandond
Copy link
Contributor

@sivasamyk can you publish a release with the latest tz stuff in it? I'm trying to do a local build but I cannot for the life of me get the plugin to package properly.

@sivasamyk
Copy link
Owner

@brandond You can download the latest release from https://github.com/sivasamyk/logtrail/releases/tag/0.1.7

This release also contains support for multiple index patterns. Intoduced display_timezone and display_timestamp_format in logtrail.json. Checkout the README for more info.

Let me know if you face any issues.

@brandond
Copy link
Contributor

Thanks for the release. Currently is broken if you set display_timezone to something other than 'local' due to bad require of moment-timezone. See #37

@sivasamyk
Copy link
Owner

Thanks!
I have updated the release with fix. I think the issue was in 5.x I was just importing moment. Merge issue!

@brandond
Copy link
Contributor

Now something else weird is going on... it loads properly the first time showing the most recent data, but the first auto-refresh gives me data from December 8th, which is the oldest index I currently have loaded in ElasticSearch.

@brandond
Copy link
Contributor

brandond commented Jan 20, 2017

I uncommented the log on line 116 of server.js, here's what it shows:

{"index":"logstash-*","size":500,"body":{"sort":[{"@timestamp":{"order":"desc","unmapped_type":"boolean"}}],"query":{"bool":{"must":{"query_string":{"analyze_wildcard":true,"default_field":"message","query":"*"}},"filter":{"bool":{"must":[],"must_not":[]}}}}}}
{"index":"logstash-*","size":500,"body":{"sort":[{"@timestamp":{"order":"asc","unmapped_type":"boolean"}}],"query":{"bool":{"must":{"query_string":{"analyze_wildcard":true,"default_field":"message","query":"*"}},"filter":{"bool":{"must":[{"range":{"@timestamp":{"gt":1484905521000,"format":"epoch_millis"}}}],"must_not":[]}}}}}}
{"index":"logstash-*","size":500,"body":{"sort":[{"@timestamp":{"order":"asc","unmapped_type":"boolean"}}],"query":{"bool":{"must":{"query_string":{"analyze_wildcard":true,"default_field":"message","query":"*"}},"filter":{"bool":{"must":[],"must_not":[]}}}}}}
{"index":"logstash-*","size":500,"body":{"sort":[{"@timestamp":{"order":"asc","unmapped_type":"boolean"}}],"query":{"bool":{"must":{"query_string":{"analyze_wildcard":true,"default_field":"message","query":"*"}},"filter":{"bool":{"must":[{"range":{"@timestamp":{"gt":1481187931881,"format":"epoch_millis"}}}],"must_not":[]}}}}}}
{"index":"logstash-*","size":500,"body":{"sort":[{"@timestamp":{"order":"asc","unmapped_type":"boolean"}}],"query":{"bool":{"must":{"query_string":{"analyze_wildcard":true,"default_field":"message","query":"*"}},"filter":{"bool":{"must":[],"must_not":[]}}}}}}
{"index":"logstash-*","size":500,"body":{"sort":[{"@timestamp":{"order":"asc","unmapped_type":"boolean"}}],"query":{"bool":{"must":{"query_string":{"analyze_wildcard":true,"default_field":"message","query":"*"}},"filter":{"bool":{"must":[{"range":{"@timestamp":{"gt":1481187978645,"format":"epoch_millis"}}}],"must_not":[]}}}}}}
{"index":"logstash-*","size":500,"body":{"sort":[{"@timestamp":{"order":"asc","unmapped_type":"boolean"}}],"query":{"bool":{"must":{"query_string":{"analyze_wildcard":true,"default_field":"message","query":"*"}},"filter":{"bool":{"must":[],"must_not":[]}}}}}}
{"index":"logstash-*","size":500,"body":{"sort":[{"@timestamp":{"order":"asc","unmapped_type":"boolean"}}],"query":{"bool":{"must":{"query_string":{"analyze_wildcard":true,"default_field":"message","query":"*"}},"filter":{"bool":{"must":[{"range":{"@timestamp":{"gt":1481188026481,"format":"epoch_millis"}}}],"must_not":[]}}}}}}

Not sure why it does a search with no timestamp in ascending order, that would definitely explain the jump back to the start of my indexes though.

@sivasamyk
Copy link
Owner

Can you let me know the time stamp of last event shown in event viewer when you open logtrail for first time? Can you let me know the changes you made in logtrail .json from default value. I assume you did not edit es_index_margin_in_seconds param.

@brandond
Copy link
Contributor

brandond commented Jan 20, 2017

Not sure how to get the raw timetamp of the last event? Is there a command I can run on the JS console?

Here's my config:

{
  "index_patterns" : [
    {
      "es": {
        "default_index": "logstash-*",
        "allow_url_parameter": true
      },
      "tail_interval_in_seconds": 15,
      "es_index_time_offset_in_seconds": 0,
      "display_timezone": "America/Los_Angeles",
      "display_timestamp_format": "MMM DD HH:mm:ss",
      "max_buckets": 500,
      "nested_objects" : false,
      "default_time_range_in_days" : 0,
      "fields" : {
        "mapping" : {
            "timestamp" : "@timestamp",
            "display_timestamp" : "@timestamp",
            "hostname" : "logStream",
            "program": "logGroup",
            "message": "message"
        }
      }
    }
  ]
}

@sivasamyk
Copy link
Owner

The timestamp of last event of event viewer will be in PST time based on your config. If we convert that to UTC (epoch_millis) you will get the timestamp used for tail. So if you can paste the last few events shown in event viewer it should good.

@brandond
Copy link
Contributor

brandond commented Jan 20, 2017

Here's the last event returned by the first search:

{"id":"33114544732589944933745996832382567856447798587079393285","timestamp":"2017-01-20T10:19:01.779Z","display_timestamp":"2017-01-20T10:19:01.779Z","hostname":"i-0cc87399e3e7f7154","message":"type=USER_END msg=audit(1484907541.779:136134): pid=9462 uid=0 auid=0 ses=19843 msg='op=PAM:session_close grantors=pam_loginuid,pam_limits acct=\"root\" exe=\"/usr/sbin/crond\" hostname=? addr=? terminal=cron res=success'","program":"/var/log/audit/audit.log"}

which converts to 1484936341779

@sivasamyk
Copy link
Owner

logtrail should be sending 1484936341779 timestamp to server during tail. I am trying to reproduce this in my local setup. Its working fine. Are you seeing any exceptions in JS debug console? I assume you are using 5.1.2 version of Kibana and Elasticsearch.

@sivasamyk
Copy link
Owner

@brandond I have added few logger statements in this build. The logs will be printed in JS console of browser. Can you please use this build and send me the logs?
logtrail-5.x-0.1.7.zip

@brandond
Copy link
Contributor

brandond commented Jan 20, 2017

Here's what I've got in the console log in Chrome after opening it and letting it sit for about a minute.

2017-01-20 08:49:57.380 Navigated to https://KIBANAHOST/app/logtrail
2017-01-20 08:49:58.621 logtrail.bundle.js?v=14588:1 connection to elasticsearch successful
2017-01-20 08:49:58.623 logtrail.bundle.js?v=14588:1 sending search request with params {"searchText":null,"timestamp":null,"rangeType":null,"order":"desc","hostname":null,"index":"logstash-*"}
2017-01-20 08:50:00.156 logtrail.bundle.js?v=14588:1 first event time : 1484929865004
2017-01-20 08:50:00.156 logtrail.bundle.js?v=14588:1 last event time set to 1484929894880
2017-01-20 08:50:00.475 logtrail.bundle.js?v=14588:1 sending search request with params {"searchText":null,"timestamp":1484929894880,"rangeType":"gt","order":"asc","hostname":null,"index":"logstash-*"}
2017-01-20 08:50:00.622 logtrail.bundle.js?v=14588:1 first event time : 1484929865004
2017-01-20 08:50:00.622 logtrail.bundle.js?v=14588:1 last event time set to 1484929894880
2017-01-20 08:50:13.647 logtrail.bundle.js?v=14588:1 sending search request with params {"searchText":null,"timestamp":null,"rangeType":"gte","order":"asc","hostname":null,"index":"logstash-*"}
2017-01-20 08:50:14.788 logtrail.bundle.js?v=14588:1 first event time : 1484929865004
2017-01-20 08:50:14.789 logtrail.bundle.js?v=14588:1 last event time set to 1481187931881
2017-01-20 08:50:15.008 logtrail.bundle.js?v=14588:1 sending search request with params {"searchText":null,"timestamp":1481187931881,"rangeType":"gt","order":"asc","hostname":null,"index":"logstash-*"}
2017-01-20 08:50:15.815 logtrail.bundle.js?v=14588:1 first event time : 1484929865004
2017-01-20 08:50:15.816 logtrail.bundle.js?v=14588:1 last event time set to 1481187978645
2017-01-20 08:50:28.629 logtrail.bundle.js?v=14588:1 sending search request with params {"searchText":null,"timestamp":null,"rangeType":"gte","order":"asc","hostname":null,"index":"logstash-*"}
2017-01-20 08:50:29.951 logtrail.bundle.js?v=14588:1 first event time : 1484929865004
2017-01-20 08:50:29.952 logtrail.bundle.js?v=14588:1 last event time set to 1481187978645
2017-01-20 08:50:29.963 logtrail.bundle.js?v=14588:1 sending search request with params {"searchText":null,"timestamp":1481187978645,"rangeType":"gt","order":"asc","hostname":null,"index":"logstash-*"}
2017-01-20 08:50:30.750 logtrail.bundle.js?v=14588:1 first event time : 1484929865004
2017-01-20 08:50:30.750 logtrail.bundle.js?v=14588:1 last event time set to 1481188026481
2017-01-20 08:50:43.628 logtrail.bundle.js?v=14588:1 sending search request with params {"searchText":null,"timestamp":null,"rangeType":"gte","order":"asc","hostname":null,"index":"logstash-*"}
2017-01-20 08:50:45.273 logtrail.bundle.js?v=14588:1 first event time : 1484929865004
2017-01-20 08:50:45.274 logtrail.bundle.js?v=14588:1 last event time set to 1481188026481
2017-01-20 08:50:45.279 logtrail.bundle.js?v=14588:1 sending search request with params {"searchText":null,"timestamp":1481188026481,"rangeType":"gt","order":"asc","hostname":null,"index":"logstash-*"}
2017-01-20 08:50:46.076 logtrail.bundle.js?v=14588:1 first event time : 1484929865004
2017-01-20 08:50:46.077 logtrail.bundle.js?v=14588:1 last event time set to 1481188064532
2017-01-20 08:50:58.628 logtrail.bundle.js?v=14588:1 sending search request with params {"searchText":null,"timestamp":null,"rangeType":"gte","order":"asc","hostname":null,"index":"logstash-*"}
2017-01-20 08:51:00.698 logtrail.bundle.js?v=14588:1 first event time : 1484929865004
2017-01-20 08:51:00.699 logtrail.bundle.js?v=14588:1 last event time set to 1481188064532
2017-01-20 08:51:00.712 logtrail.bundle.js?v=14588:1 sending search request with params {"searchText":null,"timestamp":1481188064532,"rangeType":"gt","order":"asc","hostname":null,"index":"logstash-*"}
2017-01-20 08:51:01.515 logtrail.bundle.js?v=14588:1 first event time : 1484929865004
2017-01-20 08:51:01.515 logtrail.bundle.js?v=14588:1 last event time set to 1481188132024

Notice that it runs two searches back to back in pairs. When I open the page it runs two searches, one after another. It waits 15 seconds and then runs 2 searches one after another. etc.

@sivasamyk
Copy link
Owner

Thanks again. These debug logs helped. From these logs I see there are some issues with queries sent by logtrail client. Will debug over the weekend!

@fbaligand
Copy link
Author

fbaligand commented Jan 21, 2017

Hi @sivasamyk ,
First, sorry for my late answer.

Actually, my need is simple : I want the same timestamp behaviour than Kibana, so :

  • when I define "from" date range in logtrail, I want that entered "from" date is converted using browser timestamp
  • when timestamp is displayed in logtrail, it is displayed using the browser timezone (not UTC). And the top would be that dateFormat Kibana advanced setting is used to format timestamp

The first need already works as I expected => Nice !

Concerning the second need, currently, date is displayed in logtrail using UTC timezone.
I thought that changing timezone field in logtrail.json would fix it. But it makes logtrail fail. That's why I opened the issue.
But now, given your explanations, if I understand well, timezone field in logtrail.json indicates which is stored timestamp field timezone, and not logtrail displayed timestamp timezone, right ?

Finally, as you tell, I think that @brandond PR solves my second need.

By the way, I use Kibana 4.6.4.

@sivasamyk
Copy link
Owner

@brandond The issue is fixed (the issue was due to wrong variable reference in app.js). Please download the latest plugin from https://github.com/sivasamyk/logtrail/releases/download/0.1.7/logtrail-5.x-0.1.7.zip

@fbaligand regarding the second issue download the latest 0.1.7 release and set the display_timezone to local (default value). Let me know if this solves your issue

@brandond
Copy link
Contributor

@sivasamyk Thanks, I'm getting consistent data refreshes now.

However, it no longer stays scrolled to the bottom of the window - when new data is added it scrolls off the bottom of the page. It also seems to take a TON of CPU time to add new events - I'm guessing this is related to the change that added sorting and deduplication of events as they're added? Hopefully there's some opportunity for optimization there.

Also, is there any setting to control how many entries it keeps in the window? After a few refreshes it gets REALLY slow. Guessing this is a combination of not rolling off old events + having to sort and filter each new event as it's added.

@fbaligand
Copy link
Author

I have tested it with Kibana 5.1.2, and it works great !
Like I expected :

  • the time is displayed locally to my browser timezone
  • I can custom date format to a french format for example

Thanks a lot for the work done !

That said, I use Kibana 4.6.4 in production at my company, is it possible to port the fix to 4.x branch ?

@sivasamyk
Copy link
Owner

@fbaligand The same changes are available in 4.x-0.1.7 release

https://github.com/sivasamyk/logtrail/releases/download/0.1.7/logtrail-4.x-0.1.7.tar.gz

@sivasamyk
Copy link
Owner

@brandond Can you let me know you average event incoming rate and tail refresh interval? Can you let me know the max number of events shown in your event viewer (like 10K events) ?

@fbaligand
Copy link
Author

Hi @sivasamyk,

I have tested logtrail-4.x-0.1.7 and it works great !
Thanks a lot for the fix and backport !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants