-
Notifications
You must be signed in to change notification settings - Fork 108
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No date field error when log entry has to many rows #103
Comments
There is no limit for rows in an entity. Should work fine. |
The reason is probably due to line 2 not having a date.
Weird, because the following works:
|
The format detector ignores lines started with " at", so there are 2 lines with a date and one line without a date (java.security.SignatureException: token ...). The format detector decided that the format is correct if the recognized line number is more than not recognized lines * 2/3 .
|
That explains it, thank you. I do not use the provided UI to browse through the system files but I only use the log-paths = {} section to point to logs on machines. Is it possible to specify the log formats for this approach? |
yes, |
I got it working up to a point but not yet how I want. For example take the following log row: It correctly detects the log with the following config: The problem seems to be with the subseconds SSS, with auto format detector it does capture the subseconds but not with a manual pattern. My working filter results in the detection of the time and merging of the logs but not for the subseconds or loglevel(DEBUG, INFO, etc..). For your info, the pattern specified in log4j.xml for the logs, which does not work in log-viewer, is:
|
I tried to open the provided line.
This pattern really doesn't work. There is a bug in parsing |
…p {%X{username} %X{client} %X{location}} %t [%c]: %m%n` pattern.
1.0.3 release contains a fix for parsing using |
I will update my version and configuration and test the new release My automated script fails due to a new versioning format: I run your tool on a few servers configured with Ansible for deployment/configuration I have tested version 1.0.3 and it looks great! The auto format detection seems to work now :) |
Some of my logs contain stacktraces as long as 65 rows. When this happen in a logfile, log-viewer cannot correctly identify the next log entry. It returns a No date field, log cannot be merged error. When I remove the log entries in the log it returns to function correctly. I assume this is due to a maximum number of rows an entry is allowed to take. Maybe an option to add this as a config variable to set?
Example:
2022-01-19 14:41:44,459 DEBUG { } http-nio-XXX-7070-exec-9 [XXX.DefaultTokenService]: Fout bij verifieren token.
java.security.SignatureException: token expired at X, now is 202X44.459Z
at net.oauth.jsontoken.JsonTokenParser.verifyAndDeserialize(JsonTokenParser.java:132) ~[jsontoken-1.0.jar!/:?]
..... 63 rows
2022-01-19 14:41:44,461 ERROR { } X [XTokenRestController]: Foutcode: X
Thanks, and I like your tool!
The text was updated successfully, but these errors were encountered: