Not a customer? Click the 'Start a free trial' link to begin a 30-day SaaS trial of our product and to join our community.
Existing Cisco AppDynamics customers should click the 'Sign In' button to authenticate to access the community
04-10-2017 05:10 AM
Hi there,
I am trying to understand the pattern matching system in place for groking patterns and matching multi-lines.
I am running a Java app which uses log4j for logging, configured to use the console appender. The console output is being piped to syslog using logger.
Run command:
java -jar /home/target/app.jar 2>&1 | logger -t app &
Example of my logs in /var/log/messages:
1. Apr 10 11:49:02 e0c50fc64857 app: [INFO] com.example.service.App [main]: This is a info message {correlation-id: 76230e6a-8351-44d8-8dfb-94d039772c29} 2. Apr 10 11:49:02 e0c50fc64857 app: [ERROR] com.example.service.App [main]: This is a error message {correlation-id: 76230e6a-8351-44d8-8dfb-94d039772c29} 3. Apr 10 11:49:02 e0c50fc64857 app: [WARN] com.example.service.App [main]: This is a warn message {correlation-id: 76230e6a-8351-44d8-8dfb-94d039772c29} 4. Apr 10 11:49:03 e0c50fc64857 app: java.rmi.UnexpectedException: Uncaught Exception 5. Apr 10 11:49:03 e0c50fc64857 app: #011at com.example.service.App.throwUncaughtException(App.java:37) 6. Apr 10 11:49:03 e0c50fc64857 app: #011at com.example.service.App.main(App.java:17) 7. Apr 10 11:49:05 e0c50fc64857 app: Exception in thread "Thread-1" java.lang.RuntimeException: I failed to execute something at runtime 8. Apr 10 11:49:05 e0c50fc64857 app: #011at com.example.service.App.lambda$main$0(App.java:20) 9. Apr 10 11:49:05 e0c50fc64857 app: #011at java.lang.Thread.run(Thread.java:745)
Job file being used: I only expect to see lines 1 to 3 logged to AppD
version: 2 enabled: true source: type: file # I'm using file because syslog was not working for me path: /var/log nameGlob: messages startAtEnd: false fields: sourceType: syslog applicationname: services applicationtier: logger grok: patterns: - '%{SYSLOGBASE2:syslogbase} \[%{LOGLEVEL:logLevel}\] %{JAVACLASS:class} \[%{DATA:threadName}\]: %{DATA:logMessage} \{correlation-id: %{UUID:correlationId}\}'
Version information:
buildName=113-4.2.1.next-build buildHash=b3b8138e547e0d2dd1cee3f6b3ef9ab6e14fdf33 buildTimestamp=2016-04-26T02:31:06+0000 buildNumber=12590 jobName=build-machine-agent version=4.2.1.8
Question 1
When I use the aforementioned job file, I get all of the logged lines in AppD. I was expecting to only see the lines that match the grok pattern (lines 1 to 3). Is the default to log everything, even if it does not match?
Question 2
If I add the following multi line config to grab the RuntimeException, then I get an entry with the items specifically relating to the exception (lines 7 to 9).
multiline: regex: '^Apr 10 \d{2}:\d{2}:\d{2} [a-zA-Z0-9]* app: Exception in thread .*$'
However, the remaining lines (1 to 6) come back as a single log entry in appd. Does that mean the job files don't support single & multi-line logs simultaneously?
Kind Regards
Solved! Go to Solution.
04-11-2017 03:29 AM
Log entries that does not match the grok pattern will still be sent to the events service in their raw format.
A log model having different log formats with different timestamps is not recommended at all.
If such a model exists, one way to solve this issue is to come up with a grok pattern to strip off the common stuff and use %
{GREEDYDATA:logmessage}
for the remaining stuff. Then you can use regex feature to extract those fields through our UI. Hence it is by design that all read and processed log messages by the analytics agent will be published to events service.
For second part, if you are configuring the multiline property it will take effect on the complete job. It is way for analytics agent to understand where your next line begins so that it can group the rows in the log properly. It wont imply this setting in a selective way. To configure it properly I would recommend looking up the below doucmenation:
https://docs.appdynamics.com/display/PRO43/Configure+Log+Analytics+Using+Job+Files
04-11-2017 03:51 AM
Log entries that does not match the grok pattern will still be sent to the events service in their raw format.
- This is good to know and I am not sure it's clearly stated in the documentation you referred me to. The documentation in general is very sparse
A log model having different log formats with different timestamps is not recommended at all.
- But is very likely. I could be using a structured logger for all developer controlled actions, but the moment I have an unhandled exception, it will be in a different form. As per my example.
If such a model exists, one way to solve this issue is to come up with a grok pattern to strip off the common stuff
- In order to collect the exceptions, I need to use the multi-line property which then bundles all my other single line entries into a single blob (see following section)
Then you can use regex feature to extract those fields through our UI
- I can't see any functionality in the Analytics UI which allows me to do this.
Regarding the second part:
According to the documentation:
Whenever the Log tailer sees the matching pattern at the start of the line, it starts a new match and passes the previously collected data as one log record. Each matching pattern identifies when to start accumulating the log lines in a buffer until the matching pattern is found again at the beginning of another record.
That would suggest that I can collect Java stacktraces as a single entry and all other single line logs, between exceptions, as a single blob. I do not understand how that is useful.
All I am really looking for is the ability to have a single entry in the AppD Analytics logs for a single line entry and a single entry for an exception stacktrace.
04-24-2017 05:50 AM
I managed to create a grok pattern which passes the logs as expected at http://grokconstructor.appspot.com/do/match, however, when I use this same grok pattern in the appd job file it globs the entire list of log entries into a single entry on the AppD console.
I have logged a support request, which AppD are currently looking into.
The pattern for those interested, is:
multiline: regex: '^[a-zA-Z]+ \d{2} \d{2}:\d{2}:\d{2} (\w)+ app: (#|E)'
The documentation was not that clear to me, so I'll try put it in different words. The multiline regex (from my experimentation) marks the start of line indicating that you are already in a multiline. Put differently, it's the start of the 2nd line of a multiline statement.
For example, if the multiline log entry is:
[INFO] This
# is
# a
# multiline
Then the multiline regex should be ^# .* and not ^\[INFO\] .*\n#
04-25-2017 02:20 AM
Finally got it!!! Thanks to AppD for the regex
multiline: regex: "^[A-z][a-z]{2} \\d{2} \\d{2}:\\d{2}:\\d{2} [a-zA-Z0-9]* app: [^#].*$"
My understanding was incorrect, and specifically applies to grokconstructor and not the AppD groking.
So to summarise why this works: the way the appd multiline works is you define the start of each unique line, so in this case anything without a # is a line, so each line that doesn’t have a # will be a unique entry, and each line with a # will be tagged to the line without until it sees another line without a #. Which is very different to the way grokconstructor works.
User | Count |
---|---|
3 | |
1 | |
1 | |
1 | |
1 |
Thank you! Your submission has been received!
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form