Showing results for 
Show  only  | Search instead for 
Did you mean: 
Not a customer? Start a free trial

Click the Start a free trial link to start a 15-day SaaS trial of our product and join our community as a trial user. If you are an existing customer do not start a free trial.

AppDynamics customers and established members should click the sign in button to authenticate.

Java (Java Agent, Installation, JVM, and Controller Installation)

Calculating downtime of tier per month

Calculating downtime of tier per month

Hi everyone! 


I'm currently trying to calculate the downtime of a tier for the last month.
What I mean about downtime: when a tier has no available agent(s) reporting data. I.e. when the metrics "App > Agent > Availability" = 0. So if a tier has two nodes, both nodes should have no available agents in order to be calculated as downtime. If one of the two nodes have an active agent, this should not contribute to the downtime calculation. 


So far I use this equation:  [(app>agent>availability/(#minutes x total nodes)) x 100] %


The problem I face is when I choose "Last month", the data resolution is rolled up to 1-hour intervals. If the tier has e.g. two nodes and the app>agent>availability sum = 110 (two nodes 100% available should be 120) during a rolled-up window at 1-hour resolution. I don't know if one of the two nodes was down 10 minutes, or both nodes were down 5 minutes each. Or if they were down at the same time. 


Does someone know if there is some solution to this? 

Or maybe have some pointers for me? 


Best Regards, 





By replying you agree to the Terms and Conditions of the AppDynamics Community.
Calculating downtime of tier per month