Have you received an email notification about an employee badge? Please note, this is in error. The employee badge is triggered by a set of actions and unfortunately it was going out to all community members and not just AppDynamics employees. We have subsequently turned off the badge.
If your account had previously met the criteria for the badge (before it was deleted) the final act of logging in completed the evaluation and shipped out the badge to you. This is the reason you may still receive the email notification.
Nevertheless, no action is required on your part. Please ignore the email about employee badge, you will not actually receive the employee badge unless you have an AppDynamics email account.
Sorry for the confusion and needless email. We are working to resolve the issue moving forward.
... View more
Business iQ - Transaction Analytics Webinar [Air-Date February 21, 2018] Run Time: 56 minutes.
@Mohammed.Rayan led this introductory transaction analytics session. During this session Mohammed also demonstrated the capabilities of transaction analytics and shed light on the art-of-the-possible with Business iQ. See below for attached slide deck and additional resource links.
If you have trouble viewing the recording - please try to Stream from Webex.
We're listening, please take our post-webinar survey: https://survey.clicktools.com/app/survey/go.jsp?iv=1y416hk5g68xu
Github link: Self Paced Demo: https://github.com/Appdynamics/AD-Capital-Docker Self-Paced Tutorials available to explain Business IQ central concepts and capabilities: https://www.appdynamics.com/university/biq-accelerator/
BIQ Community Resources: https://community.appdynamics.com/t5/Knowledge-Base/How-to-implement-Business-iQ-for-AppDynamics/ta-p/29140 https://goo.gl/sV1rdT
Webinar Q&A Transcript
Table of Contents
How do I enable Transaction Analytics for an application?
How can I run a query and view Transaction Analytics?
Where can I find a demo application for Transaction Analytics with Docker support?
What are best practices for enabling Business Transaction collections for analytics?
How do SQL Data Collectors work? Are they strictly for transaction data stored in the database (limited to just transactions) or do they access other event types?
Q: How do I enable Transaction Analytics for an application?
A: Within the Controller, go to the Analytics tab at the top and select the Configuration tab from the left-hand navigation bar. In the Transaction Analytics tab, select the application of interest from the Configure Analytics For Application drop-down menu. Check the box to Enable Analytics Data Collection For XXXX, then select the Business Transactions that will report analytics data. You can add and configure Method Invocation Data Collectors to capture custom data. Specify the data you would like to retrieve and display in Transaction Analytics by extracting a particular parameter from your code. You can enable HTTP Request Data Collectors, as well.
Q: How can I run a query and view Transaction Analytics?
A: To run a query, navigate to the Analytics tab and select the Searches tab . Click the +Add button and select either Drag and Drop Search or Query Language Search. In the Data view, you can customize your search criteria and filter by time frame and fields (e.g., Application or Business Transaction) to see raw data. Within the Visualization view, you can create and save a widget to your Dashboard. One helpful widget type is the Funnel widget. Additional properties were added as part of 4.4 to provide more control over the widget appearance.
If you click the gear icon in the top-right corner of the Controller and select AppDynamic Agents and the Analytics Agents tab, you can see the Transaction Pipeline Health and Log Pipeline Health. You can also sort by All Agents, Agents Publishing Log Data, and Agents Publishing Transaction Data, which can help you identify which agents are publishing the transaction data.
Q: Where can I find a demo application for Transaction Analytics with Docker support?
A: We recommend that everyone tries the Self Demo , which has instructions and is straightforward to use. All you need to do is install the Docker daemon and download the project.
Q: What are best practices for enabling Business Transaction collections for analytics?
A: As a best practice, we recommend you start by enabling just a few Business Transactions that you know well and matter most to see what the patterns and contents are and then slowly increase from there. Avoid enabling Business Transaction collections for analytics for everything initially, as it can be challenging and can cause you to run into other types of issues (e.g., connectivity or licensing issues).
Additionally, on the Business Transaction schema that we have, you may know that we have an All Other Traffic bucket. Many times what we find is that Business Transactions are enabled but in reality, that information is going into the All Other Traffic bucket because you are limited in terms of Business Transaction numbers per node or per application. That’s why being iterative here with your Business Transaction collections makes a lot more sense and will be useful for you in learning the flow as you enable them. Start with one Business Transaction that you know really well and incrementally go from there.
Q: How do SQL Data Collectors work? Are they strictly for transaction data stored in the database (limited to just transactions) or do they access other event types?
A: It pertains only to transaction collection. SQL Data collections will show you the parameter value as part of that transaction so when the analytics data set, you’ll actually see the value was passed to that particular SQL. That’s what this will allow you to do.
It doesn’t send any other event types. In SQL Data Collectors, that’s all the SQL database calls from the Business Transactions and from the parameters of those SQL calls, we capture specific custom or business information.
... View more
Sorry to hear of your frustration. Please only share non-sensitive company information/screenshots in the community. If you have sensitive information to share please open a support ticket and send files over secure support channels.
FYI, you may also private messaging with Yogesh as needed via the community.
To send a private message to Yogesh, hover over his name in the thread and click on 'send message'
Lastly, as you uncover things that are helpful, workarounds, solutiojsn please share with the community. We are striving to be a closed loop community and responsiveness is important to us. Thank you for your diligence and for raising the question.
... View more
@Radhika.Puthiyetath Radhika Puthiyetath is crafting information experience for the Business IQ product lines at AppDynamics (part of Cisco). Radhika was the Release Notes Lead for the Kubernetes 1.8 Release. Prior to Kubernetes, she had actively been involved in Apache Software Foundation, GNOME, and OpenSUSE.
The term “Unity in Diversity” refers to the state of oneness despite the presence of Diversity of all kinds. The oneness ensures the presence of, participation by, and respect for people of different interests, backgrounds, and experience. An empathetic and diverse community fosters inclusiveness. Open Source (OS) communities in general uphold their commitment to this philosophy. Talking of OS communities, Kubernetes is probably the second largest of all times, in terms of code contribution and participation. Contributors belong to different organizations with diverse interests, dabbling in different languages, focusing on different components, coexisting in harmony and contributing to a common goal–Kubernetes. Reflecting its sheer size and impact, the Kubernetes community went many steps further at championing Diversity and Inclusion in almost all aspects of community ecosystem. Read more...
... View more
By now you’ve certainly heard of – or perhaps been impacted by – Meltdown and Spectre, two newly discovered vulnerabilities that affected nearly every modern processor. If not, you might want to take a moment and visit Meltdownattack.com for a good overview, and Ars Technica for some good examples of how the vulnerability affects processors.
Read more of the AppDynamics blog>>
Tales from the Field: Spectre, Meltdown, and Patching Performance
... View more
Business iQ - Troubleshooting and using Browser Analytics Webinar [Air-Date January 17, 2018]
@Gaurav.Soni leads this webinar.
Calling all users and owners of AppD Analytics licenses. We are starting a Business iQ Ask the Expert Webinar Series just for you. This webinar is intended to provide the required knowledge, instruction and guidance to help you become more successful in bridging the gap between the app and the business. The goal for this webinar series is to help as you are planning, installing, configuring and customizing dashboards. We will also cover best practices to improve your day-to-day activities with AppD Analytics and discuss how to get started utilizing specific features.
*If you have trouble viewing the recording - please try to stream from Webex.
Webinar Q&A Trancript
Table of Contents:
How can I analyze the worst performing pages by average end-user response time while excluding pages that have very low traffic?
How do we access Business Outcomes and is this a new feature of 4.4?
How is a metric created after a minute?
Can an ADQL metric query or business metric refer to another defined ADQL-based metric or can it only be defined on the underlying event data?
How does the data age over time and for how long can I see browser analytics data?
How is the data set powered and are there limits on the data set?
Q: How can I analyze the worst performing pages by average end-user response time while excluding pages that have very low traffic?
A: There are two streams by which you can visualize the End-User Monitoring (EUM) or the browser-related data. One stream is available through the controller via Metrics, where you can go to Pages in the Ajax and see individual page registration and average response time.
The second stream, which is more pertinent for this scenario, is the analytics stream that is explained on the “In Action” slide of the presentation (Slide 3 at 02:35). The Events Service piece, which is the platform that supports the analytics data set, contains raw data. You can write a query that looks at average response times for a given window of time and tell it to exclude pages with fewer than X page counts. You can write advanced queries that are only limited by what ADQL offers you.
Q: How do we access Business Outcomes and is this a new feature of 4.4?
A: Yes, this feature was introduced in 4.4. If your controller is hosted on SaaS, Business Outcomes is enabled by default as part of your analytics entitlement. If you are on-premise, Business Outcomes is disabled by default but can be enabled from your Events Service.
Q: How is a metric created after a minute?
A: Metric calculations work the same whether they are APM-triggered or EUM-triggered. For the purposes of browser data, you can visualize the EUM collector as one giant agent. It will look at calculations of average response time for beacons, which are user-triggered data captures that happen for a given controller minute. It compiles and aggregates the data for a given controller minute and performs various metric calculations (including average), after which the controller fetches the data.
That is one of the limitations in the Metric Browser. It is helpful for trending and quick troubleshooting scenarios but is less helpful when trying to narrow in on particular user’s experience. This is where the Business iQ or Browser Analytics piece is beneficial because the entire data set is made available to run queries. As a best practice, use a combination of both when investigating and looking at browser data. People tend to use the Metric Browser and the core Controller Visualization as the starting point for standardized queries with respect to performance, then dive into the ADQL or dashboard-based approach on the Business iQ side to complete the picture.
Q: Can an ADQL metric query or business metric refer to another defined ADQL-based metric or can it only be defined on the underlying event data?
A: You cannot nest analytics metrics at this time. For the vast majority, the current Metric Browser gives you adequate capability. If you are having to create nested metrics, consider if you can visualize it via the Dashboard. The reason behind this is that you could run a query for a metric over the last 24 hours but if you introduce any sub-selections, it becomes a fairly expensive operation on the back-end and can slow down the whole metric calculation model.
Q: How does the data age over time and for how long can I see browser analytics data?
A: The data is time-limited to eight days. Historically, Web Analyze was a light-weight functionality for analytics data sets. One of the reasons we made this is because we wanted users to ask questions about their data. Many customers could set metrics and relevant alerting, but those who were new to EUM and browser monitoring didn’t know what questions to ask. One of the ways we could help was by making data sets available (Web Analyze), which is the exact raw data capture that EUM creates and pushes into the Events Service. By default, you can see the last week’s worth of data, though some of the advanced querying techniques are only available through Browser Analytics. The exposed data set should help you ask questions that you didn’t know. If you do not have a Business iQ license or entitlement but you still only use EUM, you can see your data set in Web Analyze. If you find value in that data, talk to your Sales rep about exploring Business iQ’s enriched capabilities.
Q: How is the data set powered and are there limits on the data set?
A: The engine that powers the data set for analytics, an Elasticsearch-based data store, is common across both SaaS and on-premise. Because of the backend Elastic limitation, we also have some soft thresholds in terms of how you can pull data and how much data you can pull. The API is evolving to your needs, but there are certain restrictions in place in terms of how many net buckets can make its way through to a user. If there are more than 2,000, we will limit it and say the query will not work.
Depending on the type of query being issued, there are different default and max limits that come into play. These limits are outlined in our public documentation. We’re continuously looking at expanding those limits. We’ve heard your feedback and are looking at expanding the number of rows you can export from the UI and the number of rows that we show when you issue different queries.
... View more