run time: 60 minutes
@Paul.Saivetz, Solution Consultant at AppDynamics, demonstrated how to create a BiQ segmentation dashboard step-by-step in the app.
To get you started, this session covered the following topics:
• Highlighting the most common BiQ use cases
• Preparing requirements for BiQ use cases
• Creating a segmentation dashboard
Resource links shared during the session:
Apply to Participate in the Transaction Analytics Beta Program: Agentless Analytics on SaaS
Get Started, Business iQ Tutorials
Business Health Monitoring [Session Available for Playback]
User Journeys Use Case [Session Available for Playback]
A: Yes, this is typically handled by having a dashboard per role. You would create a set of dashboards for one business unit (role) and another set of dashboards for another business unit. To clarify, it's not one dashboard in which the widgets change depending on your role. You'll have to create a separate dashboard for each business unit.
A: Yes, you can.
A: Yes, you can do this for any metric. The Drag and Drop widgets default to showing you the 10 slowest or 10 highest count, but that's configurable. You also have the option within ADQL to say "select business transaction, average response time, order by DESC." There is a LIMIT clause you can use to limit to the top 10 transactions.
A: That’s a matter of working with your business and developers to come up with an algorithm that associates value with what you’re doing. It's really a business question.
Here's an option we've seen: from the UI, let's say I'm paying a credit card statement. I fill in the credit card information and when I hit that transaction, that object is passed into the server-side or if you're able to capture it on the UI side. You can find that payment object in the java code, for example, and set up a data collector on the payment code so that every time that transaction is called, that payment amount is captured in Analytics. You can use ADQL to get the sum of the payment field for the past 1 day to get a sense of revenue. Technically, you can capture and show those values.
A: That depends. In this demo, the agents were installed on an application called the "AppDynamics Travel Agency" that uses those services. The "AppDynamics Travel Agency" has gotten permission to extract data from those providers, so not for this situation. This information is being used by the travel agency app and isn't being extracted directly from Kayak, Travelocity, etc.
Here's another way to look at the information we're capturing: think of it the same way as capturing snapshot data. When you configure data collectors to be captured with snapshots, Analytics captures those same data collectors. If you flag those data collectors and push them to Analytics, we'll capture that data for every single transaction. That's the big difference between shapshot data and analytics data. It's using the same data collectors, but snapshot data is only collected when you cross a threshold. Analytics data is captured for every single call. We can collect the data is passing through your application code. In the demo, the travel agency is just a field/parameter being passed into those transactions. We'e collecting data on which travel agency is making a call to those transactions.
A: This is by far the most complex part in the implementation. It’s really important to inform AppD Account Reps earlier in the process. When you're reaching out to developers and you have a complex idea and code, get your Reps involved so they can provide assistance. You can also reach out to the Support team at firstname.lastname@example.org if you set up data collectors and aren't seeing any data. Additionally, here is our documentation on the process:
A: Yes, AppDynamics has an integration with Splunk. You can learn more about that in our docs: Integrate Appdynamics with Splunk
That being said, the Splunk use case and AppDynamics use case are very similar. Splunk is just capturing the data from your logs. AppDynamics is capturing it from the actual code. We capture real time. Splunk is pulling from the logs in their version of real time, but it's actually after the fact. Since we're capturing the data within the code execution, we can tie it back to performance much more easily than Splunk can.
A: You can use a tool called the Config Exporter, which allows you to export configurations from one enviroment to another, including dashboards. Contact your Sales Engineer to obtain the tool.
Since it's ADQL, there's also another option. If you’re on separate Controllers (let's say your UAT Controller and Production Controller point to separate applications), as long as the data collectors and the application name are set up the same on both Controllers, you can export the dashboard as a JSON. It you look at the JSON file, you'll see the ADQL queries. You can then import it into the other Controller. As long as the ADQL has the exact same data collector or field names, it's just a matter of exporting and importing. If the names don't match, the query will break.