Click the Start a free trial link to start a 15-day SaaS trial of our product and join our community as a trial user. If you are an existing customer do not start a free trial.
AppDynamics customers and established members should click the sign in button to authenticate.
There is an increase demand of APM data to our common data reservoir.
Currently we are extracting data via REST API from Appdynamics, transforming it in the meaningful format and then publishing it to data reservoir.
But has anyone tried any specific Kafka Connector to stream the AppDynamics data to other sources?
If yes, appreciate if you could share the details over here.
I'm very sorry that I haven't got back to you yet, I was hoping to have a little more information.
I will reach out to the internal AppD team to see if we know of anyone who has tried this. I personally unaware of this being tried. Let me do some more investigation.
What kind of data are you exfiltrating with your scripts? Metric data? Events? Which ones?
Are you aware of the https://github.com/Appdynamics/AppDynamics.DEXTER/wiki toolkit? It wraps a lot of things you can't ordinarily get into and gives them to you in data warehouse format, ready to go into your data lake.
Thanks for your response @Daniel.Odievich
I am subscribing metric data (event data we already getting from AppD-Netcool integration and Netcool data is already onboarded to our data lake).
YEs, I am aware of that but as i have a constraint of using external dependency (from organization security point of view), cannot use that product in Production. Moreover, I felt Dexter is useful for extracting data and potraying the detailed output. But my requirement is somewhat not just extraction but data publishing as well. Where I beleive products\utility like Kafka-Connector for Appdynamics would be of much help and meeting the requirements.
Hope you understand my pain area and looking for someone who already gone through this.
I also have a similar requirement to extract AppD data and publish into Data Lake.
Currently, trying to use DEXTER csvs to transform and upload to Data Lake.
But, don't know how to make it Real Time?
Are you able to do it Real Time ? or is there any progress in this since your last query ?