cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Has anyone used Kafka-Connect to Publish Appdynamics Data to Other Source?

Vaibhav Vir.Singh
Architect

Experts,

There is an increase demand of APM data to our common data reservoir.

Currently we are extracting data via REST API from Appdynamics, transforming it in the meaningful format and then publishing it to data reservoir. 

 

But has anyone tried any specific Kafka Connector to stream the AppDynamics data to other sources?

If yes, appreciate if you could share the details over here.

 

Regards,

Vaibhav

12 REPLIES 12

David.Hope
AppDynamics Team

Hi Vaibhav,

 

You are currently extracting data via AppDynamics via our REST API. 

 

Did you want to extract from the REST API->Kafka, or directly out of AppDynamics via Kafka? We don't currently support any other method than via the REST API.

 

Kind Regards,

 

David Hope - APM Architect

 

Thanks David for your response.

I am referring Rest API -> Kafka and if anyone has used Kafka connect or
NiFi to integrate.

Currently I am using python script to extract data via rest API,
transforming the data and then publishing to my data lake via AMPS bus and
to be kafka(in simple terms using post API to put the data).

So was thinking if it is feasible and scalable to use any other option to
achieve the same.

Vaibhav Vir.Singh
Architect

HEllo,

Anyone tried above activity by any means?

 Hi Vaibhav

 


I'm very sorry that I haven't got back to you yet, I was hoping to have a little more information.

 

I will reach out to the internal AppD team to see if we know of anyone who has tried this. I personally unaware of this being tried. Let me do some more investigation.

 

Happy Holidays.

 

Kind Regards,

 

David

Vaibhav,

What kind of data are you exfiltrating with your scripts? Metric data? Events? Which ones?

 

Are you aware of the https://github.com/Appdynamics/AppDynamics.DEXTER/wiki toolkit? It wraps a lot of things you can't ordinarily get into and gives them to you in data warehouse format, ready to go into your data lake.

 



Found something helpful? Click the Accept as Solution button to help others find answers faster.
Liked something? Click the Thumbs Up button.

Thanks for your response @Daniel.Odievich

I am subscribing metric data (event data we already getting from AppD-Netcool integration and Netcool data is already onboarded to our data lake).

 

 

YEs, I am aware of that but as i have a constraint of using external dependency (from organization security point of view), cannot use that product in Production. Moreover, I felt Dexter is useful for extracting data and potraying the detailed output. But my requirement is somewhat not just extraction but data publishing as well. Where I beleive products\utility like Kafka-Connector for Appdynamics would be of much help and meeting the requirements.

 

Hope you understand my pain area and looking for someone who already gone through this.

 

Regards,

vaibhav

 

I am also extracting data with python scripts. if any one has any thing on this please lwt mw know.

Any thoughts on this?

Hi Vaibhav,

 

I also have a similar requirement to extract AppD data and publish into Data Lake.

Currently, trying to use DEXTER csvs to transform and upload to Data Lake.

 

But, don't know how to make it Real Time?

Are you able to do it Real Time ? or is there any progress in this since your last query ?

 

Please guide.