Skip to main content

Hi Team,

What is the standard way how to document your pipelines from Analysis -> Data Model -> Transformation -> Extraction.

Is there holistic view in Celonis which shows these relationships in one shot.

Hi @ravi.kumar23,

 

Unfortunately, there is not one holistic view to present this.

What can partially help, is to generate an HTML documentation report by pressing right-mouse button in an Analysis in Studio. Here you can find all your definitions and data sources. This links all your Analysis information towards your data model.

To generate an overview of your Data Ingestion part, you can create a back-up of your data jobs, using Pycelonis (see: https://celonis.github.io/pycelonis/1.7.3/reference/celonis_api/event_collection/data_job/#celonis_api.event_collection.data_job.Transformation.backup_content). However, this is in JSON format, which is less easy-readible than the HTML report.

Also, you have to make the link between both reports yourself. Of course you can optimize this by enriching the Python script, but this is quite time consuming in my opinion.

 

Hopefully this helps.

 

Kind regards,

@janpeter.van.d


Hi Ravi,

 

Once you enable monitoring in your data pool, you can use the standard data pipeline monitor to understad your Extraction, transformation and Model load.

 

https://cardinalhealth-prod.us-1.celonis.cloud/store/ui/execution-apps/c1edcff8-42d7-4194-bddd-3975b04947c3

 

-Muthappan


Hi @ravi.kumar23,

 

Unfortunately, there is not one holistic view to present this.

What can partially help, is to generate an HTML documentation report by pressing right-mouse button in an Analysis in Studio. Here you can find all your definitions and data sources. This links all your Analysis information towards your data model.

To generate an overview of your Data Ingestion part, you can create a back-up of your data jobs, using Pycelonis (see: https://celonis.github.io/pycelonis/1.7.3/reference/celonis_api/event_collection/data_job/#celonis_api.event_collection.data_job.Transformation.backup_content). However, this is in JSON format, which is less easy-readible than the HTML report.

Also, you have to make the link between both reports yourself. Of course you can optimize this by enriching the Python script, but this is quite time consuming in my opinion.

 

Hopefully this helps.

 

Kind regards,

@janpeter.van.d

@janpeter.van.d How do you normally monitor your celonis EMS landscape. If holistic monitoring is not there. Its difficult to ensure the data accuracy and often user comes back with the issues which could have been caused due to any failure starting from extraction till the analysis / views.

We are trying to work on a holistic monitoring solution for celonis projects starting from Triggers created in SAP tables till analysis or views.


@janpeter.van.d How do you normally monitor your celonis EMS landscape. If holistic monitoring is not there. Its difficult to ensure the data accuracy and often user comes back with the issues which could have been caused due to any failure starting from extraction till the analysis / views.

We are trying to work on a holistic monitoring solution for celonis projects starting from Triggers created in SAP tables till analysis or views.

Hi Ravi,

 

Have you seen the new data pool lay-out in Celonis (see https://docs.celonis.com/en/transfer-data-between-data-pools.html#UUID-f0fadade-58df-c9c5-aba5-16ace495034a_id_TransferdatabetweenDataPools-ExportingDataConnections)? Probably this is something that can help you.

Next to that, you could indeed use data pool monitoring analyses, as suggested by Muthappan below (see Custom Data Pipeline Monitoring (celonis.com)).

Normally, these tools are sufficient for us, especially when notification triggers are set for the data pipeline monitoring analysis.


Reply