Skip to main content

Hi all, just wanted to know that is there any way to trace the overview of the space data pool/Workspace wise?

 

Currently in Celonis Event collection there in data consumption section we can see data pool size but it is not in consolidated form and it will be a tedious task to add up all table size every single time we run the process data model.

 

Any help will be much appreciated.

Hi Rohit,

you can use the Custom Data Pipeline Monitoring feature. It comes with pre-build transformations, data models and analyses that enable aggregation and individual analyses of APC consumption. All details are in the Help Space.

 


Hi Rohit,

you can use the Custom Data Pipeline Monitoring feature. It comes with pre-build transformations, data models and analyses that enable aggregation and individual analyses of APC consumption. All details are in the Help Space.

 

Hi Svenja,

Thanks for the response will surely try this out.

 


Hi Rohit,

 

did this set up work for you?

 

Best,

Kevin


Hi there,

I am trying this too but I am stuck. I did all the steps under How to set it up Enabling the Custom Data Pipeline Monitoring. I then tried to execute the data jobs in the Monitoring Pool but I got an error (Relation "data_consumption_updated_events" does not exist).

 

Am I supposed to run the jobs? And how can this be connected to the EMS analysis Data Pipeline & Consumption Monitor

 

All the best,

Saša

 

 


Hi there,

I am trying this too but I am stuck. I did all the steps under How to set it up Enabling the Custom Data Pipeline Monitoring. I then tried to execute the data jobs in the Monitoring Pool but I got an error (Relation "data_consumption_updated_events" does not exist).

 

Am I supposed to run the jobs? And how can this be connected to the EMS analysis Data Pipeline & Consumption Monitor

 

All the best,

Saša

 

 

Hi Sasa,

 

the tables are created with the first logs generated for it. The table data_consumption_updated_events is therefore created with the first APC calculation for the team, which can take up to several days. Until then the Data Job 'Data Consumption Monitoring' will fail. I hope the table got created in the meantime and you can start working with it.

 

The package Data Pipeline & Consumption Monitor from the EMS store contains three analyses that can be used based on the data models that are part of the Monitoring Pool.

 

Best,

Svenja


Hi Sasa,

 

the tables are created with the first logs generated for it. The table data_consumption_updated_events is therefore created with the first APC calculation for the team, which can take up to several days. Until then the Data Job 'Data Consumption Monitoring' will fail. I hope the table got created in the meantime and you can start working with it.

 

The package Data Pipeline & Consumption Monitor from the EMS store contains three analyses that can be used based on the data models that are part of the Monitoring Pool.

 

Best,

Svenja

Thanks for the reply Svenja. I got there in the end. I just needed to wait a day or so :)


Reply