Skip to main content

I'm curious as we only have limited consumption in the instance and we're using live data connection.

 

I can think of two approaches:

 

One is to schedule a job which executes full load at a given time but that would be setting it up with certain parameters before scheduling so there is still manual work every now and then. It also involves the source system so I wanted to know if - two, it can be done just within Celonis, is there a script which can be scheduled for execution and checks whether tables have exceeded the set scope (e.g. one year). I understand this might have to be customized for different process blocks but this kind of optimization would be really helpful and I'd like to know if there's already a process/ template being used.

Hi @dianne.marie.jalos11,

 

unfortunately, there is no script available to achieve what you describe. Such a script is normally developed in an implementation project and can be used afterwards. Do you already have a snippet that you have developed and can be shared?

 

We can work on such an idea togehter in the Community!

 

Best,

 

Justin


Hi Dianne,

You can add a transformation job that drops older data. See an SAP example below that drops data older than 24 months

 

--Deleting data. Creating a rolling timeframe.

DELETE FROM BSAD WHERE DATEDIFF(MM, CPUDT, CURRENT_DATE()) > 24;

DELETE FROM BSID WHERE DATEDIFF(MM, CPUDT, CURRENT_DATE()) > 24;

DELETE FROM JCDS WHERE DATEDIFF(MM, UDATE, CURRENT_DATE()) > 24;

DELETE FROM LIKP WHERE DATEDIFF(MM, ERDAT, CURRENT_DATE()) > 24;

 

Before executing it, I'd suggest creating a temp table as back up.

Best,

Gabriel


Reply