Master Data Check eg. INC01 (EKKO.INCO1 vs. TINC.INCO1)

Dear all,

I want to do some master data check. For certain fields check if they are used within 3 years (we have 3 year of data in Celonis).

E.g. I want to mark all Incoterms from the master table (TINC) which are not used in EKKO.INCO1.

First problem was the join of the mastertable. We did it via
EKPO <-> T001 <-> TINC
With case key SID, BUKRS and MANDT. Dont know if there is a better way.

The idea was now to store the distinct values of EKKO.INC01 in a static variable
like the following (there is no parent table between TINC & EKKO)

static_var = distinct EKKO.INC01

CASE WHEN TINC.INCO1 in <%=static_var%> then ‘USED’ ELSE ‘NOT USED’ END

Any ideas how to do this?
Thanks in advance :slight_smile:
Stefan

Dear @stefgwh,

thank you for your question!

This can only be done with a custom script, as this functionality has not been implemented as of now. We can also only see whether a column is being used in Celonis, not whether it is used in SAP or somewhere else!

It works through the so called APC reduction app. The app loops trough each workspace and data job. Each column that is being named in either an analyses or data job will be saved to the list of used columns. Columns that are not explicitly named in a Data Job, but only via TABLENAME.* will not be included in the list of used columns.

Step-by-step guide:

  1. Download the most recent version from the ML workbench script repository (https://ml-workbench.eu-1.celonis.cloud/machine-learning/ui/notebooks/a78bb1ad-62d7-4881-b3ad-b4bf4b759134/lab). The script is located in “APC Reduction” → “Extraction”. You can select all files in the Extraction folder and right click on them to download the files.
  2. Upload all files into the customers ML workbench. The files must be located in a folder called “Extraction”.
  3. Open the APC_Reduction.ipynb file.
  4. Run the first two cells to install all required packages (sqlparse and ftfy). Now restart the kernel (“Kernel” → “Restart Kernel…”). The kernel does not have to be restarted if both sqlparse and ftfy have been previously installed.
  5. Now fill out the required information in the third cell:
    login : Fill in the url of the IBC and a valid API key that has access to all analysis and data jobs.
    pool_id : Fill in the ID of the data pool on which you want to execute the script.
    on_prem : Can only be True or False. If the data pool is on premise set to True. If it is full cloud, set on False.
    excel_only : Can only be True or False. If True the app will only generate an Excel file containing all used columns. If False, it will also generate Data Extractions for each Data Connection.
  6. Run the third cell and wait for the results. The Excel file containing all used columns can be found in the same folder where the App is located in the ML workbench. The automatically created Data Extractions will always be named “Automated Job:” + name of the data connection.

Can you let me know whether this works for you?

Have a great week!

Your,

Celonis Team!

Hi,
thanks for the info.

I am somehow not able to access the url

Reset passwort two times but seems its not working.

Can I somehow directly download the scripts from our ML workbench?
(https://voestalpine.eu-1.celonis.cloud/machine-learning/ui/notebook)

thanks!