Skip to main content

I would like to create a use case to analyse the throughput times of finished goods flow from one warehouse to another.
Unfortunately, there are no pre-existing data connectors available for this particular use where i would be especially looking into goods issue and goods receipt.

I did go through the celonis academy trainings but the trainings were basic and i cannot seem to apply what i learnt in the trainings to my use case.

Does anybody have any help/ leads?

Best Regards
Kenny

Just giving you an idea—if you go to the MSEG table, you could do this:

 

Goods Receipt (GR) → When goods arrive at the target warehouse
SELECT  MBLNR, MJAHR, MATNR, BUDAT, WERKS AS Target_Warehouse, LGORT, MENGE, MEINS
FROM MSEG
WHERE BWART = '101'; -- Movement Type 101 (GR into warehouse)

 

Goods Issue (GI) → When goods leave the source warehouse

SELECT MBLNR, MJAHR, MATNR, BUDAT, WERKS AS Source_Warehouse, LGORT, MENGE, MEINS
FROM MSEG
WHERE BWART = '601'; -- Movement Type 601 (GI for delivery)
 


Thank you very much for your response

I do know this logic and i also know the data sources from where i am supposed to get data for the analysis

However, there is no already existing process connector or data model where i can do a throughput times analysis of the goods movements in Celonis. 
Hence, my question is, which kind of transformations, which kind of attributes and activities should i define in data pool?
Also what would be the amount or level of PQL code i would need for this?

Thank you very much in advance

Best Regards
Kenny


Since there isn’t an existing process connector or data model for throughput time analysis of goods movements, the key first step is ensuring that all relevant activities are included in your transformation. This means defining the event log with attributes like Material Document Number, Goods Movement Type, Posting Date, and Storage Locations, depending on your specific use case.

Once you have these activities structured in your data model, calculating throughput time is straightforward using the standard PQL formula for throughput time:

CALC_THROUGHPUT ( LAST_OCCURRENCE E 'Goods Receipt (GR)' ] TO FIRST_OCCURRENCE E 'Goods Issue (GI)' ] , REMAP_TIMESTAMPS ( "Activities"."Timestamp" , DAYS ) )

Since there isn’t an existing process connector or data model for throughput time analysis of goods movements, the key first step is ensuring that all relevant activities are included in your transformation. This means defining the event log with attributes like Material Document Number, Goods Movement Type, Posting Date, and Storage Locations, depending on your specific use case.

Once you have these activities structured in your data model, calculating throughput time is straightforward using the standard PQL formula for throughput time:

CALC_THROUGHPUT ( LAST_OCCURRENCE E 'Goods Receipt (GR)' ] TO FIRST_OCCURRENCE E 'Goods Issue (GI)' ] , REMAP_TIMESTAMPS ( "Activities"."Timestamp" , DAYS ) )

THank you very much for your response Abhishek. Really appreciate it

My idea with the analysis is to calculate throughput times from one warehouse to the next and then to the next. And also to map this process and have a variant explorer

-I have created a data pool; 

-connected the s4hana data sources;

-extracted relevant tables (MSEG, MKPF, LIPS, LIKP, VBAP, VBAK- i have also selected very few columns in the configure columns and also enabled change date filter (for delta load but even then the data job has been running for 4 days now))

-i do not know how to proceed from here

What exactly are the transformations i should create?
What exactly are these attributes i have to define and where?

I did go through the Celonis academy trainings and while they were helpful to give a basic picture, i cannot seem to apply those trainings to my use case and i am facing obstacles at this transformations stage.

Best Regards
Kenny 


Reply