One data pool with two processes (SAP O2C and P2P)

Dear all,

Could you tell me how do we share extracted data in data pool with two processes like SAP O2C and P2P, by two blocks of DCP licence ?

Simply when I tried to download O2C and P2P process connectors in sandbox, two data pools were created.

I am thinking to reuse PO data in O2C process model, in case individual purchasing scenario etc.

Looking at Help, I feel it is possible.

Performance and system load
The data in Data Pools is separated and if you use one Data Pool per process, you will need to extract the data that is shared between processes multiple times.

Due to multiple loads, the overall needed time to extract all required data is higher. Moreover, data is duplicated. Therefore, it is recommended to have multiple process data models in one Data Pool if they share data.

You want to analyze your ServiceNow Incident Management and Service Request process. Among the required data is customer data which is needed for both cases. This needs to be extracted multiple times if you use one Data Pool per process. If you combine both processes in one Data Pool you can handle everything in one extraction and you also have to create the Data Connection only once.

Dear Kaztakata,

We were standing for the same topic 2 years ago. Within O2C we have a customer order related purchase order. This vehicle is used to fullfill the financial view of the order flow between the factories (= suppliers) and the single Sales Units all over he world. From process point of view thes two object belong to each other and the process flow is thus accordingly. For the supply of the factories we also have purchase orders, this is a process by itself and thus completely seperated from the O2C related purchasing. Technically of course we use the same SAP tables but the activities generated and data extracted is depending on the fact if a Customer Order is relelated to the PO or not. We do not have “double data” in our Celonis data. Principally you need to decide which process you want to follow End to End.
Good luck!
Have :grinning: Hans.

Dear Hans,

Thanks for your comment.
I think SO related PO should be seen in both O2C process and P2P process.
This PO should be analyzed in P2P process similar to stand alone PO,
on the other hand throughput time of SO-PO and PO-Customer Delivery should be analyzed in O2C.

Technically did you share PO data with two processes, or store duplicated PO data to both processes ?

Best regards,

Hi Kazuhiko,

You can have multiple data models in one pool. If you do what to merge two existing pools then I would recommend taking one of your existing Data pools, e.g. O2C. Then you can add P2P by copying over every part of the process out of the P2P data pool. Here are the most important steps, there may be others that I can’t think of right now.

  1. Add all the data pool parameters used in P2P to O2C

  2. Modify the Exaction of O2C to also extract the tables needed for P2P (remember to copy all filters). Also some tables are used in both O2C and P2P, they may currently be filtered to only include classes relevant to O2C, so remember to modify these filters sot that you extract the rows needed for P2P.

  3. Make a new Data Job and copy over all the transformations from P2P. Note:
    • Remember to check if they use Parameters, if they do remember to add those.
    • Some of the transformations make new tables. Some of the P2P and O2C tables are the same, except they are filtered differently, so as only to included tables relevant for there process. This is why it is good practice to name these after the process, i.e. P2P_“Table name”.
    • As you are using the tables extracted in a different data job, when an extracted table (i.e. one that is not created in a transformation) is added, either after “FROM” or after “JOIN” you have to specify the data source. I.e. instead of “USR02” you need to write <%=DATASOURCE:INSERT DATA SOURCE NAME%>.“USR02”

  4. Run the data extraction so you have loaded all the P2P only tables.

  5. Run all the transformations for P2P

  6. Create a new Data model and copy P2P so its exactly like the one in the other pool.

  7. Load the data model

  8. If you already have P2P analysis sheets change the data model of the workspaces to the one out of the shared data pool.

This kind of pool merging is a bit fiddly to get working properly as there are lots of steps and its easy to overlook somethings. If there is anything unusual about the way your data extraction is set up that needs to be considered when setting up a data model, then it can get very technically complicated.

This is really the limit of the support we can give you without access to your IBC team, as trouble shooting any issues you might have is pretty much only possible for someone who can see what you’re doing. In my experience it requires looking in all corners of the event collector on a hunch that it might reveal the source of the problem, which isn’t really feasible if you have to ask another person to send you screenshots. So, I recommend getting in touch with your Celonis Customer Success Manger to see if they can arrange for a Data Scientist to be assigned the job of supporting you in this.

I wish you all the best in your endeavor.

Best wishes,


Thanks Calandra,

That is what I thought before and I would like to get smarter idea. Anyway thanks for your answer. I got it.

Best regards,

Dear Kaztakata,

As mentioned from a business point of view the SO related purchasing are done by the same persons within the Sales unit. The “normal” Puchashing there, is very minimal compared to the purchasing in the Factories.
The business want to follow up the logistic chain from the customer ordering until the delivery. We have different variants of SO depending on reginal differences. Some Sales units have a local stock. SO with capital goods normally first go to our own local warehouse before delivery since the delivery needs to be prepared etc. etc.
It does not make sense to mix both “purchase processes”.

Techniaclly seen there is no reason to duplicate the data: we just load all EKKO, EKPO etc. data once in the SAP HANA. The only thing which is variated is activity table. In the SO related process the VBAP is leading. In the Purchase it is the EKPO. We thus have two datamodels using the same SAP HANA tables.
The topic will be which activities are needed to show to enhance efficiency, fullfillment and compliance. We include also the financial data as far as possible to show the late payments for SO or PO for example. And have specialised cockpits for that since this again is a different group of Business users.

I can recommend to try to sit on a chair of your users to find out what makes sense or not. My experience ist that the setup of activies is a bit of a “trial and error” game togehter with common sense. To give you an idea: in our extended O2C process (PO + Finance + Service) we have 126 unique Activities. In our P2P we have 29 unique activities…

Good luck.
Have :grinning:!

1 Like