Python API: automating transports from DEV to PRD

Hi all,

We have 2 (on premise, so not IBC) Celonis environments running: one where we develop our dashboards, and one production environment where finished and tested dashboards run.

Transporting dashboards from DEV to PRD is only done by us, on request. However, we’re looking into automating the manual steps to reduce errors.

Creating the transports from one or more dashboards and/or datamodels seems to be feasible through the Python API. However, I’m wondering if the ‘overwrite’ option is available in the Python API when uploading transports. The documentation does not mention this. Does anyone have any insights/ideas/suggestions?

Hi Joos,

I don’t think there is a built in option for this, if it’s only about analyses, one option could be to use pycelonis to backup the analysis on DEV, find the analysis with the same name on PRD and overwrite that one with the backup:

for example:

from pycelonis import get_celonis
dev = get_celonis("dev_url","api_secret","api_id","username")
prod = get_celonis("prod_url","api_secret","api_id","username")

analyses = ['nice dashboard','cool app']

for name in analyses:
    a1 = dev.analyses.names[name]
    a2 = prod.analyses.names[name]
    p = a1.backup_content()
    a2.rebuild_content_from_backup(p)

This will give an error if one of the analyses doesn’t exist on one of the teams but could be adjusted to be more flexible (e.g. create new analysis if it doesn’t exist yet in prod).

Reference:
https://python.celonis.cloud/docs/pycelonis/en/latest/reference/pycelonis.objects_base.BaseAnalysis.html#pycelonis.objects_base.BaseAnalysis.backup_content
https://python.celonis.cloud/docs/pycelonis/en/latest/reference/pycelonis.objects_base.BaseAnalysis.html#pycelonis.objects_base.BaseAnalysis.rebuild_content_from_backup

With celonis_tools one option might be to restore the transport in a temporary folder, delete analyses that have the same name as analyses in the temporary folder, and then replace them.

Best regards,
Simon Riezebos

Hi Simon,

Thanks for thinking with me/us :slight_smile: (always a pleasure)

So far I developed the code below to create 2 transports for dashboards and associated data models from a given folder, and store the transports on a shared drive.
I don’t think your approach works for us as we are on-premise and therefore have a different Python API package (assumption).

Overwriting analysis, keeping URLs and shared bookmarks working is crucial.
Secondly, transferring data models is very cumbersome as datamodel-table to DB-table settings, as well as loading settings, are not included in the transport.
Thirdly, as datamodels cannot be overridden, analyses need to be reconnected to the newly uploaded datamodel and the old one needs to be removed.

We’re planning to extend the script below in the coming weeks further. Any ideas, suggestions, or connections to people with similar challenges, are appreciated :slight_smile:

with CelonisSession(URL, logins[env]["username"], api_token=logins[env]["token"],api_secret=logins[env]["secret"], verify=use_tls) as session:
print("# dashboards {}".format(len(Analysis.load_all().all())))
      
print("# datamodels {}".format(len(DataModel.load_all().all())))

folder = Folder("2355......82a")
#get all analyses in the folder
a_list = list(folder.get_analyses())

name = todayString + "_KEY"
pwd = "abcdefg"
t = Transport.create_new(name+"_Analysis", pwd, analyses=a_list,include_datamodels=False, include_data=False)
t.download(r'X:\Processmining\Transports')

dm_list = [a.datamodel for a in a_list]
t_dm = Transport.create_new(name+"_DM", pwd, datamodels=dm_list,include_datamodels=False, include_data=False)
t_dm.download(r'X:\Processmining\Transports')

Hi Joos,

Pycelonis supports CPM4.5, and I’m pretty sure that URLs and shared bookmarks would not be overwritten.

It would be possible to solve it with celonis_tools too, but that might be difficult. With your requirements I’d try connecting to both systems in the same script and getting the sheets, variables and saved formulas from the DEV analysis and overwriting those in the PROD analysis. You might be able to do something similar for datamodels, looking up the linked datamodel in DEV and PROD and overwriting tables, columns and foreign keys from DEV to PROD without overwriting database connections etc.
This last option for datamodels would also be possible with pycelonis.

If you want to know more about which objects and functions are supported in CPM4 with pycelonis, check out: https://python.celonis.cloud/docs/pycelonis/en/latest/reference.html#cpm4-objects

Best regards,
Simon

Thanks a lot for sharing your further ideas, and the confirmation that PyCelonis supports 4.5.

We discussed with our product owner and we will work on this during our planning and innovation weeks surrounding the holidays. We’ll update this topic if we made progress.

Hi @s.riezebos

This week we have been working on this topic again, using pyCelonis.

However, when trying to overwrite datamodels we encounter the issue that we can overwrite a datamodel with the content of another (e.g. from the source Celonis CPM4 environment) but the new datamodel understandably refers to DBconnections and DBConnectionTables that do not exist. A Datamodel has a method to add a table from a DB connection table, but we could not find how we can ‘easily’ set the new DBconnection and DBConnectionTable IDs in the given DatamodelTable.
(Finding the new IDs is done, but setting them in the data is our current hurdle.)

Any advice?

Hi Joos,

Unfortunately, the straightforward pythonic way is not available for datamodel tables at this moment.
It should be possible to do this by changing the store attribute of the datamodel.
datamodel_table.data["store"] = something
you should be able to find the format of the something in an existing table.

Would this work for now?

Best,
Simon

Hi Simon,

Your tip was the push/nodge in the right direction, the following bit did the trick:

table.data["store"]["dbConnectionId"] = targetDBconn.data["id"]

Where we find the targetDBconn by looking through the results of target_table.database_connections (an instance of DatamodelTable) to find a connection with the same name as the DBConnection used in the source table.

Thanks a lot! This makes ‘transporting’ datamodels from one Celonis environment to another super easy and we get to overwrite an existing datamodel.

And of course a bonus question:

The datamodel settings seem to be copied over correctly, except for the load schema (did not thoroughly test all settings, but cache parameters are copied).
Any idea where load schema settings are stored and how to copy these through pyCelonis?

Sorry, I haven’t worked with the load schema in Python. If it is not in the datamodel.data, it is probably not implemented in pycelonis…