Hello Community,
I am playing around with the backup content function for Celonis analysis.
https://python.celonis.cloud/docs/pycelonis/en/latest/notebooks/99_Use_Case_Version_control.html#Create-backups-of-all-analyses-that-are-published-in-separate-workspace-directories
While investigating, I realised that you have to store your file locally, before you can push it to Git, or AWS S3. I was wondering if anyone tried to directly send the backup to S3, or Git?
Page 1 / 1
Hi Paul,
Currently the
https://docs.aws.amazon.com/code-samples/latest/catalog/python-s3-s3-python-example-upload-file.py.html
In a use case where I am doing something similar I created a git repo directly in the ML workbench and at the end of my scheduled notebook the following commands run:
Would a similar solution would work for you as well?
Best regards,
Simon Riezebos
Currently the
backup_content
functions download directly to a Path object from Pythons built in pathlib
. pathlib
currently seems to only support filesystem paths, so downloading directly to s3 or a git hosting provider seems to be impossible for now. You could add functionality to your Jupyter notebook to directly upload each downloaded file to S3 though:https://docs.aws.amazon.com/code-samples/latest/catalog/python-s3-s3-python-example-upload-file.py.html
In a use case where I am doing something similar I created a git repo directly in the ML workbench and at the end of my scheduled notebook the following commands run:
!git add .
!git commit -m "Update $(date +"%D")"
!git push
Would a similar solution would work for you as well?
Best regards,
Simon Riezebos
Reply
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.