To my knowledge, not off-the-shelf nor in realtime. I guess that combining AF and or python to "poll" your excel periodically, then get the changes and upload them into Celonis you could potentially get something like that.... but still very complex
HTH
If possible, you could add the data to a Googlesheet and easily set up a connector. Then, create a schedule to extract every couple of minutes, for example.
Although this turnaround (Or Guillermo's suggestion) will work, I have the feeling this won't be sufficient for your (future) needs.
Instead, I recommend you look for a more long-term solution by connecting to a DB instead of a spreadsheet.
Hi @raul.rodri12,
If you need to use Excel instead of Google Sheets, you could use the Microsoft 365 Excel (celonis.com) connector in Action Flows. After reading your data, you can push it to your data pool by using the data push API, see this example: Create and Push CSV to Data Integration (celonis.com)
Kind regards,
Jan-peter
thank you all very much for your answers!
Hi Alejandro,
everything depends on where those files will be stored - Jan-peter and Gabriel proposed solutions for google sheets and Microsoft 365, however I'm not a fan of using AF for such purposes as these tables probably should be pushed to backend. Data connection seems to be more useful in that regard, but you need to store in the google service.
Some time ago I've proposed some recipe for sharepoint in below link. First setup may be a bit problematic but, after rewriting functions it can be pretty easy to use. It's nice when a few files and tables can be pushed with 3-4 lines of code which is easily scalable (f.e. 20 files with 30 tables as to different data pools, or 100 different files) :)
Best Regards,
Mateusz Dudek
Hi Alejandro,
everything depends on where those files will be stored - Jan-peter and Gabriel proposed solutions for google sheets and Microsoft 365, however I'm not a fan of using AF for such purposes as these tables probably should be pushed to backend. Data connection seems to be more useful in that regard, but you need to store in the google service.
Some time ago I've proposed some recipe for sharepoint in below link. First setup may be a bit problematic but, after rewriting functions it can be pretty easy to use. It's nice when a few files and tables can be pushed with 3-4 lines of code which is easily scalable (f.e. 20 files with 30 tables as to different data pools, or 100 different files) :)
Best Regards,
Mateusz Dudek
By using the Action Flow, the table is pushed to the data pool by using the Data Push API, if that's what you mean with the back-end. Use it in transformations, data models etc. is therefore possible.
Reply
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.