Skip to main content

Studio: Process Copilot (2024-11-22)

Process Copilots have been upgraded to allow you to share chats with teammates, view your chat history and use enhanced features for providing feedback. In addition, the Process Copilot capabilities were expanded to include asset-level filters on the data used by your Process Copilots and the ability to connect your Process Copilots to Action Flows in order to trigger actions from a conversation.Conversation historyProcess Copilots now store your last 100 conversations which can be accessed by clicking the Chat History icon. You can pin frequently accessed conversations to the top of this list, edit the name of any past chat, or delete a conversation from your history.Share a conversationYou can now share conversations from a Process Copilot with other teammates in order to distribute information, report issues, or help debug problems. Note that when you share a conversation with another user, only the conversation up to the time when it was shared and will not be updated as additional questions are asked or responses are generated.To share a conversation, click the Share icon and copy the link to send to other users.Enhanced feedback flowsThe feedback mechanisms in your Process Copilots were updated to allow users to gauge the quality of the responses that were generated. Users should be encouraged to always provide feedback in order to help detect quality issues and improve the accuracy of responses.Set asset level filtersThe new Preset Filters tab allows you to select filters  that can be applied to the entire Process Copilot asset. These preset filters are applied to all conversations with this Process Copilot and cannot be removed or cleared by end users. Trigger actions from a Process CopilotThe new trigger_action_flow tool allows you to connect your Process Copilot to an Action Flow in order to trigger that action from within a conversation. Before executing the action, users will need to confirm the action to make sure the action is performed as expected.  

Object-centric process mining - Changes to SQL validation for transformations (2024-10-28)

 We've made some changes to how we parse the SQL for object-centric transformation scripts when you publish them. We'll validate more things when you publish, instead of when you run the transformations, which makes the errors easier to identify and fix or mitigate. These changes apply to both your custom transformation scripts and our supplied transformation scripts for Celonis catalog processes. A phased rollout of the changes to teams starts from now.As a result of these changes, when you publish your object-centric data model, you might see new validation errors that you weren't getting before. Here's what you might see, and how to fix or handle it: We'll now always add parentheses to expressions when we output the transformations. This might mean that an expression now fails to evaluate, or evaluates to a different result. To fix this error, in your custom scripts, follow best practice and include parentheses in SQL expressions where the order can be ambiguous. We'll now validate that each column data type supplied from your source system matches, or can be assigned to, the required data type for the attribute in our underlying database. The data types we use are Boolean, long, float, timestamp and string. You might see a new issue if a column in your source system data has a data type that our Celonis catalog transformations don't expect. To fix these errors, use the suggestions in Troubleshooting data extraction and pre-processing to account for the unexpected column data types. To handle these errors, activate the Skip missing data option for the data connection, as described in Skipping missing data for objects and events. We'll cast the data types to match the expected data types. Note that it's best to fix the errors as this option might introduce other unexpected issues. It's possible that the transformation might fail to run even with the Skip missing data option enabled if a data type can't be cast to the one required. If that's the case, you'll need to fix the data type.