Skip to main content

We have recently changed the database connection from uplink to direct wherein required to move all tables (nearby 50) to direct connection, to avoid any re extraction. Any suggestion would be helpful.

Dear @abhay.singh12 ,

 

You may want to check and ensure data connection name remain the same if it is within the same data pool. The connection name associated with scope at the data jobs. I dont think by changing from uplink to direct will make you have to move tables unless changes in name / scope.

 

Let me know your thoughts and sorry if I didnt get the question correctly.


@Teuku Faruq , thanks for your reply.

I have 2 connection, one is with UPLINK and other is with DIRECT.

And I want to move all the extracted tables to move from UPLINK to DIRECT connection schema. And yes, Data Pool is same for both connection

Let me know if I am able to explain the question. Looking forward for you help :)


Ok, so as I said, instead of creating a new connection with DIRECT, is it possible to change the setting instead and make it UPLINK?


DIRECT connection is faster to extract and we need to change it from UPLINK to DIRECT.

As we have already extracted few table and to avoid any re extraction, want to move it from UPLINK to DIRECT connection


Not sure if this will work. We, from the Global scope, manipulate tables in another scope.

As a scope is just a schema, it theory you should be able to copy full tables between scopes.

 

This is how we write the schema and tables tables:

<%=DATASOURCE:SCOPE_NAME%>.TABLE_NAME

 

so something like

CREATE TABLE <%=DATASOURCE:NEW_SCOPE%>.TABLE_NAME AS

( SELECT * FROM <%=DATASOURCE:OLD_SCOPE%>.TABLE_NAME

 

should work....in theory... I haven't tested it.

 

HTH


Hi @Guillermo Gost , thanks for your answer.

Currently doing the same by creating a Global scope and moving tables one by one from one schema to another but its time taking, so thought if anyone have some other idea.


Hi @Guillermo Gost , thanks for your answer.

Currently doing the same by creating a Global scope and moving tables one by one from one schema to another but its time taking, so thought if anyone have some other idea.

Yes. Write a python code to list all the tables of a schema and to write the sql code to copy all of them

 

Something like

 

from pycelonis import get_celonis

celonis = get_celonis (permissions=False)

pool = celonis.pools.find(datapool)

tables = pool.tables()

sqlstatement=''

for t in tables:

tablename=te'name']

sqlstatement += 'CREATE TABLE %s AS (SELECT * FROM %s);\\n\\n' %(tablename, tablename)

datajobname='datajobname'

datajob=pool.data_jobs.find(datajobname)

 datajob.create_transformation(name="script_name",description="whatever", statement=sqlstatement)

 

I wrote it from the top of my mind so not sure is 100% correct, and you will need to check how to get the scope... but I hope will give you some ideas.

 

Enjoy! won't you? 🙂

 

 


@Guillermo Gost , thanks for this idea, I have not thought about it. I will give a try. Many Thanks :)


Keep us posted, hope you can share with us the working solution later :)


Reply