Skip to main content
Question

How can we move all tables from one schema to another schema

  • December 19, 2022
  • 9 replies
  • 15 views

Forum|alt.badge.img+8

We have recently changed the database connection from uplink to direct wherein required to move all tables (nearby 50) to direct connection, to avoid any re extraction. Any suggestion would be helpful.

9 replies

teuku.faruq13
Level 9
Forum|alt.badge.img+1

Dear @abhay.singh12 ,

 

You may want to check and ensure data connection name remain the same if it is within the same data pool. The connection name associated with scope at the data jobs. I dont think by changing from uplink to direct will make you have to move tables unless changes in name / scope.

 

Let me know your thoughts and sorry if I didnt get the question correctly.


Forum|alt.badge.img+8
  • Author
  • Level 2
  • December 20, 2022

@Teuku Faruq , thanks for your reply.

I have 2 connection, one is with UPLINK and other is with DIRECT.

And I want to move all the extracted tables to move from UPLINK to DIRECT connection schema. And yes, Data Pool is same for both connection

Let me know if I am able to explain the question. Looking forward for you help :)


teuku.faruq13
Level 9
Forum|alt.badge.img+1

Ok, so as I said, instead of creating a new connection with DIRECT, is it possible to change the setting instead and make it UPLINK?


Forum|alt.badge.img+8
  • Author
  • Level 2
  • December 20, 2022

DIRECT connection is faster to extract and we need to change it from UPLINK to DIRECT.

As we have already extracted few table and to avoid any re extraction, want to move it from UPLINK to DIRECT connection


Not sure if this will work. We, from the Global scope, manipulate tables in another scope.

As a scope is just a schema, it theory you should be able to copy full tables between scopes.

 

This is how we write the schema and tables tables:

<%=DATASOURCE:SCOPE_NAME%>.TABLE_NAME

 

so something like

CREATE TABLE <%=DATASOURCE:NEW_SCOPE%>.TABLE_NAME AS

( SELECT * FROM <%=DATASOURCE:OLD_SCOPE%>.TABLE_NAME

 

should work....in theory... I haven't tested it.

 

HTH


Forum|alt.badge.img+8
  • Author
  • Level 2
  • December 20, 2022

Hi @Guillermo Gost , thanks for your answer.

Currently doing the same by creating a Global scope and moving tables one by one from one schema to another but its time taking, so thought if anyone have some other idea.


Hi @Guillermo Gost , thanks for your answer.

Currently doing the same by creating a Global scope and moving tables one by one from one schema to another but its time taking, so thought if anyone have some other idea.

Yes. Write a python code to list all the tables of a schema and to write the sql code to copy all of them

 

Something like

 

from pycelonis import get_celonis

celonis = get_celonis (permissions=False)

pool = celonis.pools.find(datapool)

tables = pool.tables()

sqlstatement=''

for t in tables:

tablename=t['name']

sqlstatement += 'CREATE TABLE %s AS (SELECT * FROM %s);\\n\\n' %(tablename, tablename)

datajobname='datajobname'

datajob=pool.data_jobs.find(datajobname)

 datajob.create_transformation(name="script_name",description="whatever", statement=sqlstatement)

 

I wrote it from the top of my mind so not sure is 100% correct, and you will need to check how to get the scope... but I hope will give you some ideas.

 

Enjoy! won't you? :)

 

 


Forum|alt.badge.img+8
  • Author
  • Level 2
  • December 20, 2022

@Guillermo Gost , thanks for this idea, I have not thought about it. I will give a try. Many Thanks :)


teuku.faruq13
Level 9
Forum|alt.badge.img+1

Keep us posted, hope you can share with us the working solution later :)