1

I have 2 tables, A and B. Table A has columns Q1, Q2, Q3, Q4 and Table B has columns Q1 and Q2 along with others. In Table A, columns will get changed in future with Q5, Q6, Q7 etc. I want to write dynamic SQL to alter the table B (adding new columns) whenever new columns are added to Table A. Columns which will get added always be like Q1, Q2,Q3...Q40.

example of my tables:

create table tableA 
(
    [sid] [varchar](50) not null, 
    [rid] [varchar](50) not null,
    [Q_URL] [varchar](500) null,
    [Q1] int null,
    [Q2] int null,
    [Q3] int null,
    [Q4] int null
);

create table tableB 
(
    [s_id] [varchar](50) not null,
    [rid] [varchar](50) not null,
    [Q_URL] [varchar](500) null,
    [Q1] int null,
    [Q2] int null
);
8
  • what's the purpose of doing this? Commented May 10, 2018 at 22:23
  • 1
    I think this is not an accurate approach for SQL based DB. In fact, if you need to add columns to a table, seems that you're missing a bit on the DB table design. Unless you're planning to consider XML fields or JSON fields. You can also consider a more dynamic DB approach, NON SQL DB's, like MongoDb. That's json based. Commented May 10, 2018 at 22:28
  • 2
    You can use the ordinal position and column name columns from information_schema.columns for this to build out your dynamic SQL but it will be terribly brittle and you will probably have to trigger it with DDL triggers which will get messy. It sounds like you want a database versioning solution rather than just some dynamic SQL. Commented May 10, 2018 at 22:29
  • 4
    How about designing your table so that you have sid varchar(50), rid varchar(50), Q_URL varchar(500), Q_NUMBER int, value int. This way you don't need to keep adding columns, instead you have one row per sid, rid, Q_NUMBER combination. Commented May 10, 2018 at 23:03
  • 3
    Stop right now. This is a bad design. If you have the opportunity you need to change this. If you are making dynamic changes to fields then you need to stop doing this, and instead move this data into rows. Commented May 11, 2018 at 1:02

2 Answers 2

1
    if object_id('tempdb..#source_col') is not null
            drop table #source_col
        Select distinct column_name as source_r_column from hr.INFORMATION_SCHEMA.COLUMNS
        where table_name = 'tableA'
        and column_name like 'Q%'
            and column_name <> 'Q_URL';
    if object_id('tempdb..#destination_col') is not null
                drop table #source_col
            Select distinct column_name as source_r_column from hr.INFORMATION_SCHEMA.COLUMNS
            where table_name = 'tableB'
            and column_name like 'Q%'
                and column_name <> 'Q_URL';

if object_id('tempdb..#newcols') is not null
    drop table #newcols
Select row_number() over(order by t1.source_q_columns) as colnum,t1.source_q_columns, 'int' as datatype
into #newcols
from #tableA t1
left join #tableB t2 on t1.source_q_columns= t2.destination_q_columns
where t2.destination_q_columns is null;

Declare @sql nvarchar(max)
Declare @column varchar(10)
Declare @datatype varchar(10)
Declare @max_colnum int
set @max_colnum = (select max(colnum)  from #newcols)
While(@max_colnum != 0)
begin
set @column = (select source_q_columns from #newcols where colnum = @max_colnum) 
print @column
select @sql = 'Alter Table destination Add '+ @column + ' int null' 
print @sql
EXEC (@sql)
set @max_colnum = @max_colnum -1
end 

This is what i came up. I believe this will work for any case. :)

Sign up to request clarification or add additional context in comments.

3 Comments

Compare this to simply inserting a row. This is a bad design and you need to change it.
why it's a bad design ? What's the issue in the code ? Thanks.. !
The bad design is adding columns to your table when you should be adding rows. Adding a row is a one-liner and downstream apps don't need to know about new columns. Dynamically adding columns is fundamentally a bad idea.
0

You can create something called a DDL Trigger for Alter Table event. The Trigger is created on the database level so it will fire for any Alter table commands executed on the database.

A sample of DDL trigger for Alter table event on the database level for TableA would look something like this.

CREATE TRIGGER tr_Replicate_DDL_For_TableA
ON DATABASE
FOR ALTER_TABLE
AS
BEGIN
    SET NOCOUNT ON;

Declare @data       xml = EVENTDATA()
       ,@TableName  SYSNAME
       ,@Sql        NVARCHAR(MAX);

SET @TableName  = @data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'SYSNAME');
SET @Sql        = @data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'nvarchar(max)');

  IF (@TableName = 'TableA')  --<-- only replicate changes if TableA's schema changed.
  BEGIN
      SET @sql = REPLACE(@sql , 'TableA' , 'TableB');

      Exec sp_executesql @sql;
  END

END
GO

1 Comment

Nice idea if Table A always modify not drop. But Table A is source table and Table B is destination table in D/W. Table A can be dropped and re-created. In this case, your code won't work since it's tracking changes only. This is the only reason i'm looking for dynamic SQL. Thanks for your time & efforts. Highly appreciate your idea..

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.