0

I have below table

create table test(id serial, key int,type text,words text[],numbers int[] );

insert into test(key,type,words) select 1,'Name',array['Table'];
insert into test(key,type,numbers) select 1,'product_id',array[2];
insert into test(key,type,numbers) select 1,'price',array[40];
insert into test(key,type,numbers) select 1,'Region',array[23,59];
insert into test(key,type,words) select 2,'Name',array['Table1'];
insert into test(key,type,numbers) select 2,'product_id',array[1];
insert into test(key,type,numbers) select 2,'price',array[34];
insert into test(key,type,numbers) select 2,'Region',array[23,59,61];
insert into test(key,type,words) select 3,'Name',array['Chair'];
insert into test(key,type,numbers) select 3,'product_id',array[5];

I was using below query to pivot table for users.

select key,
max(array_to_string(words,',')) filter(where type='Name') as "Name",
cast(max(array_to_string(numbers,',')) filter(where type='product_id') as int) as "product_id", 
cast(max(array_to_string(numbers,',')) filter(where type='price') as int) as "price" ,
max(array_to_string(numbers,',')) filter(where type='Region') as "Region"
from test group by key

enter image description here

But I couldn't unnest the Region column during Pivot in-order to use Region column to join with another table .

My expected output is below

enter image description here

2 Answers 2

1

Since we are using unnest("Region") to do to pivot. There must be a row with region data for each product. Or below code will do the trick by creating an array of null.

unnest(CASE WHEN array_length("Region", 1) >= 1
                    THEN "Region"
                    ELSE '{null}'::int[] END) 

Schema:

create table test(id serial, key int,type text,words text[],numbers int[] );

insert into test(key,type,words) select 1,'Name',array['Table'];
insert into test(key,type,numbers) select 1,'product_id',array[2];
insert into test(key,type,numbers) select 1,'price',array[40];
insert into test(key,type,numbers) select 1,'Region',array[23,59];
insert into test(key,type,words) select 2,'Name',array['Table1'];
insert into test(key,type,numbers) select 2,'product_id',array[1];
insert into test(key,type,numbers) select 2,'price',array[34];
insert into test(key,type,numbers) select 2,'Region',array[23,59,61];
insert into test(key,type,words) select 3,'Name',array['Chair'];
insert into test(key,type,numbers) select 3,'product_id',array[5];
select key,"Name",product_id,price,unnest(CASE WHEN array_length("Region", 1) >= 1
               THEN "Region"
               ELSE '{null}'::int[] END) from 
(
  select key,
  max(array_to_string(words,',')) filter(where type='Name') as "Name",
  cast(max(array_to_string(numbers,',')) filter(where type='product_id') as int)  as  "product_id", 
  cast(max(array_to_string(numbers,',')) filter(where type='price') as int)  as  "price" ,
  max(numbers) filter(where type='Region') as "Region"
  from test group by key
)t order by key
key Name product_id price unnest
1 Table 2 40 23
1 Table 2 40 59
2 Table1 1 34 23
2 Table1 1 34 59
2 Table1 1 34 61
3 Chair 5 null null

db<>fiddle here

Sign up to request clarification or add additional context in comments.

2 Comments

Hi , i have added new row (key=3 and Name='Chair') in table. but this new row is not being resulted in query suggested by you. could you pls update your query
Isn't it working @Hari? Just noticed unupvoting. thats why asking.
1

Very strange database design... I'm assuming you inherited it?

If none of the other array values will ever have a cardinality > 1 then, you can simply unnest:

select
  key,
  (max (words) filter (where type = 'Name'))[1] as name,
  (max (numbers) filter (where type = 'product_id'))[1] as product_id,
  (max (numbers) filter (where type = 'price'))[1] as price,
  unnest (max (numbers) filter (where type = 'Region')) as region
from test
group by key

If they can have multiple values, that can also be handled.

-- EDIT 3/15/2021 --

Short version: an unnest against a null won't product a row, so if you coalesce the null value into an array of a single null element, that should take care of this part:

select
  key,
  (max (words) filter (where type = 'Name'))[1] as name,
  (max (numbers) filter (where type = 'product_id'))[1] as product_id,
  (max (numbers) filter (where type = 'price'))[1] as price,
  unnest (coalesce (max (numbers) filter (where type = 'Region'), array[null]::integer[])) as region
from test
group by key
order by key

Now for the part you didn't ask... I and at least one other have been gently nudging you that your database design is going to cause multiple problems at every turn. The fact that it's in production doesn't mean you shouldn't fix it as soon as you can.

This design is what's known as EAV - Entity - Attribute - Value. It has its use cases, but like most good things it can also be applied when it shouldn't. The use case that comes to mind is if you want users to be able to dynamically add attributes to certain objects. Even then, there might be better/easier ways.

And as one example, if you have one million objects, five attributes means you have to store that as five million rows, and the majority of that space will be occupied with repeating the key and attribute names.

Just food for thought. We can continue to triage this with every new scenario you find, but it would be better to redo the design.

7 Comments

not inherited. This is flexible to introduce new features without change in table structure, like new column product_category etc.
Hi @Hambone , i have added new row (key=3 and Name='Chair') in table. but this new row is not being resulted in query suggested by you. could you pls update your query to result all key
@Hari: using a single JSON column for key/value pairs would probably make your life a lot eaier
@Hari: you mean like this? dbfiddle.uk/…
@Hari: you should seriously re-consider that design. I do think that will give you a lot of headaches in the long run. Looking at the sample, I am inclined to think that a properly normalized model would be a better choice to begin with.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.