Re: ERROR: cannot pass more than 100 arguments to a function
От | raf |
---|---|
Тема | Re: ERROR: cannot pass more than 100 arguments to a function |
Дата | |
Msg-id | 20200408213413.kyohygx2ick2zo7f@raf.org обсуждение исходный текст |
Ответ на | ERROR: cannot pass more than 100 arguments to a function (Ján Máté <jan.mate@inf-it.com>) |
Список | pgsql-bugs |
Ján Máté wrote: > Hi PostgreSQL, > > I noticed an interesting problem with the default max. number of > function arguments, especially related to JSON processing. > > I understand that there is a need to limit the max. number of > arguments for functions, but the current limit (100) is simply too > restrictive for tables with large number of columns (according to my > findings it is >250 depending on column types). > > A simple example: > > - I have a table with 70 columns > > - I want to select the most simple JSON output from that table: > > SELECT > json_build_object( > 'a', "col1", > 'b', "col2", > 'c', "col2", > ... > ) > FROM "schema"."table"; > > - with the 100 arguments limit I can select only 50 columns - 50 x > (jsonKeyName + ColumnName) = 100 arguments > > - if the limit of columns in the table is >250, why I cannot select at > least 250 columns using the built-in json_build_object(...) function? > > > Please someone consider to increase the default limit from 100 to at > least 500 (2x max. number of columns) for json_build_object(...) and > other similar JSON functions. > > Kind regards, > JM If you need that many arguments to represent the columns in a table, then you can use a single argument whose type is the table. e.g. create table t (...); create or replace function f(a t) ...; select * from f(row(...)); If you need that many arguments, and they don't represent a table, you can create an empty table just for its type definition and then do the above. cheers, raf
В списке pgsql-bugs по дате отправления: