Data loss when '"json_populate_recorset" with long column name
От | Денис Романенко |
---|---|
Тема | Data loss when '"json_populate_recorset" with long column name |
Дата | |
Msg-id | CALSd-cppwDQ5+AmvrZ7a+XKQBCE9amS1uRK3X60=q1iL7x0SaQ@mail.gmail.com обсуждение исходный текст |
Ответы |
Re: Data loss when '"json_populate_recorset" with long column name
|
Список | pgsql-hackers |
If we create a column name longer than 64 bytes, it will be truncated in PostgreSQL to max (NAMEDATALEN) length.
For example: "VeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongName" will be truncated in database to "VeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongNameVer"
But in the codebase we could work with full column name - SQL functions like INSERT/UPDATE work with long names without problem, automatically searches for suitable column (thank you for it).
But if we try to update it with "json_populate_recordset" using full name, it will not just ignore column with long name - data in that record will be nulled.
How to reproduce:
1. create table wow("VeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongName" text);
2. select * from json_populate_recordset(null::wow,'[{"VeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongName": "haha"}]');
3. "VeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongNameVer" becomes null.
P.S. Why do I need columns with more than 64 bytes length - because I use non-Latin characters in column and table names, so In fact I have only 32 chars because of Unicode. (PostgreSQL: NAMEDATALEN increase because of non-latin languages)
В списке pgsql-hackers по дате отправления: