Re: json/jsonb inconsistence - 2
От | Andrew Dunstan |
---|---|
Тема | Re: json/jsonb inconsistence - 2 |
Дата | |
Msg-id | 5387510B.2090601@dunslane.net обсуждение исходный текст |
Ответ на | Re: json/jsonb inconsistence - 2 (Andrew Dunstan <andrew@dunslane.net>) |
Ответы |
Re: json/jsonb inconsistence - 2
|
Список | pgsql-hackers |
On 05/29/2014 08:15 AM, Andrew Dunstan wrote: > > On 05/29/2014 08:00 AM, Teodor Sigaev wrote: >> postgres=# select '["\u0000"]'::json->0; >> ?column? >> ---------- >> "\u0000" >> (1 row) >> >> Time: 1,294 ms >> postgres=# select '["\u0000"]'::jsonb->0; >> ?column? >> ----------- >> "\\u0000" >> (1 row) >> >> It seems to me that escape_json() is wrongly used in >> jsonb_put_escaped_value(), right name of escape_json() is a >> escape_to_json(). > > > That's a bug. I will look into it. I think we might need to > special-case \u0000 on output, just as we do on input. Actually, this is just the tip of the iceberg. Here's what 9.3 does: andrew=# select array_to_json(array['a','\u0000','b']::text[]); array_to_json --------------------- ["a","\\u0000","b"] I'm now wondering if we should pass though any unicode escape (presumably validated to some extent). I guess we can't change this in 9.2/9.3 because it would be a behaviour change. These unicode escapes have given us more trouble than any other part of the JSON spec :-( cheers andrew
В списке pgsql-hackers по дате отправления: