copying json data and backslashes
От | Alastair McKinley |
---|---|
Тема | copying json data and backslashes |
Дата | |
Msg-id | PAXPR02MB7600EDF4AA4898094EBD8B10E30D9@PAXPR02MB7600.eurprd02.prod.outlook.com обсуждение исходный текст |
Ответы |
Re: copying json data and backslashes
|
Список | pgsql-general |
Hi all,
I have come across this apparently common issue COPY-ing json and wondering if there is potentially a better solution.
I am copying data into a jsonb column originating from a 3rd party API. The data may have literal \r,\t,\n and also double backslashes.
I discovered that I can cast this data to a jsonb value directly but I can't COPY the data without pre-processing.
The example below illustrates my issue (only with \r, but the problem extends to other \X combinations).
do $$lines=[r'{"test" : "\r this data has a carriage return"}']with open("/tmp/test1.json","w") as f:for line in lines:f.write(line.strip() + "\n")$$ language plpython3u;create temp table testing (data jsonb);-- this worksinsert into testing (data)select l::jsonbfrom pg_read_file('/tmp/test1.json') f,lateral regexp_split_to_table(f,'\n') l where l <> '';-- failscopy testing (data) from '/tmp/test1.json';-- workscopy testing (data) from program $c$ sed -e 's/\\r/\\\\u000a/g' /tmp/test1.json $c$;
Is there any other solution with COPY that doesn't require manual implementation of search/replace to handle these edge cases?
Why does ::jsonb work but COPY doesn't? It seems a bit inconsistent.
Best regards,
Alastair
В списке pgsql-general по дате отправления: