Re: BUG #12664: numeric_recv does not accept large values
От | Tom Lane |
---|---|
Тема | Re: BUG #12664: numeric_recv does not accept large values |
Дата | |
Msg-id | 29641.1422338035@sss.pgh.pa.us обсуждение исходный текст |
Ответ на | BUG #12664: numeric_recv does not accept large values (emil.lenngren@gmail.com) |
Список | pgsql-bugs |
emil.lenngren@gmail.com writes: > According to the documentation for data type "numeric" at > http://www.postgresql.org/docs/9.4/static/datatype-numeric.html: > "up to 131072 digits before the decimal point; up to 16383 digits after the > decimal point" > "Note: The maximum allowed precision when explicitly specified in the type > declaration is 1000; NUMERIC without a specified precision is subject to the > limits described in Table 8-2." Note that that's not saying anything about how many significant digits you can write. > In the binary representation, digits are grouped into groups of 4. The > number of such groups is checked in numeric_recv: > len = (uint16) pq_getmsgint(buf, sizeof(uint16)); > if (len < 0 || len > NUMERIC_MAX_PRECISION + NUMERIC_MAX_RESULT_SCALE) > ereport(ERROR, > (errcode(ERRCODE_INVALID_BINARY_REPRESENTATION), > errmsg("invalid length in external \"numeric\" value"))); > but NUMERIC_MAX_PRECISION + NUMERIC_MAX_RESULT_SCALE is 3000 which means a > limit of only 12000 digits. numeric_in has no such limit. It's true that numeric_in and numeric_recv aren't too consistent about the limits they enforce. I doubt that "remove the limits" is the right answer for that, though. In particular, allowing hundreds of thousands of digits would be a good way to lock a backend up for very long times in simple arithmetic operations ... regards, tom lane
В списке pgsql-bugs по дате отправления: