Обсуждение: Consistency problem with unlabeled intervals

Поиск
Список
Период
Сортировка

Consistency problem with unlabeled intervals

От
Tom Lane
Дата:
In current sources:

regression=# select '60'::interval;interval
----------00:01
(1 row)

regression=# select '1.5'::interval; interval
-------------00:00:01.50
(1 row)

That is, '60' is read as so many hours, '1.5' is read as so many
seconds.  This seems a tad inconsistent.

7.2 does the same thing, 7.1 says
ERROR:  Bad interval external representation '60'
but takes '1.5' as meaning 1.5 seconds.

I'd prefer to standardize on a unit of seconds myself.
        regards, tom lane


Re: Consistency problem with unlabeled intervals

От
Thomas Lockhart
Дата:
...
> That is, '60' is read as so many hours, '1.5' is read as so many
> seconds.  This seems a tad inconsistent.

They fulfill two separate use cases. Time zones can now be specified as
intervals, and the default unit must be hours. A number with a decimal
point is usually in units of seconds, and matches past behavior afaik.

The current behavior makes a choice that likely breaks "expected
behavior" if it were changed. Not to mention dealing with the upgrade
issues if the conventions were changed.

I don't have my heels dug in on this, but this example doesn't cover the
range of cases the behavior was designed to handle. I'll go back and
research it if turns out to be required.
                   - Thomas


Re: Consistency problem with unlabeled intervals

От
Tom Lane
Дата:
Thomas Lockhart <lockhart@fourpalms.org> writes:
>> That is, '60' is read as so many hours, '1.5' is read as so many
>> seconds.  This seems a tad inconsistent.

> They fulfill two separate use cases. Time zones can now be specified as
> intervals, and the default unit must be hours. A number with a decimal
> point is usually in units of seconds, and matches past behavior afaik.

Hm.  Well, if this behavior is intentional, it'd be nice to document it.
The existing paragraphs about interval's I/O format don't mention
behavior for unitless numbers at all, much less explain that a decimal
point has semantic significance.
        regards, tom lane