TimeTest zone offset weirdness
От | Oliver Jowett |
---|---|
Тема | TimeTest zone offset weirdness |
Дата | |
Msg-id | 42E5A7CA.6070109@opencloud.com обсуждение исходный текст |
Список | pgsql-jdbc |
I'm currently looking at TimeTest.testGetTimeZone() which has: final Time midnight = new Time(0, 0, 0); Calendar cal = Calendar.getInstance(); cal.setTimeZone(TimeZone.getTimeZone("GMT")); long localOffset = Calendar.getInstance().get(Calendar.ZONE_OFFSET); [...] time = rs.getTime(1, cal); assertEquals(midnight.getTime() , time.getTime() + localOffset); The value in the database that's being retrieved is '00:00:00' -- it is a TIME WITHOUT TIME ZONE. As I interpret things: - 'midnight' should represent 1970-01-01 00:00:00 in the local timezone, i.e. local midnight. - 'time' should represent 1970-01-01 00:00:00 GMT, i.e. the database value interpreted in the provided (GMT) calendar. However as localOffset is the number of milliseconds that the local timezone is *ahead* of GMT, shouldn't that test be for 'time.getTime() - localOffset'? e.g. in the NZ timezone, localOffset is 43200000 (milliseconds) as NZ is 12 hours ahead of GMT in January 1970. Then midnight.getTime() is -43200000 as NZ reaches local midnight before GMT. Given that this doesn't fail in the current driver, presumably this means that getTime() is currently returning a Time with milliseconds = -86400000 ... which is surely wrong. With the timestamp-related changes I'm doing I also overhauled the Time and Date parsers; now getTime() is returning a Time with milliseconds = 0 and the assertion fails. Is this actually an existing test/driver error or am I misinterpreting what the behaviour should be? -O
В списке pgsql-jdbc по дате отправления: