Re: When to store data that could be derived
От | Frank |
---|---|
Тема | Re: When to store data that could be derived |
Дата | |
Msg-id | 3b9874b0-1e69-d273-9be5-b3cf65811b54@chagford.com обсуждение исходный текст |
Ответ на | Re: When to store data that could be derived ("Peter J. Holzer" <hjp-pgsql@hjp.at>) |
Ответы |
Re: When to store data that could be derived
|
Список | pgsql-general |
On 2019-03-24 2:41 PM, Peter J. Holzer wrote: > On 2019-03-24 10:05:02 +0200, Frank wrote: Many thanks to Peter et al for their valuable insights. I have learned a lot. > So the important part here is not whether data is added, but whether > data is changed. Sure, new transactions are added all the time. But is > it expected that the data used to derive amount_cust and amount_local > (e.g. the exchange rate) is changed retroactively, and if it is should > the computed amount change? (I'm a bit worried about the join with the > customers table here - what happens when a customer moves their > headquarters to a country with a different currency?) I think I have got both of those covered. I store the exchange rates in physical columns on the transaction, so the compute expressions will always return the same values. I have separate tables for 'organisations' (O) and for 'customers' (C). C has a foreign key reference to O, and most static data such as addresses and contact details are stored on O. So if a customer moved, I would create a new C record with the new currency, and flag the old C record as inactive. They can happily co-exist, so receipts can be processed against the old C record until it is paid up. I have started doing some volume tests, and at this stage, for the kind of volumes I am concerned about, it looks as if performance is a non-issue. I generated about 22000 invoices and 22000 receipts, over 12 customers and 6 months. Invoices and receipts are stored in separate tables, and a VIEW presents them as a single table. Using the VIEW, I selected all transactions for a given customer for a given month. It returned 620 rows and (on my slow desktop computer) it took 20ms. I can live with that. I will generate some higher volumes overnight, and see if it makes a big difference. If you do not hear from me, you can consider it 'problem solved' :-) Again, thanks to all. Frank
В списке pgsql-general по дате отправления: