PostgreSQL DB in prod, test, debug
| От | Simon Connah |
|---|---|
| Тема | PostgreSQL DB in prod, test, debug |
| Дата | |
| Msg-id | NaMscvF8PaDKNcHJGR28wV7Kljc6kw4y0agCt2E3qI8hkUYUSUyj_beL825S4GfK2SjTuNVaHmRkW30ihuYmobAitd1uuLNis8nCkFCZT-0=@protonmail.com обсуждение исходный текст |
| Ответы |
Re: PostgreSQL DB in prod, test, debug
Re: PostgreSQL DB in prod, test, debug |
| Список | pgsql-general |
Hi, This project uses Python 3.12, Flask, psycopg3 and PostgreSQL 15. This is probably a stupid question so I apologies in advance. I'm building a website using PostgreSQL and since I've just been doing some dev work on it I've just manually played aroundwith the database if I needed new tables or functions for example but I want to start doing automated testing and needto import a clean snapshot of the database with no data and then use the automated tests to test if things work withthe tests. What I think is the best way to do this is to do a pg_dump of the database (using the --schema-only flag) and then load itinto a test only database that gets created at the start of the unit tests and destroyed at the end. The automated testswill insert, update, delete and select data to test if it all still works. My main question is does this sound OK? And if so is there a nice way to automate the dump / restore in Python? Simon.
Вложения
В списке pgsql-general по дате отправления: