pg_clog error
От | terry@greatgulfhomes.com |
---|---|
Тема | pg_clog error |
Дата | |
Msg-id | 002401c233d5$0d40db00$2766f30a@development.greatgulfhomes.com обсуждение исходный текст |
Ответы |
Re: pg_clog error
|
Список | pgsql-general |
Every night I pull data from a legacy system. Last night for the first time I got the error message: FATAL 2: open of /usr/local/pgsql/data/pg_clog/0081 failed: No such file or directory server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request. connection to server was lost In the script, imports into 2 other tables, both quite large (>300k tuples) completed successfully. I look in the directory in question, and it is there, but the file 0081 is not, just 0000. This of course causes the rest of the actions in the script to fail as the backend is restarting. The script has never had this problem before that I have noticed, and I confirmed that it did not happen last night. Does anyone know what causes this? Do I need to increase the number of file handles somewhere? Detailed snippet of the occurrence is below: <snip> psql -c 'CREATE INDEX "customer_extra_budget_rm_idx" on "customer_extra_budget" using btree ( "division_id" "bpchar_ops", "elevation" "bpchar_ops", "model_id" "bpchar_ops", "option_code" "bpchar_ops", "project_id" "bpchar_ops", "room_id" "bpchar_ops" );' -d devtest2 CREATE psql -c "DROP INDEX customer_extra_costs_idx; DROP INDEX customer_extra_costs_ct_idx" -d devtest2 DROP psql -c "delete from customer_extra_costs where division_id ='GGH';" -d devtest2 FATAL 2: open of /usr/local/pgsql/data/pg_clog/0081 failed: No such file or directory server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request. connection to server was lost Thanks Terry Fielder Network Engineer Great Gulf Homes / Ashton Woods Homes terry@greatgulfhomes.com
В списке pgsql-general по дате отправления: