Re: Non-reentrant plperlu function & concurrent access
От | zhong ming wu |
---|---|
Тема | Re: Non-reentrant plperlu function & concurrent access |
Дата | |
Msg-id | AANLkTim3ZSzvw9WJ5Htr31g3wM=yvTv33DrqL1=Y_RZx@mail.gmail.com обсуждение исходный текст |
Ответ на | Re: Non-reentrant plperlu function & concurrent access (Philippe Lang <philippe.lang@attiksystem.ch>) |
Список | pgsql-general |
On Tue, Aug 17, 2010 at 4:15 AM, Philippe Lang <philippe.lang@attiksystem.ch> wrote: >> Hi, >> >> I have a non-reentrant plperlu function, which does no database >> modification. It basically stores input data into a file, calls a unix >> shell command, and reads the result back from another file. >> >> I don't really care about database isolation here, phantom reads and >> such. It is not likely to be a problem. What could be a problem, is if >> another call to this function is fired while another one is running. >> >> In this specific case, I could solve the problem by generating random >> input and ouput filenames, but I would prefer a more general solution, >> like using some sort of mutex for the function. What is the best way to >> do that under Postgresql? Although not designed for this (if I >> understand correctly), would a "serializable" isolation level help >> here? > > I answer to my own post, sorry... > > Maybe Postgresql functions "pg_try_advisory_lock_shared" and "pg_advisory_unlock_shared" are the solution? > There is perl self locking mechanism using flock on __DATA__ . Have you tried that? google is your friend.
В списке pgsql-general по дате отправления: