Обсуждение: Automation again more info.. HELP
Sorry about the lack of info..
OS: Redhat 6.2
mailbox01-->Postgresql 6.5
mailbox02-->Postgresql 7.02
Preferred language-->TCL
I have a database of email users with a lot of tables.....
EX. BASIC_INFO, ADS_STATS etc....
I want to automate the dumping of data..... from mailbox01(name of computer) and append it to to the existing database in mailbox02 (backup of mailbox01) so the marketing dept. can do their job.. I have an idea of just querying... I want to have suggestions ... more idea on how to do this....
I hate typing again and again the query WHERE date is ... bla bla ...
ONE BIG PROBLEM....
If Im going to create a single script to placed in cron WHAT IF... I want to have the month of October I'll be dumping all the data again.... waste of time... and cpu hours hehehehe...
Thanks in advance
On Fri, 13 Oct 2000, Jerome Macaranas wrote: > Sorry about the lack of info.. > > OS: Redhat 6.2 > mailbox01-->Postgresql 6.5 > mailbox02-->Postgresql 7.02 > Preferred language-->TCL PostgreSQL can be run in one of two configurations: allow network connections or not allow network connections. If both computers allow network connections, either can be "home". If one allows network connections and the other doesn't, I would use the non-network connection as home. I believe there is a PG library for tcl, but if it is like perl, it is specific to a version of Postgresql. Which complicates things a little. Which ever machine you decide to call home, I think you should put your query server (or whatever you want to call it). This is what is started by cron (or you could just have it run all the time :-). (I am going to assume these two computers agree what time it is.) Your server is probably going to "calculate" an SQL SELECT query for each server, and it is going to start a "remote" script which is linked against the specific postgresql library needed for that machine. This script will take as an argument, the SQL statement to execute. The script will execute the SQL statement, and then it will open a socket to the query server to give it the results. You might want to open 2 sockets, if two way communication is needed. But that is how I think I would approach the problem. However, I have never implemented something like this, just read bits here and there. Maybe someone else has some better idea. Gord Matter Realisations http://www.materialisations.com/ Gordon Haverland, B.Sc. M.Eng. President 101 9504 182 St. NW Edmonton, AB, CA T5T 3A7 780/481-8019 ghaverla @ freenet.edmonton.ab.ca 780/993-1274 (cell)
On Fri, 13 Oct 2000, Macky wrote: > Hi thanks for the suggestion but as much as possible I want to minimize > production of our servers.... if going to use an SQL SELECT it would have > great impact on performace before we usually do a COPY ... There must be something to this situation that I am missing. If you need to find the information, the computer has to do work. What cron "buys" you, is the ability to schedule the work at some time of the day when the load is low enough to support this added computation. If there is no time of the day when this is true, the problem you have is that you don't have enough computing power, and you need to buy some more hardware. Even doing a pgdump to a remote machine is going to take CPU. Gord Matter Realisations http://www.materialisations.com/ Gordon Haverland, B.Sc. M.Eng. President 101 9504 182 St. NW Edmonton, AB, CA T5T 3A7 780/481-8019 ghaverla @ freenet.edmonton.ab.ca 780/993-1274 (cell)