Обсуждение: Bulk Load and Extract from PostgreSQL

Поиск
Список
Период
Сортировка

Bulk Load and Extract from PostgreSQL

От
Sandeep Khandelwal
Дата:
Hi All.

I want to extract and Load data from PostgreSQL using Libpq C API. Please let me know which approach will be good to
loadlarge number of rows into PostgreSQL(Insert or COPY FROM) and, which approach will be good to extract large number
ofrows from PostgreSQL (COPY TO or SELECT). 
I want to handle all the data types supported in the PostgreSQL.

Thanks,
Sandeep




Re: Bulk Load and Extract from PostgreSQL

От
Sean Davis
Дата:
On Monday 16 October 2006 03:07, Sandeep Khandelwal wrote:
> Hi All.
>
> I want to extract and Load data from PostgreSQL using Libpq C API. Please
> let me know which approach will be good to load large number of rows into
> PostgreSQL(Insert or COPY FROM) and, which approach will be good to extract
> large number of rows from PostgreSQL (COPY TO or SELECT). I want to handle
> all the data types supported in the PostgreSQL.

copy is the faster way to go for a single table.  

Sean


Re: Bulk Load and Extract from PostgreSQL

От
Sandeep Khandelwal
Дата:
Hi Sean.

Thanks for the reply.

I have one more question. While extracting data using COPT TO command (in TEXT mode) we have an API, PQunescapeBytea,
toconvert string representation of binary data(bytea) into binary. Similary we need to convert binary data into it's
stringrepresentation while loading data into PostgreSQL ,using COPY FROM command in TEXT mode, with PQputCopydata. But
thereis no API to convert binary data into it's string representation. Could you tell me API or I am misinterpreting
something.

Thanks in advance,
Sandeep


----- Original Message ----
From: Sean Davis <sdavis2@mail.nih.gov>
To: pgsql-interfaces@postgresql.org
Cc: Sandeep Khandelwal <sandeep_khandelwal27@yahoo.com>
Sent: Monday, October 16, 2006 4:04:45 PM
Subject: Re: [INTERFACES] Bulk Load and Extract from PostgreSQL


On Monday 16 October 2006 03:07, Sandeep Khandelwal wrote:
> Hi All.
>
> I want to extract and Load data from PostgreSQL using Libpq C API. Please
> let me know which approach will be good to load large number of rows into
> PostgreSQL(Insert or COPY FROM) and, which approach will be good to extract
> large number of rows from PostgreSQL (COPY TO or SELECT). I want to handle
> all the data types supported in the PostgreSQL.

copy is the faster way to go for a single table.

Sean

---------------------------(end of broadcast)---------------------------
TIP 9: In versions below 8.0, the planner will ignore your desire to      choose an index scan if your joining column's
datatypesdo not      match 




Re: Bulk Load and Extract from PostgreSQL

От
"Davis, Sean \(NIH/NCI\) [E]"
Дата:
Sandeep,

Try looking here:

http://www.postgresql.org/docs/8.1/interactive/libpq-exec.html

Under "Escaping Binary Strings for Inclusion in SQL Commands"?  Will that satisfy your needs, or did I miss what you
aretrying to do? 

Sean



-----Original Message-----
From: Sandeep Khandelwal [mailto:sandeep_khandelwal27@yahoo.com]
Sent: Tue 10/17/2006 1:36 AM
To: Davis, Sean (NIH/NCI) [E]; pgsql-interfaces@postgresql.org
Subject: Re: [INTERFACES] Bulk Load and Extract from PostgreSQL
Hi Sean.

Thanks for the reply.

I have one more question. While extracting data using COPT TO command (in TEXT mode) we have an API, PQunescapeBytea,
toconvert string representation of binary data(bytea) into binary. Similary we need to convert binary data into it's
stringrepresentation while loading data into PostgreSQL ,using COPY FROM command in TEXT mode, with PQputCopydata. But
thereis no API to convert binary data into it's string representation. Could you tell me API or I am misinterpreting
something.

Thanks in advance,
Sandeep


----- Original Message ----
From: Sean Davis <sdavis2@mail.nih.gov>
To: pgsql-interfaces@postgresql.org
Cc: Sandeep Khandelwal <sandeep_khandelwal27@yahoo.com>
Sent: Monday, October 16, 2006 4:04:45 PM
Subject: Re: [INTERFACES] Bulk Load and Extract from PostgreSQL


On Monday 16 October 2006 03:07, Sandeep Khandelwal wrote:
> Hi All.
>
> I want to extract and Load data from PostgreSQL using Libpq C API. Please
> let me know which approach will be good to load large number of rows into
> PostgreSQL(Insert or COPY FROM) and, which approach will be good to extract
> large number of rows from PostgreSQL (COPY TO or SELECT). I want to handle
> all the data types supported in the PostgreSQL.

copy is the faster way to go for a single table.

Sean

---------------------------(end of broadcast)---------------------------
TIP 9: In versions below 8.0, the planner will ignore your desire to      choose an index scan if your joining column's
datatypesdo not      match 





Re: Bulk Load and Extract from PostgreSQL

От
Sandeep Khandelwal
Дата:
Hi Sean.

The reference that you mentioned is just for Inclusion in the SQL commands directly. But.........

I am trying to load coma seperated data using COPY FROM STDIN command (TEXT mode) and using PQputCopydata API. COPY
FROMSTDIN command requires user to specify coma seperated text version of all the data. 

My question is: If I have binary data(bytea) then I have to convert this data into string (char *) form and pass this
datato PostgreSQL using PQputCopydata API. And, I don't have any API in Libpq to convert binary data into string. 

Thanks,
Sandeep


----- Original Message ----
From: "Davis, Sean (NIH/NCI) [E]" <sdavis2@mail.nih.gov>
To: Sandeep Khandelwal <sandeep_khandelwal27@yahoo.com>; pgsql-interfaces@postgresql.org
Sent: Tuesday, October 17, 2006 4:13:56 PM
Subject: Re: [INTERFACES] Bulk Load and Extract from PostgreSQL


Sandeep,

Try looking here:

http://www.postgresql.org/docs/8.1/interactive/libpq-exec.html

Under "Escaping Binary Strings for Inclusion in SQL Commands"?  Will that satisfy your needs, or did I miss what you
aretrying to do? 

Sean



-----Original Message-----
From: Sandeep Khandelwal [mailto:sandeep_khandelwal27@yahoo.com]
Sent: Tue 10/17/2006 1:36 AM
To: Davis, Sean (NIH/NCI) [E]; pgsql-interfaces@postgresql.org
Subject: Re: [INTERFACES] Bulk Load and Extract from PostgreSQL

Hi Sean.

Thanks for the reply.

I have one more question. While extracting data using COPT TO command (in TEXT mode) we have an API, PQunescapeBytea,
toconvert string representation of binary data(bytea) into binary. Similary we need to convert binary data into it's
stringrepresentation while loading data into PostgreSQL ,using COPY FROM command in TEXT mode, with PQputCopydata. But
thereis no API to convert binary data into it's string representation. Could you tell me API or I am misinterpreting
something.

Thanks in advance,
Sandeep


----- Original Message ----
From: Sean Davis <sdavis2@mail.nih.gov>
To: pgsql-interfaces@postgresql.org
Cc: Sandeep Khandelwal <sandeep_khandelwal27@yahoo.com>
Sent: Monday, October 16, 2006 4:04:45 PM
Subject: Re: [INTERFACES] Bulk Load and Extract from PostgreSQL


On Monday 16 October 2006 03:07, Sandeep Khandelwal wrote:
> Hi All.
>
> I want to extract and Load data from PostgreSQL using Libpq C API. Please
> let me know which approach will be good to load large number of rows into
> PostgreSQL(Insert or COPY FROM) and, which approach will be good to extract
> large number of rows from PostgreSQL (COPY TO or SELECT). I want to handle
> all the data types supported in the PostgreSQL.

copy is the faster way to go for a single table.

Sean

---------------------------(end of broadcast)---------------------------
TIP 9: In versions below 8.0, the planner will ignore your desire to      choose an index scan if your joining column's
datatypesdo not      match 




---------------------------(end of broadcast)---------------------------
TIP 1: if posting/reading through Usenet, please send an appropriate      subscribe-nomail command to
majordomo@postgresql.orgso that your      message can get through to the mailing list cleanly 




Re: Bulk Load and Extract from PostgreSQL

От
Sean Davis
Дата:
On Tuesday 17 October 2006 08:18, Sandeep Khandelwal wrote:
> Hi Sean.
>
> The reference that you mentioned is just for Inclusion in the SQL commands
> directly. But.........
>
> I am trying to load coma seperated data using COPY FROM STDIN command (TEXT
> mode) and using PQputCopydata API. COPY FROM STDIN command requires user to
> specify coma seperated text version of all the data.
>
> My question is: If I have binary data(bytea) then I have to convert this
> data into string (char *) form and pass this data to PostgreSQL using
> PQputCopydata API. And, I don't have any API in Libpq to convert binary
> data into string.

Sandeep, 

I see the issue--sorry about the misunderstanding.  I'm not sure what the 
"correct" answer is, but you could simply do base64 client-side encoding and 
decoding of your data.  

Sean


Re: Bulk Load and Extract from PostgreSQL

От
Sandeep Khandelwal
Дата:
Hi Sean.

I will try you option. Thanks a lot for the information.

Thanks,
Sandeep


----- Original Message ----
From: Sean Davis <sdavis2@mail.nih.gov>
To: Sandeep Khandelwal <sandeep_khandelwal27@yahoo.com>
Cc: pgsql-interfaces@postgresql.org
Sent: Tuesday, October 17, 2006 5:53:14 PM
Subject: Re: [INTERFACES] Bulk Load and Extract from PostgreSQL


On Tuesday 17 October 2006 08:18, Sandeep Khandelwal wrote:
> Hi Sean.
>
> The reference that you mentioned is just for Inclusion in the SQL commands
> directly. But.........
>
> I am trying to load coma seperated data using COPY FROM STDIN command (TEXT
> mode) and using PQputCopydata API. COPY FROM STDIN command requires user to
> specify coma seperated text version of all the data.
>
> My question is: If I have binary data(bytea) then I have to convert this
> data into string (char *) form and pass this data to PostgreSQL using
> PQputCopydata API. And, I don't have any API in Libpq to convert binary
> data into string.

Sandeep,

I see the issue--sorry about the misunderstanding.  I'm not sure what the
"correct" answer is, but you could simply do base64 client-side encoding and
decoding of your data.

Sean