Обсуждение: Postgres jdbc bulk insert stuck

Поиск
Список
Период
Сортировка

Postgres jdbc bulk insert stuck

От
dmachop
Дата:
Hi,

I have got a large number (about a million) of records that is being sent to
Postgres 8.4 table (with around 250+ columns; 25+ text columns).  I decided
to split it into batches of 1000 records. A strange thing I observed is that
the batch insert is stuck while individual inserts are working. Enabled the
log to check what's happening on the server. It seems to be stuck at this
phase (shorted the statement):

2016-05-20 10:06:38 EDT LOG:  duration: 1.334 ms  parse S_102: INSERT INTO
Test (<col_names>) VALUES($1,..., $290)

AFAIK: It used to parse, bind with params and execute with params that is
printed to the log.

A sample snippet where the issue occurs:
try
{
    dbUtil.connection.setAutoCommit(false);
    for(...)
    {
        batchCount++;
        if (batchCount % BATCHSIZE == 0)
        {
            //dbUtil.nonBatch(insertSql, Utility.listtoArray(recordList));
            dbUtil.batch(insertSql, Utility.listtoArray(recordList));
            recordList.clear();
        }
    }

    dbUtil.commit();
}

finally
{
    dbUtil.connection.setAutoCommit(true);
}

These are the things that I have tried:
1. Maybe I'm having a batch size that is too big to handle. So, I reduced
the size to 256 and the issue still persists.
2. Data issue? Skipped a few rows in between to check whether the particular
row is in question? Issue still appears.
3. OS Specific? I was running the application on linux box to be sent to
postgres installed on linux. Tried running the same application on Windows
sending to postgres installed on linux.
*Surprisingly, this worked since I ran this application on windows.*
4. Commit in between to ensure the rows are inserted immediately. Still no
avail.



*Details:*
Box details:

LSB_VERSION=base-4.0-amd64:base-4.0-noarch:core-4.0-amd64:core-4.0-noarch:graphics-4.0-amd64:graphics-4.0-noarch:printing-4.0-amd64:printing-4.0-noarch
Red Hat Enterprise Linux Server release 6.5 (Santiago)
Java:
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
Postgres driver:
// http://mvnrepository.com/artifact/org.postgresql/postgresql
compile group: 'org.postgresql', name: 'postgresql', version:
'9.4.1208.jre7'




--
View this message in context: http://postgresql.nabble.com/Postgres-jdbc-bulk-insert-stuck-tp5904350.html
Sent from the PostgreSQL - jdbc mailing list archive at Nabble.com.


Re: Postgres jdbc bulk insert stuck

От
Dave Cramer
Дата:
Yes, that does not look like it is doing a batch insert. It looks like it is doing plain inserts.
Does this happen with any other version of the driver ?


On 20 May 2016 at 10:42, dmachop <dmachop@gmail.com> wrote:
Hi,

I have got a large number (about a million) of records that is being sent to
Postgres 8.4 table (with around 250+ columns; 25+ text columns).  I decided
to split it into batches of 1000 records. A strange thing I observed is that
the batch insert is stuck while individual inserts are working. Enabled the
log to check what's happening on the server. It seems to be stuck at this
phase (shorted the statement):

2016-05-20 10:06:38 EDT LOG:  duration: 1.334 ms  parse S_102: INSERT INTO
Test (<col_names>) VALUES($1,..., $290)

AFAIK: It used to parse, bind with params and execute with params that is
printed to the log.

A sample snippet where the issue occurs:
try
{
        dbUtil.connection.setAutoCommit(false);
        for(...)
        {
                batchCount++;
                if (batchCount % BATCHSIZE == 0)
                {
                        //dbUtil.nonBatch(insertSql, Utility.listtoArray(recordList));
                        dbUtil.batch(insertSql, Utility.listtoArray(recordList));
                        recordList.clear();
                }
        }

        dbUtil.commit();
}

finally
{
        dbUtil.connection.setAutoCommit(true);
}

These are the things that I have tried:
1. Maybe I'm having a batch size that is too big to handle. So, I reduced
the size to 256 and the issue still persists.
2. Data issue? Skipped a few rows in between to check whether the particular
row is in question? Issue still appears.
3. OS Specific? I was running the application on linux box to be sent to
postgres installed on linux. Tried running the same application on Windows
sending to postgres installed on linux.
*Surprisingly, this worked since I ran this application on windows.*
4. Commit in between to ensure the rows are inserted immediately. Still no
avail.



*Details:*
Box details:
LSB_VERSION=base-4.0-amd64:base-4.0-noarch:core-4.0-amd64:core-4.0-noarch:graphics-4.0-amd64:graphics-4.0-noarch:printing-4.0-amd64:printing-4.0-noarch
Red Hat Enterprise Linux Server release 6.5 (Santiago)
Java:
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
Postgres driver:
// http://mvnrepository.com/artifact/org.postgresql/postgresql
compile group: 'org.postgresql', name: 'postgresql', version:
'9.4.1208.jre7'




--
View this message in context: http://postgresql.nabble.com/Postgres-jdbc-bulk-insert-stuck-tp5904350.html
Sent from the PostgreSQL - jdbc mailing list archive at Nabble.com.


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc

Re: Postgres jdbc bulk insert stuck

От
Dave Cramer
Дата:
Yes, that does not look like it is doing a batch insert. It looks like it is doing plain inserts.
Does this happen with any other version of the driver ?


On 20 May 2016 at 10:42, dmachop <dmachop@gmail.com> wrote:
Hi,

I have got a large number (about a million) of records that is being sent to
Postgres 8.4 table (with around 250+ columns; 25+ text columns).  I decided
to split it into batches of 1000 records. A strange thing I observed is that
the batch insert is stuck while individual inserts are working. Enabled the
log to check what's happening on the server. It seems to be stuck at this
phase (shorted the statement):

2016-05-20 10:06:38 EDT LOG:  duration: 1.334 ms  parse S_102: INSERT INTO
Test (<col_names>) VALUES($1,..., $290)

AFAIK: It used to parse, bind with params and execute with params that is
printed to the log.

A sample snippet where the issue occurs:
try
{
        dbUtil.connection.setAutoCommit(false);
        for(...)
        {
                batchCount++;
                if (batchCount % BATCHSIZE == 0)
                {
                        //dbUtil.nonBatch(insertSql, Utility.listtoArray(recordList));
                        dbUtil.batch(insertSql, Utility.listtoArray(recordList));
                        recordList.clear();
                }
        }

        dbUtil.commit();
}

finally
{
        dbUtil.connection.setAutoCommit(true);
}

These are the things that I have tried:
1. Maybe I'm having a batch size that is too big to handle. So, I reduced
the size to 256 and the issue still persists.
2. Data issue? Skipped a few rows in between to check whether the particular
row is in question? Issue still appears.
3. OS Specific? I was running the application on linux box to be sent to
postgres installed on linux. Tried running the same application on Windows
sending to postgres installed on linux.
*Surprisingly, this worked since I ran this application on windows.*
4. Commit in between to ensure the rows are inserted immediately. Still no
avail.



*Details:*
Box details:
LSB_VERSION=base-4.0-amd64:base-4.0-noarch:core-4.0-amd64:core-4.0-noarch:graphics-4.0-amd64:graphics-4.0-noarch:printing-4.0-amd64:printing-4.0-noarch
Red Hat Enterprise Linux Server release 6.5 (Santiago)
Java:
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
Postgres driver:
// http://mvnrepository.com/artifact/org.postgresql/postgresql
compile group: 'org.postgresql', name: 'postgresql', version:
'9.4.1208.jre7'




--
View this message in context: http://postgresql.nabble.com/Postgres-jdbc-bulk-insert-stuck-tp5904350.html
Sent from the PostgreSQL - jdbc mailing list archive at Nabble.com.


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc

Re: Postgres jdbc bulk insert stuck

От
dmachop
Дата:
Tried with older version of  postgres jdbc driver
<http://mvnrepository.com/artifact/org.postgresql/postgresql/9.3-1104-jdbc41>
and the issue still exists.

I recollect reading from Stackoverflow that batch insert in postgres does
individual inserts and recommends using copy. However, I cannot use copy due
to certain reasons. That's the reason I have to resort to batch insert. Is
there a way that I could change some parameters such that it might not hit
the maximum limit? (Probably seems to be an issue with larger tables and
batch insert works on smaller tables).

Just an additional info that the data may contain non-ascii chars while
insert (Chinese, German) and using utf-8 option on db as well as on jdbc
connection properties. I don't see that as an issue since manual inserts are
working good.



--
View this message in context: http://postgresql.nabble.com/Postgres-jdbc-bulk-insert-stuck-tp5904350p5904463.html
Sent from the PostgreSQL - jdbc mailing list archive at Nabble.com.


Re: Postgres jdbc bulk insert stuck

От
Dave Cramer
Дата:
Pretty sure we don't do single inserts.

Do you have a sense of where this breaks down. How many inserts it takes before it fails ?


On 21 May 2016 at 16:34, dmachop <dmachop@gmail.com> wrote:
Tried with older version of  postgres jdbc driver
<http://mvnrepository.com/artifact/org.postgresql/postgresql/9.3-1104-jdbc41>
and the issue still exists.

I recollect reading from Stackoverflow that batch insert in postgres does
individual inserts and recommends using copy. However, I cannot use copy due
to certain reasons. That's the reason I have to resort to batch insert. Is
there a way that I could change some parameters such that it might not hit
the maximum limit? (Probably seems to be an issue with larger tables and
batch insert works on smaller tables).

Just an additional info that the data may contain non-ascii chars while
insert (Chinese, German) and using utf-8 option on db as well as on jdbc
connection properties. I don't see that as an issue since manual inserts are
working good.



--
View this message in context: http://postgresql.nabble.com/Postgres-jdbc-bulk-insert-stuck-tp5904350p5904463.html
Sent from the PostgreSQL - jdbc mailing list archive at Nabble.com.


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc

Re: Postgres jdbc bulk insert stuck

От
Vitalii Tymchyshyn
Дата:
Just a wild guess.
Does it stop on the first batch?
Have you got any triggers on the table?
Are there any notices sent back to the client? There is a certain limitation related to client TCP read/write sequencing.

Best regards, Vitalii Tymchyshyn

Пт, 20 трав. 2016 о 10:47 dmachop <dmachop@gmail.com> пише:
Hi,

I have got a large number (about a million) of records that is being sent to
Postgres 8.4 table (with around 250+ columns; 25+ text columns).  I decided
to split it into batches of 1000 records. A strange thing I observed is that
the batch insert is stuck while individual inserts are working. Enabled the
log to check what's happening on the server. It seems to be stuck at this
phase (shorted the statement):

2016-05-20 10:06:38 EDT LOG:  duration: 1.334 ms  parse S_102: INSERT INTO
Test (<col_names>) VALUES($1,..., $290)

AFAIK: It used to parse, bind with params and execute with params that is
printed to the log.

A sample snippet where the issue occurs:
try
{
        dbUtil.connection.setAutoCommit(false);
        for(...)
        {
                batchCount++;
                if (batchCount % BATCHSIZE == 0)
                {
                        //dbUtil.nonBatch(insertSql, Utility.listtoArray(recordList));
                        dbUtil.batch(insertSql, Utility.listtoArray(recordList));
                        recordList.clear();
                }
        }

        dbUtil.commit();
}

finally
{
        dbUtil.connection.setAutoCommit(true);
}

These are the things that I have tried:
1. Maybe I'm having a batch size that is too big to handle. So, I reduced
the size to 256 and the issue still persists.
2. Data issue? Skipped a few rows in between to check whether the particular
row is in question? Issue still appears.
3. OS Specific? I was running the application on linux box to be sent to
postgres installed on linux. Tried running the same application on Windows
sending to postgres installed on linux.
*Surprisingly, this worked since I ran this application on windows.*
4. Commit in between to ensure the rows are inserted immediately. Still no
avail.



*Details:*
Box details:
LSB_VERSION=base-4.0-amd64:base-4.0-noarch:core-4.0-amd64:core-4.0-noarch:graphics-4.0-amd64:graphics-4.0-noarch:printing-4.0-amd64:printing-4.0-noarch
Red Hat Enterprise Linux Server release 6.5 (Santiago)
Java:
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
Postgres driver:
// http://mvnrepository.com/artifact/org.postgresql/postgresql
compile group: 'org.postgresql', name: 'postgresql', version:
'9.4.1208.jre7'




--
View this message in context: http://postgresql.nabble.com/Postgres-jdbc-bulk-insert-stuck-tp5904350.html
Sent from the PostgreSQL - jdbc mailing list archive at Nabble.com.


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc

Re: Postgres jdbc bulk insert stuck

От
dmachop
Дата:
Tried with older version of  postgres jdbc driver
<http://mvnrepository.com/artifact/org.postgresql/postgresql/9.3-1104-jdbc41>
and the issue still exists.

I recollect reading from Stackoverflow that batch insert in postgres does
individual inserts and recommends using copy. However, I cannot use copy due
to certain reasons. That's the reason I have to resort to batch insert. Is
there a way that I could change some parameters such that it might not hit
the maximum limit? (Probably seems to be an issue with larger tables and
batch insert works on smaller tables).

Just an additional info that the data may contain non-ascii chars while
insert (Chinese, German) and using utf-8 option on db as well as on jdbc
connection properties. I don't see that as an issue since manual inserts are
working good.



--
View this message in context: http://postgresql.nabble.com/Postgres-jdbc-bulk-insert-stuck-tp5904350p5904463.html
Sent from the PostgreSQL - jdbc mailing list archive at Nabble.com.


Re: Postgres jdbc bulk insert stuck

От
Dave Cramer
Дата:
Pretty sure we don't do single inserts.

Do you have a sense of where this breaks down. How many inserts it takes before it fails ?


On 21 May 2016 at 16:34, dmachop <dmachop@gmail.com> wrote:
Tried with older version of  postgres jdbc driver
<http://mvnrepository.com/artifact/org.postgresql/postgresql/9.3-1104-jdbc41>
and the issue still exists.

I recollect reading from Stackoverflow that batch insert in postgres does
individual inserts and recommends using copy. However, I cannot use copy due
to certain reasons. That's the reason I have to resort to batch insert. Is
there a way that I could change some parameters such that it might not hit
the maximum limit? (Probably seems to be an issue with larger tables and
batch insert works on smaller tables).

Just an additional info that the data may contain non-ascii chars while
insert (Chinese, German) and using utf-8 option on db as well as on jdbc
connection properties. I don't see that as an issue since manual inserts are
working good.



--
View this message in context: http://postgresql.nabble.com/Postgres-jdbc-bulk-insert-stuck-tp5904350p5904463.html
Sent from the PostgreSQL - jdbc mailing list archive at Nabble.com.


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc

Re: Postgres jdbc bulk insert stuck

От
Vitalii Tymchyshyn
Дата:
Just a wild guess.
Does it stop on the first batch?
Have you got any triggers on the table?
Are there any notices sent back to the client? There is a certain limitation related to client TCP read/write sequencing.

Best regards, Vitalii Tymchyshyn

Пт, 20 трав. 2016 о 10:47 dmachop <dmachop@gmail.com> пише:
Hi,

I have got a large number (about a million) of records that is being sent to
Postgres 8.4 table (with around 250+ columns; 25+ text columns).  I decided
to split it into batches of 1000 records. A strange thing I observed is that
the batch insert is stuck while individual inserts are working. Enabled the
log to check what's happening on the server. It seems to be stuck at this
phase (shorted the statement):

2016-05-20 10:06:38 EDT LOG:  duration: 1.334 ms  parse S_102: INSERT INTO
Test (<col_names>) VALUES($1,..., $290)

AFAIK: It used to parse, bind with params and execute with params that is
printed to the log.

A sample snippet where the issue occurs:
try
{
        dbUtil.connection.setAutoCommit(false);
        for(...)
        {
                batchCount++;
                if (batchCount % BATCHSIZE == 0)
                {
                        //dbUtil.nonBatch(insertSql, Utility.listtoArray(recordList));
                        dbUtil.batch(insertSql, Utility.listtoArray(recordList));
                        recordList.clear();
                }
        }

        dbUtil.commit();
}

finally
{
        dbUtil.connection.setAutoCommit(true);
}

These are the things that I have tried:
1. Maybe I'm having a batch size that is too big to handle. So, I reduced
the size to 256 and the issue still persists.
2. Data issue? Skipped a few rows in between to check whether the particular
row is in question? Issue still appears.
3. OS Specific? I was running the application on linux box to be sent to
postgres installed on linux. Tried running the same application on Windows
sending to postgres installed on linux.
*Surprisingly, this worked since I ran this application on windows.*
4. Commit in between to ensure the rows are inserted immediately. Still no
avail.



*Details:*
Box details:
LSB_VERSION=base-4.0-amd64:base-4.0-noarch:core-4.0-amd64:core-4.0-noarch:graphics-4.0-amd64:graphics-4.0-noarch:printing-4.0-amd64:printing-4.0-noarch
Red Hat Enterprise Linux Server release 6.5 (Santiago)
Java:
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
Postgres driver:
// http://mvnrepository.com/artifact/org.postgresql/postgresql
compile group: 'org.postgresql', name: 'postgresql', version:
'9.4.1208.jre7'




--
View this message in context: http://postgresql.nabble.com/Postgres-jdbc-bulk-insert-stuck-tp5904350.html
Sent from the PostgreSQL - jdbc mailing list archive at Nabble.com.


--
Sent via pgsql-jdbc mailing list (pgsql-jdbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-jdbc

Re: Postgres jdbc bulk insert stuck

От
dmachop
Дата:
The table doesn't contain any keys, indices, triggers, etc.
This is the first time I create the table. I run a create statement and then
run inserts.


Vitalii Tymchyshyn-2 wrote
> Does it stop on the first batch?

No. In fact, 4 batches are processed and gets stuck at 5th batch. Tried with
different batch count 256, 1000 (Stuck at after processing 1280, 5000 rows
respectively).


Vitalii Tymchyshyn-2 wrote
> Have you got any triggers on the table?

None whatsoever. Just created a table and nothing else.


Vitalii Tymchyshyn-2 wrote
> Are there any notices sent back to the client? There is a certain
> limitation related to client TCP read/write sequencing.

I don't receive notifications. Debugged at a point and this is the dump
where it seems to be stuck for about an hour or more
Thread [BulkDownloader-1] (Suspended)
owns: BufferedOutputStream  (id=5108)
owns: QueryExecutorImpl  (id=4483)
SocketOutputStream.socketWrite0(FileDescriptor, byte[], int, int) line: not
available [native method] [local variables unavailable]
SocketOutputStream.socketWrite(byte[], int, int) line: 109
SocketOutputStream.write(byte[], int, int) line: 153
BufferedOutputStream.flushBuffer() line: 82
BufferedOutputStream.write(byte[], int, int) line: 121
BufferedOutputStream(FilterOutputStream).write(byte[]) line: 97
PGStream.Send(byte[]) line: 229
QueryExecutorImpl.sendParse(SimpleQuery, SimpleParameterList, boolean) line:
1327
QueryExecutorImpl.sendOneQuery(SimpleQuery, SimpleParameterList, int, int,
int) line: 1629
QueryExecutorImpl.sendQuery(V3Query, V3ParameterList, int, int, int,
QueryExecutorImpl$ErrorTrackingResultHandler, BatchResultHandler) line: 1216
QueryExecutorImpl.execute(Query[], ParameterList[], BatchResultHandler, int,
int, int) line: 351
PgPreparedStatement(PgStatement).executeBatch() line: 1019
DelegatingPreparedStatement(DelegatingStatement).executeBatch() line: 345
DelegatingPreparedStatement(DelegatingStatement).executeBatch() line: 345
QueryRunner.batch(Connection, boolean, String, Object[][]) line: 152
QueryRunner.batch(Connection, String, Object[][]) line: 92
DBUtil.batch(String, Object[][]) line: 141
DBWriter.merge(SalesforceObject) line: 172
...





--
View this message in context: http://postgresql.nabble.com/Postgres-jdbc-bulk-insert-stuck-tp5904350p5904483.html
Sent from the PostgreSQL - jdbc mailing list archive at Nabble.com.


Re: Postgres jdbc bulk insert stuck

От
dmachop
Дата:
The table doesn't contain any keys, indices, triggers, etc.
This is the first time I create the table. I run a create statement and then
run inserts.


Vitalii Tymchyshyn-2 wrote
> Does it stop on the first batch?

No. In fact, 4 batches are processed and gets stuck at 5th batch. Tried with
different batch count 256, 1000 (Stuck at after processing 1280, 5000 rows
respectively).


Vitalii Tymchyshyn-2 wrote
> Have you got any triggers on the table?

None whatsoever. Just created a table and nothing else.


Vitalii Tymchyshyn-2 wrote
> Are there any notices sent back to the client? There is a certain
> limitation related to client TCP read/write sequencing.

I don't receive notifications. Debugged at a point and this is the dump
where it seems to be stuck for about an hour or more
Thread [BulkDownloader-1] (Suspended)
owns: BufferedOutputStream  (id=5108)
owns: QueryExecutorImpl  (id=4483)
SocketOutputStream.socketWrite0(FileDescriptor, byte[], int, int) line: not
available [native method] [local variables unavailable]
SocketOutputStream.socketWrite(byte[], int, int) line: 109
SocketOutputStream.write(byte[], int, int) line: 153
BufferedOutputStream.flushBuffer() line: 82
BufferedOutputStream.write(byte[], int, int) line: 121
BufferedOutputStream(FilterOutputStream).write(byte[]) line: 97
PGStream.Send(byte[]) line: 229
QueryExecutorImpl.sendParse(SimpleQuery, SimpleParameterList, boolean) line:
1327
QueryExecutorImpl.sendOneQuery(SimpleQuery, SimpleParameterList, int, int,
int) line: 1629
QueryExecutorImpl.sendQuery(V3Query, V3ParameterList, int, int, int,
QueryExecutorImpl$ErrorTrackingResultHandler, BatchResultHandler) line: 1216
QueryExecutorImpl.execute(Query[], ParameterList[], BatchResultHandler, int,
int, int) line: 351
PgPreparedStatement(PgStatement).executeBatch() line: 1019
DelegatingPreparedStatement(DelegatingStatement).executeBatch() line: 345
DelegatingPreparedStatement(DelegatingStatement).executeBatch() line: 345
QueryRunner.batch(Connection, boolean, String, Object[][]) line: 152
QueryRunner.batch(Connection, String, Object[][]) line: 92
DBUtil.batch(String, Object[][]) line: 141
DBWriter.merge(SalesforceObject) line: 172
...





--
View this message in context: http://postgresql.nabble.com/Postgres-jdbc-bulk-insert-stuck-tp5904350p5904483.html
Sent from the PostgreSQL - jdbc mailing list archive at Nabble.com.


Re: Postgres jdbc bulk insert stuck

От
Vladimir Sitnikov
Дата:
Can you please share the code that reproduces the issue? I mean DDL to create the table, and the data for the batch insert.

Alternatively, can you try rebuilding pgjdbc with MAX_BUFFERED_RECV_BYTES=250 and check if that helps? (see https://github.com/pgjdbc/pgjdbc/blob/94549ffe6c2e643efb1f779a23673b2518e465c1/pgjdbc/src/main/java/org/postgresql/core/v3/QueryExecutorImpl.java#L291)

Vladimir

Re: Postgres jdbc bulk insert stuck

От
Vladimir Sitnikov
Дата:
Can you please share the code that reproduces the issue? I mean DDL to create the table, and the data for the batch insert.

Alternatively, can you try rebuilding pgjdbc with MAX_BUFFERED_RECV_BYTES=250 and check if that helps? (see https://github.com/pgjdbc/pgjdbc/blob/94549ffe6c2e643efb1f779a23673b2518e465c1/pgjdbc/src/main/java/org/postgresql/core/v3/QueryExecutorImpl.java#L291)

Vladimir