Re: large resultset
От | Jasen Betts |
---|---|
Тема | Re: large resultset |
Дата | |
Msg-id | hv7r80$ogq$3@reversiblemaps.ath.cx обсуждение исходный текст |
Ответ на | large resultset (AI Rumman <rummandba@gmail.com>) |
Список | pgsql-php |
On 2010-06-15, Andrew McMillan <andrew@morphoss.com> wrote: > Fundamentally sending 2million of anything can get problematic pretty > darn quickly, unless the 'thing' is less than 100 bytes. > > My personal favourite would be to write a record somewhere saying 'so > and so wants these 2 million records', and give the user a URL where > they can fetch them from. Or e-mail them to the user, or... just about > anything, except try and generate them in-line with the page, in a > reasonable time for their browser to not give up, or their proxy to not > give up, or their ISP's transparent proxy to not give up. email often fails for sizes over 10Mb > Why do they want 2 million record anyway? 2 million of what? Will > another user drop by 10 seconds later and also want 2 million records? > The same 2 million? Why does the user want 2 million records? Is there > something that can be done to the 2 million records to make them a > smaller but more useful set of information? /Nobody/ wants a web page with 2 million lines on it, (scrolling gets tricky when each pixel is 2000 lines of data, plus most browsers aren't designed to handle it well) still if it's served with "Content-Disposition: Attachment" they'll get offered it for download instead. (unless they use IE and you use cookies and SSL in which case it doesn't work)
В списке pgsql-php по дате отправления: