The idea behind a streaming CSV export was to reduce the amount of memory used, by avoiding building the entire CSV file in memory before sending it to the client. However, it didn't work out this way in practice: the query objects that were created to represent each line caused Postgres to generate a very large (~200MB on bookwyrm.social) temp file, not to mention the memory being used by the Query object likely being similar to, if not larger than that used by the finalized CSV row. While we should in the long term run our CSV exports as a Celery task, this change should allow CSV exports to work on large servers without causing disk-space problems. Fixes: #2157 |
||
---|---|---|
.. | ||
__init__.py | ||
test_block.py | ||
test_change_password.py | ||
test_delete_user.py | ||
test_edit_user.py | ||
test_export.py | ||
test_two_factor_auth.py |