Looking at the tracing data from this function in prod, only ~500ms is
spent in the database. My best guess for the rest of the time is
transferring and creating the user objects, which we don't use, since we
simply need the ID.
Persist book.subjects and add_author when form validation fails.
This does not resolve the problem of cover image uploads being dropped because this is a broader problem and is covered separately in #2760
(we should investigate the plugin ` django-file-resubmit`)
Previously when the deduplicate_book_data script tried to merge an
edition that was on a shelf or in a list then it would fail because when
the canonical book was added to the shelf or the list then it wouldn’t
set the extra fields of the linking table for the “through” model of the
field. These would end up defaulting to NULL, but that is not valid for
some of the fields in ShelfItem and ListItem so postgres wouldn’t accept
it.
To fix that, this patch makes it skip updating fields that have a
non-autogenerated linking table. The linking table would appear as a
separate model anyway so the book will be moved via that instead.
Fixes: #2817
GoToSocial sends 'tag' values as a single object if there is only one
user mentioned, rather than an array with an object inside it.
This causes Bookwyrm to reject the tag since it comes through as a
dict rather than a list.
This commit fixes this at the point the incoming AP object is transformed
so that "mention" tags are turned into a mention_user.
Docker is removing support for docker-compose, and it doesn't appear to
be possible to install it anymore. Instead, it has been replaced with
compose V2 which is a docker plugin called with 'docker compose' (no
hyphen). See https://docs.docker.com/compose/compose-v2/
Thanks to @Neriderc for noticing this in #2781.