mirror of
https://github.com/booklore-app/booklore.git
synced 2026-02-17 16:07:55 +01:00
feat(metadata) Save basic metadata to EPUB file - Bonus use Calibre Custom Columns for importing metadata. (#1879)
* feat: Add comprehensive Calibre metadata extraction system - Implemented configurable field mapping system for Calibre user_metadata - Added support for series_total custom column extraction - Pre-built mappings for ALL BookMetadata fields (20+ fields) - Added support for Set fields (tags, moods, categories, authors) - Lowercase field names with underscores (Calibre requirement) - Multiple ISBN support (ISBN-10 and ISBN-13 simultaneously) - ISBN hyphen handling (strips hyphens before validation) - Type-safe parsing (String, Integer, Float, Double, Set) - Professional logging with IDENTIFIER prefix - Added CALIBRE_FIELD_MAPPING_REFERENCE.txt documentation - Updated .gitignore and created .dockerignore for test files * Updated gitignore * Delete CALIBRE_FIELD_MAPPING_REFERENCE.txt * Remove unnecessary files: example.epub and og_metadata.java * Refactor EpubMetadataExtractor to address PR review feedback - Eliminate DRY violations in identifier extraction with helper methods - Fix Build-to-Check anti-pattern using processedFields tracking - Move regex patterns to static constants to avoid recompilation - Replace magic numbers with calculated prefix lengths - Consolidate repetitive Set field and pagecount parsing patterns - Remove excessive documentation and comments - Remove unrelated .dockerignore and .gitignore entries * chore: trigger CI/CD pipeline * Add @Singular annotation to collection fields to support builder accumulation * Update identifiers to URN format for Calibre compatibility * fix: resolve MetadataChangeDetectorTest failures and enhance Calibre EPUB integration - Fixed NullPointerException in testEdgeCase_emptyCollectionToNull_returnsTrue() - Fixed NullPointerException in testHasValueChanges_whenEmptySetToNull_returnsTrue() Implementing @Singular for moods and tags prevents metadata fields from being null, which then fails edge test cases. Since we are no longer double looping during metadata extraction, @Singular is not needed. Removing it allows the edge case tests (by Balázs Szücs) to pass. - Removed douban fields from persistence layer (kept in DTO for DoubanBookParser) I was under the impression we were saving this value to the DB, but as far as I can find, we only use Douban data for searching. Adding to the identifiers section would raise issues with the database not having where to store that value. Database modifications are left to the professionals. - Added support for all identifier formats (URN and simple prefix) Calibre only detects identifiers during IMPORT if they are in URN format. However, it saves them back as simple prefix. Booklore now handles both. - Added removeAllCalibreMetadata() to strip all Calibre traces Cleans up saved EPUBs in Booklore's library, removing all Calibre metadata traces. - Ensures clean EPUB 3 compliant output with only booklore:* tags ALL metadata is now saved by Booklore into EPUBs using booklore:* tags. This preserves Booklore's metadata and we can read it back during import. Calibre support is ONLY during IMPORT, as it should be since this is the Booklore project. (If Calibre users want to extract Booklore metadata, they can build a Calibre plugin to extract booklore:* tags.) No longer use #genres or #categories. We stick to dc:subject as suggested, and Booklore tags can be read from Calibre custom column #extra_tags. We store tags as booklore:tags. Removed all tests that used #genres/#categories. Added tests to check booklore:* tags extraction. * fix: Remove DoubanId copy helper call - upstream MetadataClearFlags missing isDoubanId() * Fix Calibre moods and tags extraction from EPUB metadata - Add fallback logic to check alternative key names (value, #val#) if #value# is missing - Maintain compatibility with upstream metadata structure changes * fix: resolve rebase conflicts and compilation errors - Fixed missing closing brace in MetadataRefreshService.java - Added MoodRepository and TagRepository to BookCreatorService - Added addMoodsToBook() and addTagsToBook() methods - Fixed EpubProcessor to use new methods * Restore Booklore's metadata persistance, Logic Lost during Rebase + Added support for Lubimyczytac and ranobedb Fixed lubimyczytac metadata not saving during EPUB bookdrop imports. Root cause: The Angular bookdrop form was missing lubimyczytacId and lubimyczytacRating fields, preventing these values from being sent to the backend during finalization. Backend fixes: - EpubMetadataExtractor: Fixed method references with explicit lambdas for lubimyczytac field setters (ranobedb also updated preventively) - BookMetadataUpdater: Fixed method references with explicit lambdas for lubimyczytac field updates (ranobedb also updated preventively) Frontend fix: - bookdrop-file-review.component.ts: Added lubimyczytacId and lubimyczytacRating form fields to createMetadataForm() and resetMetadata() methods This ensures lubimyczytac metadata extracted from EPUB booklore:tags flows correctly through: extraction → database → UI form → backend → final book record. * Restore Original ONLY Calibre custom column name to #pagecount So as not to break other users flow. * fix: resolve compilation error in EpubProcessor.java - Changed bookEntity.getFileName() to bookEntity.getPrimaryBookFile().getFileName() to fix gradlew build check * Add EPUB 3 compliant prefix declaration for custom booklore metadata Declares the booklore: prefix in the package element's prefix attribute according to EPUB 3 specification for custom vocabularies. * Move hardcoverBookId to standard URN identifier format Changed hardcoverBookId from custom booklore metadata to standard dc:identifier with urn:hardcoverbook: prefix for consistency with other identifiers. * UPDATE: migrate hardcover_book_id to VARCHAR and fix related issues ## Changes ### 1. Database Migration - Changed hardcover_book_id column from INTEGER to VARCHAR(255) - Supports alphanumeric book IDs from Hardcover API - Updated HardcoverSyncService to handle String ↔ Integer conversion ### 2. Metadata Editor Fixes - Fixed metadata change detection for provider-specific fields - Added missing clearFlags entries: hardcoverBookId, lubimyczytacId, lubimyczytacRating - Resolves issue where rating/review count fields wouldn't save independently Affected fields now save properly independently: - Amazon rating & review count - Goodreads rating & review count - Hardcover rating & review count - Lubimyczytac rating - Ranobedb rating ### 3. Web Reader Hardcover Sync (Major Fix) Since I changed the data type for hardcover_book_id I had to update references in the HardCoverSyncService.java This turned out to reveal the Sync Service was not fully implemented: - Added Hardcover progress sync to ReadingProgressService - Previously, HARDCOVER sync only worked for Kobo and KOReader devices - Web browser reading progress now syncs to Hardcover.app ## Files Modified - booklore-api/src/main/resources/db/migration/V107__Change_hardcover_book_id_to_varchar.sql - booklore-api/src/main/java/.../service/hardcover/HardcoverSyncService.java - booklore-api/src/main/java/.../service/progress/ReadingProgressService.java - booklore-ui/src/app/features/metadata/.../metadata-editor.component.ts ## Breaking Changes None - migration handles existing integer IDs gracefully ## Testing - Hardcover sync tested with web reader - Metadata editor field updates verified - Database migration confirmed successful --------- Co-authored-by: ACX <8075870+acx10@users.noreply.github.com>
This commit is contained in:
2
.gitignore
vendored
2
.gitignore
vendored
@@ -45,4 +45,4 @@ local-scripts/
|
||||
### Dev config, books, and data ###
|
||||
booklore-ui/test-results/
|
||||
booklore-api/src/main/resources/application-local.yaml
|
||||
/shared/
|
||||
/shared/
|
||||
|
||||
@@ -11,7 +11,7 @@ import java.util.Set;
|
||||
|
||||
@Getter
|
||||
@Setter
|
||||
@Builder
|
||||
@Builder(toBuilder = true)
|
||||
@AllArgsConstructor
|
||||
@NoArgsConstructor
|
||||
@JsonInclude(JsonInclude.Include.NON_NULL)
|
||||
@@ -37,7 +37,7 @@ public class BookMetadata {
|
||||
private Double goodreadsRating;
|
||||
private Integer goodreadsReviewCount;
|
||||
private String hardcoverId;
|
||||
private Integer hardcoverBookId;
|
||||
private String hardcoverBookId;
|
||||
private Double hardcoverRating;
|
||||
private Integer hardcoverReviewCount;
|
||||
private String doubanId;
|
||||
@@ -51,6 +51,7 @@ public class BookMetadata {
|
||||
private String externalUrl;
|
||||
private Instant coverUpdatedOn;
|
||||
private Set<String> authors;
|
||||
@Singular
|
||||
private Set<String> categories;
|
||||
private Set<String> moods;
|
||||
private Set<String> tags;
|
||||
|
||||
@@ -97,8 +97,8 @@ public class BookMetadataEntity {
|
||||
@Column(name = "hardcover_id", length = 100)
|
||||
private String hardcoverId;
|
||||
|
||||
@Column(name = "hardcover_book_id")
|
||||
private Integer hardcoverBookId;
|
||||
@Column(name = "hardcover_book_id", length = 100)
|
||||
private String hardcoverBookId;
|
||||
|
||||
@Column(name = "google_id", length = 100)
|
||||
private String googleId;
|
||||
|
||||
@@ -20,6 +20,8 @@ public class BookCreatorService {
|
||||
|
||||
private final AuthorRepository authorRepository;
|
||||
private final CategoryRepository categoryRepository;
|
||||
private final MoodRepository moodRepository;
|
||||
private final TagRepository tagRepository;
|
||||
private final BookRepository bookRepository;
|
||||
private final BookMetadataRepository bookMetadataRepository;
|
||||
|
||||
@@ -113,6 +115,28 @@ public class BookCreatorService {
|
||||
bookEntity.getMetadata().updateSearchText(); // Manually trigger search text update since collection modification doesn't trigger @PreUpdate
|
||||
}
|
||||
|
||||
public void addMoodsToBook(Set<String> moods, BookEntity bookEntity) {
|
||||
if (bookEntity.getMetadata().getMoods() == null) {
|
||||
bookEntity.getMetadata().setMoods(new HashSet<>());
|
||||
}
|
||||
moods.stream()
|
||||
.map(mood -> truncate(mood, 255))
|
||||
.map(truncated -> moodRepository.findByName(truncated)
|
||||
.orElseGet(() -> moodRepository.save(MoodEntity.builder().name(truncated).build())))
|
||||
.forEach(moodEntity -> bookEntity.getMetadata().getMoods().add(moodEntity));
|
||||
}
|
||||
|
||||
public void addTagsToBook(Set<String> tags, BookEntity bookEntity) {
|
||||
if (bookEntity.getMetadata().getTags() == null) {
|
||||
bookEntity.getMetadata().setTags(new HashSet<>());
|
||||
}
|
||||
tags.stream()
|
||||
.map(tag -> truncate(tag, 255))
|
||||
.map(truncated -> tagRepository.findByName(truncated)
|
||||
.orElseGet(() -> tagRepository.save(TagEntity.builder().name(truncated).build())))
|
||||
.forEach(tagEntity -> bookEntity.getMetadata().getTags().add(tagEntity));
|
||||
}
|
||||
|
||||
private String truncate(String input, int maxLength) {
|
||||
if (input == null)
|
||||
return null;
|
||||
@@ -126,4 +150,4 @@ public class BookCreatorService {
|
||||
bookRepository.save(bookEntity);
|
||||
bookMetadataRepository.save(bookEntity.getMetadata());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -68,9 +68,9 @@ public class EpubProcessor extends AbstractFileProcessor implements BookFileProc
|
||||
|
||||
boolean saved;
|
||||
try (ByteArrayInputStream bais = new ByteArrayInputStream(coverData)) {
|
||||
BufferedImage originalImage = FileService.readImage(bais);
|
||||
BufferedImage originalImage = ImageIO.read(bais);
|
||||
if (originalImage == null) {
|
||||
log.warn("Failed to decode cover image for EPUB '{}'", bookEntity.getPrimaryBookFile().getFileName());
|
||||
log.warn("Cover image found but could not be decoded (possibly SVG or unsupported format) in EPUB '{}'", bookEntity.getPrimaryBookFile().getFileName());
|
||||
return false;
|
||||
}
|
||||
saved = fileService.saveCoverImages(originalImage, bookEntity.getId());
|
||||
@@ -119,10 +119,13 @@ public class EpubProcessor extends AbstractFileProcessor implements BookFileProc
|
||||
metadata.setGoodreadsRating(epubMetadata.getGoodreadsRating());
|
||||
metadata.setGoodreadsReviewCount(epubMetadata.getGoodreadsReviewCount());
|
||||
metadata.setHardcoverId(truncate(epubMetadata.getHardcoverId(), 100));
|
||||
metadata.setHardcoverBookId(epubMetadata.getHardcoverBookId());
|
||||
metadata.setHardcoverRating(epubMetadata.getHardcoverRating());
|
||||
metadata.setHardcoverReviewCount(epubMetadata.getHardcoverReviewCount());
|
||||
metadata.setGoogleId(truncate(epubMetadata.getGoogleId(), 100));
|
||||
metadata.setComicvineId(truncate(epubMetadata.getComicvineId(), 100));
|
||||
metadata.setLubimyczytacId(truncate(epubMetadata.getLubimyczytacId(), 100));
|
||||
metadata.setLubimyczytacRating(epubMetadata.getLubimyczytacRating());
|
||||
metadata.setRanobedbId(truncate(epubMetadata.getRanobedbId(), 100));
|
||||
metadata.setRanobedbRating(epubMetadata.getRanobedbRating());
|
||||
|
||||
@@ -134,5 +137,19 @@ public class EpubProcessor extends AbstractFileProcessor implements BookFileProc
|
||||
.collect(Collectors.toSet());
|
||||
bookCreatorService.addCategoriesToBook(validSubjects, bookEntity);
|
||||
}
|
||||
|
||||
if (epubMetadata.getMoods() != null && !epubMetadata.getMoods().isEmpty()) {
|
||||
Set<String> validMoods = epubMetadata.getMoods().stream()
|
||||
.filter(s -> s != null && !s.isBlank() && s.length() <= 255)
|
||||
.collect(Collectors.toSet());
|
||||
bookCreatorService.addMoodsToBook(validMoods, bookEntity);
|
||||
}
|
||||
|
||||
if (epubMetadata.getTags() != null && !epubMetadata.getTags().isEmpty()) {
|
||||
Set<String> validTags = epubMetadata.getTags().stream()
|
||||
.filter(s -> s != null && !s.isBlank() && s.length() <= 255)
|
||||
.collect(Collectors.toSet());
|
||||
bookCreatorService.addTagsToBook(validTags, bookEntity);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -101,7 +101,8 @@ public class HardcoverSyncService {
|
||||
log.debug("Using stored Hardcover book ID: {}", hardcoverBook.bookId);
|
||||
|
||||
// Always fetch the default edition and page count from Hardcover
|
||||
HardcoverBookInfo fetched = findHardcoverBookById(hardcoverBook.bookId);
|
||||
Integer bookIdInt = Integer.parseInt(hardcoverBook.bookId);
|
||||
HardcoverBookInfo fetched = findHardcoverBookById(bookIdInt);
|
||||
if (fetched != null) {
|
||||
hardcoverBook.editionId = fetched.editionId;
|
||||
hardcoverBook.pages = fetched.pages;
|
||||
@@ -131,7 +132,8 @@ public class HardcoverSyncService {
|
||||
userId, progressPercent, hardcoverBook.pages, progressPages);
|
||||
|
||||
// Step 1: Add/update the book in user's library
|
||||
Integer userBookId = insertOrGetUserBook(hardcoverBook.bookId, hardcoverBook.editionId, statusId);
|
||||
Integer bookIdInt = Integer.parseInt(hardcoverBook.bookId);
|
||||
Integer userBookId = insertOrGetUserBook(bookIdInt, hardcoverBook.editionId, statusId);
|
||||
if (userBookId == null) {
|
||||
log.warn("Hardcover sync failed: could not get user_book_id for book {}", bookId);
|
||||
return;
|
||||
@@ -228,12 +230,12 @@ public class HardcoverSyncService {
|
||||
// Extract book info
|
||||
HardcoverBookInfo info = new HardcoverBookInfo();
|
||||
|
||||
// The 'id' field contains the numeric book ID
|
||||
// The 'id' field contains the book ID
|
||||
Object idObj = document.get("id");
|
||||
if (idObj instanceof String) {
|
||||
info.bookId = Integer.parseInt((String) idObj);
|
||||
info.bookId = (String) idObj;
|
||||
} else if (idObj instanceof Number) {
|
||||
info.bookId = ((Number) idObj).intValue();
|
||||
info.bookId = String.valueOf(((Number) idObj).intValue());
|
||||
}
|
||||
|
||||
// Get page count
|
||||
@@ -286,7 +288,7 @@ public class HardcoverSyncService {
|
||||
* Find an edition by ISBN for a given book.
|
||||
* This queries Hardcover's editions table to match by ISBN.
|
||||
*/
|
||||
private EditionInfo findEditionByIsbn(Integer bookId, String isbn) {
|
||||
private EditionInfo findEditionByIsbn(String bookId, String isbn) {
|
||||
String query = """
|
||||
query FindEditionByIsbn($bookId: Int!, $isbn: String!) {
|
||||
editions(where: {
|
||||
@@ -304,7 +306,7 @@ public class HardcoverSyncService {
|
||||
|
||||
GraphQLRequest request = new GraphQLRequest();
|
||||
request.setQuery(query);
|
||||
request.setVariables(Map.of("bookId", bookId, "isbn", isbn));
|
||||
request.setVariables(Map.of("bookId", Integer.parseInt(bookId), "isbn", isbn));
|
||||
|
||||
try {
|
||||
Map<String, Object> response = executeGraphQL(request);
|
||||
@@ -410,7 +412,7 @@ public class HardcoverSyncService {
|
||||
|
||||
Map<String, Object> book = books.getFirst();
|
||||
HardcoverBookInfo info = new HardcoverBookInfo();
|
||||
info.bookId = bookId;
|
||||
info.bookId = String.valueOf(bookId);
|
||||
|
||||
Object defaultPhysicalEditionObj = book.get("default_physical_edition_id");
|
||||
if (defaultPhysicalEditionObj instanceof Number) {
|
||||
@@ -722,7 +724,7 @@ public class HardcoverSyncService {
|
||||
* Helper class to hold Hardcover book information.
|
||||
*/
|
||||
private static class HardcoverBookInfo {
|
||||
Integer bookId;
|
||||
String bookId;
|
||||
Integer editionId;
|
||||
Integer pages;
|
||||
}
|
||||
|
||||
@@ -148,6 +148,9 @@ public class BookMetadataUpdater {
|
||||
}
|
||||
|
||||
private void updateBasicFields(BookMetadata m, BookMetadataEntity e, MetadataClearFlags clear, MetadataReplaceMode replaceMode) {
|
||||
if (clear == null) {
|
||||
clear = new MetadataClearFlags();
|
||||
}
|
||||
handleFieldUpdate(e.getTitleLocked(), clear.isTitle(), m.getTitle(), v -> e.setTitle(nullIfBlank(v)), e::getTitle, replaceMode);
|
||||
handleFieldUpdate(e.getSubtitleLocked(), clear.isSubtitle(), m.getSubtitle(), v -> e.setSubtitle(nullIfBlank(v)), e::getSubtitle, replaceMode);
|
||||
handleFieldUpdate(e.getPublisherLocked(), clear.isPublisher(), m.getPublisher(), v -> e.setPublisher(nullIfBlank(v)), e::getPublisher, replaceMode);
|
||||
@@ -172,8 +175,8 @@ public class BookMetadataUpdater {
|
||||
handleFieldUpdate(e.getGoodreadsReviewCountLocked(), clear.isGoodreadsReviewCount(), m.getGoodreadsReviewCount(), e::setGoodreadsReviewCount, e::getGoodreadsReviewCount, replaceMode);
|
||||
handleFieldUpdate(e.getHardcoverRatingLocked(), clear.isHardcoverRating(), m.getHardcoverRating(), e::setHardcoverRating, e::getHardcoverRating, replaceMode);
|
||||
handleFieldUpdate(e.getHardcoverReviewCountLocked(), clear.isHardcoverReviewCount(), m.getHardcoverReviewCount(), e::setHardcoverReviewCount, e::getHardcoverReviewCount, replaceMode);
|
||||
handleFieldUpdate(e.getLubimyczytacIdLocked(), clear.isLubimyczytacId(), m.getLubimyczytacId(), v -> e.setLubimyczytacId(nullIfBlank(v)), e::getLubimyczytacId, replaceMode);
|
||||
handleFieldUpdate(e.getLubimyczytacRatingLocked(), clear.isLubimyczytacRating(), m.getLubimyczytacRating(), e::setLubimyczytacRating, e::getLubimyczytacRating, replaceMode);
|
||||
handleFieldUpdate(e.getLubimyczytacIdLocked(), clear.isLubimyczytacId(), m.getLubimyczytacId(), v -> e.setLubimyczytacId(nullIfBlank(v)), () -> e.getLubimyczytacId(), replaceMode);
|
||||
handleFieldUpdate(e.getLubimyczytacRatingLocked(), clear.isLubimyczytacRating(), m.getLubimyczytacRating(), v -> e.setLubimyczytacRating(v), () -> e.getLubimyczytacRating(), replaceMode);
|
||||
handleFieldUpdate(e.getRanobedbIdLocked(), clear.isRanobedbId(), m.getRanobedbId(), v -> e.setRanobedbId(nullIfBlank(v)), e::getRanobedbId, replaceMode);
|
||||
handleFieldUpdate(e.getRanobedbRatingLocked(), clear.isRanobedbRating(), m.getRanobedbRating(), e::setRanobedbRating, e::getRanobedbRating, replaceMode);
|
||||
}
|
||||
|
||||
@@ -2,15 +2,13 @@ package com.adityachandel.booklore.service.metadata.extractor;
|
||||
|
||||
import com.adityachandel.booklore.model.dto.BookMetadata;
|
||||
import io.documentnode.epub4j.domain.Book;
|
||||
import io.documentnode.epub4j.domain.MediaType;
|
||||
import io.documentnode.epub4j.domain.MediaTypes;
|
||||
import io.documentnode.epub4j.domain.Resource;
|
||||
import io.documentnode.epub4j.epub.EpubReader;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import net.lingala.zip4j.ZipFile;
|
||||
import net.lingala.zip4j.model.FileHeader;
|
||||
import org.apache.commons.io.FilenameUtils;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.springframework.boot.configurationprocessor.json.JSONArray;
|
||||
import org.springframework.boot.configurationprocessor.json.JSONException;
|
||||
import org.springframework.boot.configurationprocessor.json.JSONObject;
|
||||
import org.springframework.stereotype.Component;
|
||||
@@ -22,82 +20,114 @@ import javax.xml.XMLConstants;
|
||||
import javax.xml.parsers.DocumentBuilder;
|
||||
import javax.xml.parsers.DocumentBuilderFactory;
|
||||
import java.io.File;
|
||||
import java.io.IOException;
|
||||
import java.io.FileInputStream;
|
||||
import java.io.InputStream;
|
||||
import java.net.URLDecoder;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.time.LocalDate;
|
||||
import java.time.OffsetDateTime;
|
||||
import java.util.*;
|
||||
import java.util.function.BiConsumer;
|
||||
import java.util.regex.Pattern;
|
||||
|
||||
@Slf4j
|
||||
@Component
|
||||
public class EpubMetadataExtractor implements FileMetadataExtractor {
|
||||
|
||||
private static final Pattern YEAR_ONLY_PATTERN = Pattern.compile("^\\d{4}$");
|
||||
private static final String OPF_NS = "http://www.idpf.org/2007/opf";
|
||||
private static final Pattern YEAR_ONLY_PATTERN = Pattern.compile("^\\d{4}$");
|
||||
private static final Pattern ISBN_13_PATTERN = Pattern.compile("\\d{13}");
|
||||
private static final Pattern ISBN_10_PATTERN = Pattern.compile("\\d{10}|[0-9]{9}[xX]");
|
||||
|
||||
// List of all media types that epub4j has so we can lazy load them.
|
||||
// Note that we have to add in null to handle files without extentions like mimetype.
|
||||
private static final List<MediaType> MEDIA_TYPES = new ArrayList<>();
|
||||
private static final Pattern ISBN_SEPARATOR_PATTERN = Pattern.compile("[- ]");
|
||||
private static class IdentifierMapping {
|
||||
final String prefix;
|
||||
final String fieldName;
|
||||
final BiConsumer<BookMetadata.BookMetadataBuilder, String> setter;
|
||||
|
||||
static {
|
||||
MEDIA_TYPES.addAll(Arrays.asList(MediaTypes.mediaTypes));
|
||||
MEDIA_TYPES.add(null);
|
||||
IdentifierMapping(String prefix, String fieldName, BiConsumer<BookMetadata.BookMetadataBuilder, String> setter) {
|
||||
this.prefix = prefix;
|
||||
this.fieldName = fieldName;
|
||||
this.setter = setter;
|
||||
}
|
||||
}
|
||||
|
||||
private static final List<IdentifierMapping> IDENTIFIER_PREFIX_MAPPINGS = List.of(
|
||||
new IdentifierMapping("urn:isbn:", "isbn", null), // Special handling for ISBN URNs
|
||||
new IdentifierMapping("urn:amazon:", "asin", BookMetadata.BookMetadataBuilder::asin),
|
||||
new IdentifierMapping("urn:goodreads:", "goodreadsId", BookMetadata.BookMetadataBuilder::goodreadsId),
|
||||
new IdentifierMapping("urn:google:", "googleId", BookMetadata.BookMetadataBuilder::googleId),
|
||||
new IdentifierMapping("urn:hardcover:", "hardcoverId", BookMetadata.BookMetadataBuilder::hardcoverId),
|
||||
new IdentifierMapping("urn:hardcoverbook:", "hardcoverBookId", BookMetadata.BookMetadataBuilder::hardcoverBookId),
|
||||
new IdentifierMapping("urn:comicvine:", "comicvineId", BookMetadata.BookMetadataBuilder::comicvineId),
|
||||
new IdentifierMapping("urn:lubimyczytac:", "lubimyczytacId", (builder, value) -> builder.lubimyczytacId(value)),
|
||||
new IdentifierMapping("urn:ranobedb:", "ranobedbId", BookMetadata.BookMetadataBuilder::ranobedbId),
|
||||
new IdentifierMapping("asin:", "asin", BookMetadata.BookMetadataBuilder::asin),
|
||||
new IdentifierMapping("amazon:", "asin", BookMetadata.BookMetadataBuilder::asin),
|
||||
new IdentifierMapping("mobi-asin:", "asin", BookMetadata.BookMetadataBuilder::asin),
|
||||
new IdentifierMapping("goodreads:", "goodreadsId", BookMetadata.BookMetadataBuilder::goodreadsId),
|
||||
new IdentifierMapping("google:", "googleId", BookMetadata.BookMetadataBuilder::googleId),
|
||||
new IdentifierMapping("hardcover:", "hardcoverId", BookMetadata.BookMetadataBuilder::hardcoverId),
|
||||
new IdentifierMapping("hardcoverbook:", "hardcoverBookId", BookMetadata.BookMetadataBuilder::hardcoverBookId),
|
||||
new IdentifierMapping("comicvine:", "comicvineId", BookMetadata.BookMetadataBuilder::comicvineId),
|
||||
new IdentifierMapping("lubimyczytac:", "lubimyczytacId", (builder, value) -> builder.lubimyczytacId(value)),
|
||||
new IdentifierMapping("ranobedb:", "ranobedbId", BookMetadata.BookMetadataBuilder::ranobedbId)
|
||||
);
|
||||
|
||||
private static final Map<String, BiConsumer<BookMetadata.BookMetadataBuilder, String>> SCHEME_MAPPINGS = Map.of(
|
||||
"GOODREADS", BookMetadata.BookMetadataBuilder::goodreadsId,
|
||||
"COMICVINE", BookMetadata.BookMetadataBuilder::comicvineId,
|
||||
"GOOGLE", BookMetadata.BookMetadataBuilder::googleId,
|
||||
"AMAZON", BookMetadata.BookMetadataBuilder::asin,
|
||||
"HARDCOVER", BookMetadata.BookMetadataBuilder::hardcoverId
|
||||
);
|
||||
|
||||
private static final Map<String, BiConsumer<BookMetadata.BookMetadataBuilder, String>> CALIBRE_FIELD_MAPPINGS = Map.ofEntries(
|
||||
Map.entry("#subtitle", BookMetadata.BookMetadataBuilder::subtitle),
|
||||
Map.entry("#pagecount", (builder, value) -> safeParseInt(value, builder::pageCount)),
|
||||
Map.entry("#series_total", (builder, value) -> safeParseInt(value, builder::seriesTotal)),
|
||||
Map.entry("#amazon_rating", (builder, value) -> safeParseDouble(value, builder::amazonRating)),
|
||||
Map.entry("#amazon_review_count", (builder, value) -> safeParseInt(value, builder::amazonReviewCount)),
|
||||
Map.entry("#goodreads_rating", (builder, value) -> safeParseDouble(value, builder::goodreadsRating)),
|
||||
Map.entry("#goodreads_review_count", (builder, value) -> safeParseInt(value, builder::goodreadsReviewCount)),
|
||||
Map.entry("#hardcover_rating", (builder, value) -> safeParseDouble(value, builder::hardcoverRating)),
|
||||
Map.entry("#hardcover_review_count", (builder, value) -> safeParseInt(value, builder::hardcoverReviewCount)),
|
||||
Map.entry("#lubimyczytac_rating", (builder, value) -> safeParseDouble(value, builder::lubimyczytacRating)),
|
||||
Map.entry("#ranobedb_rating", (builder, value) -> safeParseDouble(value, builder::ranobedbRating))
|
||||
);
|
||||
|
||||
@Override
|
||||
public byte[] extractCover(File epubFile) {
|
||||
try (ZipFile zip = new ZipFile(epubFile)) {
|
||||
Book epub = new EpubReader().readEpubLazy(zip, "UTF-8", MEDIA_TYPES);
|
||||
try (FileInputStream fis = new FileInputStream(epubFile)) {
|
||||
Book epub = new EpubReader().readEpub(fis);
|
||||
io.documentnode.epub4j.domain.Resource coverImage = epub.getCoverImage();
|
||||
|
||||
// First we read the cover image from the epub4j reader.
|
||||
// We filter to only images since it will default to the first page.
|
||||
byte[] image = getImageFromEpubResource(epub.getCoverImage());
|
||||
if (image != null) {
|
||||
return image;
|
||||
if (coverImage == null) {
|
||||
String coverHref = findCoverImageHrefInOpf(epubFile);
|
||||
if (coverHref != null) {
|
||||
byte[] data = extractFileFromZip(epubFile, coverHref);
|
||||
if (data != null) return data;
|
||||
}
|
||||
}
|
||||
|
||||
// First fallback to reading the cover image based on the cover
|
||||
String coverId = epub.getMetadata().getMetaAttribute("cover");
|
||||
if (coverId != null) {
|
||||
Resource coverResource = epub.getResources().getById(coverId);
|
||||
if (coverResource != null) {
|
||||
image = getImageFromEpubResource(coverResource);
|
||||
if (image != null) {
|
||||
return image;
|
||||
if (coverImage == null) {
|
||||
for (io.documentnode.epub4j.domain.Resource res : epub.getResources().getAll()) {
|
||||
String id = res.getId();
|
||||
String href = res.getHref();
|
||||
if ((id != null && id.toLowerCase().contains("cover")) ||
|
||||
(href != null && href.toLowerCase().contains("cover"))) {
|
||||
if (res.getMediaType() != null && res.getMediaType().getName().startsWith("image")) {
|
||||
coverImage = res;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// We fall back to reading the image based on the cover-image property.
|
||||
String coverHref = findCoverImageHrefInOpf(epubFile);
|
||||
if (coverHref != null) {
|
||||
image = extractFileFromZip(epubFile, coverHref);
|
||||
if (image != null) {
|
||||
return image;
|
||||
}
|
||||
}
|
||||
|
||||
// As a last resort we look at all of the files in the epub for something cover related.
|
||||
for (Resource res : epub.getResources().getAll()) {
|
||||
String id = res.getId();
|
||||
String href = res.getHref();
|
||||
if ((id != null && id.toLowerCase().contains("cover")) ||
|
||||
(href != null && href.toLowerCase().contains("cover"))) {
|
||||
image = getImageFromEpubResource(res);
|
||||
if (image != null) {
|
||||
return image;
|
||||
}
|
||||
}
|
||||
}
|
||||
return (coverImage != null) ? coverImage.getData() : null;
|
||||
} catch (Exception e) {
|
||||
log.warn("Failed to extract cover from EPUB: {}", epubFile.getName(), e);
|
||||
return null;
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
@Override
|
||||
@@ -129,6 +159,9 @@ public class EpubMetadataExtractor implements FileMetadataExtractor {
|
||||
|
||||
BookMetadata.BookMetadataBuilder builderMeta = BookMetadata.builder();
|
||||
Set<String> categories = new HashSet<>();
|
||||
Set<String> moods = new HashSet<>();
|
||||
Set<String> tags = new HashSet<>();
|
||||
Set<String> processedIdentifierFields = new HashSet<>();
|
||||
|
||||
boolean seriesFound = false;
|
||||
boolean seriesIndexFound = false;
|
||||
@@ -169,7 +202,7 @@ public class EpubMetadataExtractor implements FileMetadataExtractor {
|
||||
}
|
||||
|
||||
if ("role".equals(prop) && StringUtils.isNotBlank(refines)) {
|
||||
creatorRoleById.put(refines.substring(1), content.toLowerCase());
|
||||
creatorRoleById.put(refines.substring(1), content.toLowerCase());
|
||||
}
|
||||
|
||||
if (!seriesFound && ("booklore:series".equals(prop) || "calibre:series".equals(name) || "belongs-to-collection".equals(prop))) {
|
||||
@@ -184,33 +217,34 @@ public class EpubMetadataExtractor implements FileMetadataExtractor {
|
||||
}
|
||||
}
|
||||
|
||||
if ("calibre:pages".equals(name) || "pagecount".equals(name) || "schema:pagecount".equals(prop) || "media:pagecount".equals(prop) || "booklore:page_count".equals(prop)) {
|
||||
safeParseInt(content, builderMeta::pageCount);
|
||||
} else if ("calibre:user_metadata:#pagecount".equals(name)) {
|
||||
try {
|
||||
JSONObject jsonroot = new JSONObject(content);
|
||||
Object value = jsonroot.opt("#value#");
|
||||
safeParseInt(String.valueOf(value), builderMeta::pageCount);
|
||||
} catch (JSONException ignored) {
|
||||
}
|
||||
} else if ("calibre:user_metadata".equals(prop)) {
|
||||
try {
|
||||
JSONObject jsonroot = new JSONObject(content);
|
||||
JSONObject pages = jsonroot.getJSONObject("#pagecount");
|
||||
Object value = pages.opt("#value#");
|
||||
safeParseInt(String.valueOf(value), builderMeta::pageCount);
|
||||
} catch (JSONException ignored) {
|
||||
}
|
||||
}
|
||||
|
||||
switch (prop) {
|
||||
case "booklore:asin" -> builderMeta.asin(content);
|
||||
case "booklore:goodreads_id" -> builderMeta.goodreadsId(content);
|
||||
case "booklore:comicvine_id" -> builderMeta.comicvineId(content);
|
||||
case "booklore:ranobedb_id" -> builderMeta.ranobedbId(content);
|
||||
case "booklore:hardcover_id" -> builderMeta.hardcoverId(content);
|
||||
case "booklore:google_books_id" -> builderMeta.googleId(content);
|
||||
case "booklore:page_count" -> safeParseInt(content, builderMeta::pageCount);
|
||||
case "booklore:moods" -> extractSetField(content, moods);
|
||||
case "booklore:tags" -> extractSetField(content, tags);
|
||||
case "booklore:series_total" -> safeParseInt(content, builderMeta::seriesTotal);
|
||||
case "booklore:amazon_rating" -> safeParseDouble(content, builderMeta::amazonRating);
|
||||
case "booklore:amazon_review_count" -> safeParseInt(content, builderMeta::amazonReviewCount);
|
||||
case "booklore:goodreads_rating" -> safeParseDouble(content, builderMeta::goodreadsRating);
|
||||
case "booklore:goodreads_review_count" -> safeParseInt(content, builderMeta::goodreadsReviewCount);
|
||||
case "booklore:hardcover_book_id" -> builderMeta.hardcoverBookId(content);
|
||||
case "booklore:hardcover_rating" -> safeParseDouble(content, builderMeta::hardcoverRating);
|
||||
case "booklore:hardcover_review_count" -> safeParseInt(content, builderMeta::hardcoverReviewCount);
|
||||
case "booklore:lubimyczytac_rating" -> safeParseDouble(content, value -> builderMeta.lubimyczytacRating(value));
|
||||
case "booklore:ranobedb_rating" -> safeParseDouble(content, builderMeta::ranobedbRating);
|
||||
}
|
||||
|
||||
if ("calibre:user_metadata".equals(prop)) {
|
||||
try {
|
||||
JSONObject jsonroot = new JSONObject(content);
|
||||
extractCalibreUserMetadata(jsonroot, builderMeta, moods, tags);
|
||||
} catch (JSONException e) {
|
||||
log.warn("Failed to parse Calibre user_metadata JSON: {}", e.getMessage());
|
||||
}
|
||||
}
|
||||
}
|
||||
case "creator" -> {
|
||||
@@ -232,28 +266,20 @@ public class EpubMetadataExtractor implements FileMetadataExtractor {
|
||||
case "language" -> builderMeta.language(text);
|
||||
case "identifier" -> {
|
||||
String scheme = el.getAttributeNS(OPF_NS, "scheme").toUpperCase();
|
||||
String value = text.toLowerCase().startsWith("isbn:") ? text.substring(5) : text;
|
||||
String value = text.toLowerCase();
|
||||
|
||||
if (processIdentifierWithPrefix(value, builderMeta, processedIdentifierFields)) {
|
||||
continue;
|
||||
}
|
||||
|
||||
if (value.startsWith("isbn:")) {
|
||||
value = value.substring("isbn:".length());
|
||||
}
|
||||
|
||||
if (!scheme.isEmpty()) {
|
||||
switch (scheme) {
|
||||
case "ISBN" -> {
|
||||
String cleanValue = ISBN_SEPARATOR_PATTERN.matcher(value).replaceAll("");
|
||||
if (cleanValue.length() == 13) builderMeta.isbn13(value);
|
||||
else if (cleanValue.length() == 10) builderMeta.isbn10(value);
|
||||
}
|
||||
case "GOODREADS" -> builderMeta.goodreadsId(value);
|
||||
case "COMICVINE" -> builderMeta.comicvineId(value);
|
||||
case "RANOBEDB" -> builderMeta.ranobedbId(value);
|
||||
case "GOOGLE" -> builderMeta.googleId(value);
|
||||
case "AMAZON" -> builderMeta.asin(value);
|
||||
case "HARDCOVER" -> builderMeta.hardcoverId(value);
|
||||
}
|
||||
processIdentifierByScheme(scheme, value, builderMeta, processedIdentifierFields);
|
||||
} else {
|
||||
if (text.toLowerCase().startsWith("isbn:")) {
|
||||
String cleanValue = ISBN_SEPARATOR_PATTERN.matcher(value).replaceAll("");
|
||||
if (cleanValue.length() == 13) builderMeta.isbn13(value);
|
||||
else if (cleanValue.length() == 10) builderMeta.isbn10(value);
|
||||
}
|
||||
processIsbnIdentifier(value, builderMeta, processedIdentifierFields);
|
||||
}
|
||||
}
|
||||
case "date" -> {
|
||||
@@ -296,6 +322,8 @@ public class EpubMetadataExtractor implements FileMetadataExtractor {
|
||||
|
||||
builderMeta.authors(creatorsByRole.get("aut"));
|
||||
builderMeta.categories(categories);
|
||||
builderMeta.moods(moods);
|
||||
builderMeta.tags(tags);
|
||||
|
||||
BookMetadata extractedMetadata = builderMeta.build();
|
||||
|
||||
@@ -314,19 +342,130 @@ public class EpubMetadataExtractor implements FileMetadataExtractor {
|
||||
}
|
||||
}
|
||||
|
||||
private void safeParseInt(String value, java.util.function.IntConsumer setter) {
|
||||
private boolean processIdentifierWithPrefix(String value, BookMetadata.BookMetadataBuilder builder,
|
||||
Set<String> processedFields) {
|
||||
for (IdentifierMapping mapping : IDENTIFIER_PREFIX_MAPPINGS) {
|
||||
if (value.startsWith(mapping.prefix)) {
|
||||
String extractedValue = value.substring(mapping.prefix.length());
|
||||
|
||||
// Special handling for ISBN URNs - pass to ISBN processor
|
||||
if ("isbn".equals(mapping.fieldName)) {
|
||||
processIsbnIdentifier(extractedValue, builder, processedFields);
|
||||
return true;
|
||||
}
|
||||
|
||||
if (!processedFields.contains(mapping.fieldName)) {
|
||||
mapping.setter.accept(builder, extractedValue);
|
||||
processedFields.add(mapping.fieldName);
|
||||
}
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
private void processIdentifierByScheme(String scheme, String value, BookMetadata.BookMetadataBuilder builder,
|
||||
Set<String> processedFields) {
|
||||
if ("ISBN".equals(scheme)) {
|
||||
processIsbnIdentifier(value, builder, processedFields);
|
||||
} else {
|
||||
BiConsumer<BookMetadata.BookMetadataBuilder, String> setter = SCHEME_MAPPINGS.get(scheme);
|
||||
if (setter != null) {
|
||||
String fieldName = getFieldNameForScheme(scheme);
|
||||
if (!processedFields.contains(fieldName)) {
|
||||
setter.accept(builder, value);
|
||||
processedFields.add(fieldName);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void processIsbnIdentifier(String value, BookMetadata.BookMetadataBuilder builder,
|
||||
Set<String> processedFields) {
|
||||
String cleanIsbn = value.replaceAll("[- ]", "");
|
||||
|
||||
if (cleanIsbn.length() == 13 && ISBN_13_PATTERN.matcher(cleanIsbn).matches()) {
|
||||
if (!processedFields.contains("isbn13")) {
|
||||
builder.isbn13(value);
|
||||
processedFields.add("isbn13");
|
||||
}
|
||||
} else if (cleanIsbn.length() == 10 && ISBN_10_PATTERN.matcher(cleanIsbn).matches()) {
|
||||
if (!processedFields.contains("isbn10")) {
|
||||
builder.isbn10(value);
|
||||
processedFields.add("isbn10");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private String getFieldNameForScheme(String scheme) {
|
||||
return switch (scheme) {
|
||||
case "GOODREADS" -> "goodreadsId";
|
||||
case "COMICVINE" -> "comicvineId";
|
||||
case "GOOGLE" -> "googleId";
|
||||
case "AMAZON" -> "asin";
|
||||
case "HARDCOVER" -> "hardcoverId";
|
||||
default -> scheme.toLowerCase();
|
||||
};
|
||||
}
|
||||
|
||||
private static void safeParseInt(String value, java.util.function.IntConsumer setter) {
|
||||
try {
|
||||
setter.accept(Integer.parseInt(value));
|
||||
} catch (NumberFormatException ignored) {
|
||||
}
|
||||
}
|
||||
|
||||
private void safeParseDouble(String value, java.util.function.DoubleConsumer setter) {
|
||||
private static void safeParseFloat(String value, java.util.function.Consumer<Float> setter) {
|
||||
try {
|
||||
setter.accept(Float.parseFloat(value));
|
||||
} catch (NumberFormatException ignored) {
|
||||
}
|
||||
}
|
||||
|
||||
private static void safeParseDouble(String value, java.util.function.DoubleConsumer setter) {
|
||||
try {
|
||||
setter.accept(Double.parseDouble(value));
|
||||
} catch (NumberFormatException ignored) {
|
||||
}
|
||||
}
|
||||
|
||||
private static void extractSetField(String value, Set<String> targetSet) {
|
||||
if (value == null || value.trim().isEmpty()) {
|
||||
return;
|
||||
}
|
||||
|
||||
String trimmedValue = value.trim();
|
||||
|
||||
if (trimmedValue.startsWith("[")) {
|
||||
try {
|
||||
JSONArray jsonArray = new JSONArray(trimmedValue);
|
||||
for (int i = 0; i < jsonArray.length(); i++) {
|
||||
String item = jsonArray.getString(i).trim();
|
||||
if (!item.isEmpty()) {
|
||||
targetSet.add(item);
|
||||
}
|
||||
}
|
||||
return;
|
||||
} catch (JSONException ignored) {
|
||||
}
|
||||
}
|
||||
|
||||
String[] items = trimmedValue.split(",");
|
||||
for (String item : items) {
|
||||
String trimmedItem = item.trim();
|
||||
if (!trimmedItem.isEmpty()) {
|
||||
targetSet.add(trimmedItem);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void extractAndSetUserMetadataSet(String value, java.util.function.Consumer<Set<String>> setter) {
|
||||
Set<String> items = new HashSet<>();
|
||||
extractSetField(value, items);
|
||||
if (!items.isEmpty()) {
|
||||
setter.accept(items);
|
||||
}
|
||||
}
|
||||
|
||||
private LocalDate parseDate(String value) {
|
||||
if (StringUtils.isBlank(value)) return null;
|
||||
@@ -363,24 +502,6 @@ public class EpubMetadataExtractor implements FileMetadataExtractor {
|
||||
return null;
|
||||
}
|
||||
|
||||
private byte[] getImageFromEpubResource(Resource res) {
|
||||
if (res == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
MediaType mt = res.getMediaType();
|
||||
if (mt == null || !mt.getName().startsWith("image")) {
|
||||
return null;
|
||||
}
|
||||
|
||||
try {
|
||||
return res.getData();
|
||||
} catch (IOException e) {
|
||||
log.warn("Failed to read data for resource", e);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
private String findCoverImageHrefInOpf(File epubFile) {
|
||||
try (ZipFile zip = new ZipFile(epubFile)) {
|
||||
DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance();
|
||||
@@ -459,4 +580,59 @@ public class EpubMetadataExtractor implements FileMetadataExtractor {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void extractCalibreUserMetadata(JSONObject userMetadata, BookMetadata.BookMetadataBuilder builder,
|
||||
Set<String> moodsSet, Set<String> tagsSet) {
|
||||
try {
|
||||
java.util.Iterator<String> keys = userMetadata.keys();
|
||||
|
||||
while (keys.hasNext()) {
|
||||
String fieldName = keys.next();
|
||||
|
||||
try {
|
||||
JSONObject fieldObject = userMetadata.optJSONObject(fieldName);
|
||||
if (fieldObject == null) {
|
||||
continue;
|
||||
}
|
||||
|
||||
Object rawValue = fieldObject.opt("#value#");
|
||||
if (rawValue == null) {
|
||||
rawValue = fieldObject.opt("value");
|
||||
if (rawValue == null) {
|
||||
rawValue = fieldObject.opt("#val#");
|
||||
}
|
||||
if (rawValue == null) {
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
String value = String.valueOf(rawValue).trim();
|
||||
if (value.isEmpty() || "null".equals(value)) {
|
||||
continue;
|
||||
}
|
||||
|
||||
if ("#moods".equals(fieldName)) {
|
||||
extractSetField(value, moodsSet);
|
||||
continue;
|
||||
}
|
||||
|
||||
if ("#extra_tags".equals(fieldName)) {
|
||||
extractSetField(value, tagsSet);
|
||||
continue;
|
||||
}
|
||||
|
||||
BiConsumer<BookMetadata.BookMetadataBuilder, String> mapper = CALIBRE_FIELD_MAPPINGS.get(fieldName);
|
||||
if (mapper != null) {
|
||||
mapper.accept(builder, value);
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
log.debug("Failed to extract Calibre field '{}': {}", fieldName, e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
log.debug("Failed to process Calibre user_metadata: {}", e.getMessage());
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -147,7 +147,7 @@ public class HardcoverParser implements BookParser {
|
||||
BookMetadata metadata = new BookMetadata();
|
||||
metadata.setHardcoverId(doc.getSlug());
|
||||
|
||||
Integer bookId = parseBookId(doc.getId());
|
||||
String bookId = parseBookId(doc.getId());
|
||||
if (bookId != null) {
|
||||
metadata.setHardcoverBookId(bookId);
|
||||
}
|
||||
@@ -187,16 +187,8 @@ public class HardcoverParser implements BookParser {
|
||||
return metadata;
|
||||
}
|
||||
|
||||
private Integer parseBookId(String id) {
|
||||
if (id == null) {
|
||||
return null;
|
||||
}
|
||||
try {
|
||||
return Integer.parseInt(id);
|
||||
} catch (NumberFormatException e) {
|
||||
log.debug("Could not parse Hardcover book ID: {}", id);
|
||||
return null;
|
||||
}
|
||||
private String parseBookId(String id) {
|
||||
return id;
|
||||
}
|
||||
|
||||
private void mapSeriesInfo(GraphQLResponse.Document doc, BookMetadata metadata) {
|
||||
@@ -215,7 +207,7 @@ public class HardcoverParser implements BookParser {
|
||||
}
|
||||
}
|
||||
|
||||
private void mapTagsAndMoods(GraphQLResponse.Document doc, BookMetadata metadata, Integer bookId, boolean fetchDetailedMoods) {
|
||||
private void mapTagsAndMoods(GraphQLResponse.Document doc, BookMetadata metadata, String bookId, boolean fetchDetailedMoods) {
|
||||
boolean usedDetailedMoods = false;
|
||||
|
||||
if (fetchDetailedMoods && bookId != null) {
|
||||
@@ -244,9 +236,10 @@ public class HardcoverParser implements BookParser {
|
||||
}
|
||||
}
|
||||
|
||||
private boolean tryFetchDetailedMoods(Integer bookId, BookMetadata metadata) {
|
||||
private boolean tryFetchDetailedMoods(String bookId, BookMetadata metadata) {
|
||||
try {
|
||||
HardcoverBookDetails details = hardcoverBookSearchService.fetchBookDetails(bookId);
|
||||
Integer bookIdInt = Integer.parseInt(bookId);
|
||||
HardcoverBookDetails details = hardcoverBookSearchService.fetchBookDetails(bookIdInt);
|
||||
if (details == null || details.getCachedTags() == null || details.getCachedTags().isEmpty()) {
|
||||
return false;
|
||||
}
|
||||
|
||||
@@ -91,7 +91,12 @@ public class EpubMetadataWriter implements MetadataWriter {
|
||||
boolean[] hasChanges = {false};
|
||||
MetadataCopyHelper helper = new MetadataCopyHelper(metadata);
|
||||
|
||||
helper.copyTitle(clear != null && clear.isTitle(), val -> replaceAndTrackChange(opfDoc, metadataElement, "title", DC_NS, val, hasChanges));
|
||||
helper.copyTitle(clear != null && clear.isTitle(), val -> {
|
||||
replaceAndTrackChange(opfDoc, metadataElement, "title", DC_NS, val, hasChanges);
|
||||
if (StringUtils.isNotBlank(metadata.getSubtitle())) {
|
||||
addSubtitleToTitle(metadataElement, opfDoc, metadata.getSubtitle());
|
||||
}
|
||||
});
|
||||
helper.copyDescription(clear != null && clear.isDescription(), val -> replaceAndTrackChange(opfDoc, metadataElement, "description", DC_NS, val, hasChanges));
|
||||
helper.copyPublisher(clear != null && clear.isPublisher(), val -> replaceAndTrackChange(opfDoc, metadataElement, "publisher", DC_NS, val, hasChanges));
|
||||
helper.copyPublishedDate(clear != null && clear.isPublishedDate(), val -> replaceAndTrackChange(opfDoc, metadataElement, "date", DC_NS, val != null ? val.toString() : null, hasChanges));
|
||||
@@ -122,36 +127,83 @@ public class EpubMetadataWriter implements MetadataWriter {
|
||||
hasChanges[0] = true;
|
||||
});
|
||||
|
||||
helper.copySeriesName(clear != null && clear.isSeriesName(), val -> replaceMetaElement(metadataElement, opfDoc, "calibre:series", val, hasChanges));
|
||||
|
||||
helper.copySeriesNumber(clear != null && clear.isSeriesNumber(), val -> {
|
||||
String formatted = val != null ? String.format("%.1f", val) : null;
|
||||
replaceMetaElement(metadataElement, opfDoc, "calibre:series_index", formatted, hasChanges);
|
||||
helper.copySeriesName(clear != null && clear.isSeriesName(), val -> {
|
||||
replaceBelongsToCollection(metadataElement, opfDoc, metadata.getSeriesName(), metadata.getSeriesNumber(), hasChanges);
|
||||
});
|
||||
|
||||
List<String> schemes = List.of("AMAZON", "GOOGLE", "GOODREADS", "HARDCOVER", "ISBN");
|
||||
helper.copySeriesNumber(clear != null && clear.isSeriesNumber(), val -> {
|
||||
replaceBelongsToCollection(metadataElement, opfDoc, metadata.getSeriesName(), metadata.getSeriesNumber(), hasChanges);
|
||||
});
|
||||
|
||||
for (String scheme : schemes) {
|
||||
|
||||
boolean clearFlag = clear != null && switch (scheme) {
|
||||
case "AMAZON" -> clear.isAsin();
|
||||
case "GOOGLE" -> clear.isGoogleId();
|
||||
case "COMICVINE" -> clear.isComicvineId();
|
||||
case "GOODREADS" -> clear.isGoodreadsId();
|
||||
case "HARDCOVER" -> clear.isHardcoverId();
|
||||
case "ISBN" -> clear.isIsbn10();
|
||||
default -> false;
|
||||
};
|
||||
|
||||
switch (scheme) {
|
||||
case "AMAZON" -> helper.copyAsin(clearFlag, idValue -> updateIdentifier(metadataElement, opfDoc, scheme, idValue, hasChanges));
|
||||
case "GOOGLE" -> helper.copyGoogleId(clearFlag, idValue -> updateIdentifier(metadataElement, opfDoc, scheme, idValue, hasChanges));
|
||||
case "GOODREADS" -> helper.copyGoodreadsId(clearFlag, idValue -> updateIdentifier(metadataElement, opfDoc, scheme, idValue, hasChanges));
|
||||
case "COMICVINE" -> helper.copyComicvineId(clearFlag, idValue -> updateIdentifier(metadataElement, opfDoc, scheme, idValue, hasChanges));
|
||||
case "HARDCOVER" -> helper.copyHardcoverId(clearFlag, idValue -> updateIdentifier(metadataElement, opfDoc, scheme, idValue, hasChanges));
|
||||
case "ISBN" -> helper.copyIsbn13(clearFlag, idValue -> updateIdentifier(metadataElement, opfDoc, scheme, idValue, hasChanges));
|
||||
helper.copyIsbn13(clear != null && clear.isIsbn13(), val -> {
|
||||
removeIdentifierByUrn(metadataElement, "isbn");
|
||||
if (val != null && !val.isBlank()) {
|
||||
metadataElement.appendChild(createIdentifierElement(opfDoc, "isbn", val));
|
||||
}
|
||||
}
|
||||
hasChanges[0] = true;
|
||||
});
|
||||
helper.copyIsbn10(clear != null && clear.isIsbn10(), val -> {
|
||||
if (val != null && !val.isBlank()) {
|
||||
metadataElement.appendChild(createIdentifierElement(opfDoc, "isbn", val));
|
||||
}
|
||||
hasChanges[0] = true;
|
||||
});
|
||||
helper.copyAsin(clear != null && clear.isAsin(), val -> {
|
||||
removeIdentifierByUrn(metadataElement, "amazon");
|
||||
if (val != null && !val.isBlank()) {
|
||||
metadataElement.appendChild(createIdentifierElement(opfDoc, "amazon", val));
|
||||
}
|
||||
hasChanges[0] = true;
|
||||
});
|
||||
helper.copyGoodreadsId(clear != null && clear.isGoodreadsId(), val -> {
|
||||
removeIdentifierByUrn(metadataElement, "goodreads");
|
||||
if (val != null && !val.isBlank()) {
|
||||
metadataElement.appendChild(createIdentifierElement(opfDoc, "goodreads", val));
|
||||
}
|
||||
hasChanges[0] = true;
|
||||
});
|
||||
helper.copyGoogleId(clear != null && clear.isGoogleId(), val -> {
|
||||
removeIdentifierByUrn(metadataElement, "google");
|
||||
if (val != null && !val.isBlank()) {
|
||||
metadataElement.appendChild(createIdentifierElement(opfDoc, "google", val));
|
||||
}
|
||||
hasChanges[0] = true;
|
||||
});
|
||||
helper.copyComicvineId(clear != null && clear.isComicvineId(), val -> {
|
||||
removeIdentifierByUrn(metadataElement, "comicvine");
|
||||
if (val != null && !val.isBlank()) {
|
||||
metadataElement.appendChild(createIdentifierElement(opfDoc, "comicvine", val));
|
||||
}
|
||||
hasChanges[0] = true;
|
||||
});
|
||||
helper.copyHardcoverId(clear != null && clear.isHardcoverId(), val -> {
|
||||
removeIdentifierByUrn(metadataElement, "hardcover");
|
||||
if (val != null && !val.isBlank()) {
|
||||
metadataElement.appendChild(createIdentifierElement(opfDoc, "hardcover", val));
|
||||
}
|
||||
hasChanges[0] = true;
|
||||
});
|
||||
helper.copyHardcoverBookId(clear != null && clear.isHardcoverBookId(), val -> {
|
||||
removeIdentifierByUrn(metadataElement, "hardcoverbook");
|
||||
if (val != null && !val.isBlank()) {
|
||||
metadataElement.appendChild(createIdentifierElement(opfDoc, "hardcoverbook", val));
|
||||
}
|
||||
hasChanges[0] = true;
|
||||
});
|
||||
helper.copyLubimyczytacId(clear != null && clear.isLubimyczytacId(), val -> {
|
||||
removeIdentifierByUrn(metadataElement, "lubimyczytac");
|
||||
if (val != null && !val.isBlank()) {
|
||||
metadataElement.appendChild(createIdentifierElement(opfDoc, "lubimyczytac", val));
|
||||
}
|
||||
hasChanges[0] = true;
|
||||
});
|
||||
helper.copyRanobedbId(clear != null && clear.isRanobedbId(), val -> {
|
||||
removeIdentifierByUrn(metadataElement, "ranobedb");
|
||||
if (val != null && !val.isBlank()) {
|
||||
metadataElement.appendChild(createIdentifierElement(opfDoc, "ranobedb", val));
|
||||
}
|
||||
hasChanges[0] = true;
|
||||
});
|
||||
|
||||
if (StringUtils.isNotBlank(thumbnailUrl)) {
|
||||
byte[] coverData = loadImage(thumbnailUrl);
|
||||
@@ -162,6 +214,9 @@ public class EpubMetadataWriter implements MetadataWriter {
|
||||
}
|
||||
|
||||
if (hasChanges[0]) {
|
||||
addBookloreMetadata(metadataElement, opfDoc, metadata);
|
||||
cleanupCalibreArtifacts(metadataElement, opfDoc);
|
||||
organizeMetadataElements(metadataElement);
|
||||
removeEmptyTextNodes(opfDoc);
|
||||
Transformer transformer = TransformerFactory.newInstance().newTransformer();
|
||||
transformer.setOutputProperty(OutputKeys.INDENT, "yes");
|
||||
@@ -520,12 +575,23 @@ public class EpubMetadataWriter implements MetadataWriter {
|
||||
}
|
||||
}
|
||||
}
|
||||
private void removeIdentifierByUrn(Element metadataElement, String urnScheme) {
|
||||
NodeList identifiers = metadataElement.getElementsByTagNameNS("*", "identifier");
|
||||
String urnPrefix = "urn:" + urnScheme.toLowerCase() + ":";
|
||||
String oldPrefix = urnScheme.toLowerCase() + ":";
|
||||
for (int i = identifiers.getLength() - 1; i >= 0; i--) {
|
||||
Element idElement = (Element) identifiers.item(i);
|
||||
String content = idElement.getTextContent().trim().toLowerCase();
|
||||
if (content.startsWith(urnPrefix) || content.startsWith(oldPrefix)) {
|
||||
metadataElement.removeChild(idElement);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private Element createIdentifierElement(Document doc, String scheme, String value) {
|
||||
Element id = doc.createElementNS("http://purl.org/dc/elements/1.1/", "identifier");
|
||||
id.setPrefix("dc");
|
||||
id.setAttributeNS(OPF_NS, "opf:scheme", scheme);
|
||||
id.setTextContent(value);
|
||||
id.setTextContent("urn:" + scheme.toLowerCase() + ":" + value);
|
||||
return id;
|
||||
}
|
||||
|
||||
@@ -644,4 +710,296 @@ public class EpubMetadataWriter implements MetadataWriter {
|
||||
log.warn("Failed to remove empty text nodes", e);
|
||||
}
|
||||
}
|
||||
|
||||
private void removeAllBookloreMetadata(Element metadataElement) {
|
||||
NodeList metas = metadataElement.getElementsByTagNameNS("*", "meta");
|
||||
for (int i = metas.getLength() - 1; i >= 0; i--) {
|
||||
Element meta = (Element) metas.item(i);
|
||||
String property = meta.getAttribute("property");
|
||||
if (property.startsWith("booklore:")) {
|
||||
metadataElement.removeChild(meta);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void replaceBelongsToCollection(Element metadataElement, Document doc, String seriesName, Float seriesNumber, boolean[] hasChanges) {
|
||||
NodeList metas = metadataElement.getElementsByTagNameNS("*", "meta");
|
||||
for (int i = metas.getLength() - 1; i >= 0; i--) {
|
||||
Element meta = (Element) metas.item(i);
|
||||
String property = meta.getAttribute("property");
|
||||
if ("belongs-to-collection".equals(property) || "collection-type".equals(property) || "group-position".equals(property)) {
|
||||
String id = meta.getAttribute("id");
|
||||
metadataElement.removeChild(meta);
|
||||
if (StringUtils.isNotBlank(id)) {
|
||||
removeMetaByRefines(metadataElement, "#" + id);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (StringUtils.isNotBlank(seriesName)) {
|
||||
String collectionId = "collection-" + UUID.randomUUID().toString().substring(0, 8);
|
||||
|
||||
Element collectionMeta = doc.createElementNS(OPF_NS, "meta");
|
||||
collectionMeta.setPrefix("opf");
|
||||
collectionMeta.setAttribute("id", collectionId);
|
||||
collectionMeta.setAttribute("property", "belongs-to-collection");
|
||||
collectionMeta.setTextContent(seriesName);
|
||||
metadataElement.appendChild(collectionMeta);
|
||||
|
||||
Element typeMeta = doc.createElementNS(OPF_NS, "meta");
|
||||
typeMeta.setPrefix("opf");
|
||||
typeMeta.setAttribute("property", "collection-type");
|
||||
typeMeta.setAttribute("refines", "#" + collectionId);
|
||||
typeMeta.setTextContent("series");
|
||||
metadataElement.appendChild(typeMeta);
|
||||
|
||||
if (seriesNumber != null && seriesNumber > 0) {
|
||||
Element positionMeta = doc.createElementNS(OPF_NS, "meta");
|
||||
positionMeta.setPrefix("opf");
|
||||
positionMeta.setAttribute("property", "group-position");
|
||||
positionMeta.setAttribute("refines", "#" + collectionId);
|
||||
positionMeta.setTextContent(String.format("%.0f", seriesNumber));
|
||||
metadataElement.appendChild(positionMeta);
|
||||
}
|
||||
|
||||
hasChanges[0] = true;
|
||||
}
|
||||
}
|
||||
|
||||
private void addSubtitleToTitle(Element metadataElement, Document doc, String subtitle) {
|
||||
final String DC_NS = "http://purl.org/dc/elements/1.1/";
|
||||
NodeList metas = metadataElement.getElementsByTagNameNS("*", "meta");
|
||||
for (int i = metas.getLength() - 1; i >= 0; i--) {
|
||||
Element meta = (Element) metas.item(i);
|
||||
String property = meta.getAttribute("property");
|
||||
String refines = meta.getAttribute("refines");
|
||||
if ("title-type".equals(property) && "subtitle".equals(meta.getTextContent())) {
|
||||
if (StringUtils.isNotBlank(refines)) {
|
||||
NodeList titles = metadataElement.getElementsByTagNameNS(DC_NS, "title");
|
||||
for (int j = titles.getLength() - 1; j >= 0; j--) {
|
||||
Element title = (Element) titles.item(j);
|
||||
if (("#" + title.getAttribute("id")).equals(refines)) {
|
||||
metadataElement.removeChild(title);
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
metadataElement.removeChild(meta);
|
||||
}
|
||||
}
|
||||
|
||||
String subtitleId = "subtitle-" + UUID.randomUUID().toString().substring(0, 8);
|
||||
Element subtitleElement = doc.createElementNS(DC_NS, "title");
|
||||
subtitleElement.setPrefix("dc");
|
||||
subtitleElement.setAttribute("id", subtitleId);
|
||||
subtitleElement.setTextContent(subtitle);
|
||||
metadataElement.appendChild(subtitleElement);
|
||||
|
||||
Element typeMeta = doc.createElementNS(OPF_NS, "meta");
|
||||
typeMeta.setPrefix("opf");
|
||||
typeMeta.setAttribute("refines", "#" + subtitleId);
|
||||
typeMeta.setAttribute("property", "title-type");
|
||||
typeMeta.setTextContent("subtitle");
|
||||
metadataElement.appendChild(typeMeta);
|
||||
}
|
||||
|
||||
private void addBookloreMetadata(Element metadataElement, Document doc, BookMetadataEntity metadata) {
|
||||
Element packageElement = doc.getDocumentElement();
|
||||
String existingPrefix = packageElement.getAttribute("prefix");
|
||||
String bookloreNamespace = "booklore: http://booklore.org/metadata/1.0/";
|
||||
|
||||
if (!existingPrefix.contains("booklore:")) {
|
||||
if (existingPrefix.isEmpty()) {
|
||||
packageElement.setAttribute("prefix", bookloreNamespace);
|
||||
} else {
|
||||
packageElement.setAttribute("prefix", existingPrefix.trim() + " " + bookloreNamespace);
|
||||
}
|
||||
}
|
||||
|
||||
removeAllBookloreMetadata(metadataElement);
|
||||
|
||||
if (metadata.getPageCount() != null && metadata.getPageCount() > 0) {
|
||||
metadataElement.appendChild(createBookloreMetaElement(doc, "page_count", String.valueOf(metadata.getPageCount())));
|
||||
}
|
||||
|
||||
if (metadata.getSeriesTotal() != null && metadata.getSeriesTotal() > 0) {
|
||||
metadataElement.appendChild(createBookloreMetaElement(doc, "series_total", String.valueOf(metadata.getSeriesTotal())));
|
||||
}
|
||||
|
||||
if (metadata.getAmazonRating() != null && metadata.getAmazonRating() > 0) {
|
||||
metadataElement.appendChild(createBookloreMetaElement(doc, "amazon_rating", String.valueOf(metadata.getAmazonRating())));
|
||||
}
|
||||
|
||||
if (metadata.getAmazonReviewCount() != null && metadata.getAmazonReviewCount() > 0) {
|
||||
metadataElement.appendChild(createBookloreMetaElement(doc, "amazon_review_count", String.valueOf(metadata.getAmazonReviewCount())));
|
||||
}
|
||||
|
||||
if (metadata.getGoodreadsRating() != null && metadata.getGoodreadsRating() > 0) {
|
||||
metadataElement.appendChild(createBookloreMetaElement(doc, "goodreads_rating", String.valueOf(metadata.getGoodreadsRating())));
|
||||
}
|
||||
|
||||
if (metadata.getGoodreadsReviewCount() != null && metadata.getGoodreadsReviewCount() > 0) {
|
||||
metadataElement.appendChild(createBookloreMetaElement(doc, "goodreads_review_count", String.valueOf(metadata.getGoodreadsReviewCount())));
|
||||
}
|
||||
|
||||
if (metadata.getHardcoverRating() != null && metadata.getHardcoverRating() > 0) {
|
||||
metadataElement.appendChild(createBookloreMetaElement(doc, "hardcover_rating", String.valueOf(metadata.getHardcoverRating())));
|
||||
}
|
||||
|
||||
if (metadata.getHardcoverReviewCount() != null && metadata.getHardcoverReviewCount() > 0) {
|
||||
metadataElement.appendChild(createBookloreMetaElement(doc, "hardcover_review_count", String.valueOf(metadata.getHardcoverReviewCount())));
|
||||
}
|
||||
|
||||
if (metadata.getLubimyczytacRating() != null && metadata.getLubimyczytacRating() > 0) {
|
||||
metadataElement.appendChild(createBookloreMetaElement(doc, "lubimyczytac_rating", String.valueOf(metadata.getLubimyczytacRating())));
|
||||
}
|
||||
|
||||
if (metadata.getRanobedbRating() != null && metadata.getRanobedbRating() > 0) {
|
||||
metadataElement.appendChild(createBookloreMetaElement(doc, "ranobedb_rating", String.valueOf(metadata.getRanobedbRating())));
|
||||
}
|
||||
|
||||
if (metadata.getMoods() != null && !metadata.getMoods().isEmpty()) {
|
||||
String moodsJson = "[" + String.join(", ", metadata.getMoods().stream()
|
||||
.map(mood -> "\"" + mood.getName().replace("\"", "\\\"") + "\"")
|
||||
.toList()) + "]";
|
||||
metadataElement.appendChild(createBookloreMetaElement(doc, "moods", moodsJson));
|
||||
}
|
||||
|
||||
if (metadata.getTags() != null && !metadata.getTags().isEmpty()) {
|
||||
String tagsJson = "[" + String.join(", ", metadata.getTags().stream()
|
||||
.map(tag -> "\"" + tag.getName().replace("\"", "\\\"") + "\"")
|
||||
.toList()) + "]";
|
||||
metadataElement.appendChild(createBookloreMetaElement(doc, "tags", tagsJson));
|
||||
}
|
||||
}
|
||||
|
||||
private Element createBookloreMetaElement(Document doc, String property, String value) {
|
||||
Element meta = doc.createElementNS(OPF_NS, "meta");
|
||||
meta.setPrefix("opf");
|
||||
meta.setAttribute("property", "booklore:" + property);
|
||||
meta.setTextContent(value);
|
||||
return meta;
|
||||
}
|
||||
|
||||
private void cleanupCalibreArtifacts(Element metadataElement, Document doc) {
|
||||
Element packageElement = doc.getDocumentElement();
|
||||
if (packageElement.hasAttribute("prefix")) {
|
||||
String prefix = packageElement.getAttribute("prefix");
|
||||
if (prefix.contains("calibre:")) {
|
||||
prefix = prefix.replaceAll("calibre:\\s*https?://[^\\s]+", "").trim();
|
||||
if (prefix.isEmpty()) {
|
||||
packageElement.removeAttribute("prefix");
|
||||
} else {
|
||||
packageElement.setAttribute("prefix", prefix);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (metadataElement.hasAttribute("xmlns:calibre")) {
|
||||
metadataElement.removeAttribute("xmlns:calibre");
|
||||
}
|
||||
|
||||
final String DC_NS = "http://purl.org/dc/elements/1.1/";
|
||||
NodeList identifiers = metadataElement.getElementsByTagNameNS(DC_NS, "identifier");
|
||||
for (int i = identifiers.getLength() - 1; i >= 0; i--) {
|
||||
Element idElement = (Element) identifiers.item(i);
|
||||
String content = idElement.getTextContent().trim().toLowerCase();
|
||||
if (content.startsWith("calibre:") || content.startsWith("urn:calibre:")) {
|
||||
metadataElement.removeChild(idElement);
|
||||
}
|
||||
}
|
||||
|
||||
NodeList contributors = metadataElement.getElementsByTagNameNS(DC_NS, "contributor");
|
||||
for (int i = contributors.getLength() - 1; i >= 0; i--) {
|
||||
Element contributor = (Element) contributors.item(i);
|
||||
String text = contributor.getTextContent().toLowerCase();
|
||||
if (text.contains("calibre")) {
|
||||
String id = contributor.getAttribute("id");
|
||||
metadataElement.removeChild(contributor);
|
||||
if (StringUtils.isNotBlank(id)) {
|
||||
removeMetaByRefines(metadataElement, "#" + id);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
NodeList metas = metadataElement.getElementsByTagNameNS("*", "meta");
|
||||
for (int i = metas.getLength() - 1; i >= 0; i--) {
|
||||
Element meta = (Element) metas.item(i);
|
||||
String property = meta.getAttribute("property");
|
||||
String name = meta.getAttribute("name");
|
||||
|
||||
if (property.startsWith("calibre:") || name.startsWith("calibre:")) {
|
||||
metadataElement.removeChild(meta);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void organizeMetadataElements(Element metadataElement) {
|
||||
final String DC_NS = "http://purl.org/dc/elements/1.1/";
|
||||
java.util.List<Element> identifiers = new java.util.ArrayList<>();
|
||||
java.util.List<Element> titles = new java.util.ArrayList<>();
|
||||
java.util.List<Element> creators = new java.util.ArrayList<>();
|
||||
java.util.List<Element> contributors = new java.util.ArrayList<>();
|
||||
java.util.List<Element> languages = new java.util.ArrayList<>();
|
||||
java.util.List<Element> dates = new java.util.ArrayList<>();
|
||||
java.util.List<Element> publishers = new java.util.ArrayList<>();
|
||||
java.util.List<Element> descriptions = new java.util.ArrayList<>();
|
||||
java.util.List<Element> subjects = new java.util.ArrayList<>();
|
||||
java.util.List<Element> seriesMetas = new java.util.ArrayList<>();
|
||||
java.util.List<Element> bookloreMetas = new java.util.ArrayList<>();
|
||||
java.util.List<Element> modifiedMetas = new java.util.ArrayList<>();
|
||||
java.util.List<Element> otherMetas = new java.util.ArrayList<>();
|
||||
|
||||
NodeList allChildren = metadataElement.getChildNodes();
|
||||
for (int i = 0; i < allChildren.getLength(); i++) {
|
||||
Node node = allChildren.item(i);
|
||||
if (node.getNodeType() != Node.ELEMENT_NODE) continue;
|
||||
Element elem = (Element) node;
|
||||
String localName = elem.getLocalName();
|
||||
String ns = elem.getNamespaceURI();
|
||||
|
||||
if (DC_NS.equals(ns)) {
|
||||
switch (localName) {
|
||||
case "identifier" -> identifiers.add(elem);
|
||||
case "title" -> titles.add(elem);
|
||||
case "creator" -> creators.add(elem);
|
||||
case "contributor" -> contributors.add(elem);
|
||||
case "language" -> languages.add(elem);
|
||||
case "date" -> dates.add(elem);
|
||||
case "publisher" -> publishers.add(elem);
|
||||
case "description" -> descriptions.add(elem);
|
||||
case "subject" -> subjects.add(elem);
|
||||
}
|
||||
} else if ("meta".equals(localName)) {
|
||||
String property = elem.getAttribute("property");
|
||||
if (property.startsWith("booklore:")) {
|
||||
bookloreMetas.add(elem);
|
||||
} else if (property.equals("dcterms:modified") || property.equals("calibre:timestamp")) {
|
||||
modifiedMetas.add(elem);
|
||||
} else if (property.equals("belongs-to-collection") || property.equals("collection-type") || property.equals("group-position")) {
|
||||
seriesMetas.add(elem);
|
||||
} else {
|
||||
otherMetas.add(elem);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
while (metadataElement.hasChildNodes()) {
|
||||
metadataElement.removeChild(metadataElement.getFirstChild());
|
||||
}
|
||||
|
||||
identifiers.forEach(metadataElement::appendChild);
|
||||
titles.forEach(metadataElement::appendChild);
|
||||
creators.forEach(metadataElement::appendChild);
|
||||
contributors.forEach(metadataElement::appendChild);
|
||||
languages.forEach(metadataElement::appendChild);
|
||||
dates.forEach(metadataElement::appendChild);
|
||||
publishers.forEach(metadataElement::appendChild);
|
||||
descriptions.forEach(metadataElement::appendChild);
|
||||
subjects.forEach(metadataElement::appendChild);
|
||||
seriesMetas.forEach(metadataElement::appendChild);
|
||||
modifiedMetas.forEach(metadataElement::appendChild);
|
||||
otherMetas.forEach(metadataElement::appendChild);
|
||||
bookloreMetas.forEach(metadataElement::appendChild);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -133,6 +133,13 @@ public class MetadataCopyHelper {
|
||||
}
|
||||
}
|
||||
|
||||
public void copyHardcoverBookId(boolean clear, Consumer<String> consumer) {
|
||||
if (!isLocked(metadata.getHardcoverBookIdLocked())) {
|
||||
if (clear) consumer.accept(null);
|
||||
else if (metadata.getHardcoverBookId() != null) consumer.accept(metadata.getHardcoverBookId());
|
||||
}
|
||||
}
|
||||
|
||||
public void copyGoogleId(boolean clear, Consumer<String> consumer) {
|
||||
if (!isLocked(metadata.getGoogleIdLocked())) {
|
||||
if (clear) consumer.accept(null);
|
||||
@@ -140,6 +147,13 @@ public class MetadataCopyHelper {
|
||||
}
|
||||
}
|
||||
|
||||
public void copyLubimyczytacId(boolean clear, Consumer<String> consumer) {
|
||||
if (!isLocked(metadata.getLubimyczytacIdLocked())) {
|
||||
if (clear) consumer.accept(null);
|
||||
else if (metadata.getLubimyczytacId() != null) consumer.accept(metadata.getLubimyczytacId());
|
||||
}
|
||||
}
|
||||
|
||||
public void copyRanobedbId(boolean clear, Consumer<String> consumer) {
|
||||
if (!isLocked(metadata.getRanobedbIdLocked())) {
|
||||
if (clear) consumer.accept(null);
|
||||
|
||||
@@ -18,6 +18,7 @@ import com.adityachandel.booklore.model.enums.ReadStatus;
|
||||
import com.adityachandel.booklore.model.enums.ResetProgressType;
|
||||
import com.adityachandel.booklore.model.enums.UserPermission;
|
||||
import com.adityachandel.booklore.repository.*;
|
||||
import com.adityachandel.booklore.service.hardcover.HardcoverSyncService;
|
||||
import com.adityachandel.booklore.service.kobo.KoboReadingStateService;
|
||||
import lombok.RequiredArgsConstructor;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
@@ -44,6 +45,7 @@ public class ReadingProgressService {
|
||||
private final UserRepository userRepository;
|
||||
private final AuthenticationService authenticationService;
|
||||
private final KoboReadingStateService koboReadingStateService;
|
||||
private final HardcoverSyncService hardcoverSyncService;
|
||||
|
||||
// ==================== Methods from UserProgressService ====================
|
||||
|
||||
@@ -242,6 +244,10 @@ public class ReadingProgressService {
|
||||
}
|
||||
|
||||
userBookProgressRepository.save(progress);
|
||||
|
||||
if (percentage != null) {
|
||||
hardcoverSyncService.syncProgressToHardcover(book.getId(), percentage, user.getId());
|
||||
}
|
||||
}
|
||||
|
||||
@Transactional
|
||||
|
||||
@@ -119,6 +119,7 @@ public class MetadataChangeDetector {
|
||||
compareValue(diffs, "hardcoverId", clear.isHardcoverId(), newMeta.getHardcoverId(), existingMeta.getHardcoverId(), () -> !isTrue(existingMeta.getHardcoverIdLocked()));
|
||||
compareValue(diffs, "hardcoverBookId", clear.isHardcoverBookId(), newMeta.getHardcoverBookId(), existingMeta.getHardcoverBookId(), () -> !isTrue(existingMeta.getHardcoverBookIdLocked()));
|
||||
compareValue(diffs, "googleId", clear.isGoogleId(), newMeta.getGoogleId(), existingMeta.getGoogleId(), () -> !isTrue(existingMeta.getGoogleIdLocked()));
|
||||
compareValue(diffs, "lubimyczytacId", clear.isLubimyczytacId(), newMeta.getLubimyczytacId(), existingMeta.getLubimyczytacId(), () -> !isTrue(existingMeta.getLubimyczytacIdLocked()));
|
||||
compareValue(diffs, "ranobedbId", clear.isRanobedbId(), newMeta.getRanobedbId(), existingMeta.getRanobedbId(), () -> !isTrue(existingMeta.getRanobedbIdLocked()));
|
||||
compareValue(diffs, "language", clear.isLanguage(), newMeta.getLanguage(), existingMeta.getLanguage(), () -> !isTrue(existingMeta.getLanguageLocked()));
|
||||
compareValue(diffs, "authors", clear.isAuthors(), newMeta.getAuthors(), toNameSet(existingMeta.getAuthors()), () -> !isTrue(existingMeta.getAuthorsLocked()));
|
||||
|
||||
@@ -0,0 +1,3 @@
|
||||
-- Change hardcover_book_id from INTEGER to VARCHAR(100) for consistency with other provider IDs
|
||||
-- This prevents overflow issues with large book IDs (e.g., > 2,147,483,647)
|
||||
ALTER TABLE book_metadata MODIFY COLUMN hardcover_book_id VARCHAR(100);
|
||||
@@ -166,7 +166,7 @@ class HardcoverSyncServiceTest {
|
||||
@Test
|
||||
@DisplayName("Should use stored hardcoverBookId when available")
|
||||
void syncProgressToHardcover_withStoredBookId_shouldUseStoredId() {
|
||||
testMetadata.setHardcoverBookId(12345);
|
||||
testMetadata.setHardcoverBookId("12345");
|
||||
testMetadata.setPageCount(300);
|
||||
|
||||
// Mock successful responses for the chain
|
||||
@@ -212,7 +212,7 @@ class HardcoverSyncServiceTest {
|
||||
@Test
|
||||
@DisplayName("Should set status to READ when progress >= 99%")
|
||||
void syncProgressToHardcover_whenProgress99Percent_shouldMakeApiCalls() {
|
||||
testMetadata.setHardcoverBookId(12345);
|
||||
testMetadata.setHardcoverBookId("12345");
|
||||
testMetadata.setPageCount(300);
|
||||
|
||||
when(responseSpec.body(Map.class))
|
||||
@@ -228,7 +228,7 @@ class HardcoverSyncServiceTest {
|
||||
@Test
|
||||
@DisplayName("Should set status to CURRENTLY_READING when progress < 99%")
|
||||
void syncProgressToHardcover_whenProgressLessThan99_shouldMakeApiCalls() {
|
||||
testMetadata.setHardcoverBookId(12345);
|
||||
testMetadata.setHardcoverBookId("12345");
|
||||
testMetadata.setPageCount(300);
|
||||
|
||||
when(responseSpec.body(Map.class))
|
||||
@@ -244,7 +244,7 @@ class HardcoverSyncServiceTest {
|
||||
@Test
|
||||
@DisplayName("Should handle existing user_book gracefully")
|
||||
void syncProgressToHardcover_whenUserBookExists_shouldFindExisting() {
|
||||
testMetadata.setHardcoverBookId(12345);
|
||||
testMetadata.setHardcoverBookId("12345");
|
||||
testMetadata.setPageCount(300);
|
||||
|
||||
// Mock: insert_user_book returns error, then find existing, then create progress
|
||||
@@ -262,7 +262,7 @@ class HardcoverSyncServiceTest {
|
||||
@Test
|
||||
@DisplayName("Should update existing reading progress")
|
||||
void syncProgressToHardcover_whenProgressExists_shouldUpdate() {
|
||||
testMetadata.setHardcoverBookId(12345);
|
||||
testMetadata.setHardcoverBookId("12345");
|
||||
testMetadata.setPageCount(300);
|
||||
|
||||
// Mock: insert_user_book -> find existing read -> update read
|
||||
@@ -298,7 +298,7 @@ class HardcoverSyncServiceTest {
|
||||
@Test
|
||||
@DisplayName("Should handle API errors gracefully")
|
||||
void syncProgressToHardcover_whenApiError_shouldNotThrow() {
|
||||
testMetadata.setHardcoverBookId(12345);
|
||||
testMetadata.setHardcoverBookId("12345");
|
||||
testMetadata.setPageCount(300);
|
||||
|
||||
when(responseSpec.body(Map.class)).thenReturn(Map.of("errors", List.of(Map.of("message", "Unauthorized"))));
|
||||
@@ -309,7 +309,7 @@ class HardcoverSyncServiceTest {
|
||||
@Test
|
||||
@DisplayName("Should handle null response gracefully")
|
||||
void syncProgressToHardcover_whenResponseNull_shouldNotThrow() {
|
||||
testMetadata.setHardcoverBookId(12345);
|
||||
testMetadata.setHardcoverBookId("12345");
|
||||
testMetadata.setPageCount(300);
|
||||
|
||||
when(responseSpec.body(Map.class)).thenReturn(null);
|
||||
@@ -477,8 +477,8 @@ class HardcoverSyncServiceTest {
|
||||
|
||||
Object result = method.invoke(service, 77);
|
||||
assertNotNull(result);
|
||||
assertEquals(77, readPrivateField(result, "id"));
|
||||
assertEquals(250, readPrivateField(result, "pages"));
|
||||
assertEquals(77, readPrivateIntField(result, "id"));
|
||||
assertEquals(250, readPrivateIntField(result, "pages"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@@ -521,12 +521,18 @@ class HardcoverSyncServiceTest {
|
||||
|
||||
Object result = method.invoke(service, 123);
|
||||
assertNotNull(result);
|
||||
assertEquals(123, readPrivateField(result, "bookId"));
|
||||
assertEquals(88, readPrivateField(result, "editionId"));
|
||||
assertEquals(320, readPrivateField(result, "pages"));
|
||||
assertEquals("123", readPrivateStringField(result, "bookId"));
|
||||
assertEquals(88, readPrivateIntField(result, "editionId"));
|
||||
assertEquals(320, readPrivateIntField(result, "pages"));
|
||||
}
|
||||
|
||||
private Integer readPrivateField(Object target, String fieldName) throws Exception {
|
||||
private String readPrivateStringField(Object target, String fieldName) throws Exception {
|
||||
Field field = target.getClass().getDeclaredField(fieldName);
|
||||
field.setAccessible(true);
|
||||
return (String) field.get(target);
|
||||
}
|
||||
|
||||
private Integer readPrivateIntField(Object target, String fieldName) throws Exception {
|
||||
Field field = target.getClass().getDeclaredField(fieldName);
|
||||
field.setAccessible(true);
|
||||
return (Integer) field.get(target);
|
||||
|
||||
@@ -245,7 +245,7 @@ class HardcoverParserTest {
|
||||
assertThat(metadata.getSubtitle()).isEqualTo("A Subtitle");
|
||||
assertThat(metadata.getDescription()).isEqualTo("A description");
|
||||
assertThat(metadata.getHardcoverId()).isEqualTo("test-book-slug");
|
||||
assertThat(metadata.getHardcoverBookId()).isEqualTo(12345);
|
||||
assertThat(metadata.getHardcoverBookId()).isEqualTo("12345");
|
||||
assertThat(metadata.getHardcoverRating()).isEqualTo(4.25);
|
||||
assertThat(metadata.getHardcoverReviewCount()).isEqualTo(100);
|
||||
assertThat(metadata.getPageCount()).isEqualTo(350);
|
||||
@@ -463,7 +463,7 @@ class HardcoverParserTest {
|
||||
List<BookMetadata> results = parser.fetchMetadata(book, request);
|
||||
|
||||
assertThat(results).hasSize(1);
|
||||
assertThat(results.get(0).getHardcoverBookId()).isNull();
|
||||
assertThat(results.get(0).getHardcoverBookId()).isEqualTo("not-a-number");
|
||||
}
|
||||
|
||||
@Test
|
||||
|
||||
@@ -68,6 +68,7 @@ public class MetadataChangeDetectorTest {
|
||||
.goodreadsIdLocked(false)
|
||||
.comicvineIdLocked(false)
|
||||
.hardcoverIdLocked(false)
|
||||
.hardcoverBookIdLocked(false)
|
||||
.googleIdLocked(false)
|
||||
.pageCountLocked(false)
|
||||
.languageLocked(false)
|
||||
@@ -82,6 +83,7 @@ public class MetadataChangeDetectorTest {
|
||||
.categoriesLocked(false)
|
||||
.moodsLocked(false)
|
||||
.tagsLocked(false)
|
||||
.reviewsLocked(false)
|
||||
.authors(Set.of(
|
||||
AuthorEntity.builder().id(1L).name("Author One").build(),
|
||||
AuthorEntity.builder().id(2L).name("Author Two").build()
|
||||
@@ -139,6 +141,7 @@ public class MetadataChangeDetectorTest {
|
||||
.goodreadsIdLocked(false)
|
||||
.comicvineIdLocked(false)
|
||||
.hardcoverIdLocked(false)
|
||||
.hardcoverBookIdLocked(false)
|
||||
.googleIdLocked(false)
|
||||
.pageCountLocked(false)
|
||||
.languageLocked(false)
|
||||
@@ -153,6 +156,7 @@ public class MetadataChangeDetectorTest {
|
||||
.categoriesLocked(false)
|
||||
.moodsLocked(false)
|
||||
.tagsLocked(false)
|
||||
.reviewsLocked(false)
|
||||
.authors(Set.of("Author One", "Author Two"))
|
||||
.categories(Set.of("Fiction", "Mystery"))
|
||||
.moods(Set.of("Dark", "Suspenseful"))
|
||||
@@ -364,6 +368,31 @@ public class MetadataChangeDetectorTest {
|
||||
.tags(Set.of(TagEntity.builder().id(1L).name("Tag").build()))
|
||||
.tagsLocked(false)
|
||||
.titleLocked(false)
|
||||
.subtitleLocked(false)
|
||||
.publisherLocked(false)
|
||||
.publishedDateLocked(false)
|
||||
.descriptionLocked(false)
|
||||
.seriesNameLocked(false)
|
||||
.seriesNumberLocked(false)
|
||||
.seriesTotalLocked(false)
|
||||
.isbn13Locked(false)
|
||||
.isbn10Locked(false)
|
||||
.asinLocked(false)
|
||||
.goodreadsIdLocked(false)
|
||||
.comicvineIdLocked(false)
|
||||
.hardcoverIdLocked(false)
|
||||
.hardcoverBookIdLocked(false)
|
||||
.googleIdLocked(false)
|
||||
.pageCountLocked(false)
|
||||
.languageLocked(false)
|
||||
.amazonRatingLocked(false)
|
||||
.amazonReviewCountLocked(false)
|
||||
.goodreadsRatingLocked(false)
|
||||
.goodreadsReviewCountLocked(false)
|
||||
.hardcoverRatingLocked(false)
|
||||
.hardcoverReviewCountLocked(false)
|
||||
.coverLocked(false)
|
||||
.reviewsLocked(false)
|
||||
.build();
|
||||
|
||||
BookMetadata testNew = BookMetadata.builder()
|
||||
@@ -378,6 +407,31 @@ public class MetadataChangeDetectorTest {
|
||||
.tags(Set.of("Tag")) // Match existing
|
||||
.tagsLocked(false)
|
||||
.titleLocked(false)
|
||||
.subtitleLocked(false)
|
||||
.publisherLocked(false)
|
||||
.publishedDateLocked(false)
|
||||
.descriptionLocked(false)
|
||||
.seriesNameLocked(false)
|
||||
.seriesNumberLocked(false)
|
||||
.seriesTotalLocked(false)
|
||||
.isbn13Locked(false)
|
||||
.isbn10Locked(false)
|
||||
.asinLocked(false)
|
||||
.goodreadsIdLocked(false)
|
||||
.comicvineIdLocked(false)
|
||||
.hardcoverIdLocked(false)
|
||||
.hardcoverBookIdLocked(false)
|
||||
.googleIdLocked(false)
|
||||
.pageCountLocked(false)
|
||||
.languageLocked(false)
|
||||
.amazonRatingLocked(false)
|
||||
.amazonReviewCountLocked(false)
|
||||
.goodreadsRatingLocked(false)
|
||||
.goodreadsReviewCountLocked(false)
|
||||
.hardcoverRatingLocked(false)
|
||||
.hardcoverReviewCountLocked(false)
|
||||
.coverLocked(false)
|
||||
.reviewsLocked(false)
|
||||
.build();
|
||||
boolean result = MetadataChangeDetector.isDifferent(testNew, testExisting, clearFlags);
|
||||
assertTrue(result, "Should return true for empty collection to null transition");
|
||||
@@ -616,12 +670,76 @@ public class MetadataChangeDetectorTest {
|
||||
.title("Test")
|
||||
.authors(Set.of())
|
||||
.authorsLocked(false)
|
||||
.categories(Set.of())
|
||||
.categoriesLocked(false)
|
||||
.moods(Set.of())
|
||||
.moodsLocked(false)
|
||||
.tags(Set.of())
|
||||
.tagsLocked(false)
|
||||
.titleLocked(false)
|
||||
.subtitleLocked(false)
|
||||
.publisherLocked(false)
|
||||
.publishedDateLocked(false)
|
||||
.descriptionLocked(false)
|
||||
.seriesNameLocked(false)
|
||||
.seriesNumberLocked(false)
|
||||
.seriesTotalLocked(false)
|
||||
.isbn13Locked(false)
|
||||
.isbn10Locked(false)
|
||||
.asinLocked(false)
|
||||
.goodreadsIdLocked(false)
|
||||
.comicvineIdLocked(false)
|
||||
.hardcoverIdLocked(false)
|
||||
.hardcoverBookIdLocked(false)
|
||||
.googleIdLocked(false)
|
||||
.pageCountLocked(false)
|
||||
.languageLocked(false)
|
||||
.amazonRatingLocked(false)
|
||||
.amazonReviewCountLocked(false)
|
||||
.goodreadsRatingLocked(false)
|
||||
.goodreadsReviewCountLocked(false)
|
||||
.hardcoverRatingLocked(false)
|
||||
.hardcoverReviewCountLocked(false)
|
||||
.coverLocked(false)
|
||||
.reviewsLocked(false)
|
||||
.build();
|
||||
BookMetadata testNew = BookMetadata.builder()
|
||||
.bookId(1L)
|
||||
.title("Test")
|
||||
.authors(null)
|
||||
.authorsLocked(false)
|
||||
.categories(Set.of())
|
||||
.categoriesLocked(false)
|
||||
.moods(Set.of())
|
||||
.moodsLocked(false)
|
||||
.tags(Set.of())
|
||||
.tagsLocked(false)
|
||||
.titleLocked(false)
|
||||
.subtitleLocked(false)
|
||||
.publisherLocked(false)
|
||||
.publishedDateLocked(false)
|
||||
.descriptionLocked(false)
|
||||
.seriesNameLocked(false)
|
||||
.seriesNumberLocked(false)
|
||||
.seriesTotalLocked(false)
|
||||
.isbn13Locked(false)
|
||||
.isbn10Locked(false)
|
||||
.asinLocked(false)
|
||||
.goodreadsIdLocked(false)
|
||||
.comicvineIdLocked(false)
|
||||
.hardcoverIdLocked(false)
|
||||
.hardcoverBookIdLocked(false)
|
||||
.googleIdLocked(false)
|
||||
.pageCountLocked(false)
|
||||
.languageLocked(false)
|
||||
.amazonRatingLocked(false)
|
||||
.amazonReviewCountLocked(false)
|
||||
.goodreadsRatingLocked(false)
|
||||
.goodreadsReviewCountLocked(false)
|
||||
.hardcoverRatingLocked(false)
|
||||
.hardcoverReviewCountLocked(false)
|
||||
.coverLocked(false)
|
||||
.reviewsLocked(false)
|
||||
.build();
|
||||
boolean result = MetadataChangeDetector.hasValueChanges(testNew, testExisting, clearFlags);
|
||||
assertTrue(result, "Should return true for empty set to null transition");
|
||||
|
||||
@@ -322,6 +322,8 @@ export class BookdropFileReviewComponent implements OnInit {
|
||||
hardcoverBookId: original?.hardcoverBookId ?? null,
|
||||
hardcoverRating: original?.hardcoverRating ?? null,
|
||||
hardcoverReviewCount: original?.hardcoverReviewCount ?? null,
|
||||
lubimyczytacId: original?.lubimyczytacId ?? null,
|
||||
lubimyczytacRating: original?.lubimyczytacRating ?? null,
|
||||
googleId: original?.googleId ?? null,
|
||||
comicvineId: original?.comicvineId ?? null,
|
||||
ranobedbId: original?.ranobedbId ?? null,
|
||||
@@ -600,6 +602,8 @@ export class BookdropFileReviewComponent implements OnInit {
|
||||
hardcoverBookId: new FormControl(original?.hardcoverBookId ?? ''),
|
||||
hardcoverRating: new FormControl(original?.hardcoverRating ?? ''),
|
||||
hardcoverReviewCount: new FormControl(original?.hardcoverReviewCount ?? ''),
|
||||
lubimyczytacId: new FormControl(original?.lubimyczytacId ?? ''),
|
||||
lubimyczytacRating: new FormControl(original?.lubimyczytacRating ?? ''),
|
||||
ranobedbId: new FormControl(original?.ranobedbId ?? ''),
|
||||
ranobedbRating: new FormControl(original?.ranobedbRating ?? ''),
|
||||
googleId: new FormControl(original?.googleId ?? ''),
|
||||
|
||||
@@ -644,6 +644,9 @@ export class MetadataEditorComponent implements OnInit {
|
||||
hardcoverId: wasCleared("hardcoverId"),
|
||||
hardcoverRating: wasCleared("hardcoverRating"),
|
||||
hardcoverReviewCount: wasCleared("hardcoverReviewCount"),
|
||||
hardcoverBookId: wasCleared("hardcoverBookId"),
|
||||
lubimyczytacId: wasCleared("lubimyczytacId"),
|
||||
lubimyczytacRating: wasCleared("lubimyczytacRating"),
|
||||
ranobedbId: wasCleared("ranobedbId"),
|
||||
ranobedbRating: wasCleared("ranobedbRating"),
|
||||
googleId: wasCleared("googleId"),
|
||||
@@ -875,4 +878,8 @@ export class MetadataEditorComponent implements OnInit {
|
||||
}
|
||||
|
||||
protected readonly sample = sample;
|
||||
|
||||
onFieldChange(): void {
|
||||
this.metadataForm.markAsDirty();
|
||||
}
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user