We can see in the `uv.lock` changes that we aren't actually removing
anything, but as these dependencies are not being directly accessed,
it's better to rely on direct dependencies only, in case they change
in the future and these indirect dependencies are no longer needed.
* Add type annotations for FastAPI header, query, and path parameters.
* Add type annotations for request body content.
* Update docstrings to clarify endpoint functionality, and remove
unnecessary details.
The breaking code in `streaming-form-data` has been fixed, and Poetry
also was patched based on @gantoine's proposed fix. This allows us
to use the official releases again instead of a fork.
This change replaces the `httpx` client with `aiohttp` for the
RetroAchievements API service.
The main reason for this change is that `httpx` has an unavoidable log
line with `INFO` level, which includes the request full URL, containing
the user's API key.
`httpx` has had an
[open discussion](https://github.com/encode/httpx/discussions/2765)
regarding this security issue for almost two years.
The change to `aiohttp` is painless, and would allow us to migrate more
of the codebase to it in the future, to avoid leaking sensitive
information in logs.
This change only verifies that Alembic can upgrade and downgrade through
all the current migrations. It does not verify that the application
works correctly with PostgreSQL.
This change initializes the Sentry SDK, which enables error tracking
when the `SENTRY_DSN` environment variable is set.
Drop-in alternatives to Sentry are also supported, like GlitchTip.
Trying to run MySQL as database (instead of MariaDB) was failing due to
incompatible migrations and lack of support for the MySQL driver.
These small changes fix the issue and recover the support for MySQL as
database driver.
Support for MySQL is on a best-effort basis, as the project's main
database is MariaDB.
At the moment, 7zip files are generating memory issues and even OOM
errors on user installations. This is because the current stable release
of `py7zr` does not support decompression streaming, and RomM needs to
decompress the each 7zip file in the library into memory to be able to
calculate hashes.
This change introduces a `py7zr` fork I created to have a stable commit
SHA to refer to in case upstream gets any forced pushes. It includes the
contents of the pull request the `py7zr` creator is working on to
support decompression streaming [1].
The way decompression streaming is implemented in `py7zr` is different
than the other compression utilities. Instead of being able to provide a
`bytes` iterator, we need to provide a `Py7zIO` implementation that
will call a callback on each read and write operation.
[1] https://github.com/miurahr/py7zr/pull/620
The short-term goal is to completely typehint the IGDB API responses. This
first change adds the base structures and enums RomM currently uses.
The `ExpandableField` type will allow us to model the expansion
mechanism the IGDB API provides, where a field can include either an ID,
or the full nested structure.
This change replaces the creation of Redis URL, from a simple string
interpolation, to using `yarl.URL`. The main benefit, besides not
forgetting to set all five different variables on every Redis client
initialization, is that user credentials are correctly URL-encoded, if
present.
Up until now, if a password had special characters, it could break the
generated URL.
This change also introduces support for a `REDIS_SSL` setting, which
allows the user to specify if the Redis connection should use SSL or not.
This change improves memory usage, by only keeping a single archive's
member file in memory at a time during 7zip decompression.
The `py7zr` library does not support streaming decompression yet, so
this change is the best we can do for now.
Potential fix for #1211, but it won't improve memory usage for
single-file 7zip archives.
This change installs and configures the `mod_zip` nginx module [1],
which allows nginx to stream ZIP files directly.
It includes a workaround needed to correctly calculate CRC-32 values for
included files, by including a new `server` section listening at port
8081, only used for the file requests to be upstream subrequests that
correctly trigger the CRC-32 calculation logic.
Also, to be able to provide a `m3u` file generated on the fly, we add a
`/decode` endpoint fully implemented in nginx using NJS, which receives
a `value` URL param, and decodes it using base64. The decoded value is
returned as the response.
That way, the contents of the `m3u` file is base64-encoded, and set as
part of the response, for `mod_zip` to include it in the ZIP file.
[1] https://github.com/evanmiller/mod_zip