Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Restoring large (1.4GB) postgres backup using powershell encounters System.OutOfMemoryException #14619

Closed
1 of 3 tasks
artus42 opened this issue Dec 10, 2024 · 2 comments
Closed
1 of 3 tasks

Comments

@artus42
Copy link

artus42 commented Dec 10, 2024

The bug

I tried following the procedure described in the documentation (https://immich.app/docs/administration/backup-and-restore/) to manually restore a backup of the postgres database. Note that I'm on windows, hence I've tried the described procedure for the powershell.

However at the step where one is to restore the database using the .sql file, the documentation states that one should use gc "C:\path\to\backup\dump.sql" | docker exec -i immich_postgres psql --username=postgres.

This then errors in my case with a System.OutOfMemoryException:

Fehler beim Ausführen des Programms "docker.exe": Eine Ausnahme vom Typ "System.OutOfMemoryException" wurde ausgelöst.In Zeile:1 Zeichen:69
+ ... 556194.sql" | docker exec -i immich_postgres psql --username=postgres
+                   ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~.
In Zeile:1 Zeichen:1
+ gc "F:\immich\library\backups\immich-db-backup-1733405556194.sql" | d ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : ResourceUnavailable: (:) [], ApplicationFailedException
    + FullyQualifiedErrorId : NativeCommandFailed

I assume this is due to powershells Get-Content not being capable of handling the 1.4GB .sql file due to its size. The powershell in which it was ran occupied (after encountering the exception) something like 8GB in task manager, so much more than the actual restoration file.

Note that my system had 16 GB of RAM to work with out of a total of 32GB, so in theory it should be able to read the entire file into memory, so I do not know why it is struggling so much.
Also note: I've managed to restore the database by manually removing offending files (and it seems to work again), so while it is not critical for me right now, it would be nice to have a working backup + restoration strategy.

The OS that Immich Server is running on

Windows 11 version 23H2

Version of Immich Server

v1.122.2

Version of Immich Mobile App

N/A

Platform with the issue

  • Server
  • Web
  • Mobile

Your docker-compose.yml content

#
# WARNING: Make sure to use the docker-compose.yml of the current release:
#
# https://github.com/immich-app/immich/releases/latest/download/docker-compose.yml
#
# The compose file on main may not be compatible with the latest release.
#

name: immich

services:
  immich-server:
    container_name: immich_server
    image: ghcr.io/immich-app/immich-server:${IMMICH_VERSION:-release}
    volumes:
      - ${UPLOAD_LOCATION}:/usr/src/app/upload
      - /etc/localtime:/etc/localtime:ro
      - [REDACTED]:/mnt/f_drive/Fotos:ro
      - [REDACTED]:/mnt/d_drive/Fotos:ro
    env_file:
      - .env
    ports:
      - 2283:2283
    depends_on:
      - redis
      - database
    restart: always

  immich-machine-learning:
    container_name: immich_machine_learning
    # For hardware acceleration, add one of -[armnn, cuda, openvino] to the image tag.
    # Example tag: ${IMMICH_VERSION:-release}-cuda
    image: ghcr.io/immich-app/immich-machine-learning:${IMMICH_VERSION:-release}
    extends: # uncomment this section for hardware acceleration - see https://immich.app/docs/features/ml-hardware-acceleration
      file: hwaccel.ml.yml
      service: cuda # set to one of [armnn, cuda, openvino, openvino-wsl] for accelerated inference - use the `-wsl` version for WSL2 where applicable
    volumes:
      - model-cache:/cache
    env_file:
      - .env
    restart: always

  redis:
    container_name: immich_redis
    image: registry.hub.docker.com/library/redis:6.2-alpine@sha256:84882e87b54734154586e5f8abd4dce69fe7311315e2fc6d67c29614c8de2672
    restart: always

  database:
    container_name: immich_postgres
    image: registry.hub.docker.com/tensorchord/pgvecto-rs:pg14-v0.2.0@sha256:90724186f0a3517cf6914295b5ab410db9ce23190a2d9d0b9dd6463e3fa298f0
    environment:
      POSTGRES_PASSWORD: ${DB_PASSWORD}
      POSTGRES_USER: ${DB_USERNAME}
      POSTGRES_DB: ${DB_DATABASE_NAME}
      POSTGRES_INITDB_ARGS: '--data-checksums'
    volumes:
      - ${DB_DATA_LOCATION}:/var/lib/postgresql/data
    restart: always
    command: ["postgres", "-c" ,"shared_preload_libraries=vectors.so", "-c", 'search_path="$$user", public, vectors', "-c", "logging_collector=on", "-c", "max_wal_size=2GB", "-c", "shared_buffers=512MB", "-c", "wal_compression=on"]

volumes:
  model-cache:

Your .env content

# You can find documentation for all the supported env variables at https://immich.app/docs/install/environment-variables

# The location where your uploaded files are stored
UPLOAD_LOCATION=./library
# The location where your database files are stored
DB_DATA_LOCATION=./postgres

# The Immich version to use. You can pin this to a specific version like "v1.71.0"
IMMICH_VERSION=release

# Connection secret for postgres. You should change it to a random password
DB_PASSWORD=[REDACTED]

# The values below this line do not need to be changed
###################################################################################
DB_USERNAME=postgres
DB_DATABASE_NAME=immich

Reproduction steps

  1. Have a .sql file of size ~1.4GB of the postgres database.
  2. Try to follow the steps in the documentation to restore the database using powershell until gc "C:\path\to\backup\dump.sql" | docker exec -i immich_postgres psql --username=postgres
  3. This step fails with the above error message (System.OutOfMemoryException)

Relevant log output

No response

Additional information

No response

@mmomjian
Copy link
Contributor

There doesn’t seem to be a simple solution to this: https://www.reddit.com/r/PowerShell/comments/lgepo1/parsing_large_500_mb_text_file/

I am going to write a PR to remove this example because it’s at this point broken in multiple ways. Hopefully someone else can develop a working restore method on windows.

@mmomjian
Copy link
Contributor

Fixed in #14620

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants