This module offers a private media file storage, so user uploads can be protected behind a login.
It uses the Django storage API's internally, so all form rendering and admin integration work out of the box.
pip install django-private-storage
Add to the settings:
INSTALLED_APPS += (
'private_storage',
)
PRIVATE_STORAGE_ROOT = '/path/to/private-media/'
PRIVATE_STORAGE_AUTH_FUNCTION = 'private_storage.permissions.allow_staff'
Add to urls.py
:
import private_storage.urls
urlpatterns += [
url('^private-media/', include(private_storage.urls)),
]
In a Django model, add the PrivateFileField
:
from django.db import models
from private_storage.fields import PrivateFileField
class MyModel(models.Model):
title = models.CharField("Title", max_length=200)
file = PrivateFileField("File")
The PrivateFileField
also accepts the following kwargs:
upload_to
: the optional subfolder in thePRIVATE_STORAGE_ROOT
.upload_subfolder
: a function that defines the folder, it receives the current modelinstance
.content_types
: allowed content typesmax_file_size
: maximum file size.storage
: the storage object to use, defaults toprivate_storage.storage.private_storage
The PRIVATE_STORAGE_CLASS
setting can be redefined to point to a different storage class.
The default is private_storage.storage.files.PrivateFileSystemStorage
, which uses
a private media folder that PRIVATE_STORAGE_ROOT
points to.
Define one of these settings instead:
PRIVATE_STORAGE_CLASS = 'private_storage.storage.s3boto3.PrivateS3BotoStorage'
AWS_PRIVATE_STORAGE_BUCKET_NAME = 'private-files' # bucket name
This uses django-storages settings. Replace the prefix AWS_
with AWS_PRIVATE_
.
The following settings are reused when they don't have an corresponding AWS_PRIVATE_...
setting:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_S3_URL_PROTOCOL
AWS_S3_REGION_NAME
AWS_IS_GZIPPED
All other settings should be explicitly defined with AWS_PRIVATE_...
settings.
By default, all URLs in the admin return the direct S3 bucket URls, with the query parameter authentication enabled.
When AWS_PRIVATE_QUERYSTRING_AUTH = False
, all file downloads are proxied through our PrivateFileView
URL.
This behavior can be enabled explicitly using PRIVATE_STORAGE_S3_REVERSE_PROXY = True
.
To have encryption either configure AWS_PRIVATE_S3_ENCRYPTION
and AWS_PRIVATE_S3_SIGNATURE_VERSION
or use:
PRIVATE_STORAGE_CLASS = 'private_storage.storage.s3boto3.PrivateEncryptedS3BotoStorage'
Make sure an encryption key is generated on Amazon.
The PRIVATE_STORAGE_AUTH_FUNCTION
defines which user may access the files.
By default, this only includes superusers.
The following options are available out of the box:
private_storage.permissions.allow_authenticated
private_storage.permissions.allow_staff
private_storage.permissions.allow_superuser
You can create a custom function, and use that instead.
The function receives a private_storate.models.PrivateFile
object,
which has the following fields:
request
: the Django request.storage
: the storage engine used to retrieve the file.relative_name
: the file name in the storage.full_path
: the full file system path.exists()
: whether the file exists.content_type
: the HTTP content type.parent_object
: only set whenPrivateStorageDetailView
was used.
To implement more object-based access permissions, create a custom view that provides the download.
from private_storage.views import PrivateStorageDetailView
class MyDocumentDownloadView(PrivateStorageDetailView):
model = MyModel
model_file_field = 'file'
def get_queryset(self):
# Make sure only certain objects can be accessed.
return super().get_queryset().filter(...)
def can_access_file(self, private_file):
# When the object can be accessed, the file may be downloaded.
# This overrides PRIVATE_STORAGE_AUTH_FUNCTION
return True
Sending large files can be inefficient in some configurations.
In the worst case scenario, the whole file needs to be read in chunks and passed as a whole through the WSGI buffers, OS kernel, webserver and proxy server. In effect, the complete file is copied several times through memory buffers.
There are more efficient ways to transfer files, such as the sendfile()
system call on UNIX.
Django uses such feature when the WSGI server provides wsgi.file_handler
support.
In some situations, this effect is nullified, for example by by a local HTTP server sitting in front of the WSGI container. A typical case would be running Gunicorn behind an Nginx or Apache webserver.
For such situation, the native support of the webserver can be enabled with the following settings:
PRIVATE_STORAGE_SERVER = 'apache'
PRIVATE_STORAGE_SERVER = 'nginx'
PRIVATE_STORAGE_INTERNAL_URL = '/private-x-accel-redirect/'
Add the following location block in the server config:
location /private-x-accel-redirect/ {
internal;
alias /path/to/private-media/;
}
The PRIVATE_STORAGE_SERVER
may also point to a dotted Python class path.
Implement a class with a static serve(private_file)
method.
The PrivateFileField
accepts a storage
kwarg,
hence you can initialize multiple private_storage.storage.PrivateStorage
objects,
each providing files from a different location
and base_url
.
For example:
from django.db import models
from private_storage.fields import PrivateFileField
from private_storage.storage.files import PrivateFileSystemStorage
my_storage = PrivateFileSystemStorage(
location='/path/to/storage2/',
base_url='/private-documents2/'
)
class MyModel(models.Model):
file = PrivateFileField(storage=my_storage)
Then create a view to serve those files:
from private_storage.views import PrivateStorageView
from .models import my_storage
class MyStorageView(PrivateStorageView):
storage = my_storage
def can_access_file(self, private_file):
# This overrides PRIVATE_STORAGE_AUTH_FUNCTION
return self.request.is_superuser
And expose that URL:
urlpatterns += [
url('^private-documents2/(?P<path>.*)$', views.MyStorageView.as_view()),
]
This module is designed to be generic. In case there is anything you didn't like about it, or think it's not flexible enough, please let us know. We'd love to improve it!