Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DATA_UPLOAD_MAX_MEMORY_SIZE in Middleware #61

Open
AnkurBegining opened this issue Mar 4, 2019 · 7 comments
Open

DATA_UPLOAD_MAX_MEMORY_SIZE in Middleware #61

AnkurBegining opened this issue Mar 4, 2019 · 7 comments

Comments

@AnkurBegining
Copy link

I don't want to add the DATA_UPLOAD_MAX_MEMORY_SIZE in my settings.py but because of using the middleware. Now response in Django app is directing me that I have exceeded the MAX_MEMORY_SIZE while uploading the files. I don't want to set any max memory size in my app for now. Is there some way to use this middleware and still upload a larger file.

@jongoncalves
Copy link

@AnkurBegining
Do you have any additional details? Version of Django, file size you are attempting to upload, repro steps, etc.?

Also, are you sure this is due to the middleware?

You can always set DATA_UPLOAD_MAX_MEMORY_SIZE to None to disable the check.
Django Docs Reference

@AnkurBegining
Copy link
Author

AnkurBegining commented Mar 13, 2019

@jongoncalves
I am using Django=2.0, size_of_csv_file = 33.00 MB. I have tried to debug the middleware and I think it is because of the use of request.body in middleware. I don't think assigning DATA_UPLOAD_MAX_MEMORY_SIZE to None is a good idea. When I am removing this middleware then I am able to upload the larger memory size file but I am using this middleware then it just stops. I tried to debug the code of middleware and found the issue of request.body.

@andreparames
Copy link

Hi @AnkurBegining @jongoncalves, we're hitting the same problem. The issue is that request.body is actually a property, and it will run a check that verifies if the full body (including uploaded files, etc) can fit inside memory, that is, if it's smaller than DATA_UPLOAD_MAX_MEMORY_SIZE. It does that because reading from request.body forces Django to load the whole request as a string in memory.

The check is here: https://github.com/django/django/blob/master/django/http/request.py#L295

Yes, setting DATA_UPLOAD_MAX_MEMORY_SIZE = None will technically stop the error because it disables the check, but that means if one uploads a 2GB file, it will load 2GB as a Python string, which is not good 😄

We're currently trying to find a solution, but any ideas would be appreciated!

@pchiquet
Copy link

As a workaround, you can disable logging only for your file upload views thanks the no_logging decorator.

For example with a django-rest-framework view:

from rest_framework import viewsets
from request_logging.decorators import no_logging

class MyAPIView(viewsets.ModelViewSet):
    serializer_class = ...
    ...

    @no_logging("Disable logger for file upload.")
    def create(self, *args, **kwargs):
        return super().create(*args, **kwargs)

@shc261392
Copy link

My workaround is to override methods in LoggingMiddleware class to skip all request logging part for multipart/form-data requests.

class CustomLoggingMiddleware(LoggingMiddleware):
    def __call__(self, request):
        content_type = request.META.get('CONTENT_TYPE', '')
        is_multipart = content_type.startswith('multipart/form-data')
        if not is_multipart:
            self.cached_request_body = request.body
        response = self.get_response(request)
        self.process_request(request, response)
        self.process_response(request, response)
        return response

    def _log_multipart(self, body, logging_context, log_level):
        return

@brandon-clair
Copy link

brandon-clair commented Sep 28, 2022

@pchiquet Wrote:

As a workaround, you can disable logging only for your file upload views thanks the no_logging decorator.

Unfortunately, since the decorator lives on the view, and Middleware is processed prior to passing requests on to views, the RequestDataTooBig exception would raise before the request could even hit the decorator logic.

@shc261392 Wrote:

My workaround is to override methods in LoggingMiddleware class to skip all request logging part for multipart/form-data requests.

This is a more appropriate workaround. To make it more elegant, we can do a simple Content-Length header check against the current maximum file size limit, rather than entirely disable multipart logging. This would allow logging multipart/form-data requests that likely won't trigger the exception, while still avoiding problematic requests:

# Third Party
from django.conf import settings
from request_logging.middleware import LoggingMiddleware


class SafeLoggingMiddleware(LoggingMiddleware):
    """Custom request_logging middleware to handle edge cases."""

    def __call__(self, request):
        """Override to provide custom logic."""

        # Inspect headers
        content_length = request.META.get('CONTENT_LENGTH') or 0
        content_type = request.META.get('CONTENT_TYPE', '')
        is_multipart = content_type.startswith('multipart/form-data')

        # Do not process potentially large request bodies
        max_upload_size = int(settings.FILE_UPLOAD_MAX_MEMORY_SIZE) or 0
        if is_multipart and int(content_length) > max_upload_size:
            # Accessing request.body would trigger a check for any file size,
            # which would lead to a RequestDataTooBig exception being raised.
            # This exception would raise prior to views handling the request,
            # rendering the no_logging decorator ineffective. Skip logging the body
            # in this case in order to pass middleware and let the view validate file
            # size.
            cached_request_body = None
        else:
            cached_request_body = request.body

        # Honor standard processing
        #   https://github.com/Rhumbix/django-request-logging/blob/master/request_logging/middleware.py#L155
        response = self.get_response(request)
        self.process_request(request, response, cached_request_body)
        self.process_response(request, response)

        return response

@dolohow
Copy link

dolohow commented Feb 17, 2023

Attaching traceback, for completion

RequestDataTooBig:Request body exceeded settings.DATA_UPLOAD_MAX_MEMORY_SIZE.
File "django/core/handlers/exception.py", line 56, in inner
     response = get_response(request)
File "request_logging/middleware.py", line 159, in __call__
     cached_request_body = request.body
File "django/http/request.py", line 339, in body
     raise RequestDataTooBig( django.core.exceptions.RequestDataTooBig: Request body exceeded settings.DATA_UPLOAD_MAX_MEMORY_SIZE.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants