Skip to content

LOH fragmentation when user post form with section content more than 85KB #2720

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
aspnet-hello opened this issue Jan 2, 2018 · 12 comments
Closed
Labels
affected-very-few This issue impacts very few customers area-networking Includes servers, yarp, json patch, bedrock, websockets, http client factory, and http abstractions bug This issue describes a behavior which is not expected - a bug. feature-http-abstractions Perf severity-minor This label is used by an internal tool Stress

Comments

@aspnet-hello
Copy link

From @yuwaMSFT on Friday, March 25, 2016 3:24:48 PM

Post a multipart form with content more than 85KB in each section would cause heavy LOH fragmentation in MVC app.

Post a multipart form with 100 sections, each containing random sized content with an average of 120KB.

After the first request:

_         segment             begin         allocated              size
0000021c56c50000  0000021c56c51000  0000021c582fa2a0  0x16a92a0(23761568)
Total Size:              Size: 0x1f0bcb8 (32554168) bytes.
Statistics:
              MT    Count    TotalSize Class Name
0000021c46a5d8e0      **107         3204      **Free
00007ffbbe854418        9       146136 System.Object[]
00007ffbbe84e210       16      2097536 System.Byte[]
00007ffbbe862498      ** 81     21514154 **System.String
_

After the 2nd request:

_Large object heap starts at 0x0000021c56c51000
         segment             begin         allocated              size
0000021c56c50000  0000021c56c51000  0000021c5983f700  0x2bee700(46065408)
Statistics:
              MT    Count    TotalSize Class Name
0000021c46a5d8e0      **194         5814      **Free
00007ffbbe854418        9       146136 System.Object[]
00007ffbbe84e210       22      2884112 System.Byte[]
00007ffbbe862498      **162     43028308 **System.String
_

After the 3rd request:

_Large object heap starts at 0x0000021c56c51000
         segment             begin         allocated              size
0000021c56c50000  0000021c56c51000  0000021c5983f700  0x2bee700(46065408)
Total Size:              Size: 0x362f488 (56816776) bytes.
Statistics:
              MT    Count    TotalSize Class Name
0000021c46a5d8e0     ** 276         8274      **Free
00007ffbbe854418        9       146136 System.Object[]
00007ffbbe84e210       23      3015208 System.Byte[]
00007ffbbe862498     ** 243     64542462 **System.String_

So for each request we are adding the large string in LOH for each section (field) of the form. Also, look at the "Free" type. They are mostly 30 bytes in LOH accompany each large string. It may be some string interning.

For the string object, basically we are reading each section (field) into a string:

_                        else if (HasFormDataContentDisposition(contentDisposition))
                        {
                            // Content-Disposition: form-data; name="key"
                            //
                            // value

                            var key = HeaderUtilities.RemoveQuotes(contentDisposition.Name);
                            MediaTypeHeaderValue mediaType;
                            MediaTypeHeaderValue.TryParse(section.ContentType, out mediaType);
                            var encoding = FilterEncoding(mediaType?.Encoding);
                            using (var reader = new StreamReader(section.Body, encoding, detectEncodingFromByteOrderMarks: true, bufferSize: 1024, leaveOpen: true))
                            {
                                **var value = await reader.ReadToEndAsync**();
                                formAccumulator.Append(key, value);
                            }
                        }_

This can be easily explored to cause the server to run OOM. Security concern as well.

Copied from original issue: aspnet/HttpAbstractions#597

@aspnet-hello aspnet-hello added this to the Backlog milestone Jan 2, 2018
@aspnet-hello aspnet-hello added 0 - Backlog bug This issue describes a behavior which is not expected - a bug. feature-http-abstractions Stress labels Jan 2, 2018
@aspnet-hello
Copy link
Author

From @yuwaMSFT on Friday, March 25, 2016 3:26:42 PM

@sajayantony @sivagms @halter73 @Tratcher @blowdart

@aspnet-hello
Copy link
Author

From @sajayantony on Friday, March 25, 2016 3:27:48 PM

@yuwaMSFT Does the LOH fragmentation cause OOM or crash?

@aspnet-hello
Copy link
Author

From @Tratcher on Friday, March 25, 2016 3:33:59 PM

Related: aspnet/HttpAbstractions#583

@aspnet-hello
Copy link
Author

From @yuwaMSFT on Friday, March 25, 2016 3:35:02 PM

It causes the memory to accumulate. At some point it will cause OOM (and the app would crash on exception of memory allocation or killed by OOM killer before that happens).

The LOH looks like below:
0000021c5a90d830 0000021c46a5d8e0       30 Free
0000021c5a90d850 00007ffbbe862498   353606     
0000021c5a963d98 0000021c46a5d8e0       30 Free
0000021c5a963db8 00007ffbbe862498   289318     
0000021c5a9aa7e0 0000021c46a5d8e0       30 Free
0000021c5a9aa800 00007ffbbe862498   474146     
0000021c5aa1e428 0000021c46a5d8e0       30 Free
0000021c5aa1e448 00007ffbbe862498   345570     
0000021c5aa72a30 0000021c46a5d8e0       30 Free
0000021c5aa72a50 00007ffbbe862498   297354     
0000021c5aabb3e0 0000021c46a5d8e0       30 Free
0000021c5aabb400 00007ffbbe862498   208958     
0000021c5aaee440 0000021c46a5d8e0       30 Free
0000021c5aaee460 00007ffbbe862498   112526     
0000021c5ab09bf0 0000021c46a5d8e0       30 Free
0000021c5ab09c10 00007ffbbe862498   305390     
0000021c5ab54500 0000021c46a5d8e0       30 Free
0000021c5ab54520 00007ffbbe862498   249138     
0000021c5ab91258 0000021c46a5d8e0       30 Free
0000021c5ab91278 00007ffbbe862498   241102     
0000021c5abcc048 0000021c46a5d8e0       30 Free
0000021c5abcc068 00007ffbbe862498   305390     
0000021c5ac16958 0000021c46a5d8e0       30 Free
0000021c5ac16978 00007ffbbe862498   216994     
0000021c5ac4b920 0000021c46a5d8e0       30 Free
0000021c5ac4b940 00007ffbbe862498   233066     
0000021c5ac847b0 0000021c46a5d8e0       30 Free
0000021c5ac847d0 00007ffbbe862498   136634     
0000021c5aca5d90 0000021c46a5d8e0       30 Free
0000021c5aca5db0 00007ffbbe862498   257174  

@aspnet-hello
Copy link
Author

From @yuwaMSFT on Friday, March 25, 2016 3:39:14 PM

Related to #583 but different issue. This is easier to be explored as a security concern.

@aspnet-hello
Copy link
Author

From @blowdart on Saturday, March 26, 2016 5:29:12 AM

Ouchie :( And I think you're right, this is certainly a potential DoS concern (as is 583)

@aspnet-hello
Copy link
Author

From @muratg on Tuesday, April 5, 2016 2:49:05 PM

@blowdart need to fix this?

@aspnet-hello
Copy link
Author

From @blowdart on Tuesday, April 5, 2016 2:50:12 PM

Yes. If it's spiking memory so badly the app pool will recycle then it's a DoS

@aspnet-hello
Copy link
Author

From @muratg on Wednesday, July 6, 2016 9:26:31 AM

@yuwaMSFT Do you know if this is still an issue?

@aspnet-hello
Copy link
Author

From @yuwaMSFT on Wednesday, July 6, 2016 10:53:32 AM

I checked the latest code. It seems to still have the same issue for multipart form data content.
We seems to only apply KeyLengthLimit and ValueLengthLimit (which is 4MB anyway) to urlencodedform but not multipart-form. The only guard for multipart-form is the MultipartBodyLengthLimit which is by default 128MB. So I believe the same issue still exists.

@Tratcher
Copy link
Member

I don't think the object model gives us any choice here, we're supposed to render form fields as strings. There's no way to split up a string to keep it under the LOH limit.

Yes these LOH strings can cause a memory spike, but they should still be collected by GC before an actual OOM. Has anyone here actually experienced an OOM?

The MultipartBodyLengthLimit (128mb) and the server's request body size limit (30mb) can both be used to mitigate this if you're not expecting large bodies. The alternative is to read the form directly using the MutiplartReader.
https://docs.microsoft.com/en-us/aspnet/core/mvc/models/file-uploads?view=aspnetcore-5.0#upload-large-files-with-streaming

I recommend closing this.

@Tratcher Tratcher added affected-very-few This issue impacts very few customers severity-minor This label is used by an internal tool labels Nov 11, 2020 — with ASP.NET Core Issue Ranking
@davidfowl davidfowl removed this from the Backlog milestone Mar 28, 2021
@davidfowl davidfowl added the Perf label Mar 28, 2021
@davidfowl
Copy link
Member

We should make sure we document that large file uploads are unsuitable with the existing API (I believe we do call that out a little).

@ghost ghost locked as resolved and limited conversation to collaborators Apr 27, 2021
@amcasey amcasey added area-networking Includes servers, yarp, json patch, bedrock, websockets, http client factory, and http abstractions and removed area-runtime labels Jun 2, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
affected-very-few This issue impacts very few customers area-networking Includes servers, yarp, json patch, bedrock, websockets, http client factory, and http abstractions bug This issue describes a behavior which is not expected - a bug. feature-http-abstractions Perf severity-minor This label is used by an internal tool Stress
Projects
None yet
Development

No branches or pull requests

6 participants