Skip to content

Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException: Unexpected end of request content. #23949

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Tino92 opened this issue Jul 15, 2020 · 89 comments · Fixed by #60359
Labels
affected-most This issue impacts most of the customers area-mvc Includes: MVC, Actions and Controllers, Localization, CORS, most templates enhancement This issue represents an ask for new feature or an enhancement to an existing one feature-mvc-execution-pipeline Features related to how MVC executes a controller or razor page investigate severity-nice-to-have This label is used by an internal tool
Milestone

Comments

@Tino92
Copy link

Tino92 commented Jul 15, 2020

We are facing intermittently BadHttpRequestException: Unexpected end of request content.

Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException:
   at Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException.Throw (Microsoft.AspNetCore.Server.Kestrel.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.Http1ContentLengthMessageBody+<ReadAsyncInternal>d__9.MoveNext (Microsoft.AspNetCore.Server.Kestrel.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.HttpRequestStream+<ReadAsyncInternal>d__30.MoveNext (Microsoft.AspNetCore.Server.Kestrel.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.WebUtilities.FileBufferingReadStream+<ReadAsync>d__36.MoveNext (Microsoft.AspNetCore.WebUtilities, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.WebUtilities.StreamHelperExtensions+<DrainAsync>d__3.MoveNext (Microsoft.AspNetCore.WebUtilities, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.Formatters.NewtonsoftJsonInputFormatter+<ReadRequestBodyAsync>d__13.MoveNext (Microsoft.AspNetCore.Mvc.NewtonsoftJson, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.ModelBinding.Binders.BodyModelBinder+<BindModelAsync>d__7.MoveNext (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.ModelBinding.ParameterBinder+<BindModelAsync>d__7.MoveNext (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.Controllers.ControllerBinderDelegateProvider+<>c__DisplayClass0_0+<<CreateBinderDelegate>g__Bind|0>d.MoveNext (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker+<<InvokeInnerFilterAsync>g__Awaited|13_0>d.MoveNext (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker+<<InvokeNextResourceFilter>g__Awaited|24_0>d.MoveNext (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.Rethrow (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.Next (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.InvokeFilterPipelineAsync (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker+<<InvokeAsync>g__Logged|17_1>d.MoveNext (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Routing.EndpointMiddleware+<<Invoke>g__AwaitRequestTask|6_0>d.MoveNext (Microsoft.AspNetCore.Routing, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Authorization.AuthorizationMiddleware+<Invoke>d__5.MoveNext (Microsoft.AspNetCore.Authorization.Policy, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware+<Invoke>d__6.MoveNext (Microsoft.AspNetCore.Authentication, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.HttpProtocol+<ProcessRequests>d__214`1.MoveNext (Microsoft.AspNetCore.Server.Kestrel.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)

We are running on .NET Core 3.1.5, this exception seemed to only appear when we moved over to .NET Core 3.0

There have been similiar issues opened in the past, #19476 (comment) and #6575.

This exception seems to be thrown when a client aborts mid-request.
My question is, should this be logged as a warning instead of an exception? It creates a lot of noise in our logs.

@Tratcher
Copy link
Member

The exception needs to be thrown from the request body APIs so you don't think the request finished gracefully, but the caller should catch it. In this case that's MVC, and I think they're adding something in 5.0.

@Tratcher Tratcher added area-mvc Includes: MVC, Actions and Controllers, Localization, CORS, most templates and removed area-servers feature-kestrel labels Jul 15, 2020
@mkArtakMSFT mkArtakMSFT added this to the Next sprint planning milestone Jul 15, 2020
@mkArtakMSFT
Copy link
Contributor

Thanks for contacting us.
We're moving this issue to the Next sprint planning milestone for future evaluation / consideration. We will evaluate the request when we will planning the work for the next milestone. To learn more about what to expect next and how this issue will be handled you can read more about our triage process here.

@ghost
Copy link

ghost commented Jul 21, 2020

We've moved this issue to the Backlog milestone. This means that it is not going to be worked on for the coming release. We will reassess the backlog following the current release and consider this item at that time. To learn more about our issue management process and to have better expectation regarding different types of issues you can read our Triage Process.

@Tino92
Copy link
Author

Tino92 commented Jul 22, 2020

Do you recommend just filtering out the exceptions?

@Tratcher
Copy link
Member

Yes

@SteveSandersonMS SteveSandersonMS added affected-most This issue impacts most of the customers bug This issue describes a behavior which is not expected - a bug. severity-nice-to-have This label is used by an internal tool enhancement This issue represents an ask for new feature or an enhancement to an existing one labels Oct 6, 2020 — with ASP.NET Core Issue Ranking
@SteveSandersonMS SteveSandersonMS removed the bug This issue describes a behavior which is not expected - a bug. label Oct 6, 2020
@sunliusi
Copy link

The issue is still here in 5.0, version 2.2 is normal.

@sunliusi
Copy link

Can this error be ignored globally?

1 similar comment
@mynkow
Copy link

mynkow commented Mar 4, 2021

Can this error be ignored globally?

@javiercn javiercn added the feature-mvc-execution-pipeline Features related to how MVC executes a controller or razor page label Apr 18, 2021
@pccai
Copy link

pccai commented Jun 16, 2021

The issue is still here in 5.0.7

@xsoheilalizadeh
Copy link

We are facing the same issue on .NET 6 Preview 7:

fail: Microsoft.AspNetCore.Server.Kestrel[13]
Connection id "0HMBJDQ4IBONH", Request id "0HMBJDQ4IBONH:00000002": An unhandled exception was thrown by the application.
Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException: Unexpected end of request content.
at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.Http1ContentLengthMessageBody.ReadAsyncInternal(CancellationToken cancellationToken) in Microsoft.AspNetCore.Server.Kestrel.Core.dll:token 0x60009ee+0x1a2
at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.HttpRequestStream.ReadAsyncInternal(Memory`1 destination, CancellationToken cancellationToken) in Microsoft.AspNetCore.Server.Kestrel.Core.dll:token 0x6000e12+0x77
at System.Text.Json.JsonSerializer.ReadFromStreamAsync(Stream utf8Json, ReadBufferState bufferState, CancellationToken cancellationToken) in System.Text.Json.dll:token 0x60003a8+0x95
at System.Text.Json.JsonSerializer.ReadAllAsync[TValue](Stream utf8Json, JsonTypeInfo jsonTypeInfo, CancellationToken cancellationToken) in System.Text.Json.dll:token 0x60003a6+0xfa
at Microsoft.AspNetCore.Mvc.Formatters.SystemTextJsonInputFormatter.ReadRequestBodyAsync(InputFormatterContext context, Encoding encoding) in Microsoft.AspNetCore.Mvc.Core.dll:token 0x6000b0e+0x103
at Microsoft.AspNetCore.Mvc.Formatters.SystemTextJsonInputFormatter.ReadRequestBodyAsync(InputFormatterContext context, Encoding encoding) in Microsoft.AspNetCore.Mvc.Core.dll:token 0x6000b0e+0x26a
at Microsoft.AspNetCore.Mvc.ModelBinding.Binders.BodyModelBinder.BindModelAsync(ModelBindingContext bindingContext) in Microsoft.AspNetCore.Mvc.Core.dll:token 0x60007fc+0x1e8
at Microsoft.AspNetCore.Mvc.ModelBinding.ParameterBinder.BindModelAsync(ActionContext actionContext, IModelBinder modelBinder, IValueProvider valueProvider, ParameterDescriptor parameter, ModelMetadata metadata, Object value, Object container) in Microsoft.AspNetCore.Mvc.Core.dll:token 0x600066e+0x220
at Microsoft.AspNetCore.Mvc.Controllers.ControllerBinderDelegateProvider.<>c__DisplayClass0_0.<<CreateBinderDelegate>g__Bind|0>d.MoveNext() in Microsoft.AspNetCore.Mvc.Core.dll:token 0x60010f6+0x16d
--- End of stack trace from previous location ---
at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.<InvokeInnerFilterAsync>g__Awaited|13_0(ControllerActionInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted) in Microsoft.AspNetCore.Mvc.Core.dll:token 0x60009c2+0x65
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<InvokeFilterPipelineAsync>g__Awaited|20_0(ResourceInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted) in Microsoft.AspNetCore.Mvc.Core.dll:token 0x6000a81+0x65
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<InvokeAsync>g__Awaited|17_0(ResourceInvoker invoker, Task task, IDisposable scope) in Microsoft.AspNetCore.Mvc.Core.dll:token 0x6000a7d+0x77
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<InvokeAsync>g__Awaited|17_0(ResourceInvoker invoker, Task task, IDisposable scope) in Microsoft.AspNetCore.Mvc.Core.dll:token 0x6000a7d+0xfb
at Microsoft.AspNetCore.Routing.EndpointMiddleware.<Invoke>g__AwaitRequestTask|6_0(Endpoint endpoint, Task requestTask, ILogger logger) in Microsoft.AspNetCore.Routing.dll:token 0x60000ab+0x5e
at Microsoft.AspNetCore.Authorization.Policy.AuthorizationMiddlewareResultHandler.HandleAsync(RequestDelegate next, HttpContext context, AuthorizationPolicy policy, PolicyAuthorizationResult authorizeResult) in Microsoft.AspNetCore.Authorization.Policy.dll:token 0x600001b+0x338
at Microsoft.AspNetCore.Authorization.AuthorizationMiddleware.Invoke(HttpContext context) in Microsoft.AspNetCore.Authorization.Policy.dll:token 0x6000013+0x448
at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware.Invoke(HttpContext context) in Microsoft.AspNetCore.Authentication.dll:token 0x6000049+0x401
at Microsoft.AspNetCore.Routing.EndpointRoutingMiddleware.<Invoke>g__AwaitMatcher|8_0(EndpointRoutingMiddleware middleware, HttpContext httpContext, Task`1 matcherTask) in Microsoft.AspNetCore.Routing.dll:token 0x60000b9+0x130
at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.HttpProtocol.ProcessRequests[TContext](IHttpApplication`1 application) in Microsoft.AspNetCore.Server.Kestrel.Core.dll:token 0x6000d6b+0x54c

@senj
Copy link

senj commented Sep 27, 2021

How can I filter this exception in logs, without ignoring other error messages from kestrel ?

@WeihanLi
Copy link
Contributor

How can I filter this exception in logs, without ignoring other error messages from kestrel ?

I add a custom exception handler to ignore the BadHttpRequestException

@supermihi
Copy link

@WeihanLi could you share a code snippet for filtering that exception?

@pranavkm pranavkm added old-area-web-frameworks-do-not-use *DEPRECATED* This label is deprecated in favor of the area-mvc and area-minimal labels and removed area-mvc Includes: MVC, Actions and Controllers, Localization, CORS, most templates labels Oct 19, 2021
@AnisTigrini
Copy link

AnisTigrini commented Oct 1, 2024

This is still happening. We are using dotnet 8 and are running on a k8s on Azure. As soon as we stress the pod a little with multiple concurrent connections the exception is being thrown.

@michaelmarcuccio
Copy link

michaelmarcuccio commented Oct 1, 2024

@AnisTigrini are you using app insights profiler?

services.AddServiceProfiler

After I disabled this, it fixed this issue for me. This is the info that got me there:
https://stackoverflow.com/questions/77855606/should-we-enable-azure-application-insights-profiler-in-production

@AnisTigrini
Copy link

Hey there @michaelmarcuccio thanks for the quick reply.
We actually do not have the app insight profiler it enabled on azure.
So it makes me think it might be the number of concurrent connection setup maybe.

@AnisTigrini
Copy link

By the way, for anyone that experienced the same problem, here is what I found. So we are running a pod with dotnet 8 using the official Microsoft image mcr.microsoft.com/dotnet/sdk:8.0
The pod we had is a dotnet REST API that does a lot of HTTP calls to other services. It's kind of a proxy to be honest.

I tried to contact API enpoints that do not make any HTTP calls, and the server returned a response immediately, so I knew that the problem was occurring when the server was making HTTP calls to other services. 

I tried to take a look by downloading netstat into the pod and realized that the problem was socket starvation.
In summary, we were using a deprecated API that Dotnet recommended avoiding. (WebClient).

The solution for us was to do a migration and replace all those calls with the recommended API (HTTPClient).  
Furthermore, you should avoid using that class once per request, as it can also cause socket starvation. It is meant to be used as a singleton or using the HttpClientFactory with DI.

I hope this helps some of you that are dealing with the issue.

@adityamandaleeka
Copy link
Member

Hey everyone, this issue has a lot of responses, but from a quick read through these comments, it looks like there are lots of ways people end up hitting this exception. This makes sense, since this is a pretty generic exception that can occur in many cases (client side aborts, network issues, timeouts/disconnects under load, errors in proxies/LBs/etc.).

Unfortunately, it also means that this issue isn't really actionable for us unless we have a specific case (with a repro) that we can investigate and address.

@macias
Copy link

macias commented Jan 24, 2025

It is actionable right away, and it was from the beginning. If you use "pretty generic exception" then it is pure common sense to actually include the reason of the problem (as the message), so we (end-developers) at least understand what is going on.

Reproducing is tricky if it happens say once in 10 000 times, but once it hits we are not left in dark at least.

@adityamandaleeka
Copy link
Member

include the reason of the problem (as the message), so we (end-developers) at least understand what is going on.

Can you describe what you're looking for here? The reason for the BadHttpRequestException is the Unexpected end of request content, which is what it already says. There are other forms of logging you can enable to investigate more (e.g. debug/trace level logging, client logs if available, etc) but it sounds like you are looking for something else to show up in the exception message.

@macias
Copy link

macias commented Jan 25, 2025

@adityamandaleeka "Can you describe what you're looking for here?" I don't know because I am not developer of this part/framework. But since you mention few possibilities, I can only rely on those, and thus I would expect message: "client side aborted connection", or "connection timeout", or "proxy error".

I am especially interested in core/root error message because I run my server locally, with no proxies, so basically none of so far mentioned reasons you mentioned yesterday apply to me.

@taylaninan
Copy link

@adityamandaleeka If you want, I can provide the simple API source codes (approx. 150-200 lines) for inspection, which are causing this specific error under load (stress tests) when tested with Autocannon, which is a NodeJS + NPM packaged JavaScript application.

I don't want to open source all the code, which I'll provide, so I'm not adding it to here as a ZIP file or GitHub repo.

taylaninan ['@'] yahoo.com is my email address. Just write me an email, so that I can attach the ZIP file and send some instructions written in email, too.

@taylaninan
Copy link

@adityamandaleeka 5 days have past since my last offer of help, but unfortunately I have been NOT contacted by you to provide the sample project to reproduce the bug, so that it can be fixed. This bug has been open for 4.5 years. Does Microsoft even care about the developers? Or considering the situation of Windows 11's "problematic" updates, does MS care about its end-users?

I have mentioned this bug, 6-7 months ago, and in the meanwhile, I have learned "Java", and you know what? I'm not looking back. MS is loosing simply developers and end-users because of lack of support, which will eventually cost MS money.

I'm avoiding .NET Core like plague. In my projects, I'm using either "outdated" .NET Framework or Java, but no .NET Core, because of bugs and lack of support.

Developers and end-users are starting to get "angry" with MS.
Before spending 80 billion dollars on AI, first fix Windows 11, then fix .NET Core and your development platforms.

@adityamandaleeka
Copy link
Member

adityamandaleeka commented Jan 31, 2025

@taylaninan Sorry about the delay; this is on my list but I am juggling lots of things here and doing my best.

Unfortunately I can't accept a zip repro, but if you can describe your scenario or create a minimal repro (which doesn't include any of your private code that you don't want to OSS) and put it on a public repo we can investigate. However, I want to be clear that it is not likely that addressing that particular case will "solve" this issue for everyone else. From the server's perspective, when this happens, we can only know that there was an unexpected end of request content, but not what other parts of the system can/should do.

@SteveSmith16384
Copy link

Does anyone know what do other servers do under these circumstances, like Apache? Or proxy servers like Nginx?

@nefarius
Copy link

This is such an incredibly weird one; the very first time I saw this happen is when a colleague of mine living in the USA (I am in Central Europe, the Server in question is hosted in Germany) started using a POST endpoint that I have used many times before without issue but when he does, this exception pops up fairly frequently and for now we use Polly to just retry until the upload succeeds. It's a bit cursed TBH.

@taylaninan
Copy link

@adityamandaleeka
I have created a public GitHub repo for you with very simplified codes
https://github.com/taylaninan/WebAPICore8-BadHttpRequest

Just concentrate on the file "Program.cs", which is only 129 lines long in total.
The public repo has a folder named "Autocannon" for load-testing and reproducing the "BadHttpRequest" exception.

Just open the project in Visual Studio 2022 and hit "run" to start the Kestrel server.

For autocannon to work, you must have installed "node & npm" on your computer.
Then just install autocannon with "npm install autocannon" to your computer.
After that start the autocannon test script ("autocannon-tests.bat").
You will see a lot of "BadHttpRequest" errors on the Console/Terminal.

If you have any questions, don't hesitate to ask me for help...

@adityamandaleeka
Copy link
Member

Thank you @taylaninan. That does indeed reproduce this exception. The high volume of concurrent connections seems like a key ingredient here and suggests that resource exhaustion of some sort is the issue.

@adityamandaleeka
Copy link
Member

Actually, it looks like these always appear at the end... I wonder if these are just partial requests that the client aborts when it needs to stop (in your case after 60 seconds).

@nenadcinober
Copy link

Actually, it looks like these always appear at the end... I wonder if these are just partial requests that the client aborts when it needs to stop (in your case after 60 seconds).

In that case, this sounds like some kind of expected behaviour. If that is the case, does it make sense to reduce the log level of this to Warning or Info? It is not an application error.

@adityamandaleeka
Copy link
Member

I did a wireshark capture to confirm my theory.

Image

In this scenario, I modified your sample to use 400 concurrent connections for a 10 second warmup period, then back off to 40 concurrent connections for the actual test.

We can see that there are a series of RST, ACK packets right around the 10 second mark, which indicate that the client has forcefully closed the connection. The Win=0 thing means the client isn't willing to receive any more data at this point.

Notably, after this set of BadHttpRequestExceptions, the rest of the test (the post-warmup period from the autocannon perspective) went fine, indicating there's no lasting effects on the Kestrel side.

@adityamandaleeka
Copy link
Member

@nenadcinober I agree, that makes sense. Let me consider that a bit more. Maybe that's the best outcome from this.

@sergey-litvinov
Copy link
Contributor

sergey-litvinov commented Feb 4, 2025

It might be a breaking change, but could it be more OperationCanceledException from logical side? like when client sends request and it takes some time, and then client just disconnects, then OperationCanceledException is thrown (at least if we handle CancellationToken for that request).
Maybe this specific case should follow the same behavior? and the difference is that client's request was cancelled not on the stage when it waits for response body but more on the initial request & header's parsing there. Since both cases mean that client just cancelled own request (because response was too long or it was thread starvation and we didn't have free threads to handline incoming request yet and it wasn't able to scale it fast enough to handle incoming wall of requests)

@nenadcinober
Copy link

@nenadcinober I agree, that makes sense. Let me consider that a bit more. Maybe that's the best outcome from this.

@adityamandaleeka It seems that this issue is introduced long ago in this PR:
#16725
with this change: e3b971a#diff-6724c4b674672fff80410c379bdc2b86ee5e6b797ef797cbf667a80877113d7cR94

I feel we should replace "ThrowUnexpectedEndOfRequestContent()" with just "break;" on this line, as client has canceled the request, so we can break gracefully:

@adityamandaleeka
Copy link
Member

I've just opened a PR (#60359) to solve this issue in a safe way that's fairly straightforward. Basically we can skip logging these bad requests as an application error and continue to allow people who want to see it to opt into bad request logging.

@taylaninan
Copy link

@adityamandaleeka
Very good indeed! Thanks for the solution...

1 down, 3430 issues to go...
Keep up the good work!

@lindsay-duncan-tylertech

Will this solution be available in Dotnet 8?

@adityamandaleeka
Copy link
Member

@lindsay-duncan-tylertech We usually backport fixes based on severity and impact. In this case, the log message is unexpected, but it doesn’t actually cause any issues for your application. Plus, this behavior has been around for years. Given that, while I'm glad we've improved it, I don’t think there’s a strong enough reason to backport this one.

@danmoseley
Copy link
Member

@lindsay-duncan-tylertech perhaps you or others here would be willing to try out the next .NET 10 preview to confirm the fix.

@irfankhanteo
Copy link

I am facing the same issue. I am using dotnet 8.

Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException: Reading the request body timed out due to data arriving too slowly. See MinRequestBodyDataRate. at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.Http1ContentLengthMessageBody.ReadAsyncInternal(CancellationToken cancellationToken) at System.Runtime.CompilerServices.PoolingAsyncValueTaskMethodBuilder1.StateMachineBox1.System.Threading.Tasks.Sources.IValueTaskSource.GetResult(Int16 token) at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.HttpRequestStream.ReadAsyncInternal(Memory1 destination, CancellationToken cancellationToken) at System.Runtime.CompilerServices.PoolingAsyncValueTaskMethodBuilder1.StateMachineBox1.System.Threading.Tasks.Sources.IValueTaskSource.GetResult(Int16 token) at Microsoft.AspNetCore.WebUtilities.FileBufferingReadStream.ReadAsync(Memory1 buffer, CancellationToken cancellationToken) at

@adityamandaleeka
Copy link
Member

adityamandaleeka commented Mar 17, 2025

@irfankhanteo Yours is a different issue than the client disconnects we discussed above. As the exception message suggests, your request body is arriving too slowly so it timed out. This doc will help you: https://learn.microsoft.com/dotnet/api/microsoft.aspnetcore.server.kestrel.core.kestrelserverlimits.minrequestbodydatarate

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
affected-most This issue impacts most of the customers area-mvc Includes: MVC, Actions and Controllers, Localization, CORS, most templates enhancement This issue represents an ask for new feature or an enhancement to an existing one feature-mvc-execution-pipeline Features related to how MVC executes a controller or razor page investigate severity-nice-to-have This label is used by an internal tool
Projects
No open projects
Status: No status