-
Notifications
You must be signed in to change notification settings - Fork 200
Description
The following functions works as you would expect in the in-process model:
using System.Threading.Tasks;
using System.Net.Http;
using System.Net;
using System.Collections.Generic;
using System.IO;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
namespace MyStreamExamples;
public class StreamExamples
{
private static HttpClient Client { get; } = new HttpClient();
[FunctionName("StreamExample1")]
public async Task<HttpResponseMessage> Run1([HttpTrigger(AuthorizationLevel.Anonymous)] HttpRequest req)
{
// Get a stream to some large-ish resource
var stream = await Client.GetStreamAsync("https://git.kernel.org/torvalds/t/linux-6.2-rc7.tar.gz");
// This immediately starts streaming the upstream content to the client without having to buffer the entire file first
return new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StreamContent(stream),
};
}
[FunctionName("StreamExample2")]
public async Task<IActionResult> Run2([HttpTrigger(AuthorizationLevel.Anonymous)] HttpRequest req)
{
// Manipulate the response directly
var response = req.HttpContext.Response;
response.StatusCode = 200;
response.ContentType = "text/plain";
await using var sw = new StreamWriter(response.Body);
await foreach (var msg in GetDataAsync())
{
// This streams a line to the client every 200 ms
await sw.WriteLineAsync(msg);
await sw.FlushAsync();
}
// Required to avoid the host trying to add headers to the response
return new EmptyResult();
}
private async IAsyncEnumerable<string> GetDataAsync()
{
for (var i = 0; i < 50; ++i)
{
await Task.Delay(200);
yield return $"Hello World no. {i+1}!";
}
}
}
Neither of these scenarios appear to be possible in the out-of-process model, as the HttpRequestMessage
/ HttpResponseMessage
instances are handled at the host, marshalling HttpRequestData
and HttpResponseData
as gRPC messages to and from the workers.
This comment hints that work is being done to bring the HTTP functionality available in the in-process model to workers, and refers to this epic and the most recent roadmap update. However, the issue referred to from both the roadmap and the epic for tracking HTTP related improvements hasn't seen any activity for (almost exactly) two years.
Is there any information you can share at this point related to supporting scenarios like the above? It appears to me (based on my cursory knowledge) that this will require substantial design/architecture changes for the entire worker-based model, so I am very curious to hear from you guys on what approaches you are considering and how this will turn out looking in "userland".
(Adding this as a new issue, as #366 - while obviously related - addresses streaming requests, not responses.)