Description
Prerequisites
- Write a descriptive title.
Description of the new feature/enhancement
As a PowerShell User
I want to receive prediction results from external sources like large language models (ChatGPT/Copilot)
And that may take longer than 20 milliseconds
Since the original design of Predictive Intellisense, the world has changed in the form of LLMs like Github CoPilot
that can predict code for you based on a prompt. However, the round trip to perform this action will likely be in the several hundred ms if not multiple seconds, which is more than the default 20ms in the PowerShell CommandPrediction subsystem.
https://github.com/PowerShell/PowerShell/blob/master/src/System.Management.Automation/engine/Subsystem/PredictionSubsystem/CommandPrediction.cs#L75
That default is configurable however, so empower users with the option to choose to extend the timeout, understanding they are trading "slower" prompt responses for potentially more detailed ones.
I ran into this while trying to write a copilot predictor, thanks @SeeminglyScience for the heads up.
CC @daxian-dbw
Proposed technical implementation details (optional)
Set-PSReadlineOption -PredictionTimeout 5000
If this option is set, it will call the PredictInputAsync
method with the overload that allows specifying a timeout.