Replies: 3 comments
-
Would this be the solution: in the POST body parsing plugin use _format_version: "3.0"
_transform: true
services:
- name: gpt-4o
host: gpt-4o.internal.cluster
port: 80
protocol: http
plugins:
- name: rate-limiting
config:
minute: 10
policy: local
- name: gpt-35
host: gpt-35.internal.cluster
port: 80
protocol: http
plugins:
- name: rate-limiting
config:
minute: 50
policy: local
- name: dummy-service
host: dummy.internal
port: 80
protocol: http
routes:
- name: chat-route
paths:
- /v1/chat/completions
strip_path: false
service: dummy-service
- name: gpt-4o-route
paths:
- /v1/chat/completions/gpt-4o
strip_path: true
service: gpt-4o
- name: gpt-35-route
paths:
- /v1/chat/completions/gpt-3.5-turbo
strip_path: true
service: gpt-35
plugins:
- name: body-model-router
route: chat-route
|
Beta Was this translation helpful? Give feedback.
-
@frosk1 To achieve what you want—route based on POST body content while still applying service-level plugins—you can:
|
Beta Was this translation helpful? Give feedback.
-
Hey, thx for the answer. So I think I maybe in the meantime have tried also this but not 100% sure. After no solution I ended up letting the plugin send another http request to local host with the new route to trigger the default plugins, because I think I have read somewhere that nginx or open resty and thus Kong, does not provide a way where a request is coming in and a plugin would trigger a move again to the routing phase when already received the request so to say. Not sure if I outline this correctly, but to make it more straightforward: Is the only way to trigger default plugins under a route from another route, when an entrypoint route triggers a plugin which sends a new http request to localhost? PS: I know that this will actually increase memory load by the factor of two, since Every request to the entrypoint will generate a new request with the exact same payload and everything. |
Beta Was this translation helpful? Give feedback.
-
Hey,
I wanted to ask if it is possible in Kong to apply a single entrypoint route that uses a POST body parsing custom plugin, which routes the traffic to the corresponding backend service. But still I would want to apply the default plugins on the backend services.
The following config yaml examplifies this:
I would want Kong to
Like a unified API Gateway for openai API specifics, that routes traffic based on the POST body model parameter to a specific backend.
I guess this comes down to:
Can
kong.service.set_target()
be used to reference the service object defined in the config with the service specific plugins? Or is there another way that the route plugin could then trigger the service plugins instead of directly setting upstream host port?Beta Was this translation helpful? Give feedback.
All reactions