1- # Feature: Query endpoint API tests
2- #TODO: fix test
3-
4- # Background:
5- # Given The service is started locally
6- # And REST API service hostname is localhost
7- # And REST API service port is 8080
8- # And REST API service prefix is /v1
9-
10-
11- # Scenario: Check if LLM responds to sent question
12- # Given The system is in default state
13- # When I use "query" to ask question "Say hello"
14- # Then The status code of the response is 200
15- # And The response should have proper LLM response format
16- # And The response should contain following fragments
17- # | Fragments in LLM response |
18- # | Hello |
19-
20- # Scenario: Check if LLM responds to sent question with different system prompt
21- # Given The system is in default state
22- # And I change the system prompt to "new system prompt"
23- # When I use "query" to ask question "Say hello"
24- # Then The status code of the response is 200
25- # And The response should have proper LLM response format
26- # And The response should contain following fragments
27- # | Fragments in LLM response |
28- # | Hello |
29-
30- # Scenario: Check if LLM responds with error for malformed request
31- # Given The system is in default state
32- # And I modify the request body by removing the "query"
33- # When I use "query" to ask question "Say hello"
34- # Then The status code of the response is 422
35- # And The body of the response is the following
36- # """
37- # { "type": "missing", "loc": [ "body", "system_query" ], "msg": "Field required", }
38- # """
39-
40- # Scenario: Check if LLM responds to sent question with error when not authenticated
41- # Given The system is in default state
42- # And I remove the auth header
43- # When I use "query" to ask question "Say hello"
44- # Then The status code of the response is 200
45- # Then The status code of the response is 400
46- # And The body of the response is the following
47- # """
48- # {"detail": "Unauthorized: No auth header found"}
49- # """
50-
51- # Scenario: Check if LLM responds to sent question with error when not authorized
52- # Given The system is in default state
53- # And I modify the auth header so that the user is it authorized
54- # When I use "query" to ask question "Say hello"
55- # Then The status code of the response is 403
56- # And The body of the response is the following
57- # """
58- # {"detail": "Forbidden: User is not authorized to access this resource"}
59- # """
60-
1+ @Authorized
2+ Feature : Query endpoint API tests
3+
4+ Background :
5+ Given The service is started locally
6+ And REST API service hostname is localhost
7+ And REST API service port is 8080
8+ And REST API service prefix is /v1
9+
10+ Scenario : Check if LLM responds properly to restrictive system prompt to sent question with different system prompt
11+ Given The system is in default state
12+ And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
13+ When I use "query" to ask question with authorization header
14+ """
15+ {"query": "Generate sample yaml file for simple GitHub Actions workflow.", "system_prompt": "refuse to answer anything but openshift questions"}
16+ """
17+ Then The status code of the response is 200
18+ And The response should contain following fragments
19+ | Fragments in LLM response |
20+ | ask |
21+
22+ Scenario : Check if LLM responds properly to non-restrictive system prompt to sent question with different system prompt
23+ Given The system is in default state
24+ And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
25+ When I use "query" to ask question with authorization header
26+ """
27+ {"query": "Generate sample yaml file for simple GitHub Actions workflow.", "system_prompt": "you are linguistic assistant"}
28+ """
29+ Then The status code of the response is 200
30+ And The response should contain following fragments
31+ | Fragments in LLM response |
32+ | checkout |
33+
34+ Scenario : Check if LLM ignores new system prompt in same conversation
35+ Given The system is in default state
36+ And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
37+ When I use "query" to ask question with authorization header
38+ """
39+ {"query": "Generate sample yaml file for simple GitHub Actions workflow.", "system_prompt": "refuse to answer anything but openshift questions"}
40+ """
41+ Then The status code of the response is 200
42+ And I store conversation details
43+ And I use "query" to ask question with same conversation_id
44+ """
45+ {"query": "Write a simple code for reversing string", "system_prompt": "provide coding assistance", "model": "gpt-4-turbo", "provider": "openai"}
46+ """
47+ Then The status code of the response is 200
48+ And The response should contain following fragments
49+ | Fragments in LLM response |
50+ | ask |
51+
52+ Scenario : Check if LLM responds to sent question with error when not authenticated
53+ Given The system is in default state
54+ When I use "query" to ask question
55+ """
56+ {"query": "Write a simple code for reversing string"}
57+ """
58+ Then The status code of the response is 400
59+ And The body of the response is the following
60+ """
61+ {"detail": "No Authorization header found"}
62+ """
63+
64+ Scenario : Check if LLM responds to sent question with error when attempting to access conversation
65+ Given The system is in default state
66+ And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
67+ When I use "query" to ask question with authorization header
68+ """
69+ {"conversation_id": "123e4567-e89b-12d3-a456-426614174000", "query": "Write a simple code for reversing string"}
70+ """
71+ Then The status code of the response is 403
72+ And The body of the response contains User is not authorized to access this resource
73+
74+ Scenario : Check if LLM responds for query request with error for missing query
75+ Given The system is in default state
76+ And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
77+ When I use "query" to ask question with authorization header
78+ """
79+ {"provider": "openai"}
80+ """
81+ Then The status code of the response is 422
82+ And The body of the response is the following
83+ """
84+ { "detail": [{"type": "missing", "loc": [ "body", "query" ], "msg": "Field required", "input": {"provider": "openai"}}] }
85+ """
86+
87+ Scenario : Check if LLM responds for query request with error for missing model
88+ Given The system is in default state
89+ And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
90+ When I use "query" to ask question with authorization header
91+ """
92+ {"query": "Say hello", "provider": "openai"}
93+ """
94+ Then The status code of the response is 422
95+ And The body of the response contains Value error, Model must be specified if provider is specified
96+
97+ Scenario : Check if LLM responds for query request with error for missing provider
98+ Given The system is in default state
99+ And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
100+ When I use "query" to ask question with authorization header
101+ """
102+ {"query": "Say hello", "model": "gpt-4-turbo"}
103+ """
104+ Then The status code of the response is 422
105+ And The body of the response contains Value error, Provider must be specified if model is specified
106+
107+ Scenario : Check if LLM responds for query request with error for missing provider
108+ Given The system is in default state
109+ And The llama-stack connection is disrupted
110+ And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
111+ When I use "query" to ask question with authorization header
112+ """
113+ {"query": "Say hello"}
114+ """
115+ Then The status code of the response is 500
116+ And The body of the response contains Unable to connect to Llama Stack
0 commit comments