|
1 | | -# Feature: Info endpoint API tests |
2 | | -#TODO: fix test |
3 | | - |
4 | | -# Background: |
5 | | -# Given The service is started locally |
6 | | -# And REST API service hostname is localhost |
7 | | -# And REST API service port is 8080 |
8 | | -# And REST API service prefix is /v1 |
9 | | - |
10 | | -# Scenario: Check if the OpenAPI endpoint works as expected |
11 | | -# Given The system is in default state |
12 | | -# When I access endpoint "openapi.json" using HTTP GET method |
13 | | -# Then The status code of the response is 200 |
14 | | -# And The body of the response contains OpenAPI |
15 | | - |
16 | | -# Scenario: Check if info endpoint is working |
17 | | -# Given The system is in default state |
18 | | -# When I access REST API endpoint "info" using HTTP GET method |
19 | | -# Then The status code of the response is 200 |
20 | | -# And The body of the response has proper name "lightspeed_stack" and version "0.2.0" |
21 | | - |
22 | | -# Scenario: Check if models endpoint is working |
23 | | -# Given The system is in default state |
24 | | -# When I access REST API endpoint "models" using HTTP GET method |
25 | | -# Then The status code of the response is 200 |
26 | | -# And The body of the response contains gpt |
27 | | - |
28 | | - |
29 | | -# Scenario: Check if models endpoint is working |
30 | | -# Given The system is in default state |
31 | | -# And The llama-stack connection is disrupted |
32 | | -# When I access REST API endpoint "models" using HTTP GET method |
33 | | -# Then The status code of the response is 503 |
34 | | - |
35 | | -# Scenario: Check if metrics endpoint is working |
36 | | -# Given The system is in default state |
37 | | -# When I access REST API endpoint "metrics" using HTTP GET method |
38 | | -# Then The status code of the response is 200 |
39 | | -# And The body of the response has proper metrics |
40 | | - |
41 | | -# Scenario: Check if metrics endpoint is working |
42 | | -# Given The system is in default state |
43 | | -# And The llama-stack connection is disrupted |
44 | | -# When I access REST API endpoint "metrics" using HTTP GET method |
45 | | -# Then The status code of the response is 500 |
46 | | - |
| 1 | +Feature: Info tests |
| 2 | + |
| 3 | + |
| 4 | + Background: |
| 5 | + Given The service is started locally |
| 6 | + And REST API service hostname is localhost |
| 7 | + And REST API service port is 8080 |
| 8 | + And REST API service prefix is /v1 |
| 9 | + |
| 10 | + Scenario: Check if the OpenAPI endpoint works as expected |
| 11 | + Given The system is in default state |
| 12 | + When I access endpoint "openapi.json" using HTTP GET method |
| 13 | + Then The status code of the response is 200 |
| 14 | + And The body of the response contains OpenAPI |
| 15 | + |
| 16 | + Scenario: Check if info endpoint is working |
| 17 | + Given The system is in default state |
| 18 | + When I access REST API endpoint "info" using HTTP GET method |
| 19 | + Then The status code of the response is 200 |
| 20 | + And The body of the response has proper name Lightspeed Core Service (LCS) and version 0.2.0 |
| 21 | + And The body of the response has llama-stack version 0.2.19 |
| 22 | + |
| 23 | + Scenario: Check if info endpoint reports error when llama-stack connection is not working |
| 24 | + Given The system is in default state |
| 25 | + And The llama-stack connection is disrupted |
| 26 | + When I access REST API endpoint "info" using HTTP GET method |
| 27 | + Then The status code of the response is 500 |
| 28 | + And The body of the response is the following |
| 29 | + """ |
| 30 | + {"detail": {"response": "Unable to connect to Llama Stack", "cause": "Connection error."}} |
| 31 | + """ |
| 32 | + |
| 33 | + Scenario: Check if models endpoint is working |
| 34 | + Given The system is in default state |
| 35 | + When I access REST API endpoint "models" using HTTP GET method |
| 36 | + Then The status code of the response is 200 |
| 37 | + And The body of the response for model gpt-4o-mini has proper structure |
| 38 | + |
| 39 | + |
| 40 | + Scenario: Check if models endpoint is working |
| 41 | + Given The system is in default state |
| 42 | + And The llama-stack connection is disrupted |
| 43 | + When I access REST API endpoint "models" using HTTP GET method |
| 44 | + Then The status code of the response is 500 |
| 45 | + And The body of the response is the following |
| 46 | + """ |
| 47 | + {"detail": {"response": "Unable to connect to Llama Stack", "cause": "Connection error."}} |
| 48 | + """ |
| 49 | + |
| 50 | + |
| 51 | + Scenario: Check if metrics endpoint is working |
| 52 | + Given The system is in default state |
| 53 | + When I access endpoint "metrics" using HTTP GET method |
| 54 | + Then The status code of the response is 200 |
| 55 | + And The body of the response contains ls_provider_model_configuration |
0 commit comments