中文 | 日本語 | Русский язык
A production-ready template for LLM (Large Language Model) application development, integrating AI tools, TypeScript type safety, Zod validation, and modern dev utilities.
- Type Safety: Leverages TypeScript to improve code quality and reduce runtime errors.
- Fast Development Workflow: Utilizes Vite for quick server starts and hot module replacement.
- Optimized Builds: Employs Rslib for efficient library bundling and optimized production outputs.
- AI Integration: Pre-configured with
@ai-sdk/openaiandaifor seamless interaction with large language models. - Robust Validation: Utilizes Zod for runtime schema validation, ensuring data integrity.
- Focus on Testing: Includes Vitest for fast and reliable unit testing.
- Code Consistency: Enforces code style and quality using Prettier and Antfu's ESLint configuration.
- Environment Management: Uses
dotenvfor secure configuration of API keys and environment-specific settings. - Cross-Platform Paths: Employs
pathefor consistent file path handling across different operating systems.
Key technologies used in this project include:
- Language: TypeScript
- LLM Framework: AI SDK (
@ai-sdk/openai,ai) - Validation: Zod
- Testing Framework: Vitest
- Build Tool: Rslib
- Development Server: Vite
- Code Quality: ESLint (Antfu's config), Prettier
- Utilities: Dotenv, Pathe
See the package.json for a full list of dependencies.
Follow these instructions to get the project running locally.
Ensure you have the following installed:
- Node.js (>= 18.x recommended)
- Package manager (npm, yarn, or pnpm)
node -v
npm -vRun script
pnpm create trapar-wavesInstall dependencies
npm install
yarn install
pnpm installCommon scripts available via npm run <script>, yarn <script>, or pnpm <script>:
build: Creates a production-ready build using Rslib.build:watch: Creates a production-ready build using Rslib in watch mode.lint: Checks the code for style and errors using ESLint.
Example:
# Create production build
npm run buildThis template provides a foundational structure for building LLM applications. It includes:
- A basic project structure with
src/directory. - Integration with OpenAI API through
@ai-sdk/openai. - Example usage of
aiandzodfor generating structured output. - Configuration for building and testing with Rslib and Vitest.
Developers can extend this template by adding their own prompts, models, and application logic within the src/ directory.
Contributions are welcome and greatly appreciated! Please follow these steps to contribute:
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Distributed under the MIT License. See LICENSE file for more information.
- Rikka: ([email protected])
- GitHub Profile: Muromi-Rikka