Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
86 changes: 40 additions & 46 deletions .github/workflows/build-validation.yml
Original file line number Diff line number Diff line change
@@ -1,64 +1,58 @@
name: .NET Build Validation

'on':
"on":
pull_request:
branches: [ main ]
branches: [main]
paths:
- '02-SetupDevEnvironment/**/*.cs'
- '02-SetupDevEnvironment/**/*.csproj'
- '02-SetupDevEnvironment/**/*.sln'
- '02-SetupDevEnvironment/**/*.json'
- '03-CoreGenerativeAITechniques/**/*.cs'
- '03-CoreGenerativeAITechniques/**/*.csproj'
- '03-CoreGenerativeAITechniques/**/*.sln'
- '03-CoreGenerativeAITechniques/**/*.json'
- '04-PracticalSamples/**/*.cs'
- '04-PracticalSamples/**/*.csproj'
- '04-PracticalSamples/**/*.sln'
- '04-PracticalSamples/**/*.razor'
- '04-PracticalSamples/**/*.json'
- '05-AppCreatedWithGenAI/**/*.cs'
- '05-AppCreatedWithGenAI/**/*.csproj'
- '05-AppCreatedWithGenAI/**/*.sln'
- '05-AppCreatedWithGenAI/**/*.razor'
- '05-AppCreatedWithGenAI/**/*.json'
- '.github/workflows/build-validation.yml'
- "samples/CoreGenerativeAITechniques/**/*.cs"
- "samples/CoreGenerativeAITechniques/**/*.csproj"
- "samples/CoreGenerativeAITechniques/**/*.sln"
- "samples/CoreGenerativeAITechniques/**/*.json"
- "samples/PracticalSamples/**/*.cs"
- "samples/PracticalSamples/**/*.csproj"
- "samples/PracticalSamples/**/*.sln"
- "samples/PracticalSamples/**/*.razor"
- "samples/PracticalSamples/**/*.json"
- "05-AppCreatedWithGenAI/**/*.cs"
- "05-AppCreatedWithGenAI/**/*.csproj"
- "05-AppCreatedWithGenAI/**/*.sln"
- "05-AppCreatedWithGenAI/**/*.razor"
- "05-AppCreatedWithGenAI/**/*.json"
- ".github/workflows/build-validation.yml"

jobs:
build:
name: Build .NET Projects
runs-on: ubuntu-latest

strategy:
fail-fast: false # Show results for all projects even if some fail
fail-fast: false # Show results for all projects even if some fail
matrix:
solution:
- path: "02-SetupDevEnvironment/src/GettingReadySamples.sln"
name: "Setup Samples"
- path: "03-CoreGenerativeAITechniques/src/CoreGenerativeAITechniques.sln"
name: "Core Techniques"
- path: "04-PracticalSamples/src/Aspire.MCP.Sample.sln"
name: "Practical Samples"
- path: "05-AppCreatedWithGenAI/HFMCP.GenImage/HFMCP.GenImage.sln"
name: "HFMCP GenImage App"
- path: "05-AppCreatedWithGenAI/SpaceAINet/SpaceAINet.sln"
name: "SpaceAINet App"
- path: "samples/CoreGenerativeAITechniques/CoreGenerativeAITechniques.sln"
name: "Core Techniques"
- path: "samples/PracticalSamples/Aspire.MCP.Sample.sln"
name: "Practical Samples"
- path: "05-AppCreatedWithGenAI/HFMCP.GenImage/HFMCP.GenImage.sln"
name: "HFMCP GenImage App"
- path: "05-AppCreatedWithGenAI/SpaceAINet/SpaceAINet.sln"
name: "SpaceAINet App"

steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Checkout code
uses: actions/checkout@v4

- name: Setup .NET 9.0
uses: actions/setup-dotnet@v4
with:
dotnet-version: '9.0.x'
- name: Setup .NET 9.0
uses: actions/setup-dotnet@v4
with:
dotnet-version: "9.0.x"

- name: Restore dependencies for ${{ matrix.solution.name }}
run: dotnet restore "${{ matrix.solution.path }}"
- name: Restore dependencies for ${{ matrix.solution.name }}
run: dotnet restore "${{ matrix.solution.path }}"

- name: Build ${{ matrix.solution.name }}
run: dotnet build "${{ matrix.solution.path }}" --no-restore --configuration Release --verbosity minimal
- name: Build ${{ matrix.solution.name }}
run: dotnet build "${{ matrix.solution.path }}" --no-restore --configuration Release --verbosity minimal

- name: Display build result
if: success()
run: echo "✅ ${{ matrix.solution.name }} build succeeded"
- name: Display build result
if: success()
run: echo "✅ ${{ matrix.solution.name }} build succeeded"
8 changes: 4 additions & 4 deletions 02-SetupDevEnvironment/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,13 +37,13 @@ Here's a quick rundown of the services:

The Ollama Codespace will provision all the necessary models that you need. However, if you are working in local mode, once you have installed Ollama, you need to pull the models for the lessons you want to run.

- For lesson "**02 - Setting Up for .NET Development with Generative AI**" and project [MEAIFunctionsOllama](https://github.com/microsoft/Generative-AI-for-beginners-dotnet/tree/main/02-SetupDevEnvironment/src/BasicChat-03Ollama) you need to pull a model like [phi4-mini](https://ollama.com/library/phi4-mini) or [llama3.2](https://ollama.com/library/llama3.2) by entering in terminal
- For lesson "**02 - Setting Up for .NET Development with Generative AI**" and project [BasicChat-03Ollama](../samples/CoreGenerativeAITechniques/BasicChat-03Ollama/) you need to pull a model like [phi4-mini](https://ollama.com/library/phi4-mini) or [llama3.2](https://ollama.com/library/llama3.2) by entering in terminal

```bash
ollama pull phi4-mini
```

- For lesson "**03 - Core Generative AI Techniques with .NET**", when running the ollama projects like [RAGSimple-10SKOllama](https://github.com/microsoft/Generative-AI-for-beginners-dotnet/tree/main/03-CoreGenerativeAITechniques/src/RAGSimple-10SKOllama), you need to pull the models [all-minilm](https://ollama.com/library/all-minilm) and [phi4-mini](https://ollama.com/library/phi4-mini) by entering in terminal:
- For lesson "**03 - Core Generative AI Techniques with .NET**", when running the ollama projects like [RAGSimple-10SKOllama](../samples/CoreGenerativeAITechniques/RAGSimple-10SKOllama/), you need to pull the models [all-minilm](https://ollama.com/library/all-minilm) and [phi4-mini](https://ollama.com/library/phi4-mini) by entering in terminal:

```bash
ollama pull phi4-mini
Expand Down Expand Up @@ -126,11 +126,11 @@ Once your Codespace is fully loaded and configured, let's run a sample app to ve
1. Switch to the proper directory by running the following command:
If you're using Windows Command Prompt (CMD) or PowerShell:
```bash
cd 02-SetupDevEnvironment\src\BasicChat-01MEAI
cd samples\CoreGenerativeAITechniques\BasicChat-01MEAI
```
or If you're using Linux, macOS, Git Bash, WSL, or the VS Code terminal
```bash
cd 02-SetupDevEnvironment/src/BasicChat-01MEAI
cd samples/CoreGenerativeAITechniques/BasicChat-01MEAI
```

1. Then run the application with the following command:
Expand Down

This file was deleted.

12 changes: 0 additions & 12 deletions 02-SetupDevEnvironment/src/BasicChat-01MEAI/Program.cs

This file was deleted.

This file was deleted.

13 changes: 0 additions & 13 deletions 02-SetupDevEnvironment/src/BasicChat-03Ollama/Program.cs

This file was deleted.

28 changes: 0 additions & 28 deletions 02-SetupDevEnvironment/src/GettingReadySamples.sln

This file was deleted.

18 changes: 9 additions & 9 deletions 03-CoreGenerativeAITechniques/01-lm-completions-functions.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ A text completion itself is not a chat application, it is a one and done interac

Let's see how you would use text completions using the **Microsoft.Extensions.AI** library in .NET.

> 🧑‍💻**Sample code**: [Here is a working example of this application](./src/BasicChat-01MEAI/) you can follow along with.
> 🧑‍💻**Sample code**: [Here is a working example of this application](../../samples/CoreGenerativeAITechniques/BasicChat-01MEAI/) you can follow along with.

#### How to run the sample code

Expand All @@ -41,7 +41,7 @@ To run the sample code, you'll need to:
IChatClient client = new ChatCompletionsClient(
endpoint: new Uri("https://models.github.ai/inference"),
new AzureKeyCredential(githubToken))
.AsIChatClient("Phi-3.5-MoE-instruct");
.AsIChatClient("openai/gpt-5-mini");

// here we're building the prompt
StringBuilder prompt = new StringBuilder();
Expand All @@ -58,13 +58,13 @@ var response = await client.GetResponseAsync(prompt.ToString());
Console.WriteLine(response.Text);
```

> 🗒️**Note:** This example showed GitHub Models as the hosting service. If you want to use Ollama, [check out this example](./src/BasicChat-03Ollama/) (it instantiates a different `IChatClient`).
> 🗒️**Note:** This example showed GitHub Models as the hosting service. If you want to use Ollama, [check out this example](../../samples/CoreGenerativeAITechniques/BasicChat-03Ollama/) (it instantiates a different `IChatClient`).
>
> If you want to use Azure AI Foundry you can use the same code, but you will need to change the endpoint and the credentials.
>
> **GitHub Models Endpoint:** The endpoint `https://models.github.ai/inference` is the new dedicated GitHub Models endpoint as announced in the [GitHub Models deprecation notice](https://github.blog/changelog/2025-07-17-deprecation-of-azure-endpoint-for-github-models/), replacing the previous Azure-based endpoint.
>
> If you want to use both Ollama and Semantic Kernel together, [check out the BasicChat-04OllamaSK example](./src/BasicChat-04OllamaSK/).
> If you want to use both Ollama and Semantic Kernel together, [check out the BasicChat-04OllamaSK example](../../samples/CoreGenerativeAITechniques/BasicChat-04OllamaSK/).
>
> For instructions on how to set up Ollama, refer to [Getting Started with Ollama](../02-SetupDevEnvironment/getting-started-ollama.md).

Expand All @@ -88,7 +88,7 @@ During the chat with the model, you will need to keep track of the chat history.

Let's take a look at how you would build a chat application using MEAI.

> 🧑‍💻**Sample code**: You can find complete chat application examples in the [BasicChat-01MEAI](./src/BasicChat-01MEAI/) and [BasicChat-02SK](./src/BasicChat-02SK/) directories.
> 🧑‍💻**Sample code**: You can find complete chat application examples in the [BasicChat-01MEAI](../../samples/CoreGenerativeAITechniques/BasicChat-01MEAI/) and [BasicChat-02SK](../../samples/CoreGenerativeAITechniques/BasicChat-02SK/) directories.

```csharp
// assume IChatClient is instantiated as before
Expand Down Expand Up @@ -120,7 +120,7 @@ while (true)
}
```

> 🗒️**Note:** This can also be done with Semantic Kernel. [Check out the code here](./src/BasicChat-02SK/).
> 🗒️**Note:** This can also be done with Semantic Kernel. [Check out the code here](../../samples/CoreGenerativeAITechniques/BasicChat-02SK/).

> 🙋 **Need help?**: If you encounter any issues running the chat application examples, [open an issue in the repository](https://github.com/microsoft/Generative-AI-for-beginners-dotnet/issues/new?template=Blank+issue) and we'll help you troubleshoot.

Expand All @@ -132,15 +132,15 @@ _⬆️Click the image to watch the video⬆️_

When building AI applications you are not limited to just text-based interactions. It is possible to extend the functionality of the chatbot by calling pre-defined functions in your code based off user input. In other words, function calls serve as a bridge between the model and external systems.

> 🧑‍💻**Sample code**: [Here is a working example of this application](./src/MEAIFunctions/) you can follow along with.
> 🧑‍💻**Sample code**: [Here is a working example of this application](../../samples/CoreGenerativeAITechniques/MEAIFunctions/) you can follow along with.

### Function calling in chat applications

There are a couple of setup steps you need to take in order to call functions with MEAI.

> 🧑‍💻**Sample code**: [Here is a working example of function calling](./src/MEAIFunctions/) you can follow along with. To run this example, follow the same steps as for the previous examples, but navigate to `03-CoreGenerativeAITechniques/src/MEAIFunctions` directory.
> 🧑‍💻**Sample code**: [Here is a working example of function calling](../../samples/CoreGenerativeAITechniques/MEAIFunctions/) you can follow along with. To run this example, follow the same steps as for the previous examples, but navigate to `samples/CoreGenerativeAITechniques/MEAIFunctions` directory.
>
> We also have examples showing function calling with [Azure OpenAI](./src/MEAIFunctionsAzureOpenAI/) and [Ollama](./src/MEAIFunctionsOllama/).
> We also have examples showing function calling with [Azure OpenAI](../../samples/CoreGenerativeAITechniques/MEAIFunctionsAzureOpenAI/) and [Ollama](../../samples/CoreGenerativeAITechniques/MEAIFunctionsOllama/).

1. First, of course, define the function that you want the chatbot to be able to call. In this example we're going to get the weather forecast.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,16 +35,16 @@ You may have heard of vector databases. These are databases that store data in a

We'll use the Microsoft.Extension.AI along with the [Microsoft.Extensions.VectorData](https://www.nuget.org/packages/Microsoft.Extensions.VectorData.Abstractions/) and [Microsoft.SemanticKernel.Connectors.InMemory](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.InMemory) libraries to implement RAG below.

> 🧑‍💻**Sample code:** You can follow along with the [sample code here](./src/RAGSimple-02MEAIVectorsMemory/).
> 🧑‍💻**Sample code:** You can follow along with the [sample code here](../../samples/CoreGenerativeAITechniques/RAGSimple-02MEAIVectorsMemory/).
>
> You can also see how to implement a RAG app [using Semantic Kernel by itself in our sample source code here](./src/RAGSimple-01SK/).
> You can also see how to implement a RAG app [using Semantic Kernel by itself in our sample source code here](../../samples/CoreGenerativeAITechniques/RAGSimple-01SK/).
>
> We have additional RAG examples for different vector stores and models:
>
> - [RAGSimple-03MEAIVectorsAISearch](./src/RAGSimple-03MEAIVectorsAISearch/) - Using Azure AI Search as a vector store
> - [RAGSimple-04MEAIVectorsQdrant](./src/RAGSimple-04MEAIVectorsQdrant/) - Using Qdrant as a vector store
> - [RAGSimple-10SKOllama](./src/RAGSimple-10SKOllama/) - Using Semantic Kernel with Ollama
> - [RAGSimple-15Ollama-DeepSeekR1](./src/RAGSimple-15Ollama-DeepSeekR1/) - Using Ollama with DeepSeek-R1 model
> - [RAGSimple-03MEAIVectorsAISearch](../../samples/CoreGenerativeAITechniques/RAGSimple-03MEAIVectorsAISearch/) - Using Azure AI Search as a vector store
> - [RAGSimple-04MEAIVectorsQdrant](../../samples/CoreGenerativeAITechniques/RAGSimple-04MEAIVectorsQdrant/) - Using Qdrant as a vector store
> - [RAGSimple-10SKOllama](../../samples/CoreGenerativeAITechniques/RAGSimple-10SKOllama/) - Using Semantic Kernel with Ollama
> - [RAGSimple-15Ollama-DeepSeekR1](../../samples/CoreGenerativeAITechniques/RAGSimple-15Ollama-DeepSeekR1/) - Using Ollama with DeepSeek-R1 model

### Populating the knowledge store

Expand Down
Loading
Loading