AI/ML - Code

Tutorial: A basic plugin using Semantic Kernel

Below source code are for Creating a basic plugin (testplugin.cs), and see how it is being called using prompt from Program.cs.

This example uses Weather finding, such as “What is the weather in London on 18th June 2024?”, and it will always return a hard coded value of “29” degrees celsius. You can modify the function to do complex logic.

testplugin.cs

This is a very basic plugin, which I purposefully did not include any logic. Comments are added inline to explain what each line/function does.

using System.ComponentModel; // Provides the DescriptionAttribute used to add human-friendly descriptions
using Microsoft.SemanticKernel; // The Semantic Kernel provides attributes like KernelFunction used for plugin discovery

// A simple plugin class that the Semantic Kernel can discover and call
public class testplugin
{
    // Marks this method as a Kernel function with the logical name "check_weather".
    // The Semantic Kernel uses this attribute to expose the method as a callable skill/function.
    [KernelFunction("check_weather")]

    // Adds a short human-readable description for the method. Useful for documentation or developer tools.
    [Description("Checks the weather when  in and city date is provided")]

    // Adds a description to the return value indicating units or format.
    [return: Description("Weather information in Celsius")]

    // Public method exposed as a plugin function. It takes a city and a date and returns a string.
    // Parameters:
    //  - city: the name of the city to check weather for
    //  - date: the date to check weather for (string format in this sample)
    public string FindWeather(string city, string date)
    {
        // This is a placeholder implementation. A real implementation would call a weather API
        // or perform lookups based on the city and date. Here we return a fixed string so the
        // plugin can be used for testing or demonstration.
        return "29 degrees Celsius";
    }

}

Program.cs

using Microsoft.Extensions.Configuration; // Library that helps us read configuration files like appsettings.json
using Microsoft.SemanticKernel; // Core types for Microsoft Semantic Kernel
using Microsoft.SemanticKernel.ChatCompletion; // Types for chat completion features (messages, settings)
using Microsoft.SemanticKernel.Connectors.OpenAI; // Connector that integrates Azure/OpenAI with Semantic Kernel

// Determine the absolute path to the appsettings.json file in case the current working directory
// is different from the source file location. This helps avoid "file not found" issues.
string filePath = Path.GetFullPath("appsettings.json");
Console.WriteLine(filePath); // Print the resolved path to the console for debugging

// Build a configuration object by loading the JSON file. This lets us read keys like PROJECT_KEY.
var config = new ConfigurationBuilder()
    .AddJsonFile(filePath) // Add the JSON file as a configuration provider
    .Build(); // Build the IConfigurationRoot so we can read values

// Read required settings from configuration. The '!' operator asserts the value is not null.
// For beginners: if a key is missing, this will throw; consider null-checking in production code.
string apiKey = config["PROJECT_KEY"]!; // Your Azure OpenAI API key (keep this secret!)
string endpoint = config["PROJECT_ENDPOINT"]!; // The Azure OpenAI service endpoint URL
string deploymentName = config["DEPLOYMENT_NAME"]!; // The name of the model deployment in Azure

// Create a Semantic Kernel builder and add the Azure OpenAI chat completion connector.
// This wires up the kernel to use your Azure-hosted model for chat completions.
var builder = Kernel.CreateBuilder().AddAzureOpenAIChatCompletion(deploymentName, endpoint, apiKey);

// Build the kernel instance which manages plugins, AI services, and execution context.
Kernel kernel = builder.Build();

// Register a plugin class (testplugin) so its methods can be invoked as functions by the kernel.
// The string "check_weather" is the skill name used when invoking functions.
kernel.Plugins.AddFromType<testplugin>("check_weather");

// Resolve the chat completion service from the kernel. This service sends the chat history
// to the model and returns the generated assistant reply.
var chatCompletionService = kernel.GetRequiredService<IChatCompletionService>();

// Create execution settings for the OpenAI prompt. Here we set function choice behavior
// to automatic so the SDK can decide when to call registered functions/plugins.
OpenAIPromptExecutionSettings openAIPromptExecutionSettings = new()
{
    FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};

// Create a chat history object and add a system message which acts as the user's prompt
// or the initial instruction to the assistant.
var history = new ChatHistory();
history.AddSystemMessage("What is the weather in London on 18th June 2024?");

// Call the helper that sends the chat to the model and prints the reply.
await GetReply();

// Helper async function that sends the chat history to the model and prints the assistant reply.
async Task GetReply()
{
    // Ask the chat completion service to get the assistant's message content based on the
    // current history and execution settings. The kernel is provided so functions can run.
    ChatMessageContent reply = await chatCompletionService.GetChatMessageContentAsync(
        history,
        executionSettings: openAIPromptExecutionSettings,
        kernel: kernel
    );

    // Print the assistant's response to the console so you can see the output.
    Console.WriteLine("Assistant: " + reply.ToString());

    // Save the assistant's reply back into the history so further calls keep context.
    history.AddAssistantMessage(reply.ToString());
}

appsettings.json

I used this file to avoid hardcoding sensitive information in the source code files. Use your own keys from Azure Open AI / ai.azure.com

{
    "DEPLOYMENT_NAME": "gpt-4o-mini", 
    "PROJECT_KEY": "2PDaYMJde5o9UuD8dUMsxtryQNnJjCzRlSufq0Z2em6BW5jvZ9cEasdfasdfasdfasdfasdfasdfasdf",
    "PROJECT_ENDPOINT": "https://example.openai.azure.com/"
}

Leave a Reply