• AIPressRoom
  • Posts
  • A Pythonista’s Intro to Semantic Kernel | by Chris Hughes | Sep, 2023

A Pythonista’s Intro to Semantic Kernel | by Chris Hughes | Sep, 2023

We will see that that is equal to our handbook method.

Creating Customized Plugins

Now that we perceive tips on how to create semantic capabilities, and tips on how to use plugins, we’ve got every part we have to begin making our personal plugins!

Plugins can comprise two varieties of capabilities:

  • Semantic capabilities: use pure language to carry out actions

  • Native capabilities: use Python code to carry out actions

which will be mixed inside a single plugin.

The selection of whether or not to make use of a semantic vs native perform will depend on the duty that you’re performing. For duties involving understanding or producing language, semantic capabilities are the plain alternative. Nonetheless, for extra deterministic duties, similar to performing mathematical operations, downloading knowledge or accessing the time, native capabilities are higher suited.

Let’s discover how we are able to create every sort. First, let’s create a folder to retailer our plugins.

from pathlib import Path

plugins_path = Path("Plugins")
plugins_path.mkdir(exist_ok=True)

Making a Poem generator plugin

For our instance, let’s create a plugin which generates poems; for this, utilizing a semantic perform appears a pure alternative. We will create a folder for this plugin in our listing.

poem_gen_plugin_path = plugins_path / "PoemGeneratorPlugin"
poem_gen_plugin_path.mkdir(exist_ok=True)

Recalling that plugins are only a assortment of capabilities, and we’re making a semantic perform, the following half ought to be fairly acquainted. The important thing distinction is that, as a substitute of defining our immediate and config inline, we’ll create particular person recordsdata for these; to make it simpler to load.

Let’s create a folder for our semantic perform, which we will name write_poem.

poem_sc_path = poem_gen_plugin_path / "write_poem"
poem_sc_path.mkdir(exist_ok=True)

Subsequent, we create our immediate, saving it as skprompt.txt.

Now, let’s create our config and retailer this in a json file.

While it’s at all times good apply to set significant descriptions in our config, this turns into extra vital after we are defining plugins; plugins ought to present clear descriptions that describe how they behave, what their inputs and outputs are, and what their unwanted effects are. The explanation for that is that that is the interface that’s introduced by our kernel and, if we would like to have the ability to use an LLM to orchestrate duties, it wants to have the ability to perceive the plugin’s performance and tips on how to name it in order that it could possibly choose applicable capabilities.

config_path = poem_sc_path / "config.json"
%%writefile {config_path}

{
"schema": 1,
"sort": "completion",
"description": "A poem generator, that writes a brief poem primarily based on person enter",
"default_services": ["azure_gpt35_chat_completion"],
"completion": {
"temperature": 0.0,
"top_p": 1,
"max_tokens": 250,
"number_of_responses": 1,
"presence_penalty": 0,
"frequency_penalty": 0
},
"enter": {
"parameters": [{
"name": "input",
"description": "The topic that the poem should be written about",
"defaultValue": ""
}]
}
}

Word that, as we’re saving our config as a JSON file, we have to take away the feedback to make this legitimate JSON.

Now, we’re in a position to import our plugin:

poem_gen_plugin = kernel.import_semantic_skill_from_directory(
plugins_path, "PoemGeneratorPlugin"
)

Inspecting our plugin, we are able to see that it exposes our write_poem semantic perform.

We will name our semantic perform instantly:

consequence = poem_gen_plugin["write_poem"]("Munich")

or, we are able to use it in one other semantic perform:

chat_config_dict = {
"schema": 1,
# The kind of immediate
"sort": "completion",
# An outline of what the semantic perform does
"description": "Wraps a plugin to jot down a poem",
# Specifies which mannequin service(s) to make use of
"default_services": ["azure_gpt35_chat_completion"],
# The parameters that will likely be handed to the connector and mannequin service
"completion": {
"temperature": 0.0,
"top_p": 1,
"max_tokens": 500,
"number_of_responses": 1,
"presence_penalty": 0,
"frequency_penalty": 0,
},
# Defines the variables which can be used inside the immediate
"enter": {
"parameters": [
{
"name": "input",
"description": "The input given by the user",
"defaultValue": "",
},
]
},
}

immediate = """

"""

write_poem_wrapper = kernel.register_semantic_function(
skill_name="PoemWrapper",
function_name="poem_wrapper",
function_config=create_semantic_function_chat_config(
immediate, chat_config_dict, kernel
),
)

consequence = write_poem_wrapper("Munich")

Creating an Picture Classifier plugin

Now that we’ve got seen tips on how to use a Semantic perform in a plugin, let’s check out how we are able to use a local perform.

Right here, let’s create a plugin that takes a picture url, then downloads and classifies the picture. As soon as once more, let’s create a folder for our new plugin.

image_classifier_plugin_path = plugins_path / "ImageClassifierPlugin"
image_classifier_plugin_path.mkdir(exist_ok=True)

download_image_sc_path = image_classifier_plugin_path / "download_image.py"
download_image_sc_path.mkdir(exist_ok=True)

Now, we are able to create our Python module. Contained in the module, we will be fairly versatile. Right here, we’ve got created a category with two strategies, the important thing step is to make use of the sk_function decorator to specify which strategies ought to be uncovered as a part of the plugin.

On this instance, our perform solely requires a single enter. For capabilities that require a number of inputs, the sk_function_context_parameter can be utilized, as demonstrated in the documentation.

import requests
from PIL import Picture
import timm
from timm.knowledge.imagenet_info import ImageNetInfo

from semantic_kernel.skill_definition import (
sk_function,
)
from semantic_kernel.orchestration.sk_context import SKContext

class ImageClassifierPlugin:
def __init__(self):
self.mannequin = timm.create_model("convnext_tiny.in12k_ft_in1k", pretrained=True)
self.mannequin.eval()
data_config = timm.knowledge.resolve_model_data_config(self.mannequin)
self.transforms = timm.knowledge.create_transform(**data_config, is_training=False)
self.imagenet_info = ImageNetInfo()

@sk_function(
description="Takes a url as an enter and classifies the picture",
identify="classify_image",
input_description="The url of the picture to categorise",
)
def classify_image(self, url: str) -> str:
picture = self.download_image(url)
pred = self.mannequin(self.transforms(picture)[None])
return self.imagenet_info.index_to_description(pred.argmax())

def download_image(self, url):
return Picture.open(requests.get(url, stream=True).uncooked).convert("RGB")

For this instance, I’ve used the superb Pytorch Image Models library to supply our classifier. For extra info on how this library works, take a look at this blog post.

Now, we are able to merely import our plugin as seen beneath.

image_classifier = ImageClassifierPlugin()

classify_plugin = kernel.import_skill(image_classifier, skill_name="classify_image")

Inspecting our plugin, we are able to see that solely our adorned perform is uncovered.

We will confirm that our plugin works utilizing an image of a cat from Pixabay.

url = "https://cdn.pixabay.com/photograph/2016/02/10/16/37/cat-1192026_1280.jpg"
response = classify_plugin["classify_image"](url)

Manually calling our perform, we are able to see that our picture has been categorised accurately! In the identical manner as earlier than, we might additionally reference this perform instantly from a immediate. Nonetheless, as we’ve got already demonstrated this, let’s strive one thing barely completely different within the following part.

Chaining a number of plugins

Additionally it is attainable to chain a number of plugins collectively utilizing the kernel, as demonstrated beneath.

context = kernel.create_new_context()
context["input"] = url

reply = await kernel.run_async(
classify_plugin["classify_image"],
poem_gen_plugin["write_poem"],
input_context=context,
)

We will see that, utilizing each plugins sequentially, we’ve got categorised the picture and wrote a poem about it!

At this level, we’ve got completely explored semantic capabilities, perceive how capabilities will be grouped and used as a part of a plugin, and have seen how we are able to chain plugins collectively manually. Now, let’s discover how we are able to create and orchestrate workflows utilizing LLMs. To do that, Semantic Kernel supplies Planner objects, which might dynamically create chains of capabilities to try to obtain a purpose.

A planner is a category that takes a person immediate and a kernel, and makes use of the kernel’s companies to create a plan of tips on how to carry out the duty, utilizing the capabilities and plugins which were made accessible to the kernel. Because the plugins are the principle constructing blocks of those plans, the planner depends closely on the descriptions offered; if plugins and capabilities don’t have clear descriptions, the planner won’t be able to make use of them accurately. Moreover, as a planner can mix capabilities in numerous other ways, it is very important be certain that we solely expose capabilities that we’re completely happy for the planner to make use of.

Because the planner depends on a mannequin to generate a plan, there will be errors launched; these often come up when the planner doesn’t correctly perceive tips on how to use the perform. In these instances, I’ve discovered that offering specific directions — similar to describing the inputs and outputs, and stating whether or not inputs are required — within the descriptions can result in higher outcomes. Moreover, I’ve had higher outcomes utilizing instruction tuned fashions than base fashions; base textual content completion fashions are likely to hallucinate capabilities that don’t exist or create a number of plans. Regardless of these limitations, when every part works accurately, planners will be extremely highly effective!

Let’s discover how we are able to do that by exploring if we are able to create a plan to jot down a poem about a picture, primarily based on its url; utilizing the plugins we created earlier. As we’ve got outlined a number of capabilities that we not want, let’s create a brand new kernel, so we are able to management which capabilities are uncovered.

kernel = sk.Kernel()

To create our plan, let’s use our OpenAI chat service.

kernel.add_chat_service(
service_id="azure_gpt35_chat_completion",
service=AzureChatCompletion(
OPENAI_DEPLOYMENT_NAME, OPENAI_ENDPOINT, OPENAI_API_KEY
),
)

Inspecting our registered companies, we are able to see that our service can be utilized for each textual content completion and chat completion duties.

Now, let’s import our plugins.

classify_plugin = kernel.import_skill(
ImageClassifierPlugin(), skill_name="classify_image"
)
poem_gen_plugin = kernel.import_semantic_skill_from_directory(
plugins_path, "PoemGeneratorPlugin"
)

We will see which capabilities our kernel has entry to as demonstrated beneath.

Now, let’s import our planner object.

from semantic_kernel.planning.basic_planner import BasicPlanner

planner = BasicPlanner()

To make use of our planner, all we’d like is a immediate. Typically, we might want to tweak this relying on the plans which can be generated. Right here, I’ve tried to be as specific as attainable concerning the enter that’s required.

ask = f"""
I would really like you to jot down poem about what's contained on this picture with this url: {url}. This url ought to be used as enter.

"""

Subsequent, we are able to use our planner to create a plan for the way it will remedy the duty.

plan = await planner.create_plan_async(ask, kernel)

Inspecting our plan, we are able to see that the mannequin has accurately recognized out enter, and the proper capabilities to make use of!

Lastly, all that’s left to do is to execute our plan.

poem = await planner.execute_plan_async(plan, kernel)

Wow, it labored! For a mannequin educated to foretell the following phrase, that’s fairly highly effective!

As a phrase of warning, I used to be fairly fortunate when making this instance that the generated plan labored first time. Nonetheless, working this a number of instances with the identical immediate, we are able to see that this isn’t at all times the case, so it is very important double test your plan earlier than working it! For me personally, in a manufacturing system, I might really feel way more snug manually creating the workflow to execute, fairly than leaving it to the LLM! Because the expertise continues to enhance, particularly on the present price, hopefully this advice will turn into outdated!