Smarter’s Origin Story
In mid 2023 we stumbled upon a feature in the OpenAI API, and more recently, a growing number of other LLMs called, “function calling” that provides a way for prompt engineers to safely integrate private data into LLM prompts without divulging your overall data sets. Prompt response quality is noticeably better than alternatives such as Retrieval Augmented Generation (RAG) and Fine Tuning, and importantly, this approach naturally protects your private data.
The problem is that using it requires advanced Python programming skills that most prompt engineers lack.
Moreover, adding function calls to LLM prompts linearly drives up your token cost, making a prompt with 10 function calls easily cost around 10x more than a simple prompt. And so frustratingly, Function Calling can’t be generalized due to cost and I/O constraints. Lastly, analyzing and monitoring whether, and how, and to what extent an LLM makes use of Function Calling in any given prompt is cumbersome.
These are the principal problems that Smarter strives to solve. But we also saw that the large organizations around the world who are trying to leverage generative AI face additional requirements that, to put it mildly, can be challenging to satisfy:
Running at scale
Data privacy and security
Reporting
Logging and audit capabilities
Unified access to a range of different LLMs
Budgeting and cost controls
Meeting all of these challenges was a tall order that required a lot of different technologies. To create an efficient software platform that met the critical needs to avoid generative AI hallucination and the unintended disclosure of corporate secrets to the LLM providers, we created the Smarter.sh architecture. There are three key architectural concepts that are unique to Smarter, and require some amount of orientation to get a Prompt Engineer up to speed: Smarter.sh Plugins, Smarter.sh Manifests, and the Smarter.sh CLI.
Smarter.sh Plugins
Smarter.sh Plugins extended the knowledge of your AI chatbot with your own facts. These might be static facts, facts generated via REST API calls to other software, or fact data obtained from database queries.
Smarter.sh Manifests
For static facts, you can author these using Manifests. A manifest specifies a set of facts in text, plus a set of prompt search terms that specifies the situations where the facts in the manifest are going to be used in the results of a chatbot prompt.
Smarter.sh CLI
You control your chatbot and interact with it using a Command Line Interface. CLI commands exist to manage each type of resource managed by Smarter.sh.