Build Your AI-Powered

Features That Shine

Scoopika is an open-source system for building controllable and predictable
AI-powered context-aware products that enable users to
interact with their data in natural language.

Compitable with large AI players

Built for developers

Stop building the same f**king chatbot, come build something special that works.

Function-calling that makes sense

We built a function-calling system that works with any LLM and gives you rules and configuration options to control each step in the process.

Wide range of configuration options for how each argument is treated.

Pre-execution custom validation steps, along with manual approval.

Your function will NEVER receive inputs it does not expect.

Lower costs, better performance

Our cloud-based solution provides a per-request pricing (start for free). When it comes to your LLM's costs, they are much lower, as not all tools count to the context window in function-calling (only the selected one), history is summarized and retreived with RAG using vector stores, and everything is optimized for performance.

Ready APIs

Leave the heavy lifting to our APIs and focus on building your ideas with ease. Below code shows how to use Scoopika with function-calling:

import { Client, loadSecret } from "scoopika"
import { tools } from "./tools" // where the tools are defined

// Initialize the client
client = new Client({
    token: "SCOOPIKA_TOKEN",
    tools: tools,
    llm: {
        host: "together", 
        api_key: loadSecret("TOGETHER_KEY"), // load from platform
        model: "mistralai/Mixtral-8x7B-Instruct-v0.1"
    },
})

// Invoke the client
client.invoke({
    input: "Look up most popular remixes for Get Lucky by Daft Punk",
})

// The invoke will run the tool "search-songs" and return its result
// arguments:
// title="Get Lucky"
// artists=["Daft Punk"]
// query="popular remixes for Get Lucky"
// order={"field": "views", "desc": true}

Runs anywhere

You can run Scoopika on our Serverless cloud platform with client libraries for both TypeScript and Python. and you can host theScoopika's core locally on your own.
Cloud (TypeScript & Python).
Local (Python)

Scoopika

Designed for function calling

Function calling is when an AI system is tasked to call a function from a set of
predefined functions based on the user's input and context.

But... why Scoopika ??

The concept of "function-calling" is brilliant, but working with it is almost impossible in real products, Usually due to LLMs sending unexpected arguments values to the function, making up new arguments, or missing arguments. And not talking about other problems like costs and tools taking up the whole context window. Scoopika works with ANY LLM, and tries to solve all of these issues:

1. Multi-stage approach instead of feeding everything to a function-calling LLM at once.
2. A LOT of configuration options to control how the system should treat each argument, along with validation steps before executing the tool.
3. For history it uses RAG and vector stores, and it also uses a"user-actions" approach instead of "multi-round" function calling, and this approach costs less tokens and is easier to work with.

85% isn't enough

Scoopika might not make the LLM better, but it makes the process better. Scoopika is the bridge between the LLM and your functions, the config options you set for each argument will not force the LLM to output what you expect, but will give the system instructions on how to validate the LLM output before calling the function. You know your function will never receive a value you don't expect, if it can't validate all required arguments (using the options you provide) it will not call the function, and in some cases it will try to ask about some values. but what are the available options ?? Check this out:

Join us now

Scoopika v1.0 is coming soon, for the moment you can join the waitlist
and check out more information about the project (See products)

This website does not work on this device size at the moment.
Please use a larger screen.