ConstraintsTranslator.jl
Documentation for ConstraintsTranslator.jl
.
ConstraintsTranslator.GoogleLLM Type
GoogleLLM
Structure encapsulating the parameters for accessing the Google LLM API.
api_key
: an API key for accessing the Google Gemini API (https://ai.google.dev/gemini-api/docs/), read from the environmental variableGOOGLE_API_KEY
.model_id
: a string identifier for the model to query. See https://ai.google.dev/gemini-api/docs/models/gemini for the list of available models.url
: URL for chat completions. Defaults tohttps://generativelanguage.googleapis.com/v1beta/models/
.
ConstraintsTranslator.GroqLLM Type
GroqLLM
Structure encapsulating the parameters for accessing the Groq LLM API.
api_key
: an API key for accessing the Groq API (https://groq.com), read from the environmental variable GROQ_API_KEY.model_id
: a string identifier for the model to query. See https://console.groq.com/docs/models for the list of available models.url
: URL for chat completions. Defaults to `https://api.groq.com/openai/v1/chat/completions``.
ConstraintsTranslator.LlamaCppLLM Type
LlamaCppLLM
Structure encapsulating the parameters for accessing the llama.cpp server API.
api_key
: an optional API key for accessing the servermodel_id
: a string identifier for the model to query. Unused, kept for API compatibility.url
: the URL of the llama.cpp server OpenAI API endpoint (e.g., http://localhost:8080)
NOTE: we do not apply the appropriate chat templates to the prompt. This must be handled either in an external code path or by the server.
ConstraintsTranslator.MetadataMessage Type
MetadataMessage
Represents the metadata information of a prompt template. The templates follow the specifications of PromptingTools.jl
.
Fields
content::String
: The content of the metadata message.description::String
: A description of the metadata message.version::String
: The version of the metadata message.
ConstraintsTranslator.Prompt Type
Prompt
Simple data structure encapsulating a system prompt and a user prompt for LLM generation.
Fields
system
: the system prompt.user
: the user prompt.
ConstraintsTranslator.PromptTemplate Type
PromptTemplate
Represents a complete prompt template, comprising metadata, system, and user messages.
Fields
metadata::MetadataMessage
: The metadata message of the prompt template.system::SystemMessage
: The system message of the prompt template.user::UserMessage
: The user message of the prompt template.
ConstraintsTranslator.SystemMessage Type
SystemMessage
Represents the prompt template of a system message. The template can optionally contain string placeholders enclosed in double curly braces, e.g., . Placeholders must be replaced with actual values when generating prompts.
Fields
content::String
: The content of the system message.variables::Vector{String}
: A list of variables used in the system message.
ConstraintsTranslator.UserMessage Type
UserMessage
Represents the prompt template of a user message. The template can optionally contain string placeholders enclosed in double curly braces, e.g., . Placeholders must be replaced with actual values when generating prompts.
Fields
content::String
: The content of the system message.variables::Vector{String}
: A list of variables used in the system message.
ConstraintsTranslator.check_syntax_errors Method
check_syntax_errors(s::String)
Parses the string s
as Julia code. In the case of syntax errors, it returns the error message of the parser as a string. Otherwise, it returns an empty string.
ConstraintsTranslator.edit_in_editor Method
edit_in_vim(s::String)
Edits the input string s
in a temporary file using the Vim editor. Returns the modified string after the editor is closed.
ConstraintsTranslator.extract_structure Method
extract_structure(model <: AbstractLLM, description <: AbstractString)
Extracts the parameters, decision variables and constraints of an optimization problem given a natural-language description
. Returns a Markdown-formatted text containing the above information.
ConstraintsTranslator.fix_syntax_errors Method
fix_syntax_errors(model::AbstractLLM, code::AbstractString, error::AbstractString)
Fixes syntax errors in the code
by querying the Large Language Model model
, based on an error
produced by the Julia parser. Returns Markdown-formatted text containing the corrected code in a Julia code block.
ConstraintsTranslator.format_template Method
format_template(template::PromptTemplate; kwargs...)::Prompt
Formats a PromptTemplate
by substituting all variables in the system and user messages with user-provided values.
Arguments
template::PromptTemplate
: The prompt template containing metadata, system, and user messages.kwargs...
: A variable number of keyword arguments where keys are variable names and values are the corresponding replacements.
Returns
Prompt
: APrompt
struct with the system and user messages containing the substituted values.
Raises
ErrorException
: If any variables specified in the system or user templates are not present in thekwargs
.Warning
: If there are extrakwargs
that are not used in the templates.
ConstraintsTranslator.get_completion Method
get_completion(llm::AbstractLLM, prompt::AbstractPrompt)
Returns a completion for a prompt
using the llm
model API.
ConstraintsTranslator.get_completion Method
get_completion(llm::OpenAILLM, prompt::Prompt)
Returns a completion for the given prompt using an OpenAI API compatible LLM
ConstraintsTranslator.get_completion Method
get_completion(llm::GoogleLLM, prompt::Prompt)
Returns a completion for the given prompt using the Google Gemini LLM API.
ConstraintsTranslator.get_package_path Method
get_package_path()
Returns the absolute path of the root directory of ConstraintsTranslator.jl
.
ConstraintsTranslator.jumpify_model Method
jumpify_model(model::AbstractLLM, description::AbstractString, examples::AbstractString)
Translates the natural language description
of an optimization problem into a JuMP constraints programming model to be solved with CBL by querying the Large Language Model model
. The examples
are snippets from ConstraintModels.jl
used as in-context examples to the LLM. To work optimally, the model expects the description
to be a structured Markdown-formatted description as the ones generated by extract_structure
. Returns a Markdown-formatted text containing Julia code in a code block.
ConstraintsTranslator.parse_code Method
parse_code(s::String)
Parse the code blocks in the input string s
delimited by triple backticks and an optional language annotation. Returns a dictionary keyed by language. Code blocks from the same language are concatenated.
ConstraintsTranslator.read_template Method
read_template(data_path::String)
Reads a prompt template from a JSON file specified by data_path
. The function parses the JSON data and constructs a PromptTemplate
object containing metadata, system, and user messages. TODO: validate the JSON data against a schema to ensure it is valid before parsing.
Arguments
data_path::String
: The path to the JSON file containing the prompt template.
Returns
PromptTemplate
: APromptTemplate
structure encapsulating the metadata, system, and user messages.
Raises
ErrorException
: if the template does not match the specification format.
ConstraintsTranslator.stream_completion Method
stream_completion(llm::AbstractLLM, prompt::AbstractPrompt)
Returns a completion for a prompt
using the llm
model API. The completion is streamed to the terminal as it is generated.
ConstraintsTranslator.stream_completion Method
stream_completion(llm::OpenAILLM, prompt::Prompt)
Returns a completion for the given prompt using an OpenAI API compatible model. The completion is streamed to the terminal as it is generated.
ConstraintsTranslator.stream_completion Method
stream_completion(llm::GoogleLLM, prompt::Prompt)
Returns a completion for the given prompt using the Google Gemini LLM API. The completion is streamed to the terminal as it is generated.
ConstraintsTranslator.translate Method
translate(model::AbstractLLM, description::AbstractString; interactive::Bool = false)
Translate the natural-language description
of an optimization problem into a Constraint Programming model by querying the Large Language Model model
. If interactive
, the user will be prompted via the command line to inspect the intermediate outputs of the LLM, and possibly modify them.