Skip to content

ConstraintsTranslator.jl

Documentation for ConstraintsTranslator.jl.

ConstraintsTranslator.GoogleLLM Type
julia
GoogleLLM

Structure encapsulating the parameters for accessing the Google LLM API.

source

ConstraintsTranslator.GroqLLM Type
julia
GroqLLM

Structure encapsulating the parameters for accessing the Groq LLM API.

source

ConstraintsTranslator.LlamaCppLLM Type
julia
LlamaCppLLM

Structure encapsulating the parameters for accessing the llama.cpp server API.

  • api_key: an optional API key for accessing the server

  • model_id: a string identifier for the model to query. Unused, kept for API compatibility.

  • url: the URL of the llama.cpp server OpenAI API endpoint (e.g., http://localhost:8080)

NOTE: we do not apply the appropriate chat templates to the prompt. This must be handled either in an external code path or by the server.

source

ConstraintsTranslator.MetadataMessage Type
julia
MetadataMessage

Represents the metadata information of a prompt template. The templates follow the specifications of PromptingTools.jl.

Fields

  • content::String: The content of the metadata message.

  • description::String: A description of the metadata message.

  • version::String: The version of the metadata message.

source

ConstraintsTranslator.Prompt Type
julia
Prompt

Simple data structure encapsulating a system prompt and a user prompt for LLM generation.

Fields

  • system: the system prompt.

  • user: the user prompt.

source

ConstraintsTranslator.PromptTemplate Type
julia
PromptTemplate

Represents a complete prompt template, comprising metadata, system, and user messages.

Fields

  • metadata::MetadataMessage: The metadata message of the prompt template.

  • system::SystemMessage: The system message of the prompt template.

  • user::UserMessage: The user message of the prompt template.

source

ConstraintsTranslator.SystemMessage Type
julia
SystemMessage

Represents the prompt template of a system message. The template can optionally contain string placeholders enclosed in double curly braces, e.g., . Placeholders must be replaced with actual values when generating prompts.

Fields

  • content::String: The content of the system message.

  • variables::Vector{String}: A list of variables used in the system message.

source

ConstraintsTranslator.UserMessage Type
julia
UserMessage

Represents the prompt template of a user message. The template can optionally contain string placeholders enclosed in double curly braces, e.g., . Placeholders must be replaced with actual values when generating prompts.

Fields

  • content::String: The content of the system message.

  • variables::Vector{String}: A list of variables used in the system message.

source

ConstraintsTranslator.check_syntax_errors Method
julia
check_syntax_errors(s::String)

Parses the string s as Julia code. In the case of syntax errors, it returns the error message of the parser as a string. Otherwise, it returns an empty string.

source

ConstraintsTranslator.edit_in_editor Method
julia
edit_in_vim(s::String)

Edits the input string s in a temporary file using the Vim editor. Returns the modified string after the editor is closed.

source

ConstraintsTranslator.extract_structure Method
julia
extract_structure(model <: AbstractLLM, description <: AbstractString)

Extracts the parameters, decision variables and constraints of an optimization problem given a natural-language description. Returns a Markdown-formatted text containing the above information.

source

ConstraintsTranslator.fix_syntax_errors Method
julia
fix_syntax_errors(model::AbstractLLM, code::AbstractString, error::AbstractString)

Fixes syntax errors in the code by querying the Large Language Model model, based on an error produced by the Julia parser. Returns Markdown-formatted text containing the corrected code in a Julia code block.

source

ConstraintsTranslator.format_template Method
julia
format_template(template::PromptTemplate; kwargs...)::Prompt

Formats a PromptTemplate by substituting all variables in the system and user messages with user-provided values.

Arguments

  • template::PromptTemplate: The prompt template containing metadata, system, and user messages.

  • kwargs...: A variable number of keyword arguments where keys are variable names and values are the corresponding replacements.

Returns

  • Prompt: A Prompt struct with the system and user messages containing the substituted values.

Raises

  • ErrorException: If any variables specified in the system or user templates are not present in the kwargs.

  • Warning: If there are extra kwargs that are not used in the templates.

source

ConstraintsTranslator.get_completion Method
julia
get_completion(llm::AbstractLLM, prompt::AbstractPrompt)

Returns a completion for a prompt using the llm model API.

source

ConstraintsTranslator.get_completion Method
julia
get_completion(llm::OpenAILLM, prompt::Prompt)

Returns a completion for the given prompt using an OpenAI API compatible LLM

source

ConstraintsTranslator.get_completion Method
julia
get_completion(llm::GoogleLLM, prompt::Prompt)

Returns a completion for the given prompt using the Google Gemini LLM API.

source

ConstraintsTranslator.get_package_path Method
julia
get_package_path()

Returns the absolute path of the root directory of ConstraintsTranslator.jl.

source

ConstraintsTranslator.jumpify_model Method
julia
jumpify_model(model::AbstractLLM, description::AbstractString, examples::AbstractString)

Translates the natural language description of an optimization problem into a JuMP constraints programming model to be solved with CBL by querying the Large Language Model model. The examples are snippets from ConstraintModels.jl used as in-context examples to the LLM. To work optimally, the model expects the description to be a structured Markdown-formatted description as the ones generated by extract_structure. Returns a Markdown-formatted text containing Julia code in a code block.

source

ConstraintsTranslator.parse_code Method
julia
parse_code(s::String)

Parse the code blocks in the input string s delimited by triple backticks and an optional language annotation. Returns a dictionary keyed by language. Code blocks from the same language are concatenated.

source

ConstraintsTranslator.read_template Method
julia
read_template(data_path::String)

Reads a prompt template from a JSON file specified by data_path. The function parses the JSON data and constructs a PromptTemplate object containing metadata, system, and user messages. TODO: validate the JSON data against a schema to ensure it is valid before parsing.

Arguments

  • data_path::String: The path to the JSON file containing the prompt template.

Returns

  • PromptTemplate: A PromptTemplate structure encapsulating the metadata, system, and user messages.

Raises

  • ErrorException: if the template does not match the specification format.

source

ConstraintsTranslator.stream_completion Method
julia
stream_completion(llm::AbstractLLM, prompt::AbstractPrompt)

Returns a completion for a prompt using the llm model API. The completion is streamed to the terminal as it is generated.

source

ConstraintsTranslator.stream_completion Method
julia
stream_completion(llm::OpenAILLM, prompt::Prompt)

Returns a completion for the given prompt using an OpenAI API compatible model. The completion is streamed to the terminal as it is generated.

source

ConstraintsTranslator.stream_completion Method
julia
stream_completion(llm::GoogleLLM, prompt::Prompt)

Returns a completion for the given prompt using the Google Gemini LLM API. The completion is streamed to the terminal as it is generated.

source

ConstraintsTranslator.translate Method
julia
translate(model::AbstractLLM, description::AbstractString; interactive::Bool = false)

Translate the natural-language description of an optimization problem into a Constraint Programming model by querying the Large Language Model model. If interactive, the user will be prompted via the command line to inspect the intermediate outputs of the LLM, and possibly modify them.

source