CompositionalNetworks.jl

CompositionalNetworks.jl, a Julia package for Interpretable Compositional Networks (ICN), a variant of neural networks, allowing the user to get interpretable results, unlike regular artificial neural networks.

The current state of our ICN focuses on the composition of error functions for LocalSearchSolvers.jl, but produces results independently of it and export it to either/both Julia functions or/and human readable output.

How does it work?

The package comes with a basic ICN for learning global constraints. The ICN is composed of 4 layers: transformation, arithmetic, aggregation, and comparison. Each contains several operations that can be composed in various ways. Given a concept (a predicate over the variables' domains), a metric (hamming by default), and the variables' domains, we learn the binary weights of the ICN.

Installation

] add CompositionalNetworks

As the package is in a beta version, some changes in the syntax and features are likely to occur. However, those changes should be minimal between minor versions. Please update with caution.

Quickstart

# 4 variables in 1:4
doms = [domain([1,2,3,4]) for i in 1:4]

# allunique concept (that is used to define the :all_different constraint)
err = explore_learn_compose(allunique, domains=doms)
# > interpretation: identity ∘ count_positive ∘ sum ∘ count_eq_left

# test our new error function
@assert err([1,2,3,3], dom_size = 4) > 0.0

# export an all_different function to file "current/path/test_dummy.jl"
compose_to_file!(icn, "all_different", "test_dummy.jl")

The output file should produces a function that can be used as follows (assuming the maximum domain size is 7)

import CompositionalNetworks

all_different([1,2,3,4,5,6,7]; dom_size = 7)
# > 0.0 (which means true, no errors)

Please see JuliaConstraints/Constraints.jl/learn.jl for an extensive example of ICN learning and compositions.

Public interface

CompositionalNetworks.ICNType
ICN(; nvars, dom_size, param, transformation, arithmetic, aggregation, comparison)

Construct an Interpretable Compositional Network, with the following arguments:

  • nvars: number of variable in the constraint
  • dom_size: maximum domain size of any variable in the constraint
  • param: optional parameter (default to nothing)
  • transformation: a transformation layer (optional)
  • arithmetic: a arithmetic layer (optional)
  • aggregation: a aggregation layer (optional)
  • comparison: a comparison layer (optional)
CompositionalNetworks.arithmetic_layerMethod
arithmetic_layer()

Generate the layer of arithmetic operations of the ICN. The operations are mutually exclusive, that is only one will be selected.

CompositionalNetworks.comparison_layerFunction
comparison_layer(param = false)

Generate the layer of transformations functions of the ICN. Iff param value is set, also includes all the parametric comparison with that value. The operations are mutually exclusive, that is only one will be selected.

CompositionalNetworks.composeMethod
compose(icn)
compose(icn, weights)

Return a function composed by some of the operations of a given ICN. Can be applied to any vector of variables. If weights are given, will assign to icn.

CompositionalNetworks.compose_to_file!Function
compose_to_file!(icn::ICN, name, path, language = :Julia)

Compose a string that describes mathematically the composition of an ICN and write it to a file.

Arguments:

  • icn: a given compositional network with a learned composition
  • name: name of the composition
  • path: path of the output file
  • language: targeted programming language
CompositionalNetworks.compose_to_file!Method
compose_to_file!(concept, name, path; domains, param = nothing, language = :Julia, search = :complete, global_iter = 10, local_iter = 100, metric = hamming, popSize = 200)

Explore, learn and compose a function and write it to a file.

Arguments:

  • concept: the concept to learn
  • name: the name to give to the constraint
  • path: path of the output file

Keywords arguments:

  • domains: domains that defines the search space
  • param: an optional paramater of the constraint
  • language: the language to export to, default to :julia
  • search: either :partial or :complete search
  • global_iter: number of learning iteration
  • local_iter: number of generation in the genetic algorithm
  • metric: the metric to measure the distance between a configuration and known solutions
  • popSize: size of the population in the genetic algorithm
CompositionalNetworks.explore_learn_composeMethod
explore_learn_compose(concept; domains, param = nothing, search = :complete, global_iter = 10, local_iter = 100, metric = hamming, popSize = 200, action = :composition)

Explore a search space, learn a composition from an ICN, and compose an error function.

Arguments:

  • concept: the concept of the targeted constraint
  • domains: domains of the variables that define the training space
  • param: an optional parameter of the constraint
  • search: either :partial or :complete search
  • global_iter: number of learning iteration
  • local_iter: number of generation in the genetic algorithm
  • metric: the metric to measure the distance between a configuration and known solutions
  • popSize: size of the population in the genetic algorithm
  • action: either :symbols to have a description of the composition or :composition to have the composed function itself
CompositionalNetworks.hammingMethod
hamming(x, X)

Compute the hamming distance of x over a collection of solutions X, i.e. the minimal number of variables to switch in xto reach a solution.

CompositionalNetworks.lazyMethod
lazy(funcs::Function...)

Generate methods extended to a vector instead of one of its components. A function f should have the following signature: f(i::Int, x::V; param = nothing).

CompositionalNetworks.lazy_paramMethod
lazy_param(funcs::Function...)

Generate methods extended to a vector instead of one of its components. A function f should have the following signature: f(i::Int, x::V; param).

CompositionalNetworks.learn_composeFunction
learn_compose(;
    nvars, dom_size, param=nothing, icn=ICN(nvars, dom_size, param),
    X, X_sols, global_iter=100, local_iter=100, metric=hamming, popSize=200
)

Create an ICN, optimize it, and return its composition.

CompositionalNetworks.optimize!Function
optimize!(icn, X, X_sols, global_iter, local_iter; metric=hamming, popSize=100)

Optimize and set the weigths of an ICN with a given set of configuration X and solutions X_sols. The best weigths among global_iter will be set.

CompositionalNetworks.regularizationMethod
regularization(icn)

Return the regularization value of an ICN weights, which is proportional to the normalized number of operations selected in the icn layers.

CompositionalNetworks.transformation_layerFunction
transformation_layer(param = false)

Generate the layer of transformations functions of the ICN. Iff param value is true, also includes all the parametric transformations.