CompositionalNetworks.jl
CompositionalNetworks.jl
, a Julia package for Interpretable Compositional Networks (ICN), a variant of neural networks, allowing the user to get interpretable results, unlike regular artificial neural networks.
The current state of our ICN focuses on the composition of error functions for LocalSearchSolvers.jl
, but produces results independently of it and export it to either/both Julia functions or/and human readable output.
How does it work?
The package comes with a basic ICN for learning global constraints. The ICN is composed of 4 layers: transformation
, arithmetic
, aggregation
, and comparison
. Each contains several operations that can be composed in various ways. Given a concept
(a predicate over the variables' domains), a metric (hamming
by default), and the variables' domains, we learn the binary weights of the ICN.
Installation
] add CompositionalNetworks
As the package is in a beta version, some changes in the syntax and features are likely to occur. However, those changes should be minimal between minor versions. Please update with caution.
Quickstart
# 4 variables in 1:4
doms = [domain([1,2,3,4]) for i in 1:4]
# allunique concept (that is used to define the :all_different constraint)
err = explore_learn_compose(allunique, domains=doms)
# > interpretation: identity ∘ count_positive ∘ sum ∘ count_eq_left
# test our new error function
@assert err([1,2,3,3], dom_size = 4) > 0.0
# export an all_different function to file "current/path/test_dummy.jl"
compose_to_file!(icn, "all_different", "test_dummy.jl")
The output file should produces a function that can be used as follows (assuming the maximum domain size is 7
)
import CompositionalNetworks
all_different([1,2,3,4,5,6,7]; dom_size = 7)
# > 0.0 (which means true, no errors)
Please see JuliaConstraints/Constraints.jl/learn.jl
for an extensive example of ICN learning and compositions.
Public interface
CompositionalNetworks.ICN
— TypeICN(; nvars, dom_size, param, transformation, arithmetic, aggregation, comparison)
Construct an Interpretable Compositional Network, with the following arguments:
nvars
: number of variable in the constraintdom_size: maximum domain size of any variable in the constraint
param
: optional parameter (default tonothing
)transformation
: a transformation layer (optional)arithmetic
: a arithmetic layer (optional)aggregation
: a aggregation layer (optional)comparison
: a comparison layer (optional)
CompositionalNetworks.aggregation_layer
— Methodaggregation_layer()
Generate the layer of aggregations of the ICN. The operations are mutually exclusive, that is only one will be selected.
CompositionalNetworks.arithmetic_layer
— Methodarithmetic_layer()
Generate the layer of arithmetic operations of the ICN. The operations are mutually exclusive, that is only one will be selected.
CompositionalNetworks.comparison_layer
— Functioncomparison_layer(param = false)
Generate the layer of transformations functions of the ICN. Iff param
value is set, also includes all the parametric comparison with that value. The operations are mutually exclusive, that is only one will be selected.
CompositionalNetworks.compose
— Methodcompose(icn)
compose(icn, weights)
Return a function composed by some of the operations of a given ICN. Can be applied to any vector of variables. If weights
are given, will assign to icn
.
CompositionalNetworks.compose_to_file!
— Functioncompose_to_file!(icn::ICN, name, path, language = :Julia)
Compose a string that describes mathematically the composition of an ICN and write it to a file.
Arguments:
icn
: a given compositional network with a learned compositionname
: name of the compositionpath
: path of the output filelanguage
: targeted programming language
CompositionalNetworks.compose_to_file!
— Methodcompose_to_file!(concept, name, path; domains, param = nothing, language = :Julia, search = :complete, global_iter = 10, local_iter = 100, metric = hamming, popSize = 200)
Explore, learn and compose a function and write it to a file.
Arguments:
concept
: the concept to learnname
: the name to give to the constraintpath
: path of the output file
Keywords arguments:
domains
: domains that defines the search spaceparam
: an optional paramater of the constraintlanguage
: the language to export to, default to:julia
search
: either:partial
or:complete
searchglobal_iter
: number of learning iterationlocal_iter
: number of generation in the genetic algorithmmetric
: the metric to measure the distance between a configuration and known solutionspopSize
: size of the population in the genetic algorithm
CompositionalNetworks.explore_learn_compose
— Methodexplore_learn_compose(concept; domains, param = nothing, search = :complete, global_iter = 10, local_iter = 100, metric = hamming, popSize = 200, action = :composition)
Explore a search space, learn a composition from an ICN, and compose an error function.
Arguments:
concept
: the concept of the targeted constraintdomains
: domains of the variables that define the training spaceparam
: an optional parameter of the constraintsearch
: either:partial
or:complete
searchglobal_iter
: number of learning iterationlocal_iter
: number of generation in the genetic algorithmmetric
: the metric to measure the distance between a configuration and known solutionspopSize
: size of the population in the genetic algorithmaction
: either:symbols
to have a description of the composition or:composition
to have the composed function itself
CompositionalNetworks.hamming
— Methodhamming(x, X)
Compute the hamming distance of x
over a collection of solutions X
, i.e. the minimal number of variables to switch in x
to reach a solution.
CompositionalNetworks.lazy
— Methodlazy(funcs::Function...)
Generate methods extended to a vector instead of one of its components. A function f
should have the following signature: f(i::Int, x::V; param = nothing)
.
CompositionalNetworks.lazy_param
— Methodlazy_param(funcs::Function...)
Generate methods extended to a vector instead of one of its components. A function f
should have the following signature: f(i::Int, x::V; param)
.
CompositionalNetworks.learn_compose
— Functionlearn_compose(;
nvars, dom_size, param=nothing, icn=ICN(nvars, dom_size, param),
X, X_sols, global_iter=100, local_iter=100, metric=hamming, popSize=200
)
Create an ICN, optimize it, and return its composition.
CompositionalNetworks.manhattan
— Methodmanhattan(x, X)
CompositionalNetworks.minkowski
— Methodminkowski(x, X, p)
CompositionalNetworks.optimize!
— Functionoptimize!(icn, X, X_sols, global_iter, local_iter; metric=hamming, popSize=100)
Optimize and set the weigths of an ICN with a given set of configuration X
and solutions X_sols
. The best weigths among global_iter
will be set.
CompositionalNetworks.regularization
— Methodregularization(icn)
Return the regularization value of an ICN weights, which is proportional to the normalized number of operations selected in the icn layers.
CompositionalNetworks.show_composition
— Methodshow_composition(icn)
Return the composition (weights) of an ICN.
CompositionalNetworks.show_layers
— Methodshow_layers(icn)
Return a formated string with each layers in the icn.
CompositionalNetworks.transformation_layer
— Functiontransformation_layer(param = false)
Generate the layer of transformations functions of the ICN. Iff param
value is true, also includes all the parametric transformations.