CompositionalNetworks.jl
CompositionalNetworks.jl
, a Julia package for Interpretable Compositional Networks (ICN), a variant of neural networks, allowing the user to get interpretable results, unlike regular artificial neural networks.
The current state of our ICN focuses on the composition of error functions for LocalSearchSolvers.jl
, but produces results independently of it and export it to either/both Julia functions or/and human readable output.
How does it work?
The package comes with a basic ICN for learning global constraints. The ICN is composed of 4 layers: transformation
, arithmetic
, aggregation
, and comparison
. Each contains several operations that can be composed in various ways. Given a concept
(a predicate over the variables' domains), a metric (hamming
by default), and the variables' domains, we learn the binary weights of the ICN.
Installation
] add CompositionalNetworks
As the package is in a beta version, some changes in the syntax and features are likely to occur. However, those changes should be minimal between minor versions. Please update with caution.
Quickstart
# 4 variables in 1:4
doms = [domain([1,2,3,4]) for i in 1:4]
# allunique concept (that is used to define the :all_different constraint)
err = explore_learn_compose(allunique, domains=doms)
# > interpretation: identity ∘ count_positive ∘ sum ∘ count_eq_left
# test our new error function
@assert err([1,2,3,3], dom_size = 4) > 0.0
# export an all_different function to file "current/path/test_dummy.jl"
compose_to_file!(icn, "all_different", "test_dummy.jl")
The output file should produces a function that can be used as follows (assuming the maximum domain size is 7
)
import CompositionalNetworks
all_different([1,2,3,4,5,6,7]; dom_size = 7)
# > 0.0 (which means true, no errors)
Please see JuliaConstraints/Constraints.jl/learn.jl
for an extensive example of ICN learning and compositions.
Public interface
CompositionalNetworks.Composition
— Typestruct Composition{F<:Function}
Store the all the information of a composition learned by an ICN.
CompositionalNetworks.Composition
— MethodComposition(f::F, symbols) where {F<:Function}
Construct a Composition
.
CompositionalNetworks.ICN
— TypeICN(; nvars, dom_size, param, transformation, arithmetic, aggregation, comparison)
Construct an Interpretable Compositional Network, with the following arguments:
nvars
: number of variable in the constraintdom_size: maximum domain size of any variable in the constraint
param
: optional parameter (default tonothing
)transformation
: a transformation layer (optional)arithmetic
: a arithmetic layer (optional)aggregation
: a aggregation layer (optional)comparison
: a comparison layer (optional)
CompositionalNetworks.aggregation_layer
— Methodaggregation_layer()
Generate the layer of aggregations of the ICN. The operations are mutually exclusive, that is only one will be selected.
CompositionalNetworks.arithmetic_layer
— Methodarithmetic_layer()
Generate the layer of arithmetic operations of the ICN. The operations are mutually exclusive, that is only one will be selected.
CompositionalNetworks.code
— Functioncode(c::Composition, lang=:maths; name="composition")
Access the code of a composition c
in a given language lang
. The name of the generated method is optional.
CompositionalNetworks.comparison_layer
— Functioncomparison_layer(param = false)
Generate the layer of transformations functions of the ICN. Iff param
value is set, also includes all the parametric comparison with that value. The operations are mutually exclusive, that is only one will be selected.
CompositionalNetworks.compose
— Functioncompose(icn, weigths=nothing)
Return a function composed by some of the operations of a given ICN. Can be applied to any vector of variables. If weigths
are given, will assign to icn
.
CompositionalNetworks.compose_to_file!
— Methodcompose_to_file!(concept, name, path; domains, param = nothing, language = :Julia, search = :complete, global_iter = 10, local_iter = 100, metric = hamming, popSize = 200)
Explore, learn and compose a function and write it to a file.
Arguments:
concept
: the concept to learnname
: the name to give to the constraintpath
: path of the output file
Keywords arguments:
domains
: domains that defines the search spaceparam
: an optional paramater of the constraintlanguage
: the language to export to, default to:julia
search
: either:partial
or:complete
searchglobal_iter
: number of learning iterationlocal_iter
: number of generation in the genetic algorithmmetric
: the metric to measure the distance between a configuration and known solutionspopSize
: size of the population in the genetic algorithm
CompositionalNetworks.composition
— Methodcomposition(c::Composition)
Access the actual method of an ICN composition c
.
CompositionalNetworks.composition_to_file!
— Functioncomposition_to_file!(c::Composition, path, name, language=:Julia)
Write the composition code in a given language
into a file at path
.
CompositionalNetworks.explore_learn_compose
— Methodexplore_learn_compose(concept; domains, param = nothing, search = :complete, global_iter = 10, local_iter = 100, metric = hamming, popSize = 200, action = :composition)
Explore a search space, learn a composition from an ICN, and compose an error function.
Arguments:
concept
: the concept of the targeted constraintdomains
: domains of the variables that define the training spaceparam
: an optional parameter of the constraintsearch
: eitherflexible
,:partial
or:complete
search. Flexible search will usesearch_limit
andsolutions_limit
to determine if the search space needs to be partially or completely exploredglobal_iter
: number of learning iterationlocal_iter
: number of generation in the genetic algorithmmetric
: the metric to measure the distance between a configuration and known solutionspopSize
: size of the population in the genetic algorithmaction
: either:symbols
to have a description of the composition or:composition
to have the composed function itself
CompositionalNetworks.hamming
— Methodhamming(x, X)
Compute the hamming distance of x
over a collection of solutions X
, i.e. the minimal number of variables to switch in x
to reach a solution.
CompositionalNetworks.lazy
— Methodlazy(funcs::Function...)
Generate methods extended to a vector instead of one of its components. A function f
should have the following signature: f(i::Int, x::V)
.
CompositionalNetworks.lazy_param
— Methodlazy_param(funcs::Function...)
Generate methods extended to a vector instead of one of its components. A function f
should have the following signature: f(i::Int, x::V; param)
.
CompositionalNetworks.learn_compose
— Methodlearn_compose(;
nvars, dom_size, param=nothing, icn=ICN(nvars, dom_size, param),
X, X_sols, global_iter=100, local_iter=100, metric=hamming, popSize=200
)
Create an ICN, optimize it, and return its composition.
CompositionalNetworks.manhattan
— Methodmanhattan(x, X)
CompositionalNetworks.minkowski
— Methodminkowski(x, X, p)
CompositionalNetworks.nbits
— Methodnbits(icn)
Return the expected number of bits of a viable weigth of an ICN.
CompositionalNetworks.regularization
— Methodregularization(icn)
Return the regularization value of an ICN weigths, which is proportional to the normalized number of operations selected in the icn layers.
CompositionalNetworks.show_layers
— Methodshow_layers(icn)
Return a formated string with each layers in the icn.
CompositionalNetworks.symbols
— Methodsymbols(c::Composition)
Output the composition as a layered collection of Symbol
s.
CompositionalNetworks.transformation_layer
— Functiontransformation_layer(param = false)
Generate the layer of transformations functions of the ICN. Iff param
value is true, also includes all the parametric transformations.
CompositionalNetworks.weigths!
— Methodweigths!(icn, weigths)
Set the weigths of an ICN with a BitVector
.
CompositionalNetworks.weigths
— Methodweigths(icn)
Access the current set of weigths of an ICN.
CompositionalNetworks.weigths_bias
— Methodweigths_bias(x)
A metric that bias x
towards operations with a lower bit. Do not affect the main metric.