Implementing a Modeler
This guide explains how to implement an optimization modeler in CTSolvers. Modelers are strategies that convert AbstractOptimizationProblem instances into NLP backend models and convert NLP solver results back into problem-specific solutions. We use Modelers.ADNLP and Modelers.Exa as reference examples.
Read Architecture and Implementing a Strategy first. A modeler is a strategy with two additional callable contracts.
The AbstractNLPModeler Contract
A modeler must satisfy three contracts:
- Strategy contract —
id,metadata,options(inherited fromAbstractStrategy) - Model building callable —
(modeler)(prob, initial_guess) → NLP model - Solution building callable —
(modeler)(prob, nlp_stats) → Solution
Both callables have default implementations that throw NotImplemented.
using CTSolversThe id is available directly:
CTSolvers.Strategies.id(CTSolvers.Modelers.ADNLP):adnlpCTSolvers.Strategies.id(CTSolvers.Modelers.Exa):exaStep-by-Step Implementation
We walk through the Modelers.ADNLP implementation as a reference.
Step 1 — Define the struct
struct Modelers.ADNLP <: AbstractNLPModeler
options::Strategies.StrategyOptions
endStep 2 — Implement id
CTSolvers.Strategies.id(CTSolvers.Modelers.ADNLP):adnlpStep 3 — Define defaults and metadata
The metadata defines all configurable options with types, defaults, and validators:
CTSolvers.Strategies.metadata(CTSolvers.Modelers.ADNLP)StrategyMetadata with 11 options:
├─ show_time :: Bool (default: NotProvided)
│ description: Whether to show timing information while building the ADNLP model
├─ backend (adnlp_backend) :: Symbol (default: optimized)
│ description: Automatic differentiation backend used by ADNLPModels
├─ matrix_free :: Bool (default: NotProvided)
│ description: Enable matrix-free mode (avoids explicit Hessian/Jacobian matrices)
├─ name :: String (default: NotProvided)
│ description: Name of the optimization model for identification
├─ gradient_backend :: Union{Nothing, ADNLPModels.ADBackend, Type{<:ADNLPModels.ADBackend}} (default: NotProvided)
│ description: Override backend for gradient computation (advanced users only)
├─ hprod_backend :: Union{Nothing, ADNLPModels.ADBackend, Type{<:ADNLPModels.ADBackend}} (default: NotProvided)
│ description: Override backend for Hessian-vector product (advanced users only)
├─ jprod_backend :: Union{Nothing, ADNLPModels.ADBackend, Type{<:ADNLPModels.ADBackend}} (default: NotProvided)
│ description: Override backend for Jacobian-vector product (advanced users only)
├─ jtprod_backend :: Union{Nothing, ADNLPModels.ADBackend, Type{<:ADNLPModels.ADBackend}} (default: NotProvided)
│ description: Override backend for transpose Jacobian-vector product (advanced users only)
├─ jacobian_backend :: Union{Nothing, ADNLPModels.ADBackend, Type{<:ADNLPModels.ADBackend}} (default: NotProvided)
│ description: Override backend for Jacobian matrix computation (advanced users only)
├─ hessian_backend :: Union{Nothing, ADNLPModels.ADBackend, Type{<:ADNLPModels.ADBackend}} (default: NotProvided)
│ description: Override backend for Hessian matrix computation (advanced users only)
└─ ghjvprod_backend :: Union{Nothing, ADNLPModels.ADBackend, Type{<:ADNLPModels.ADBackend}} (default: NotProvided)
description: Override backend for g^T ∇²c(x)v computation (advanced users only)
Step 4 — Constructor and options accessor
The constructor validates options and stores them:
modeler = CTSolvers.Modelers.ADNLP(backend = :optimized)ADNLP (instance, id: :adnlp)
└─ backend = optimized [user]
Tip: use describe(ADNLP) to see all available options.
CTSolvers.Strategies.options(modeler)StrategyOptions with 1 option:
└─ backend = optimized [user]
Step 5 — Model building callable
This is the core of the modeler. It retrieves the appropriate builder from the problem and invokes it:
function (modeler::Modelers.ADNLP)(
prob::AbstractOptimizationProblem,
initial_guess,
)::ADNLPModels.ADNLPModel
# Get the builder registered for this problem type
builder = get_adnlp_model_builder(prob)
# Extract modeler options as a Dict
options = Strategies.options_dict(modeler)
# Build the NLP model, passing all options to the builder
return builder(initial_guess; options...)
endThe key interaction is with the Builder pattern: the modeler doesn't know how to build the model itself — it asks the problem for a builder, then calls it. See Implementing an Optimization Problem for how builders work.
Step 6 — Solution building callable
Same pattern, but for converting NLP results back into a problem-specific solution:
function (modeler::Modelers.ADNLP)(
prob::AbstractOptimizationProblem,
nlp_solution::SolverCore.AbstractExecutionStats,
)
builder = get_adnlp_solution_builder(prob)
return builder(nlp_solution)
endModelers.Exa: A Second Example
Modelers.Exa follows the same pattern with different options and a slightly different callable signature:
struct Modelers.Exa <: AbstractNLPModeler
options::Strategies.StrategyOptions
end
Strategies.id(::Type{<:Modelers.Exa}) = :exa
function Strategies.metadata(::Type{<:Modelers.Exa})
return Strategies.StrategyMetadata(
Options.OptionDefinition(
name = :base_type,
type = DataType,
default = Float64,
description = "Base floating-point type used by ExaModels",
validator = validate_exa_base_type,
),
Options.OptionDefinition(
name = :backend,
type = Union{Nothing, KernelAbstractions.Backend},
default = nothing,
description = "Execution backend for ExaModels (CPU, GPU, etc.)",
),
)
endThe model building callable extracts base_type as a positional argument:
function (modeler::Modelers.Exa)(
prob::AbstractOptimizationProblem,
initial_guess,
)::ExaModels.ExaModel
builder = get_exa_model_builder(prob)
options = Strategies.options_dict(modeler)
# ExaModels requires BaseType as first positional argument
BaseType = options[:base_type]
delete!(options, :base_type)
return builder(BaseType, initial_guess; options...)
endADNLPModelBuilder takes (initial_guess; kwargs...) while ExaModelBuilder takes (BaseType, initial_guess; kwargs...). Each modeler adapts the call to its builder's expected signature.
Integration with buildmodel / buildsolution
The Optimization module provides two generic functions that delegate to the modeler's callables:
# In src/Optimization/building.jl
function build_model(prob, initial_guess, modeler)
return modeler(prob, initial_guess)
end
function build_solution(prob, model_solution, modeler)
return modeler(prob, model_solution)
endThese are used by the high-level CommonSolve.solve:
Validation
Use validate_strategy_contract to verify the strategy contract (but not the callables — those require a real problem):
julia> Strategies.validate_strategy_contract(Modelers.ADNLP)
true
julia> Strategies.validate_strategy_contract(Modelers.Exa)
truevalidate_strategy_contract requires that the default constructor produces options matching the metadata exactly. For modelers with NotProvided defaults or advanced option handling, run validation after loading all required extensions.
For the callables, test with a fake or real problem:
# Create a fake problem with builders
prob = FakeOptimizationProblem(adnlp_builder, adnlp_solution_builder)
# Test model building
modeler = Modelers.ADNLP(backend = :optimized)
nlp = modeler(prob, x0)
@test nlp isa ADNLPModels.ADNLPModel
# Test solution building
stats = solve(nlp, solver)
solution = modeler(prob, stats)
@test solution isa ExpectedSolutionTypeSummary: Adding a New Modeler
To add a new modeler (e.g., MyModeler for a new NLP backend):
- Define
MyModeler <: AbstractNLPModelerwithoptions::StrategyOptions - Implement
Strategies.id(::Type{<:MyModeler}) = :my_backend - Implement
Strategies.metadata(::Type{<:MyModeler})with option definitions - Write constructor:
MyModeler(; mode, kwargs...) - Implement
Strategies.options(m::MyModeler) = m.options - Implement model building callable:
(modeler::MyModeler)(prob, x0) → NLP - Implement solution building callable:
(modeler::MyModeler)(prob, stats) → Solution - Add corresponding builder types in
Optimizationif needed (MyModelBuilder,MySolutionBuilder) - Add contract methods in
Optimization:get_my_model_builder,get_my_solution_builder