Mimi Framework

Optimization of two parameters

Hi all,
I’m trying to optimize MIU and S (mitigation and savings) for a small variation of MimiDICE2013. But I only know how to optimize one parameter, in the following way:

using BlackBoxOptim
m_opt = GreenDICE
function eval_dice(x)
    set_param!(m_opt,:emissions,:MIU,x)
    
    run(m_opt)
    
    return -m_opt[:welfare, :UTILITY]
end
res = bboptimize(eval_dice;SearchRange=(0.,1.), NumDimensions=60, Method=:adaptive_de_rand_1_bin_radiuslimited,MaxSteps=99999)
best_candidate(res) # optimal vector of miu emissions trajectories

When I try to add the second parameter bboptimize won’t work:

function eval_dice(x)
    m = x[:,1]
    s = x[:,2]
    set_param!(m_opt,:emissions,:MIU,m)
    set_param!(m_opt,:neteconomy,:S,s)
    run(m_opt)
    
    return -m_opt[:welfare, :UTILITY]
end
res = bboptimize(eval_dice;SearchRange=(0.,1.), NumDimensions=[2,60], Method=:adaptive_de_rand_1_bin_radiuslimited,MaxSteps=99999)

I get this error:

ERROR: MethodError: no method matching fill(::Tuple{Float64,Float64}, ::Array{Int64,1})
Closest candidates are:
  fill(::Any, ::Union{Integer, AbstractUnitRange}...) at array.jl:401
  fill(::Any, ::Tuple{}) at array.jl:404
  fill(::Any, ::Tuple{Vararg{Integer,N}}) where N at array.jl:403
  ...
Stacktrace:
 [1] symmetric_search_space(::Array{Int64,1}, ::Tuple{Float64,Float64}) at C:\Users\bastien\.julia\packages\BlackBoxOptim\RgNEa\src\search_space.jl:109
 [2] check_and_create_search_space(::DictChain{Symbol,Any}) at C:\Users\bastien\.julia\packages\BlackBoxOptim\RgNEa\src\default_parameters.jl:64
 [3] setup_problem(::Function, ::DictChain{Symbol,Any}) at C:\Users\bastien\.julia\packages\BlackBoxOptim\RgNEa\src\bboptimize.jl:27
 [4] #bbsetup#73(::Base.Iterators.Pairs{Symbol,Any,NTuple{4,Symbol},NamedTuple{(:SearchRange, :NumDimensions, :Method, :MaxSteps),Tuple{Tuple{Float64,Float64},Array{Int64,1},Symbol,Int64}}}, ::Function, ::Function, ::Dict{Symbol,Any}) at C:\Users\bastien\.julia\packkages\BlackBoxOptim\RgNEa\src\bboptimize.jl:86
 [5] #bbsetup at .\none:0 [inlined]
 [6] #bboptimize#72(::Base.Iterators.Pairs{Symbol,Any,NTuple{4,Symbol},NamedTuple{(:SearchRange, :NumDimensions, :Method, :MaxSteps),,Tuple{Tuple{Float64,Float64},Array{Int64,1},Symbol,Int64}}}, ::Function, ::Function, ::Dict{Symbol,Any}) at C:\Users\bastien\.juliap\packages\BlackBoxOptim\RgNEa\src\bboptimize.jl:69
 [7] #bboptimize at .\none:0 [inlined] (repeats 2 times)
 [8] top-level scope at none:0

Thanks in advance for your help!

Hi there,

I’m not sure if this is the only solution, but I think I recall that BlackBoxOptim only optimizes on a single vector of choice values ie. your NumDimensions can only be an integer, not an array. The following should work, I tried it out and it runs! @davidanthoff @FrankErrickson is this response correct or is there a nuance I’m missing?

using Mimi
using MimiDICE2013
using BlackBoxOptim

m_opt = MimiDICE2013.get_model()
run(m_opt)

function eval_dice(x)
    m = x[1:60]
    s = x[61:end]
    set_param!(m_opt,:emissions,:MIU,m)
    set_param!(m_opt,:neteconomy,:S,s)
    run(m_opt)
    
    return -m_opt[:welfare, :UTILITY]
end
res = bboptimize(eval_dice;SearchRange=(0.,1.), NumDimensions=120, Method=:adaptive_de_rand_1_bin_radiuslimited,MaxSteps=99999)

Yes, that is exactly right. One small twist, you can avoid a bunch of unnecessary allocations by using the following code:

m = @view x[1:60]
s = @view x[61:end]

@lrennels @BerBastien I’ve used NLopt for all of my standard IAM optimizations, and only used BlackBoxOptim for more complicated multi-objective stuff that doesn’t really apply here… so take my advice with a grain of salt.

At least with NLopt, you need to pass in a single vector for your choice variables, and then split them up to different parameters within your objective function if you’re optimizing on more than two parameters. So it seems like the approach @lrennels took is correct.

A few other minor points. (1) If you end up wanting to optimize RICE then it’d be easier to optimize the carbon tax rather than the mitigation rates (assuming a globally harmonized tax). Then you just need to optimize a single tax for each period rather than regional mitigation rates for every period. You’ll have to adjust your upper values for the optimization to correspond to the backstop prices. Happy to share some code if you end up taking this approach, (2) I’ve generally found the savings rates have very little impact on the results, so including them may just be making it harder for the optimization to converge to a solution, and (3) @davidanthoff I have no clue what the @view code is doing. Can you provide some intuition for why this is better? Is this a Mimi thing or a Julia thing?

@view is a Julia thing, it’s an optimization to prevent unnecessary allocation of new arrays, I’ll let @davidanthoff elaborate

Thanks for your help!

Here is the story about @view. If you have an array A = [1,2,3,4], and you write x = A[2:3], then julia will under the hood create a new array and copy the subset of data from A that you referenced in the brackets into that new array, and then x points to this new array. At that point, A and x just point to different arrays that don’t have anything to do with each other.

x = @view A[2:3] (or equivalently x = view(A, 2:3)) on the other hand does not create an new array. Instead it returns something that behaves like an array, but actually is an alternative “view” into the original array A. So no data is copied, x now just is a way to access a subset of A with slightly different indexing offsets.

There are two major differences between these two approaches: 1) views allocate no new memory, so generally speaking you put a lot less pressure on the memory management side of julia which can help a lot with performance. 2) if you edit a value in the view (e.g. you write x[1]=10), then you are actually modifying that value in A, so if you then look at the content of A, you will see that edit there as well. In the first example that didn’t use views, though, any edit you make to x won’t show up in A because those are just different arrays.

@davidanthoff Thanks, this is super useful to know!