Original link:tecdat.cn/?p=22853

Original source:Tuo End number according to the tribe public number

This article introduces different solvers in R that can be used for portfolio optimization.

Universal solver

The universal solver can deal with any nonlinear optimization problem, but the cost may be slow convergence.

The defaultpackage

Package STATS (the base R package installed by default) provides several general optimizers.

  • Optimize (). For one-dimensional unconstrained function optimization over an interval (for one-dimensional roots, use uniroot()).
< - function f (x) exp (0.5 * x) * sin (10 * PI * x) f (0.5)Copy the code

` `

Result < -optimize (f, interval = C (0, 1), tol = 0.0001) resultCopy the code

Curve (0, 1, n = 200)Copy the code

  • Optim () general optimization, there are six different optimization methods.

    • Nelder-mead: Relatively robust approach (default) that does not require derivatives.
    • CG: Low memory optimization for high dimensional unconstrained problems
    • BFGS: A simple unconstrained quasi-Newtonian method
    • L-bfgs-b: Optimization of boundary constraint problems
    • SANN: Simulated annealing method
    • Brent: for one-dimensional problems (actually calling optimize()).

This example does a least square fit: minimize

 

Call the solver (initial value is C (0, 1), default method is "nelder-mead"). Optim (PAR = c(0, 1), f, data = dat) #Copy the code

# Compare lm(y ~ x, data = dat) with built-in linear regression in RCopy the code

The next example illustrates the use of gradients, the famous Rosenbrock banana function:

The gradient

, unconstrained minimization problem

Copy the code
< function(x) c(-400 * x[1] * (x[2] -x [1] * x[1]) -2 * (1-x [1]), 200 * ([1] [2] - x x x x [1])) optim (c (1.2, 1), f_banana)Copy the code

Optim (c(-1.2, 1), f, gr, method = "BFGS")Copy the code
Copy the code

The following example uses a bound constraint. ` `

To minimize the

Constraints:

p <- length(x); The sum (c (1, rep (4, p - 1)) * (x - c (1, x [p]) ^ 2) ^ 2)} # 25 dimension constraint optim (rep (3, 25), f, the lower = rep (2, 25), upper = rep (4Copy the code

This example uses simulated annealing (for global optimization). ` `

Res < -optim (50, f, method = "SANN")Copy the code

Optim (res$par, f, method = "BFGS")Copy the code

 

  • ConstrOptim (). Minimize a function (call optim()) with linear inequality constraints using an adaptive constraint algorithm.
Copy the code
# inequality constraint (UI %*% theta >= ci): x <= 0.9, y-x > 0.1 constrOptim(c(.5, 0))Copy the code

` `

  • nlm(): This function minimizes the objective function using Newtonian algorithms.
While NLM (f, c (10, 10))Copy the code

` `

  • nlminb(): Unbounded constraint optimization. .
Nlminb (c (1.2, 1), f)Copy the code

Nlminb (c (1.2, 1), f, gr)Copy the code

optim

The base function Optim () is easy to use and compare as a package for many other solvers.

Opm (f, method = c(" nelder-mead ", "BFGS"))Copy the code

Global optimization

The idea of global optimization is quite different from that of local optimization (global optimization solvers are often called stochastic solvers that try to avoid local optimality).

Solver for a specific class of problems

If the problem to be solved belongs to a certain class of problems, such as LS, LP, MILP, QP, SOCP, or SDP, it is better to use a dedicated solver for that class of problems.

Least square method (LS)

The linear least squares (LS) problem is thatMinimization, possibly bounded or linear constraint.

Linear programming (LP)

The function solveLP() can conveniently solveLP of the following form:

To minimize:

Constraints:

Copy the code
Cvec < -c (1800, 600, 600) # bvec < -c (40, 90, 2500) # solveLP(maximum = TRUE)Copy the code

 

Copy the code

Mixed Integer Linear Programming (MILP)

LpSolve (which is much faster than Linprog because it is encoded in C) can solve linear mixed integer problems (LP with some integer constraints possibly). ` `

# Setup problem: # subject to x1 + 2 x2 + 3 x3 <= 9 # 3 x1 + 2 x2 + 2 x3 <= 15Copy the code

 

Lp (int. Vec = 1:3)Copy the code

solution
Copy the code

Quadratic programming (QP)

The following forms of QP can be conveniently solved

Minimize: constraint:Copy the code
# Setup problem: # minimize - (0 0 5) % * # % x + 1/2 x x ^ T subject to A # ^ T x > = b with b = (8 0) ^ T # (2-4 0) # A = (1-3-2) # (0, 0 1) # solve(Dmat...)Copy the code

Copy the code

Solve quadratic programming with absolute value constraint and absolute value in objective function.

Second order cone programming (SOCP)

There are several packages:

  • ECOSolveR provides an interface to embedded COnic Solver (ECOS), a well-known, efficient and robust C language library for convex problems.
  • CLSOCP provides an implementation of a one-step smooth Newton method for solving SOCP problems.

Optimization based

We have seen two packages that are packages for many other solvers.

Used for convex problems, MIP and non-convex problems

The ROI package provides a framework for dealing with optimization issues in R. It uses an object-oriented approach to define and solve various optimization tasks in R, which can come from different problem classes (for example, linear, quadratic, nonlinear programming problems).

LP – Consider LP:

To maximize the:

` `

Constraints:

 

#> Default solver: auto. OP(Objective = L_objective(C (3, 7, -12)),... , maximum = TRUE) #>Copy the code

Solve it res < -roi_solve (prob) resCopy the code

` `

MILP – Considers the previous LP and makes it a MILP by adding constraints X2,x3∈Z

Types (prob) < -c (" c ", "I", "I") probCopy the code

` `

BLP — Consider binary linear programming (BLP):

To minimize:

Constraints:

 

OP(objective = L_objective,... , types = rep("B", 5) ROI_solve(prob) #> Optimal Solution found. #> The objective value is: -1.01e+02Copy the code
Copy the code

SOCP – Consider SOCP:

To maximize the:

Constraints:

And note the SOC constraintsCan be written asor, is implemented in the code as:. ` `

OP(objective = L_objective,... , maximum = TRUE)Copy the code

` `

SDP – consider SDP:

To minimize:

Constraints:

Note the SDP constraintsCan be written as(The size is 3 because in our problem, the matrix is 2×2, but vech() extracts 3 independent variables because the matrix is symmetric). ` `

OP(objective = L_objective,... , rhs ))Copy the code

Copy the code

NLP— Consider nonlinear programming (NLP)

maximize

The constraint

Copy the code
OP(objective = F_objective,... , bounds , maximum = TRUE)Copy the code

` `

Convex optimization

R provides an object-oriented modeling language for convex optimization. It allows users to formulate convex optimization problems using natural mathematical syntax, rather than the restrictive standard form required by most solvers. Targets and constraint sets are specified by using a library of functions with known mathematical properties, combining constants, variables, and parameters. Now let’s look at some examples.

Least square method— Let’s start with a simple LS example: minimize

Of course, we can use the fundamental linear model fitting function lm() of R. ` `

# generate data m < -100 n < -10 beta_true < -c (-4:5) # generate data res < -lm (y ~ 0 + X) # 0 indicates that there is no intercept in our model.Copy the code

` `

Do it with CVXR


result <- solve(prob)
str(result)
Copy the code

   

We can now easily add a constraint to solve non-negative LS. ` `

Problem(Minimize(obj), constraints = list(beta >= 0))
solve(prob)
Copy the code

` `

Robust Huber regression – Let’s consider a simple example of robust regression:

To minimize the

Among them

Copy the code
sum(huber(y - X %*% beta, M)
Problem(Minimize(obj))
solve(prob)
Copy the code
Copy the code

Elastic net regularization – we are now going to solve the problem of minimizing

< -function (beta) {ridge <- (1-alpha) * sum(beta^2) lasso < -alpha * p_norm(beta, Sum ((y - X %*% beta)^2) + elastic(beta, lambda, alpha) Problem(Minimize(obj)) solve(prob)Copy the code

Sparse Inverse covariance Matrix — Considering convexity problem of matrix values:maximizeOn the condition that

log_det(X) - matrix_trace(X %*% S)
list(sum(abs(X)) <= alpha)
Copy the code

Covariance — Consider convex matrix valuesIn:Under the condition of maximization. ` `

Constr < - list (Sigma [1, 1] = = 0.2, Sigma [1, 2] > = 0, Sigma [1, 3] > = 0, Sigma [2] = = 0.1, Sigma [2, 3) < = 0, Sigma (2, 4) < = 0, Sigma [3, 3] = = 0.3, Sigma [3, 4] > = 0, Sigma [4] = = 0.1)Copy the code

Portfolio optimizationConsider Markowitz portfolio design: maximize.

Problem(Maximize(obj), constr)
solve(prob)
Copy the code
Copy the code

conclusion

The number of solvers available in the R language is large. The following steps are recommended.

  • If it is a convex optimization problem, start preliminary testing.
  • If it’s not fast enough, use ROI.
  • If faster speed is still required, a class-specific solver is used if the problem falls into one of the defined categories (for example, lpSolve is recommended for LP and Quadprog for QP).
  • However, if the problem does not belong to any category, then a general solver of nonlinear optimization must be used. In this sense, if a local solution is sufficient, packages of many solvers can be used. If you need a global solver, a good choice is the gloptim package, which comes with many global solvers.

Most welcome insight

1. Use R language to simulate the mixed queuing random service queuing system

2. Queuing theory is used in R language to predict waiting time

3. Realize markov chain Monte Carlo MCMC model in R language

4. Markov regime switching model in R language

5. Matlab Bayesian hidden Markov HMM model

6. Use R language to simulate the mixed queuing random service queuing system

7. Portfolio optimization in Python based on particle swarm optimization

8. Research on accident prediction of traffic casualties by R Language Markov Transformation model

9. Identifying changing Stock Market Conditions with Machine learning: Application of Hidden Markov Models