# Pass Extra Parameters in Problem-Based Approach

In an optimization problem, the objective or constraint functions sometimes have parameters in addition to the independent variable. The extra parameters can be data, or can represent variables that do not change during the optimization.

To include these parameters in the problem-based approach, simply refer to workspace variables in your objective or constraint functions.

### Least-Squares Problem with Passed Data

For example, suppose that you have matrices `C` and `d` in the `particle.mat` file, and these matrices represent data for your problem. Load the data into your workspace.

`load particle`

View the sizes of the matrices.

`disp(size(C))`
``` 2000 400 ```
`disp(size(d))`
``` 2000 1 ```

Create an optimization variable `x` of a size that is suitable for forming the vector `C*x`.

`x = optimvar('x',size(C,2));`

Create an optimization problem to minimize the sum of squares of the terms in `C*x – d` subject to the constraint that `x` is nonnegative.

```x.LowerBound = 0; prob = optimproblem; expr = sum((C*x - d).^2); prob.Objective = expr;```

You include the data `C` and `d` into the problem simply by referring to them in the objective function expression. Solve the problem.

`[sol,fval,exitflag,output] = solve(prob)`
```Solving problem using lsqlin. Minimum found that satisfies the constraints. Optimization completed because the objective function is non-decreasing in feasible directions, to within the value of the optimality tolerance, and constraints are satisfied to within the value of the constraint tolerance. ```
```sol = struct with fields: x: [400x1 double] ```
```fval = 22.5795 ```
```exitflag = OptimalSolution ```
```output = struct with fields: message: '...' algorithm: 'interior-point' firstorderopt: 9.9673e-07 constrviolation: 0 iterations: 9 linearsolver: 'sparse' cgiterations: [] solver: 'lsqlin' ```

### Nonlinear Problem with Extra Parameters

Use the same approach for nonlinear problems. For example, suppose that you have an objective function of several variables, some of which are fixed data for the optimization.

`type parameterfun`
```function y = parameterfun(x,a,b,c) y = (a - b*x(1)^2 + x(1)^4/3)*x(1)^2 + x(1)*x(2) + (-c + c*x(2)^2)*x(2)^2; ```

For this objective function, `x` is a 2-element vector, and `a`, `b`, and `c` are scalar parameters. Create the optimization variable and assign the parameter values in your workspace.

```a = 4; b = 2.1; c = 4; x = optimvar('x',2);```

Create an optimization problem. Because this objective function is a rational function of `x`, you can specify the objective in terms of the optimization variable. Solve the problem starting from the point `x0.x = [1/2;1/2]`.

```prob = optimproblem; prob.Objective = parameterfun(x,a,b,c); x0.x = [1/2;1/2]; [sol,fval] = solve(prob,x0)```
```Solving problem using fminunc. Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance. ```
```sol = struct with fields: x: [2x1 double] ```
```fval = -1.0316 ```

If `parameterfun` were not composed of supported functions, you would convert `parameterfun` to an optimization expression and set the converted expression as the objective. See Supported Operations on Optimization Variables and Expressions and Convert Nonlinear Function to Optimization Expression.

```expr = fcn2optimexpr(@parameterfun,x,a,b,c); prob.Objective = expr; [sol,fval] = solve(prob,x0)```
```Solving problem using fminunc. Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance. ```
```sol = struct with fields: x: [2x1 double] ```
```fval = -1.0316 ```