# partialcorri

Partial correlation coefficients adjusted for internal variables

## Description

example

rho = partialcorri(y,x) returns the sample linear partial correlation coefficients between pairs of variables in y and x, adjusting for the remaining variables in x.

example

rho = partialcorri(y,x,z) returns the sample linear partial correlation coefficients between pairs of variables in y and x, adjusting for the remaining variables in x, after first controlling both x and y for the variables in z.

example

rho = partialcorri(___,Name,Value) returns the sample linear partial correlation coefficients with additional options specified by one or more name-value pair arguments, using input arguments from any of the previous syntaxes. For example, you can specify whether to use Pearson or Spearman partial correlations, or specify how to treat missing values.

example

[rho,pval] = partialcorri(___) also returns a matrix pval of p-values for testing the hypothesis of no partial correlation against the one- or two-sided alternative that there is a nonzero partial correlation.

## Examples

collapse all

Compute partial correlation coefficients for each pair of variables in the x and y input matrices, while controlling for the effects of the remaining variables in x.

The data contains measurements from cars manufactured in 1970, 1976, and 1982. It includes MPG and Acceleration as performance measures, and Displacement, Horsepower, and Weight as design variables. Acceleration is the time required to accelerate from 0 to 60 miles per hour, so a high value for Acceleration corresponds to a vehicle with low acceleration.

Define the input matrices. The y matrix includes the performance measures, and the x matrix includes the design variables.

y = [MPG,Acceleration];
x = [Displacement,Horsepower,Weight];

Compute the correlation coefficients. Include only rows with no missing values in the computation.

rho = partialcorri(y,x,'Rows','complete')
rho = 2×3

-0.0537   -0.1520   -0.4856
-0.3994   -0.4008    0.4912

The results suggest, for example, a 0.4912 correlation between weight and acceleration after controlling for the effects of displacement and horsepower. You can return the $p$-values as a second output, and examine them to confirm whether these correlations are statistically significant.

For a clearer display, create a table with appropriate variable and row labels.

rho = array2table(rho, ...
'VariableNames',{'Displacement','Horsepower','Weight'}, ...
'RowNames',{'MPG','Acceleration'});

disp('Partial Correlation Coefficients')
Partial Correlation Coefficients
disp(rho)
Displacement    Horsepower     Weight
____________    __________    ________

MPG              -0.053684       -0.15199     -0.48563
Acceleration      -0.39941       -0.40075      0.49123

Test for partial correlation between pairs of variables in the x and y input matrices, while controlling for the effects of the remaining variables in x plus additional variables in matrix z.

The data contains measurements from cars manufactured in 1970, 1976, and 1982. It includes MPG and Acceleration as performance measures, and Displacement, Horsepower, and Weight as design variables. Acceleration is the time required to accelerate from 0 to 60 miles per hour, so a high value for Acceleration corresponds to a vehicle with low acceleration.

Create a new variable Headwind, and randomly generate data to represent the notion of an average headwind along the performance measurement route.

rng('default');  % For reproducibility

Since headwind can affect the performance measures, control for its effects when testing for partial correlation between the remaining variables.

Define the input matrices. The y matrix includes the performance measures, and the x matrix includes the design variables. The z matrix contains additional variables to control for when computing the partial correlations, such as headwind.

y = [MPG,Acceleration];
x = [Displacement,Horsepower,Weight];

Compute the partial correlation coefficients. Include only rows with no missing values in the computation.

[rho,pval] = partialcorri(y,x,z,'Rows','complete')
rho = 2×3

0.0572   -0.1055   -0.5736
-0.3845   -0.3966    0.4674

pval = 2×3

0.5923    0.3221    0.0000
0.0002    0.0001    0.0000

The small returned $p$-value of 0.001 in pval indicates, for example, a significant negative correlation between horsepower and acceleration, after controlling for displacement, weight, and headwind.

For a clearer display, create tables with appropriate variable and row labels.

rho = array2table(rho, ...
'VariableNames',{'Displacement','Horsepower','Weight'}, ...
'RowNames',{'MPG','Acceleration'});

pval = array2table(pval, ...
'VariableNames',{'Displacement','Horsepower','Weight'}, ...
'RowNames',{'MPG','Acceleration'});

disp('Partial Correlation Coefficients, Accounting for Headwind')
Partial Correlation Coefficients, Accounting for Headwind
disp(rho)
Displacement    Horsepower     Weight
____________    __________    ________

MPG               0.057197       -0.10555     -0.57358
Acceleration      -0.38452       -0.39658       0.4674
disp(pval)
Displacement    Horsepower      Weight
____________    __________    __________

MPG                 0.59233        0.32212    3.4401e-09
Acceleration     0.00018272     0.00010902    3.4091e-06

## Input Arguments

collapse all

Data matrix, specified as an n-by-px matrix. The rows of x correspond to observations, and the columns correspond to variables.

Data Types: single | double

Data matrix, specified as an n-by-py matrix. The rows of y correspond to observations, and the columns correspond to variables.

Data Types: single | double

Data matrix, specified as an n-by-pz matrix. The rows of z correspond to observations, and the columns correspond to variables.

Data Types: single | double

### Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside quotes. You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.

Example: 'Type','Spearman','Rows','complete' computes Spearman partial correlations using only the data in rows that contain no missing values.

Type of partial correlations to compute, specified as the comma-separated pair consisting of 'Type' and either 'Pearson' or 'Spearman'. Pearson computes the Pearson (linear) partial correlations. Spearman computes the Spearman (rank) partial correlations.

Example: 'Type','Spearman'

Rows to use in computation, specified as the comma-separated pair consisting of 'Rows' and one of the following.

 'all' Use all rows regardless of missing (NaN) values. 'complete' Use only rows with no missing values. 'pairwise' Use all available values in each column of y when computing the partial correlation coefficients and p-values corresponding to that column. For each column of y, rows will be dropped corresponding to missing values in x (and/or z, if supplied). However, remaining rows with valid values in that column of y are used, even if there are missing values in other columns of y.

Example: 'Rows','complete'

Alternative hypothesis to test against, specified as the comma-separated pair consisting of 'Tail' and one of the following.

 'both' Test the alternative hypothesis that the correlation is not zero. 'right' Test the alternative hypothesis that the correlation is greater than 0. 'left' Test the alternative hypothesis that the correlation is less than 0.

Example: 'Tail','right'

## Output Arguments

collapse all

Sample linear partial correlation coefficients, returned as a py-by-px matrix.

• If you input x and y matrices, the (i,j)th entry is the sample linear partial correlation between the ith column in y and the jth column in x, controlled for all the columns of x except column j.

• If you input x, y, and z matrices, the (i,j)th entry is the sample linear partial correlation between the ith column in y and the jth column in x, adjusted for all the columns of x except column j, after first controlling both x and y for the variables in z.

p-values, returned as a matrix. Each element of pval is the p-value for the corresponding element of rho. If pval(i,j) is small, then the corresponding partial correlation rho(i,j) is statistically significantly different from zero.

partialcorri computes p-values for linear and rank partial correlations using a Student's t distribution for a transformation of the correlation. This is exact for linear partial correlation when x and z are normal, but is a large-sample approximation otherwise.

## References

[1] Stuart, Alan, K. Ord, and S. Arnold. Kendall's Advanced Theory of Statistics. 6th edition, Volume 2A, Chapter 28, Wiley, 2004.

[2] Fisher, Ronald A. "The Distribution of the Partial Correlation Coefficient." Metron 3 (1924): 329-332