The difference is simply which side they occur on during the multiplication with the matrixA used in their definition.
A right eigenvector as a column vectorX_R satisfies:
A * X_R = lambda_R * X_R
A left eigenvector as a row vectorX_L satisfies:
X_L * A = lambda_L * X_L
From wikipedia:"Many disciplines traditionally represent vectors as matrices with a single column rather than as matrices with a single row. For that reason, the word 'eigenvector' in the context of matrices almost always refers to a right eigenvector".