Definition MVP. Matrix-Vector Product.
Suppose is an matrix with columns and is a vector of size Then the matrix-vector product of with is the linear combination
City | Temp | Colleges | Superfund | Crime |
Los Angeles | 77 | 28 | 93 | 254 |
Chicago | 84 | 38 | 85 | 363 |
New York | 84 | 99 | 1 | 193 |
xxxxxxxxxx
A = matrix(QQ, [[1, -3, 4, 5],
[2, 3, -2, 0],
[5, 6, 8, -2]])
v = vector(QQ, [2, -2, 1, 3])
A*v
xxxxxxxxxx
sum([v[i]*A.column(i) for i in range(len(v))])
xxxxxxxxxx
B = matrix(QQ, [[ 1, -3, 4, 5],
[ 2, 3, -2, 0],
[ 5, 6, 8, -2],
[-4, 1, 1, 2]])
w = vector(QQ, [1, 2, -3, 2])
B*w
xxxxxxxxxx
w*B
xxxxxxxxxx
B*w == w*B
xxxxxxxxxx
coeff = matrix(QQ, [[-1, 3, -1, -1, 0, 2],
[ 2, -6, 1, -2, -5, -8],
[ 1, -3, 2, 5, 4, 1],
[ 2, -6, 2, 2, 1, -3]])
const = vector(QQ, [13, -25, -17, -23])
solution1 = coeff.solve_right(const)
coeff*solution1
xxxxxxxxxx
nsp = coeff.right_kernel(basis='pivot')
nsp
xxxxxxxxxx
nspb = nsp.basis()
solution2 = solution1 + 5*nspb[0]+(-4)*nspb[1]+2*nspb[2]
coeff*solution2
xxxxxxxxxx
nonnullspace = vector(QQ, [5, 0, 0, 0, 0, 0])
nonnullspace in nsp
xxxxxxxxxx
nonsolution = solution1 + nonnullspace
coeff*nonsolution
A.solve_right(v)
and A.solve_left(v)
is that the former asks for a vector x
such that A*x == v
, while the latter asks for a vector x
such that x*A == v
. Given Sage’s preference for rows, a direction-neutral version of a command, if it exists, will be the “left” version. For example, there is a .right_kernel()
matrix method, while the .left_kernel()
and .kernel()
methods are identical — the names are synonyms for the exact same routine.xxxxxxxxxx
A = matrix(QQ, [[3, -1, 2, 5],
[9, 1, 2, -4]])
B = matrix(QQ, [[1, 6, 1],
[0, -1, 2],
[5, 2, 3],
[1, 1, 1]])
A*B
xxxxxxxxxx
sum([A[0,k]*B[k,2] for k in range(A.ncols())])
A.ncols()
by B.nrows()
since these two quantities must be identical. You can experiment with the last statement by editing it to compute any of the five other entries of the matrix product.True
it will be a minor miracle.xxxxxxxxxx
A = random_matrix(QQ,4,4)
B = random_matrix(QQ,4,4)
A*B == B*A # random, sort of
True
. Repeated experimental evidence does not make a proof, but certainly gives us confidence.xxxxxxxxxx
A = random_matrix(QQ, 3, 7)
B = random_matrix(QQ, 7, 5)
(A*B).transpose() == B.transpose()*A.transpose()
xxxxxxxxxx
A = matrix(QQbar, [[ 45, -5-12*I, -1-15*I, -56-8*I],
[-5+12*I, 42, 32*I, -14-8*I],
[-1+15*I, -32*I, 57, 12+I],
[-56+8*I, -14+8*I, 12-I, 93]])
A.is_hermitian()
x
and y
below are random, but according to Theorem HMIP the final command should produce True
for any possible values of these two vectors. (You would be right to think that using random vectors over QQbar
would be a better idea, but at this writing, these vectors are not as “random” as one would like, and are insufficient to perform an accurate test here.)xxxxxxxxxx
x = random_vector(QQ, 4) + QQbar(I)*random_vector(QQ, 4)
y = random_vector(QQ, 4) + QQbar(I)*random_vector(QQ, 4)
(A*x).hermitian_inner_product(y) == x.hermitian_inner_product(A*y)