You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
the product of 2 matrices can be implemented naively with something like (a[:,None,:]*b.T[None,:,:]).sum(-1). Sadly, this allocates a large 3d temporary array in the middle. For this particular operation, I should of course use a@b. However, I am interested in various similar operations, the simplest example being a tropical matrix multiplication ((a[:,None,:]+b.T[None,:,:]).min(-1)), and then I do not have access to a magically super-optimized version like @ anymore.
KeOps is all about avoiding big temporary arrays when we only use them in a reduction. However, it tends to concentrate on pairwise operations (indexed by i and j) where the reduction is along either axis i or axis j, whereas here my reduction (sum(-1) or min(-1)) is along some other axis k. I wonder if KeOps can still help here (I don't need optimal performance, but something reasonably fast without filling the whole memory), or if I should look elsewhere (there are options, but nothing super convenient, the easiest is probably to split into blocks myself 😞 ). Some possibilities:
Lazytensor to tensor #271 mentions a possible future conversion from LazyTensor to tensor. I can easily define my result as a LazyTensor, so this looks interesting, although I don't know how well it would manage to take advantage of the reduction, since it is not officially presented as a reduction.
Inventing a trivial reduction (identity / concatenation) so reducing along the j axis simply does nothing. The non-constant size may be an issue.
Somehow make (i,j) appear as one index and k as the other. I guess I could create an integer array (n²,2) that contains the pairs (i,j), an array (n,) with the indices k, and see if KeOps is ok using those values as indexes into some other array (I've used a similar trick a few times to work around restrictions in sklearn), but "LazyTensors only support indexing with integers and vanilla python slices" does not make me optimistic. The reduction could then be used to kill k. Even if that worked, it is not very pretty and I expect performance might not be great.
And whatever the approach, KeOps tends to like low dimension (<=100 according to the doc), whereas here all dimensions are n. I am probably knocking on the wrong door by trying to use KeOps here, but who knows...
Any advice?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hello,
the product of 2 matrices can be implemented naively with something like
(a[:,None,:]*b.T[None,:,:]).sum(-1)
. Sadly, this allocates a large 3d temporary array in the middle. For this particular operation, I should of course usea@b
. However, I am interested in various similar operations, the simplest example being a tropical matrix multiplication ((a[:,None,:]+b.T[None,:,:]).min(-1)
), and then I do not have access to a magically super-optimized version like@
anymore.KeOps is all about avoiding big temporary arrays when we only use them in a reduction. However, it tends to concentrate on pairwise operations (indexed by i and j) where the reduction is along either axis i or axis j, whereas here my reduction (
sum(-1)
ormin(-1)
) is along some other axis k. I wonder if KeOps can still help here (I don't need optimal performance, but something reasonably fast without filling the whole memory), or if I should look elsewhere (there are options, but nothing super convenient, the easiest is probably to split into blocks myself 😞 ). Some possibilities:And whatever the approach, KeOps tends to like low dimension (<=100 according to the doc), whereas here all dimensions are n. I am probably knocking on the wrong door by trying to use KeOps here, but who knows...
Any advice?
Beta Was this translation helpful? Give feedback.
All reactions