Bayesian linear regression - avoid matrix inversion? #667
tom-metherell
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
First of all, thank you for all that you do!
I was looking at the internal function$\mathbf{\hat{β}}$ is computed via inverting $\mathbf{R^{T}R}$ .
estimice
, and I see that the default implementation uses QR decomposition to compute the least squares estimates. Then, the covariance matrix ofHowever, using the identity$\mathbf{(R^{T}R)^{-1}} \equiv \mathbf{R^{-1}(R^{T})^{-1}} \equiv \mathbf{R^{-1}(R^{-1})^T}$ , I think you could instead find $\mathbf{R^{-1}}$ by back substitution (as $\mathbf{R}$ is upper triangular) and multiply it by its own transpose, which is (generally at least) faster and more numerically stable, so it could maybe reduce the number of cases where the ridge penalty has to be applied.
So instead of
you could have
It's a minor point, and maybe there's a good reason this shouldn't be done that I haven't thought of, but just thought I'd say in case it can be useful!
Beta Was this translation helpful? Give feedback.
All reactions