Hello (mostly to me),
Here’s a list of blog posts I want to write, probably in the order they will be written. This is mostly for me as a reminder, but feel free to get on my case if I’ve missed out on something.
Inverse Problems as Bayesian Inference: This is an important one! Describes how I think about basically everything. Will start with the maths leading up to the classic regularised functional. Describe how this naturally leads to uncertainty quantification, as well as other useful interpretation tricks like drawing samples to understand the regularisation and pre-estimate parameters. This also makes data synthesis, design of experiments, and inter/exrapolation a breeze, but those will get proper dedicated discussions later. Include the classic, all linear case that can be solved by stacking.
Linear Algebra Megapost: Why and how to use matrices. Inversion and pseudoinversion. SVD and QR. How everything can be understood in terms of SVD. Iterative solving/landweber.
Bayesian Approximation Error: A really handy trick, and another advantage of the Bayesian view as this basically falls right out.
Online QR: The most useful thing in my thesis, extra handy for BAE.
Inverse Crimes: What are they, why they matter, and how to avoid doing them. This also relates to BAE, as the whole inverse crime concept becomes obvious with BAE established. This is an important idea, and explains a lot of why experimental results often disappoint compared to synthetic, even though “everything” was modeled “properly”. I think this will become increasingly significant as more people try doing neural net stuff.
Discretisation Invariance: A key factor in determining the practical robustness of a prior/regulariser. I will talk about TV and smoothness priors here rather than in a dedicated blog, but that might change.
Hybrid Imaging: Or joint inversion, multiphysics, data synthesis and so on. In the Bayesian setting this is easy to do, while a lot of stuff going on seems ad hoc and messy. I don’t know if I should talk about a particular CBCT modality that would benefit from this in a blog, as I’d rather get a paper out of it first!
Design of Experiments: The Bayesian view lets us quantify uncertainty – but what about minimising uncertainty? Typical experiments might take e.g. equispaced measurements, but that might not contain the most useful information given what is already known. DoE lets us pick exactly which measurements will give the mot information.
Synergy: This was inspired by a chat with Paul Keall, so is all in cancer treatment terms. Let A and B be treatments (e.g. A is a 5gy radiotherapy fraction, B is 10ml of chemotherapy) and r(A), r(B) be the responses (e.g. tumour size reduction). There is a lot of interest in whether A and B are “synergistic” – but no one knows what that means. Maybe it means r(A+B)>r(A)+r(B). Or maybe just r(A+B)>r(A) and r(A+B)>r(B) so still worth doing. Or perhaps there is a combination of A and B better than the best A or B? How would you find this experimentally?
X-ray CT: And FBCT and CBCT because they look cooler. It’s hard to find a clear explanation of filtered back projection, ART and other iterative methods, so I’ll write my own. This is low priority though as plenty of info is out there, I just find it more muddled than necessary.
CT Ventilation Imaging: What is it, and how does it work. A good chance to talk about numerical approximations to derivatives (here Jacobian with O(1) error) and why the Determinant of the Jacobian relates to the volume change. Also discuss why the smoothed estimates work better (sub O(1) error when correlation structure introduced).