Verifying image simulation with measurements (like MTF) is not really trivial, either. Perhaps the most interesting test cases are aliased systems, which can violate assumptions made in modeling. Modeling aliased systems is not really a special case, but it is very difficult. In my experience, the best algorithm can depend on the data, so defaults must be chosen based on a criteria like “usually among the best” or “rarely among the worst.” Placing qualifiers on the quality of the result usually does nothing to instill a great sense of accuracy in the user. This is not to say it can’t be done. If you read the docstring for the conv method, it gives a terse outline of how the algorithm handles gridding to maintain accuracy always. Beyond this, the only thing you really have to do if you want the last word in accuracy is do things outside the natural order, deferring interaction with aliased data until the last moment possible. This prevents the aliases from corrupting the lower frequencies during intermediate steps.
On the measurement side - measurement algorithms are complicated, too. It would be foolish to think one can implement a suitably robust piece of code quickly since the improvements are usually made in response to its failure on some data, and exposure to that data comes with time. This is also not to say it is impossible to develop your own high quality measurement algorithms.
I’ve cross compared prysm to MTF mapper simulating a few different targets, which all have different aliasing properties. Frans and I have shown that the image simulation tool in MTF-mapper itself1, and prysm2 hit those without special user intervention to the 1e-6 level of accuracy, which is a few digits better than, say, a Trioptics or Optikos MTF bench will even report their measurements to.
prysm’s API is designed to express ideas more or less in English, with modeling/simulation parameters available as arguments. The default argument cases are designed with a number of clever tricks to maintain accuracy. This means some of its internal algorithms are more complicated than the naive approach, and may have an unexpected time complexity, for better or worse.
1 MTF-Mapper uses a discrete point non-gridded method which is more akin to rayleigh-sommerfeld, with some differences
2 prysm is a classical numerical methods approach
Testing your code in MTF measurement scenarios is probably more punishing than usual scenes, since the targets are all going to be aliased where a natural scene may have little to no appreciable aliasing. The benefit is that you know truth in the analytic cases. I suppose a natural scene has a concept of truth as well, since you can compare the Fourier spectrum, but simply acquiring an accurate Fourier spectrum of an image is harder than fft(im)
which is something probably not all that well appreciated by most engineers and scientists.