I agree, which is why I wrote "if you are viewing the image at a more realistic 50% crop where four pixels get binned down to one then f/11 is easily good enough" and also when I wrote, again referring to f/11, that "one might argue that you actually don't need the 7D's 18MP, though I think there's a case for having them available during post-processing". Don't forget that I enjoy 21 MP so I'm hardly a megapixel sceptic.
So far as stopping down significantly
into the diffraction limited zone in order to maximise depth of field is concerned I really don't believe you gain anything as all you do is add blur to the image which you can either see if the print (or screen image) is large enough or you can't if it's small enough. I'll try and explain why I take that view.
I'll start with the Diffraction Limit Calculator at the bottom of CambridgeInColour's Pixel Size, Aperture and Airy Disks
tutorial. We already know from the table in my earlier post
that any lens stopped down to f/8 or dimmer is diffraction limited so far as the 7D sensor is concerned: that's when viewing at a 100% crop. But the Diffraction Limit Calculator shows that, leaving the initial defaults (print size, viewing distance etc.) in place, and entering 18 MP and f/13 and clicking Calculate the resultant print
is not diffraction limited. But double the print size to 20" and recalculate and the print is so limited. The fact that f/13 wasn't diffraction limiting in the 10" case is directly due to how much detail the observer can see when printing at that size and viewing from that distance. You can check that by changing the "Eyesight" field to 20/20 and recalculating. The 10" case does then become diffraction limiting.
Diffraction softening has a direct effect on the "circle of confusion" which is how depth of field is derived. You can play the same game as played with the Diffraction Limit Calculator using the Depth of Field Calculator at the top of this page
. Accept the defaults already in place and plug in a 35mm lens at f/11 and focussed at 3m and "Calculate". The depth of field is 4.459m. Leave everything the same except the print size which now goes up to 20" once more. Recalculate and the depth of field drops to 1.719m, less than half what one had at 10". The hidden assumption in the calculation is, of course, that the sensor plus lens can provide enough resolution to sensibly fill the print but the principle stands: bigger prints produce shallower depths of field.
The effect of that hidden assumption in the Depth of Field Calculator can be seen if you plug f/64 into the aperture as you'll get an "infinite" depth of field. That's technically correct but with an Airy disk diameter of 85.9µm at f/64, some 20 times the width of the 7D's photosites, the resultant image will be a horrible blur at any reasonable print size, roughly equivalent to the image from a 43,000 pixel (0.043 MP) sensor!
That's an extreme case but I think the argument stands that stopping down significantly below the diffraction limiting aperture for a particular sensor doesn't actually get you anything but a dimmer image. The theoretical increase in depth of field is bought at the expense of blurring the whole image which means the image has to be viewed at a smaller size in order for the sharp bits to look sharp. But if you have to view the image at such a reduced size then you might just as well have left the aperture at the diffraction limit as the perceived depth of field will have increased because of the smaller image. Complicated, innit!
Just to reiterate, I'm a megapixel junkie but I think it's important to understand the limits physics imposes if you need to get best use of all those pixels for any particular image.
P.S. My apologies to those, including popo, who undoubtedly know a lot of what I've just written. There was certainly no intention to try and "teach Granny to suck eggs" but I thought it important to try and justify my argument. Helps me think things through as well.