just place a multihole screen in front of a two or three lens array, the software combines the multiple images from the through-grid view while the grid absorbs smudges.
One effect of this is you cant see where on the iPad or iPhone lenses are giving the iThing a nonconspicuous, nonstartling way to make images through what might look like fabric or mesh.
although there is a resolution variation as a result of the software reconstruction 20 megapixels is current, so three 80 pixel sensors, screened, then interpolated to 40 megapixels is still pretty functional at 2014 to 2016 ish-- beanangel, Oct 24 2012random, halfbakery