This is a nice TED talk about a 3D camera technology. A bright femtosecond laser pulse illuminates the whole scene at once. No need to slowly scan around with an ordinary laser, like current LIDARs do.
It can see through some dust and liquid water drops too. This could be useful for some navigation systems. Among a million other uses.
Also, an idea struck me today. The nights are already dark and long, and when looking at some blue light ads on some office buildings, the usual chromatic aberration in the human eye makes them very blurry compared to the red or green ads. That’s because the eye focuses the red and green wavelengths much better (they are close to each other), at the cost of having the blue somewhat out of alignment. There are also much less blue detecting cone cells in the than red or green. (Greens are the most plentiful.) Especially in the fovea, where a yellow pigment layer filters out the blue light. So I’m not sure how much of the effect is because of chromatic aberration in the eye lens and how much is because of sparse blue cells and the pigment layer. Otherwise I have pretty good vision and don’t use eye glasses.
But in theory you could correct the eye system to an achromatic lens by distorting the blue light to the opposite direction before it enters the eye.
There are achromatic (focus 2 wavelengths to same focus) and even apochromatic (3 wavelengths) lenses out there for cameras and telescopes. The eye has not evolved such things.
So you could have an achromatic eyeglass-eye system – even apochromatic with very fancy eye glasses – you would probably need multiple lenses.
The inspiration for this post was this post by Ambivalent Engineer over a year ago.