h a l f b a k e r yCrust or bust.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Please log in.
Before you can vote, you need to register.
Please log in or create an account.
|
1. The real world of both audio and visual fields consists of ever changing naturally-created complex spectra extending beyond the limits of perception in both directions
2a. Audio theory recognises that sound spectra are effectively infinitely variable within the bounds of audibility (low and high
limits of perception). Digitised reproduction of sound-fields use FFT algorithms to analyse and recreate spectral attributes by approximating an actual complex waveform by a composite of superimposed sine waves of as many different frequencies as required.
2b. Visual theory recognises three primary colours, and uses 3-axis colour space mapping to analyse and recreate a given light spectrum by mixing the output of three fixed-wavelength emitters at different intensities.
3. Obviously the synthesised visual spectrum is grossly simplified and much inferior to the synthesised audio spectrum.
10. Proposed is a display consisiting of individual pixels. Each pixel generates synthesised light which is produced by a FFT process to synthesis a spectral profile.
11. Result: true to life full colour displays
12. There is no point twelve.
13. Thirteen is my lucky number.
Tunable laser for each pixel of true waveform color; better even than FFT
https://www.rpmclas...gth/tunable-lasers/ [beanangel, Dec 26 2020]
Spectral monitor
[xaviergisz, Dec 26 2020]
This chip is a radio that works at 670 Ghz, and its from 2012, eight years ago.
https://web.ece.ucs...12_6_17_Seo_IMS.pdf [beanangel, Dec 27 2020]
survey of ptical bench on a chip technology from 24 years ago (1996)
https://www.laserfo...cal-bench-on-a-chip [beanangel, Dec 28 2020]
[link]
|
|
This would be a considerable technical challenge, use huge amounts of bandwidth, and deliver no practical benefit. |
|
|
[+] please proceed immediately. |
|
|
Your eyes only have 3 types of colour sensors, so 3 emitters (tuned
more-or-less to those 3; exact response peaks vary among
people...) are enough. (Except for tetrachromats, of course, but they
are a rare anomaly.) It's getting good resolution in the AMPLITUDE
that's important, not having lots of frequencies.
Your ears have a large number of sensory cells, able to detect many
frequencies, so sound needs the more "full" spectrum. |
|
|
Yes, but that doesn't mean that an expensive, over-engineered yet standard performance solution isn't appropriate. There are plenty of military contractors who will do a superb job of producing something heavier, bulkier, higher power demand and much more costly, but no more effective than the regular version. |
|
|
If every pixel was "full range", for the same amount of pixels, you'd get 3x the resolution. Of course "How is this to be accomplished" is the question. |
|
|
I want something with bremsstrahlung radiation. |
|
|
<obligatory pedantry> the phrase "whiter than white" came from an old laundry detergent commercial. It was actually true - one of the chemical additives to the soap downshifts UV into the visual range, so - outdoors, where the Sun provides plenty of UV - a white t-shirt washed in <whatever> would shine brighter. I think snow does the same thing. </op> |
|
|
"Each pixel generates synthesised light which is produced by a FFT process to synthesis a spectral profile." All the other organisms on the planet, which have different higher numbers of color sensors in their eyes would then see what we perceive on a computer monitor screen. |
|
|
The light source you mention is a tunable laser [link], and if they can be placed at an array then you could get screenfuls of pixels of just one true ~~~ wavelength. That is better even than an FFT composite. |
|
|
The FFT composite might only work in math space rather than retina space! I hear magenta and salmon, and maybe cyan do not exist as a roygbiv waveform direct waveforms, but only as the human eye's response to combined frequencies. So you might FFT decompose and assemble a nice purple, but a human sees magenta, unless it is nonadditive pure roygbiv 414 nM one sine wave purple. |
|
|
One different fun thing people here might like is two photon excitation of the rhodopsin protein at the eye. When photons are so near together in timing that two (more?) of them meet up with a rhodopsin molecule there is sufficient electrical charge motion at the protein to adjust the rhodopsin and the person can see IR or UV. I do not know what the perceived color is. |
|
|
That suggests very fast nanosecond or picosecond light pulse trains from a laser could be a different perceived color than their actual sinusoidal wavelength to a human from "overloading" rhodopsin. I don't know if its bad for people or not, but stacking up rhodopsin with 2-24 photons really fast might work, and each of the 2-24 photon-electron events might look like a different color. |
|
|
//Your eyes only have 3 types of colour sensors// This is true, but they are not very narrow-band, and so are sensitive to all wavelengths to some extent. The conclusion of this is that a given light spectra gives a certain subjective colour (qualia in philosophese) which cannot be accurately rendered by a handful of single-wavelength sources combined. |
|
|
Sound doesn't need CD-quality or higher resolution or bit-rate; old fashioned gramophones and telephones managed to reproduce sound waveforms plenty well enough to understand and listen to. but it is generally accepted that if there is a perceptible difference between Vivaldi played down the telephone line while you wait to complain about your electric bill, and the same Vivaldi played from CD on a good system, then why not have similar ambitions for colour reproduction? |
|
|
Spectral light sources such as prisms give single-wavelength light at any desired wavelength. But a spot colour is not a single wavelength; it is a complex spectrum caused by the spectral profile of the light source, modified by the spectral absorbence of the material. |
|
|
Consider a photograph of an oak tree. One pixel shows one part of one leaf. The clear blue sky overhead produces not-quite black-body radiation, and the not-quiteness of its spectral profile is what gives it its distinctive appearance. Then the chemicals in the leaf remove certain wavelengths at different intensities giving a resulting spectral profile that is far far from a single wavelength of light. |
|
|
The FFT attempts to approximate this complex waveform of the near-black-body-minus-chlor ophyll-absorbtion-profile light coming off that part of the leaf. |
|
|
Any objection about the small size of the signal generater for each pixel can be resolved by making the display very large and very far away. |
|
|
//not very narrow-band//
No, they aren't, but that's the point. They all detect all visible light,
but output differently depending on the frequency each cone cell is
more sensitive to. What gets sent to your brain isn't "this is a 427nm
photon"; it's "red cell is 20%, green cell is 37%, blue cell is 89%, etc"
(across the whole retina; there is some weird pre-processing and
other stuff in there too...). Your eyes can't measure light frequency in
the way your ears CAN (sort of) measure sound frequency. (I don't
know as much about ears as I do eyes.) |
|
|
// Your eyes can't measure light frequency in the way your ears CAN // |
|
|
Well, that's just sloppy engineering, isn't it ? Have you complained to the manufacturers ? |
|
|
There must be an upgrade available by now, shirley ? After all, we have this ... |
|
|
// can be resolved by making the display very large and very far away. // |
|
|
That's quality thinking, well beyond your usual standards. |
|
|
Vivaldi jerks the handset away firm his ear: "Sapristi; I'm being
played!" |
|
|
Somebody has to mention wavelets, it might as well be me. I'm very ignorant of math, perhaps there is a halfbaker who can do actual FFT equations, and feels like looking up wavelets on wikipedia and weighing in on if Wavelets, when it comes to making monitors displays and lighting for humans has a place. |
|
|
Just as Fourier transform can make any waveform from a composite of sine waves wavelets can make any waveform from a composite of other waves, but the things is, lots less waves make up the number of waves in a composite. |
|
|
If a wavelet minimal description of human visual apparatus sensitivity is made, then what? I do not know. Perhaps exciting exotic things like analog integrated circuits doing pure analog wavelets, physically, at a chip, stacked under LEDs or phosphors drive analog color. |
|
|
Perhaps wavelet-based image sensor chips produce true analog color, and the sampling rate is whatever the phone/ car/ camera/ computer/ medical scanner/ monitor is able to do. So if you are a video color purist you might spring for a Indium phosphide 670 GHz [link] dedicated video computer so the distance light travels in 1/670 Gigasecond is like 1/300th the length of a human neuron. It's still digital, but it changes faster than your retina Rods and Cones can tell the nearest connected neuron what they see. |
|
|
Anyway, does anybody here know enough to do wavelets on humans seeing light? |
|
|
Watching the pananoramic vistas of the sun in winter and melting ice tundra in summer will be a whole new experience. |
|
|
Here's a very halfbaked idea, but it might benefit educational software-on moitor schoolshildren with varied (nearsighted, farsighted, astigmatic) vision and people that watch recreational media on living room monitors (TVs): |
|
|
During PRK (photorefractive keratotamy; laser eye surgery) diagnoses, and live during the laser sculpting process the actual physical shape of your eye, likely at micrometer resolution is processed by a computer. It is possible to know the exact shape of the cornea and do things with it. |
|
|
Using the same look at the physical eye and compute contours technology the TV or computer monitor could compute exactly the right diameter laser, beamspread (sort of like focus pre-shape), and raster scan YXZ path of a laser directed into the eye so that *back of eye retina focus is always a perfect dot*. |
|
|
The laser has been calculated to get through the peculiarities of the cornea to the persons retina as an optimum, just right, pixel sized spot. The laser system is capable of projecting more laser-precise, cornea-passing dots than the resolution of the human eye (I'm reminded of Apple's Retina display [link] which may or may not have higher than 2020 human vision resolution). |
|
|
So, it's perfectly in focus. When you watch entertainment or compute on a monitor you have like 20/5 vision of something. All the microbulges and wiggles on your cornea are completely compensated for. |
|
|
Combine with continuous varying frequency tunable lasers [link] for true spectral color at perfect cornea retinal focus for an always in focus "full spectrum display". |
|
|
What is sometimes called optical bench on a chip [link] technology optically adjusts to compensate for the individual cornea of each of several possible monitor watchers in a room, so everybody, at every angle gets their perfect vision cornea-compensated view of the computer, educational software, or also entertainment monitor |
|
| |