I help design spectrometers. Every once in a while, a new idea for a mini spectrometer floats across my desk. Many of the ideas are based on silicon IC fabrication technologies or micromachining.
The drawback is the same in every case: The sheer amount of light that can pass through the optics of a spectrometer depends on its size. This in turn affects what signal-to-noise ratio can be realized.
These devices are certainly interesting, and performance is relative to requirements, so a novel application could make use of whatever sensitivity is made available by a particular device. But a tiny spectrometer isn't likely to be a drop-in replacement for a big one unless the big one is overkill for its application.
Still, I read these articles with interest, because my "size matters" rule is somewhat ad hoc, and there might be a factor that I'm overlooking.
Maybe you can help me with this... I want to measure the quality of light in my home and office. CRI seems to be the right metric, and it seems that the cheapest option is a $2000 (Sekonic) hand-held device. Are there cheaper options? I don't need much precision, just enough info to understand that these lightbulbs over here are much better than those other lightbulbs over there.
I am puzzled as to why it has to be so expensive. I would think one can put a prism, project the light over the surface of a black-and-white CCD sensor and measure the light intensity along the length, giving a reasonably good idea of how much energy is deposited into each wavelength range. I feel like one of these can be made for $20 in parts?
You can get a plastic DSLR camera body cap for less than 50 cents each (ebay from China). This cap goes where the lens would attach the camera when you don't have a lens on. Sometimes I cut a hole in the center of a cap and make really hacky camera "lenses" for fun. You could put a prism on the outside of a cap.
You would then need to calibrate your new spectra-cam on something with a known radiation (usually just find a good blackbody radiator.) You can't just convert to black and white without calibration because the RGB sensors on the camera are going to have their own sensitivity to different wavelengths. This would be an awesome project!
--
I write firmware for high end professional LED lights, and have a $3K spectrometer on my desk. Here are things I've learned:
1. CRI is simply a bad formula to judge light quality overall and gives almost meaningless results at lower color temperature levels.
2. There are lots of fantastic python modules for working with spectrograph and light data.
3. One awful thing cheap LED lights do is blink slowly to control their voltage. This is at least as bad a poor CRI for aggravating your eyes. (Sometimes you can see this just by using slow motion video on your phone)
Light quality has been something I've been interested in for a while so I've invested in making my whole environment use LIFX and Yeelight bulbs.
This has been great and I get bright neutral lighting during the day, warmer light in the evening and variations of colour gradients and themes when friends are over to chill.
However, I have no way of knowing if something is missing from the light quality or how I'd even go about optimising things both for my health/comfort and for indoor plants. Does anyone have recommendations?
(Sorry, I meant to say that I hate cheap LED lights that blink slowly to control their brightness.)
Blinking the LED on/off to control its brightness with a PWM is definitely not the only way to control output. And if you are controlling by blinking, PWM is still the bottom of the pile in ways to blink, from a quality of light point of view.
PWM could be combined with a capacitor in order to smooth the voltage out. Or sufficiently high PWM frequency can be used (10s of khz), in order to avoid eye strain.
The former approach might negatively affect the efficiency, though, depending on where the efficiency peak is.
A constant current source does not have to be linear, one can use a switchmode powersupply with current regulation. Then the current though the LED flows in a continuous manner.
Sure they pulse/switch, but that is into/from the inductor/capacitor used as intermediate energy storage, not through the load (LED). At the output a constant voltage is seen (plus some ripple).
A capacitor on PWM output will give a decent approximation, _if_ it is big enough. For large LEDs this can be very costly (relative to other parts in system).
Though the inductor in a switching converter is also a big cost driver.
Both for the capacitor, inductor and wrt to human perception increasing switching frequency helps. So that is what modern designs focus on normally.
A challenging aspect of LEDs is their nonlinear voltage versus current characteristic. A small change in voltage gives a very large change in current, and thus in output power. The characteristic is temperature dependent and has per device variation.
Hence LED drivers are usually constant-current sources, ie they measure and attempt to regulate the current.
(diffraction spectrometer from used CDs!), very easy to make, and it costs ~$0.
It has enough resolution to distinguish individual lines of fluorescent lamps (by using a narrow aperture). Highly recommended (very nice as an educational tool also, to impress folk with the hidden spectra of light around us ;) ).
Here are some brief thoughts. It turns out that monochrome CCD's can be a bit pricey because they are sold in lower volumes than color chips. Where you might find one is in the video security market, since they are often used with illumination from infrared LED's to do covert monitoring.
On the other hand, you could achieve pretty decent "monochrome" behavior by just summing the three color channels of a RGB camera, at least in the visible range of wavelengths. With that issue settled, you could figure out a way to produce a rainbow spectrum by hook or crook, such as a cheap prism from Surplus Shed (if they have one in stock -- they are a great source of bargain optics) and some lenses. For intensity calibration, a plain tungsten lamp would suffice for home experimentation. They operate very close to their rated color temperature. For wavelength calibration, one idea is to see if there are useful visible lines in the mercury emission spectrum (don't remember) which are emitted by a regular fluorescent lamp. Also, colored LED's at room temperature run pretty close to their rated peak wavelengths.
Summing RGB sensors doesn't really work the way you would think because each sensor is actually sensitive to a very broad range of wavelengths, some of which affect it more, and some of which affect it less. Unless you know what your red sensor reads with X amount of illumination at 635nm vs 670nm, you are going to get very bumpy results on even a blackbody.
There's still some of variation in colored LEDs. (and unless you are paying big bucks, a ton of variation on white LEDs. What most home LED bulb companies do is buy a bunch of off color white LEDs and sort them per light so that the badness sort of evens out to something close to okay.)
Indeed, once the sensors are summed, the intensity scale of the spectrometer still needs to be calibrated using something like a blackbody source.
But even monochrome sensors have lumpy response.
I'm only suggesting to use colored LED's as crude wavelength references and a blackbody as an intensity versus wavelength reference, for something that's good enough for home experimentation within reason.
This spectrometer is inexpensive:
https://publiclab.org/wiki/desktop-spectrometry-kit-3-0
The problem is calibration, you would need to calibrate with sources like lasers or fluorescent light with known spectral peaks. Then you could use the spectrum to calculate CRI. You are just calculating cri (which has its own issues), so even with a crude device some spectral error is ok.
This is cool, a while back when I used to do fruit sorting machines, we used a spectrometer for working out the sugar content of fruit.
Nothing quite like setting up the machine to siphon off the biggest sweetest peach you can find at one of the largest peach farms in the world. best. peach. ever. :)
If you are wanting an inexpensive spectrometer, then Public Labs has instructions on how to build one using an old DVD as a diffraction grating https://publiclab.org/wiki/spectrometry
It depends on what you want this spectrometer to do. Public Lab spectrometer is interesting and cheap, but the performance is very very limited.
If you need something better you can search eBay for used (scanning) monochromators. Next step is to check Ocean Optics or Hamamatsu catalogs: they have good line of products, but those are in 1000 USD+ range.
Here we demonstrate a transformative on-chip digital Fourier transform spectrometer that acquires high-resolution spectra via time-domain modulation of a reconfigurable Mach-Zehnder interferometer. The device, fabricated and packaged using industry-standard silicon photonics technology, claims the multiplex advantage to dramatically boost the signal-to-noise ratio and unprecedented scalability capable of addressing exponentially increasing numbers of spectral channels.
FTIR devices seem to cost around 50000 right now, would this be something that would lower that cost?
FTIRs are definitely in demand with harm reduction groups both in cities and for drug testing at music festivals. If we could get these at lower costs it would honestly save lives.
A very simple refractive IR spectrometer is possible for under $1000 if you can live with 2-10μm range. I don't have a full write up yet but the basic design is an MLX90614 thermopile scanning infrared light generated by nichrome wire, focused with a zinc selenide CO2 laser lens (the ones without an AR coating are better but harder to find), and separated by a silicon prism which is essentially just a chunk of a broken silicon boule. It's very difficult to find transparent materials in this wavelength range, especially when you're limited to cheap goods mass-produced for some other application. Signal noise and speed will be much worse than a modern FTIR but it's usable when calibrated.
It might work for qualitative measurement (what is the solution you are measuring), but this kind of instrument will not be performant enough to do qualitative analysis. Once you want to measure amounts the FTIR is probably the only way to go, especially with more complex solutions.
There is Neospectra [1] company that produces FTIR-on-chip modules. Performance is not that great compared to research grade FTIR, but the price is a lot less.
Peaks of each chemical is unique so you can scan database of known chemicals and compare.
You can even measure the amounts of various chemicals in solution, but that would require complex calibration. In this case you are basically writing ML system that is trying to fit various peak heights/areas to amount of compound.
The drawback is the same in every case: The sheer amount of light that can pass through the optics of a spectrometer depends on its size. This in turn affects what signal-to-noise ratio can be realized.
These devices are certainly interesting, and performance is relative to requirements, so a novel application could make use of whatever sensitivity is made available by a particular device. But a tiny spectrometer isn't likely to be a drop-in replacement for a big one unless the big one is overkill for its application.
Still, I read these articles with interest, because my "size matters" rule is somewhat ad hoc, and there might be a factor that I'm overlooking.