On Mon, Nov 21, 2011 at 1:47 PM, Simon Quellen Field <sfield@scitoys.com> wrote:
> The focus problem is due to the use of high dispersion gratings.
> They spread the spectrum over a wide angle, so the middle of the
> spectrum focuses farther back than the two ends do.
> There are several ways to get around this.
> First, you could simply eliminate focusing altogether, by having the optics
> focus on infinity. The slit then looks like a point source at infinity, and
> is
> always in focus.
> Second, you could use a low dispersion grating, and simply back away
> until the spectrum fills the imager. The angle between red and violet is
> now so small that everything is in focus. If you don't want a meter-long
> device, fold the optics with mirrors.
Well the point of using this all-in-one grating that I keep tending
towards, is because I want this to be small and rugged.
> Third, you can convert a high dispersion grating to a low dispersion one
> using a lens.
> A cute fourth trick is probably not DIY, but a lens can be made that focuses
> green light closer than red or violet.
I think this is really the point of the aberration-corrected gratings,
to keep things small but which require high-engineering and the right
math to produce the correct shapes and holograms.
> Likewise, a lens or mirror can be made that focuses the center closer than
> the
> edges (this is basically what the curved grating is trying to do). But the
> grating
> itself does not need to be the active optical element -- if you have a lens
> do it,
> you can use a cheap grating.
> It is easy to have software accommodate various tilts and zooms. The
> software
> I wrote for the spectrograph I did for MAKE magazine does that. But it
> cannot
> fix focus problems without taking two or three shots at different foci and
> combining them in something like CombineZP.
> Making the spectrograph self-calibrating using a fluorescent light saves all
> kinds
> of time and hassle, and allows for a lot of slop in assembly. It also gives
> you a good
> feeling that your data is actually accurate between runs.
Agreed, the calibration is a solved problem.
> I would not worry about UV absorption. When you calibrate in the amplitude
> domain,
> any absorption is accounted for. And it is completely silly to worry about
> absorption
> in thin plastic films. If UV filtering were that easy, sunscreen SPFs would
> be in the
> billions.
But we also don't worry about covering our body from the tiny amount
of UV that will come through a slit. Low light conditions can really
start to effect things especially in the UV because the sensors
already aren't too good down there (CCD or CMOS), and from what I've
been seeing in different papers and spec sheets, they generally get
coated in a UV excited fluorophore that then emits around 400nm...
this conversion has loss to it.
> As with all optics, you will have to consider empty magnification. It won't
> matter how
> many pixels you have if your optics can only resolve the spectrum to five or
> ten
> nanometers.
> With that in mind, consider very narrow slits, and compensate for the
> dimming by
> integrating over time. You can also do HDR by taking several images and
> combining
With longer integration times I feel that dark currents would start to
blur the image to where it couldn't be ignored. Maybe not, I'll ask my
girlfriend (who is working on CCD noise for her thesis) what type of
noise long integration times can cause, and what the limits of
software correction are.
> them in software. If you jiggle the detector or the optics between shots,
> you can also
> compensate for pixel sensitivity variations in the sensor. The software will
> do correlation
> to match up the various images, and any pixel variation will average out.
> Newport is a great company, and having local access to their experts is a
> big win.
> If you are worried about quality control with any company, Chinese or not,
> just make
> sure your contract allows returns of items that don't meet your
> specifications for cash
> or full credit. Quality control is your job, you can't outsource that to
> your vendors.
Hmm, good point.
> Find out what your needs really are, and don't engineer a device to be more
> expensive than it needs to be. Astrophysics might need extremely high
> resolutions,
> but most biological applications might be satisfied with nanometer or even
> five nanometer
> resolution in the frequency domain, and care more about dynamic range and
> reproducibility
> in the amplitude domain. The time stability or calibration accuracy of your
> light source
Well so my thought here is that the over-engineered grating also
simplifies the optics system to one component... so less physical
stuff to get banged around if some ecologist wanted to take samples in
the field. Also I'm concerned with the ease of assembling these
devices so they're at least within limits of calibrating to each
other. Example: we got the CNC laser cutter here, unaligned. It took
us three or four nights (2-3 hours each night) to get it aligned...
now this is IR and invisible, but it only had 2 mirrors to be
adjusted, along with the laser. A spectrometer might be aligned with
visible light, but its going to be dim (as seen by testing some
networking optical fiber and TOSlink cables with 500W halogen lamps,
and red and green laser pointers).
It might not seem like it, because I may seem to jump around with my
thoughts, but I'm really all for KISS (Keep It Simple Stupid)
> might be much more important than resolving Doppler shifted spectral lines
> from one
> side of a star to the other to determine rotation velocities (the
> astrophysics case for high
> resolution).
> Amplitude calibration will probably be done by comparing an empty cuvette to
> the one
> with the sample. If the light source changes in that time, you have a
> problem. A design
> that uses a tall slit, with the sample blocking only half of it, allows you
> to do the compare
> in one shot. But then you would need a two dimensional sensor like a camera,
> or two
> linear ones.
> For protocols that specify growing a culture until it blocks n% of light at
> some wavelength,
> you can go really sloppy. The bugs will keep growing after you measure, and
Sure, but you want tighter measurements for something like DNA
quantification... so again, application specific.
On that note, I've been looking at the high-end for the bioscience
world, but haven't really asked about the educational/amateur
non-science world. What performance would realistically be useful to
people, other than showing them the difference in color temperature?
Are we no longer talking chemical fingerprinting at this point because
of smearing, or just less accuracy overall (soda vs milk works, but
milk vs milk+melamine doesn't... or vice-versa)?
> ten percent
> accuracy is probably overkill. A light meter and a few different colored
> LEDs will probably
> do just fine, once you have set up some conversion tables between a lab
> spectrometer
> and the DIY LED gadget.
--
Nathan McCorkle
Rochester Institute of Technology
College of Science, Biotechnology/Bioinformatics
--
You received this message because you are subscribed to the Google Groups "DIYbio" group.
To post to this group, send email to diybio@googlegroups.com.
To unsubscribe from this group, send email to diybio+unsubscribe@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/diybio?hl=en.






0 comments:
Post a Comment