0

I am sourcing a new spectrometer for my optics lab and I am wondering how to determine the slit size. They offer 10um, 25um, and 50um slit sizes with decreasing resolution as we go larger. I want to go with the 10um slit to have the highest resolution, but I am worried about not getting enough light throughput.

My current setup is characterizing 10mW lasers with the light path going from laser diode -> 4-in integrating sphere -> fiber patch cable -> spectrometer.

I would use the same slit size as the existing spectrometer, but it is from a long time ago and they did not list the slit size they ordered for it.

I couldn't find a spec for the minimum light the spectrometer expects or how much light is loss with decreasing slit size. I could just increase the integration time way high, but I don't want too slow of a measurement. Any readings, guidelines, and/or advice appreciated!

  • Most companies selling instruments have sales engineers who can help you with selecting their products. Have you tried contacting the manufacturer and specifically asking for a sales engineer? I've had good success with this in the past. – Eric S Feb 08 '24 at 17:14
  • They just gave me a bunch of slit size options, so I thought it might be somewhat common knowledge or some table available. I'll check with the manufacturer though. Thanks! – ImpressionableEE Feb 08 '24 at 17:33

0 Answers0