I am sourcing a new spectrometer for my optics lab and I am wondering how to determine the slit size. They offer 10um, 25um, and 50um slit sizes with decreasing resolution as we go larger. I want to go with the 10um slit to have the highest resolution, but I am worried about not getting enough light throughput.
My current setup is characterizing 10mW lasers with the light path going from laser diode -> 4-in integrating sphere -> fiber patch cable -> spectrometer.
I would use the same slit size as the existing spectrometer, but it is from a long time ago and they did not list the slit size they ordered for it.
I couldn't find a spec for the minimum light the spectrometer expects or how much light is loss with decreasing slit size. I could just increase the integration time way high, but I don't want too slow of a measurement. Any readings, guidelines, and/or advice appreciated!