[Introduction To Lens Design With Practical Zemax Examples Pdf Download

0 views
Skip to first unread message

Tilo Chopin

unread,
Jun 10, 2024, 12:15:11 PM6/10/24
to metlocounre

Lens design used to be a skill reserved for a very few professionals. They used company proprietary optical design and analysis software which was resident on large and expensive mainframe computers. Today, with reasonably priced commercially-available optical design software and powerful personal (and portable) computers, lens design tools are accessible to the general optical engineering community. Optical design is therefore a strong component of a well-rounded education in optics, and a skill valued by industries employing optical engineers.

Design principles of lens and mirror optical systems; evaluation of designs using computer techniques. The lectures include an introduction to optical system design, methods of lens design, optimization, paraxial layout, achomatization methods, Petzval curvature, 3rd and higher order optical aberrations, and image quality metrics.

Introduction To Lens Design With Practical Zemax Examples Pdf Download


Download ---> https://t.co/2ryXq97kx5



The design principles covered in the lectures include: lens bending, stop shifts, element splitting, color correction, aberration balancing, field flattening, aspherics, and the proper use and construction of the merit function.

This series of blogs are based on many years of learning and using Zemax. Just want to share here and hope they can help readers to learn ZPL quicker and easier. Some of the examples and plots are based on older versions of Zemax, and some are based on more recent versions. However, the main idea should remain the same.

In Sequential Ray Tracing mode, Zemax defines an optical system as being made up of various surfaces, and assumes that a light ray starts from the object surface, goes through the various surfaces of the system in a pre-defined order, and finally reaches the image surface. Figure 1.1-1 shows an example of Sequential Ray Tracing:

As we can see, Ray 1 starts from the Object surface, goes through Surface 1 and Surface 2, and then arrives at Image surface. Ray 2 goes along a different path, but the sequence of the surfaces it goes through is exactly the same as that for Ray 1. In another word, the order each ray goes through the optical surfaces is exactly the same, therefore, the behavior of each ray in the optical system is predictable. By tracing the light path of each ray, Zemax knows the performance of the whole system.

In Non-Sequential Ray Tracing mode, Zemax defines the optical system as being made up of many components (or solid modules), and each component is called an object. For example, a lens is an object with not only two surfaces, but also an edge that might scatter or absorb light, and even fattened outer faces for mounting. Other common objects supported in Non-Sequential Ray Tracing include prisms, light pipes, lens arrays, light sources, detectors, TIR reflectors, partial transmissive and partial reflective compnents, etc.

As can be seen, Ray 1 starts from the Source, passes through Surface 1, Surface 3, and then reaches the Detector. Ray 2 starts also from the Source, however, it is reflected by surface 1, and never reaches the Detector. Ray 3 starts from the Source, passes through Surface 1, is reflected by Surface 4, passes through Surface 3, and finally reaches the Detector. This shows that in a non-sequential system, different rays may follow different paths, interact with some or all of the surfaces in different orders. Therefore, Zemax needs to trace each ray in the optical system to know its optical path, and get the overall performance of the system.

Besides the full sequential ray tracing mode and the full non-sequential ray tracing mode, Zemax also provides a mixed ray tracing mode (NSC with port). In this mode, part of the optical system is treated as sequential system, and part of the optical system is treated as non-sequential system.

The user interface of Zemax is made up of different types of windows, each of which serves a different purpose. When running Zemax, a default window called Main window will be seen, as shown in figure 1.1-3.

In general, Zemax has very powerful optical design capabilities. It can accurately calculate light path, refraction and reflection, phase and optical path difference, optical image and distortion, polarization, transmission and absorption in thin film coating, scattering, etc. However, despite its powerfulness, Zemax cannot help you to master basic principles in optical design. A good optical designer should only use Zemax as an effective tool to help his or her design, but cannot simply rely on this tool. When needed, the optical designer needs to know how to extend this tool to make it more capable.

This course is a comprehensive introduction to the optical design of lenses and imaging systems. This course begins with a review of basic optics, including geometrical optics and Fourier optics. A discussion of how different system specifications influence the choice of design form, achievable performance, and cost will be presented. Aberration theory, stop shift theory, and induced aberrations are examined in detail. Factors that affect aberrations and the principles of aberration correction are discussed. Demonstrations of computer-aided lens design are given accompanied by a discussion of optimization theory, variables and constraints, and local vs. global optimization. Techniques for improving an optical design are illustrated with easy-to-understand Zemax examples.

When optical designers attempt to compare the performance of optical systems, a commonly used measure is the modulation transfer function (MTF). MTF is used for components as simple as a spherical singlet lens to those as complex as a multi-element telecentric imaging lens assembly. In order to understand the significance of MTF, consider some general principles and practical examples for defining MTF including its components, importance, and characterization.

Resolution is an imaging system's ability to distinguish object detail. It is often expressed in terms of line-pairs per millimeter (where a line-pair is a sequence of one black line and one white line). This measure of line-pairs per millimeter $ \small\left(\tfrac\textlp\textmm\right) $ is also known as frequency. The inverse of the frequency yields the spacing in millimeters between two resolved lines. Bar targets with a series of equally spaced, alternating white and black bars (i.e. a 1951 USAF target or a Ronchi ruling) are ideal for testing system performance. For a more detailed explanation of test targets, view Choosing the Correct Test Target. For all imaging optics, when imaging such a pattern, perfect line edges become blurred to a degree (Figure 1). High-resolution images are those which exhibit a large amount of detail as a result of minimal blurring. Conversely, low-resolution images lack fine detail.

A practical way of understanding line-pairs is to think of them as pixels on a camera sensor, where a single line-pair corresponds to two pixels (Figure 2). Two camera sensor pixels are needed for each line-pair of resolution: one pixel is dedicated to the red line and the other to the blank space between pixels. Using the aforementioned metaphor, image resolution of the camera can now be specified as equal to twice its pixel size.

Consider normalizing the intensity of a bar target by assigning a maximum value to the white bars and zero value to the black bars. Plotting these values results in a square wave, from which the notion of contrast can be more easily seen (Figure 3). Mathematically, contrast is calculated with Equation 3:

When this same principle is applied to the imaging example in Figure 1, the intensity pattern before and after imaging can be seen (Figure 4). Contrast or modulation can then be defined as how faithfully the minimum and maximum intensity values are transferred from object plane to image plane.

To understand the relation between contrast and image quality, consider an imaging lens with the same resolution as the one in Figure 1 and Figure 4, but used to image an object with a greater line-pair frequency. Figure 5 illustrates that as the spatial frequency of the lines increases, the contrast of the image decreases. This effect is always present when working with imaging lenses of the same resolution. For the image to appear defined, black must be truly black and white truly white, with a minimal amount of grayscale between.

In imaging applications, the imaging lens, camera sensor, and illumination play key roles in determining the resulting image contrast. The lens contrast is typically defined in terms of the percentage of the object contrast that is reproduced. The sensor's ability to reproduce contrast is usually specified in terms of decibels (dB) in analog cameras and bits in digital cameras.

Now that the components of the modulation transfer function (MTF), resolution and contrast/modulation, are defined, consider MTF itself. The MTF of a lens, as the name implies, is a measurement of its ability to transfer contrast at a particular resolution from the object to the image. In other words, MTF is a way to incorporate resolution and contrast into a single specification. As line spacing decreases (i.e. the frequency increases) on the test target, it becomes increasingly difficult for the lens to efficiently transfer this decrease in contrast; as result, MTF decreases (Figure 6).

Figure 6 plots the MTF of an aberration-free image with a rectangular pupil. As can be expected, the MTF decreases as the spatial resolution increases. It is important to note that these cases are idealized and that no actual system is completely aberration-free.

In traditional system integration (and less crucial applications), the system's performance is roughly estimated using the principle of the weakest link. The principle of the weakest link proposes that a system's resolution is solely limited by the component with the lowest resolution. Although this approach is very useful for quick estimations, it is actually flawed because every component within the system contributes error to the image, yielding poorer image quality than the weakest link alone.

795a8134c1
Reply all
Reply to author
Forward
0 new messages