Understanding Field of View (FOV) and Angular Field of View (AFOV) in Camera Lenses

Key Takeaways Field of View (FOV) in a camera lens, influenced by focal length and sensor size, captures the scene. Angular Field of View (AFOV), measured in degrees or length, is determined through optical tests. Shorter focal lengths intensify light convergence, affecting Angular FOV. Exploring the Field of View (FOV) in Camera Lenses The camera lens captures the extent of the observable area in a single shot, known as the Field of View (FOV). This encompasses what one can see through the eyes or an optical device. In photography, the FOV determines “what we are seeing in our image” and “how much of the scene we are seeing.” As light passes through the camera lens, it focuses on the subject being imaged by converging the light. Shorter focal lengths intensify the convergence of light to focus on the subject being imaged. Focal Length & Angle of View Guide Focal Lengths and Field of View (FOV) Dynamics Conversely, shorter focal lengths focus the image by converging the light more intensely. The determination of the focal length distance depends on how strongly the lens converges the light to focus the subject being imaged. This, in turn, affects the angle from the horizontal of the light captured by the lens. Referred to as the angular field of view (AFOV), it is necessary for determining the overall FOV. The FOV is expressed in either angular or size terms, with the former indicating the full angle in degrees and the latter denoting the length in millimeters or meters. Influences on Field of View (FOV) The lens focal length and sensor size influence the FOV, necessitating a wider FOV for a larger sensor if the lens focal length remains fixed. Typically measured horizontally due to the rectangular shape of sensors, FOV is usually expressed in millimeters. Optical tests commonly determine the FOV of UV, visible, and infrared cameras. These tests involve focusing light from a black body (an object that absorbs all light that falls on it) onto a test target at the focal place. A set of mirrors creates a virtual image at an infinitely far distance during the test. Camera FOV (or Camera Coverage) Field of View Formula Note: f is the lens focal length. Camera FOV vs. Lens FOV The image below shows the difference between the camera FOV and lens FOV. Camera FOV vs. Lens FOV Note: the maximum image (circle) diameter of the lens should be equal to or larger than the Sensor diagonal size. RELATED CONTENT

Read more
Optical Filters for AR/MR/VR – Part 5

Key Takeaways Avantier specializes in advanced optical filters for AR/VR/MR applications, offering anti-reflective coatings to reduce glare, color filters for vibrant visuals, polarizing filters to enhance contrast, and neutral density optical filters. Their high-quality neutral density filters optimize light entry, preventing eye strain. Optical Filters for AR/VR/MR Optical Filters are used in AR/VR/MR (Augmented Reality, Virtual Reality, and Mixed Reality) to enhance the visual experience and make it more realistic. Here are a few examples of how filters are used in these fields: Anti-reflective coatings AR/VR/MR devices often have multiple lenses and screens, which can cause reflections and glare that can distract from the virtual experience. Anti-reflective coatings on these surfaces can help to reduce these reflections and improve the clarity of the images. What Aventier is using for coating can mainly be divided into two categories: physical vapor deposition (PVD) and chemical vapor deposition (CVD). PVD is a family of techniques that involve the deposition of a thin film of material onto a substrate by physical means, such as thermal evaporation, electron beam evaporation, etc. CVD is a technique that involves the deposition of a thin film of material onto a substrate by chemical means, such as plasma-enhanced CVD, low-pressure CVD, and etc. Color Optical Filters Color filters can be used in AR/VR/MR to enhance the colors of the virtual environment and make them more vivid. For example, a red filter can be used to enhance the color red in a virtual scene. Avantier colored glass filters are highly quality absorption filters made of colored glass, they allow certain wavelengths of light to pass unimpeded, while blocking other wavelength ranges to a designated extent. Rather than using thin film coatings to achieve filtering effects, these filters rely on the absorption and transmission properties of the color glass. Precision can be achieved through careful control of the thickness of the material as well as of the concentration of color used. Colored glass filters are often categorized as longpass, shortpass, or bandpass. Polarizing Optical filters Polarizing filters can be used to reduce glare and improve the contrast in the virtual environment. This can be particularly useful in outdoor settings or bright environments where glare can be a problem. At Avantier, we also specialize in polarizing coating, which can be formed of a very thin film of a birefringent material, or alternately by means of interference effects in a multi-layer dielectric coating. If desired, polarizers can be designed to work with an incidence angle of 45 degrees, leading to a beam reflected at a 90 degree angle. Under certain circumstances, a polarizing coating on a lens or optical window can be used to replace polarizing prisms in an optical assembly. Neutral Density Optical Filters Neutral density filters can be used to reduce the amount of light entering the AR/VR/MR device. This can help to prevent eye strain and improve the overall comfort of the user. At Avantier, we produce high quality neutral density for visible light as well as for ultraviolet and infrared applications. Our neutral density filter kit provides a set of filters with varying optical densities that can be used separately or in stacked configurations. Stepped optical filters, also known as stepped neutral density filters, are another option where imaging with a wide range of light transmission is required. They are designed to provide a discrete range of optical densities on a single filter. These are just a few examples of how filters are used in AR/VR/MR. There are many other types of filters and applications in these fields, depending on the specific device and the requirements of the user. Please contact us if you’d like to schedule a consultation or request for quote on your next project. RELATED CONTENT:

Read more
Innovative AR/MR/VR Filter Applications – Part 3

Holographic filters in AR/MR/VR enhance realism by manipulating light, addressing optical challenges, and improving image quality. Notable applications include expanding the field of view and reducing device components, exemplified by Microsoft HoloLens 2. This article explores unique applications and offers consultations for holographic filter projects in AR/MR/VR.

Read more
Image Recovery or Image Reconstruction of an Imaging System

Blurring is a significant source of image degradation in an imperfect imaging system. The optical system’s point spread function (PSF) describes the measure of blur in a given imaging system and is often used in image reconstruction or image recovery algorithms. Below in example of using inverse PSF to eliminate the barcode image degradation. Barcodes are found on many everyday consumer products. A typical 1-D (one-dimensional) barcode is a series of varying width vertical lines (called bars) and spaces. The example of the popular GS1-128 Symbology barcode is shown here: The signal amplitude of code image only has changes in horizontal direction (i.e. X-direction).  For the imaging system used to capture and decode the barcode it is sufficient to look at one-dimensional intensity profile along the X-direction. In good conditions the profile may look like this: Using such a good scan, it is trivial to recover initial binary (only Black and only White) barcode. One can set threshold in the middle between maxima and minima of the received signal, and assign whatever is above the threshold to White, and below the threshold to Black. However, in situations when the Point Spread Function (PSF) of the imaging system is poor, it may be difficult or impossible to set the proper threshold.  See example below: PSF is the impulse response of an imaging system, it contains information of the image formation, systematic aberrations and imperfections. To correctly decode barcode in such situations one may try to use inverse PSF information to improve the received signal. The idea is to deduce inverse PSF from the multiple signals obtained from the many scans of different barcodes of the same symbology. All barcodes of the same Symbology, such as GS1-128, have the same common features defined by the Symbology standards. This permits us to calculate inverse PSF coefficients by minimizing deviation of the received signals from the ideal barcode profile signals. A small number, such as 15, of the inverse PSF coefficients may be used to correct the received signals to make them as close to barcode signals as possible in the Least Squares sense. The inverse PSF coefficients were found and used to convert poor received signal shown previously into better signal shown on the next picture by red: While the recovered red signal is not ideal, it does permit to set threshold and correctly recover the scanned barcode.

Read more
How to Read an Optical Drawing

An optical drawing is a detailed plan that allows us to manufacture optical components according to a design and given specifications. When optical designers and engineers come up with a design, they condense it in an optical drawing that can be understood by manufacturers anywhere.  ISO 10110 is the most popular standard for optical drawing. It describes all optical parts in terms of tolerance and geometric dimension. The image below shows the standard format of an optical drawing. Notice thee main fields. The upper third, shown here in blue, is called the drawing field. Under this the green area is known as the table field, and below this the title field or, alternately, the title block (shown here in yellow). Once an optical drawing is completed, it will look something like this: Notice the three fields— the drawing field, the table field, and the title field. We’ll look at each of them in turn. Field I — Drawing Field The drawing field contains a sketch or schematic of the optical component or assembly. In the drawing here, we see key information on surface texture, lens thickness, and lens diameter. P3 means level 3 polished, and describes the surface texture. Surface texture tells us how close to a perfectly flat ideal plane our surface is, and how extensive are the deviations. 63 refers to the lens diameter, the physical measurement of the diameter of the front-most part of the lens 12 refers to the lens thickness, the distance along the optical axis between the two surfaces of the lens After reviewing the drawing field we know this is a polished bi-convex lens, and we know exactly how large and how thick it is. But there is more we need to know before we begin production. To find this additional information, we look at the table field. Field 2— Table Field In our example, the optical component has two optical surfaces, and table field is broken into three subfields. The left subfield refers to the specifications of the left surface, and the right subfield refers to the specifications of the right surface. The middle field refers to the specifications of the material. Surface Specifications: Sometimes designers will indicate “CC” or “CX” after radius of curvature, CC means concave, CX means convex. Material Specifications: 1/ : Bubbles and Inclusions Usually written as 1/AxB where A is the number of allowed bubbles or inclusions in lens B is the length of side of a square in units of mm 2/ : Homogeneity and Striae Usually written as 2/A;B where A is the class number for homogeneity B is the class for striae Field 3: Title Field The last field on an optical drawing is called the title field, and it is here that all the bookkeeping happens. The author of the drawing, the date it was drawn, and the project title will be listed here, along with applicable standards. Often there will also be room for an approval, for a revision count, and for the project company. A final crucial piece of information is the scale: is the drawing done in 1:1, or some other scale? Now you know how to read an optical drawing and where to find the information you’re looking for. If you have any other questions, feel free to contact us!

Read more