The global food inspection industry needs newer and more precise tools to meet stringent government regulations. From specialty crops to seafood, meat, and poultry, the food safety testing market alone is valued at a staggering $19.5 billion USD in 2021 and is projected to reach $28.6 billion by 20261. This is one example where Hyperspectral Imaging (HSI) represents a tool that can alleviate tedious and labor-intensive tasks, as well as bring a new level of consistency to some historically subjective grading applications.
Developed originally for remote-sensing applications involving imagery from aircraft and satellites, HSI has since become a commercially viable technique for advanced machine vision applications. HSI sensors act like thousands or millions of spectrometers providing the chemical signatures from the reflected light at each pixel of an image. Headwall’s sensors can be tuned to wavelength ranges beyond the ability of the human eye to discern, from the ultraviolet and visible (UV and VIS) through the near-infrared (VNIR, NIR, and SWIR) wavelength ranges.
Hyperspectral-imaging sensors can distinguish spectral features that elude the human eye and conventional color imaging. These features can detect potentially harmful foreign matter and provide a means to sort and grade material such as food products whose value is often tied to characteristics that are often better and more consistently measured by an HSI system than a human who is subject to fatigue or the effects of something as simple as varying amounts of coffee each day 2.
Systems utilizing HSI have faced significant hurdles in industrial deployment because of the need to handle comparatively vast amounts of raw data and address the relative complexity of spectral-classification model development. However, newer HSI platforms such as Headwall’s Hyperspec® MV.X imaging system combine a high-performance imaging spectrometer with powerful embedded computing and software to rapidly create spectral-classification models to extract actionable results in real time and send instructions over the local network to take action or collect monitoring and control data.
The human eye, as capable as it is, can only detect images that fall into the visible-light spectrum between 400 nm and 700 nm. There are only three colors within this range that fall under broad RGB (red, green or blue) regions, and each person’s color sensitivity and perception varies to a wide degree. Nevertheless, the food-inspection industry has depended on humans and RGB sensors to detect problems and grade products for centuries. These include foreign objects missed earlier in the harvesting process, and even hard-to-detect disease conditions that might be largely invisible to either of these traditional methods. The stakes are high: consumer preference, the ability to meet new governmental regulations, and corporate shareholder value can all hinge on the precision and effectiveness of how inspection is implemented across all facets of the food industry.
‘Spectral imaging’ sensors can be subdivided into two categories. Multispectral sensors comprise a handful of spectral bands, anywhere from four to dozens, whereas hyperspectral sensors provide a much more granular (i.e., high spectral resolution) look, since they can capture literally hundreds of spectral bands at a time. Both provide a much more complete picture of foods under inspection because they go well beyond the simple RGB paradigm so commonly and traditionally used.
Humans see color as combinations of red (1), green (2), and blue (3) in a very small region of the electromagnetic spectrum. The addition of bands 4 and 5 represent a multispectral example, where more than 3 areas of the spectrum are captured and analyzed. Hyperspectral imaging utilizes hundreds of bands of the spectrum instead of just a few. This enables the high performance of a point-measurement spectrometer but at each pixel of an image that could consist of millions of pixels.
An ‘advanced’ machine-vision system for the purposes of this discussion may comsist of one or more spectral imaging sensors, a suitable illumination source, and a computer that collects the image data while communicating to downstream robotics. The sensor presents image data to the computer in real time, and that image data is then sent onward to the robotics system. The robotics system interprets the image and immediately understands what to do based on algorithms and instructions. In some cases, it may simply grab and delete a piece of foreign material (pass/fail). In other cases, it would direct certain colorations of a product to another line for further processing (product ‘grading’). For recycling applications, it can classify different yet similar-looking types of plastics along high-speed lines.
The hyperspectral sensor is not a standalone device but rather an important and very accurate part of an entire advanced machine vision system. By one estimate3, machine vision has been employed in less than 20% of the applications for which it is potentially useful. Therefore, it is sensible to discuss ways in which this powerful imaging technology can make inspection processes better and economically more efficient.
The HSI sensor can be thought of as a ‘new set of eyes’ acting as a sentinel standing watch over inspection lines, however long the typical production cycle lasts. Its ability to ‘talk’ to other elements of the system is a crucial reason hyperspectral sensing is favored as a new tool for the industry with an ability to far surpass RGB units.
A “pushbroom” hyperspectral sensor captures images by scanning line-by-line through a slit (left), each line containing pixels that save the spectral characteristics of the subject (center). As the sensor moves relative to the area or object being scanned, a dataset is built up (right). The resulting dataset can be thought of as a stack where each layer represents a particular “band” (a small range of wavelengths). Pixels are stitched together to form an image where each pixel contains not just conventional RGB values but hundreds of values along the wavelength range of the sensor.
The basic function of a hyperspectral sensor is to capture individual slices of an incoming scene through a physical slit in the case of a ‘pushbroom’ design and to break each slice into discrete wavelength components that are then presented to a focal plane array (FPA). A diffraction grating manages the task of dispersing the image slices into discrete wavelength components. The grating is engineered with a precise groove profile to maintain spatial coherence in one dimension (the length of the image slit) and cause the spatial information (the width of the slit, in microns) to diffract. This diffraction (dispersion) process allows the spectral content to transverse to known wavelength channels on the sensor.
The all-reflective push broom spectral line-scanning technology used by Headwall captures a spectral line (X spatial and Z spectral) in each ‘frame’. Sequential frames build up the Y spatial dimension. The pushbroom design is preferred for its ability to provide low distortion at very high spatial and spectral resolution. High throughput means a high signal-to-noise ratio and very low stray light. Because it is an all-reflective design, chromatic dispersion issues are eliminated.
When viewed though the slit of the hyperspectral sensor, all we see is the spatial strip that the slit lets through. This would be equivalent to one column of pixels. You can still see the spatial detail in the image, but only one strip at a time. In every slit, there are many colors. The HSI system separates the light in each spatial pixel into the different colors in that pixel. Each time the camera takes a picture of the slit, it gets a full frame of spectral data for each pixel. Stacking up each spectral image of the slit as we cross the scene, we build up the hyperspectral data cube. As the sensor moves left to right over the scene, advanced hyperspectral processing software can take a set of pictures and stitch them together to acquire a full ‘data cube.’
One characteristic of spectral imaging that makes it perfect for advanced machine vision applications is movement. Since sensors capture image data frame by frame, they naturally depend on motion to occur. The sensor either needs to move over the field of view (as it would if attached to a drone or aircraft in remote-sensing applications), or the field of view needs to move beneath the sensor (as it would in an advanced machine vision deployment).
The precision agriculture community has adopted both hyperspectral and multispectral sensors as payloads for drones and aircraft that fly above crop fields. A wealth of vital agricultural data is captured by these sensors, with respect to indices such as NDVI, PRI, WBI, Red Edge Ratio and many more. Crop vitality, fertilization and irrigation effectiveness, and early signs of invasive species and diseases can all be seen within the hundreds of bands of a Visible-Near-Infrared (VNIR) sensor that ‘sees’ between 400 nm and 1000 nm.
Hyperspectral sensors can be placed along the production line and connected to robots that take appropriate action based upon real-time analysis performed in the embedded processors in the sensor systems themselves.
Along a high-speed conveyor, the same level of meaningful data can be collected to positively impact the inspection process. Frame-rate and field-of-view characteristics are such that the sensors are more than capable of monitoring wide lines operating at high speeds. The high level of discrimination afforded by HSI means that even hard-to-distinguish anomalies are seen and managed. A blueberry in a field of strawberries is easy to spot, but what about minute color or chemical differences within the same crop or within similar-appearing recycled materials? Only hyperspectral can distinguish these impossible-to-see differences.
Light rays that enter the slit of a pushbroom hyperspectral sensor are separated into a spectrum of color like in a rainbow, in this case by a holographic grating of very fine grooves. The spectrum falls onto a 2D photosensor. Software converts the signal level at each photosensitive pixel into a spectral curve at each pixel of the image as the sensor moves relative to the object being scanned.
Although HSI sensors are sometimes referred to as ‘cameras,’ they are in truth a marriage of spectrometers and cameras. Headwall’s sensors are based on an all-reflective design having no moving parts or potentially offending transmissive optics. This is accomplished by using holographic diffraction gratings that manage the incoming light passing through the image slit. The gratings are not only exceptionally precise, but they are small and light. This allows the instruments themselves to be small and light for easy deployment anywhere.
Headwall is the only spectral sensor manufacturer that also makes its own diffraction gratings. Each grating is ‘master quality,’ which means identical groove profiles from one to the next for a given application. Since the fundamental optical performance of the sensor is a function of the grating, this capability represents true differentiation. Hyperspectral sensors are designed and ‘tuned’ to specific spectral ranges. Within each range, literally hundreds of spectral bands are collected, giving a very precise and highly resolved view of everything moving along the inspection line...both spectrally and spatially.
The Visible-Near-Infrared range (VNIR) covers from 400 nm to 1000 nm and the Extended VNIR range covers from 550 nm to 1700nm. The Near-Infrared range (NIR) collects image data from 900 nm to 1700nm, while the Shortwave-Infrared range (SWIR) covers from 900 nm to 2500 nm. Since material ‘reflects light’ at certain points within these ranges, it is important to first define the signatures themselves. Then, through algorithms, the sensor can characterize material or detect anything not precisely defined as ‘good,’ not only with respect to foreign material, but also hard-to-distinguish ‘grading’ differences from one berry to another or from one nut to another. This is a very valuable characteristic of hyperspectral imaging, since it has a level of specificity that goes far beyond more traditional RGB sensors.
Since HSI sensors measure and analyze reflected light, illumination is an important consideration. The overall objective is to provide the sensor’s field of view with an extremely uniform, consistent form of illumination that is simultaneously robust and economical.
For the VNIR spectral range, Quartz Tungsten Halogen (QTH) represents one such illumination technology while newer LED light sources may be seen as another albeit less mature alternative. Bundled optical fiber also presents a uniform light source. Much of what interests the food-inspection industry ‘reflects’ at ranges beyond the visible, which cuts out at around 700 nm. So having a light source covering either side of this point is vital.
Beyond being as cool as possible, robust, and uniform, the light source needs to fully traverse the width of the inspection line. This edge-to-edge capability takes advantage of the wide field-of-view of the sensor, allowing inspected product to be seen not only directly beneath the sensor itself but off to the edges. There’s no regimentation in a high-speed food inspection line because product could be everywhere... along the edges or bunched together on the conveyor belt. Longevity of the light source is also important since many food-inspection lines run around the clock.
Since the sensor is building a ‘cube’ of image data one slice at a time and the illumination itself is a very thin strip, the region of interest (the ‘slit image’) is what needs to be illuminated. A white reflectance target is used to calibrate the sensor prior to actual operation. This is a crucial step because the sensor is collecting image data that a downstream robotics system (e.g., vacuum, air knives, picking claw) will use to segregate ‘good’ from ‘bad.’ The objective is always to present the right kind of light at the right intensity, exactly where it’s needed. Also, it is important that documentation exists providing wavelengths and intensity of light across the field of projection, uniformity of the light, and degradation across standard distances. This way, the exact positioning of the sensor relative to the line can be determined should some adjustment on the architecture of the line be necessary.
Robotic subsystems are a natural element of many advanced machine vision processing-line applications. The ability to discriminate and eliminate depends on the ability of the sensor and robotic system to communicate rapidly and faithfully, in real time. Hyperspec® sensors can run hundreds of frames per second, meaning they are well-suited from both an operational and economic standpoint to work with high-speed lines and the robotic systems embedded in them.
The machine vision industry understands that the wherewithal to integrate a wide range of subsystems into a seamless and continuously running line demands that communication protocols be industry-standard and fast. Gigabit Ethernet is often used to tie everything together from a data-flow perspective. The HSI systems and computers that manage the incoming data all work with Gigabit Ethernet, but also other very fast communication links such as CameraLink.
Inspecting specialty crops such as nuts and berries involves looking at vastly similar-looking items with small degrees of variability. Therefore, dividing the signal or image into hundreds of hyperspectral ‘channels’ is a benefit to the industry.
Since ease-of-use is paramount and steep yet rapid learning curves are necessary, Headwall’s software is intuitive and contains functions that allow users to modify and adapt their inspection processes based on what the sensors see. The algorithm-based process pinpoints the spectral characteristics users might encounter. For example, almonds with insect damage are nearly indistinguishable from ‘good’ almonds under RGB analysis. But the same scene classified using HSI will call attention to the damaged ones, which can be eliminated by the downstream robotic system.
The combination of innovative sensors, software, and workflow allows a growing number of users access to HSI with true solutions that use spectral data to not only detect contamination, but also ‘grade’ products so that less is wasted and more is converted to revenue. Hyperspectral imaging unlocks that possibility within the machine vision industry.
This article was written by Christian Felsheim, Director Headwall Photonics EMEA, and Dr. Will Rock, Senior Application Engineer, Headwall Photonics (Bolton, MA). For more information, visit here .
This article first appeared in the March, 2022 issue of Photonics & Imaging Technology Magazine.
Read more articles from this issue here.
Read more articles from the archives here.
Sapphire Fiber Could Enable Cleaner Energy and Air-Travel
'Floating Sensors' Spread Like Dandelion Seeds
Photonic Quantum Computer Made in Germany
EV vs. ICE: How Does Wire Harness Design Differ?
Machine Intelligence to Build Soft Machines
5 High-Tech Materials That Sense and Detect
Thermostatic Technology: Improve System Efficiency with Automatic and Precise Thermal Control
Understanding Fluid Systems Testing for Current and Future Technologies
Preparing for the Cabin Cooling Transition: R-134a to R-1234yf
Get Even More Out of Your EVs with Thermoplastics
Utilizing Model-Based Systems Engineering Approach for Vehicle Development
Here's an Idea: A Future for Smart Fabrics
Nano Diamond Battery Provides Universal Applicability
Sapphire Fiber Could Enable Cleaner Energy and Air-Travel
Create the Future Design Contest: Success Story — Reinventing Mobility
Product of the Month: LED Light Engines for Large FOV Fluorescence Imaging...
Smart Agriculture Sensors: Helping Small Farmers and Positively Impacting...
Waist Belt Pulley Aids Impaired Walkers
By submitting your personal information, you agree that SAE Media Group and carefully selected industry sponsors of this content may contact you and that you have read and agree to the Privacy Policy.
You may reach us at privacy@saemediagroup.com.
You may unsubscribe at any time.
© 2009-2022 Tech Briefs Media Group