Scientists have long used hyperspectral imaging – essentially, photography with wavelengths of light that aren’t visible to the human eye – to discover information about objects. It’s how night vision cameras work and how we know what distant stars are made of. One startup, ImpactVision, wants to use these superhuman visual powers to reform the food supply chain.
How Can Hyperspectral Imaging Help the Food Industry?
Different molecules absorb and reflect light at distinctive wavelengths, allowing hyperspectral imaging to detect the presence or absence of many different chemicals. This is a pretty wide-ranging ability – so why is it any use to the food industry?
Many practical characteristics of food such as ripeness, nutritional levels and bacterial contamination can be deduced from the presence of particular molecules. Starch and sugar levels in potatoes, for example, determine the best use for the potato. High-starch potatoes are more crumbly and perfect for mashed potato, while low-starch potatoes are firmer and better for salads. Potatoes low in sugar are better for frying.
Currently, food businesses have to assess food quality and other characteristics either by eye (which can be unreliable) or by destroying samples in lengthy laboratory tests. Hyperspectral cameras provide a quicker method and could inspect food without harming it.

ImpactVision’s Approach
ImpactVision was founded by Abi Ramanan while she attended Singularity University’s Global Solutions Programme. The startup is building software that uses machine learning to rapidly and reliably translate hyperspectral images into useful information for the food industry and consumers.
Ramanan’s goal is to reduce the astonishing levels of food that are currently wasted throughout the supply chain. Did you know that around one third of the food the world produces is wasted?
Initially, ImpactVision is focusing on industrial applications. Their “FishCam” software package can reduce food fraud by telling the difference between fresh fillets and those that have been frozen and thawed, and also between different species of fish.
“MeatCam” can predict the shelf life and tenderness of beef, which could be used to set prices and reduce waste by helping suppliers schedule deliveries which is particularly important given the high carbon footprint of beef compared to other meats and meat alternatives.
ImpactVision is collecting hyperspectral data on many different food types, which should allow them to expand the range of food types that their software can analyse in the future.
Superpowers in Your Pocket
ImpactVision hopes that hyperspectral cameras will soon be small and cheap enough to be integrated into smartphones, giving consumers quick and easy access to the powers of hyperspectral imaging. ImpactVision suggests that their “FruitCam” software could be adapted into an app to tell you in real-time which pieces of fruit are ripest and should be used up first.
With hyperspectral cameras for smartphones being developed by several organisations, including the VTT Technical Research Centre of Finland and Israeli start-up Unispectral, it may not be long before this vision becomes a reality.

I can’t help but wonder if an app like this could make us forget how to judge ripeness from subtle variations in texture, colour and scent, using our own senses – much like I rely on weighing scales to measure out baking ingredients, instead of being able to estimate by eye.
On the other hand, hyperspectral imaging can detect far more light than our eyes can. ImpactVision even describes their technique as giving people “superpowers”.
Have you got a bright idea for how hyperspectral technology could be used to make the food industry more sustainable?
Do you know a business that could benefit from the software?
ImpactVision is looking for business partners and can be contacted through their website.