Before the evolution of Google Glass, not many people seem to have understood what exactly Augmented Reality (AR) is. And if they did, they probably kept on wondering how and where in a day-to-day life they might use it.
If you haven’t heard about Google Glass or Augmented Reality, this video would be a place to start.
With things becoming more clear now, and possibilities being unfolded, the BIG question will be whether someone will need to have a Google Glass to do all these awesome things. Short answer is No.
Augmented reality (or virtual reality as it was referred earlier) is not dependent on Google Glass. Although it’s an amazing gadget to have, one can achieve same level of interaction through a smartphone application. Most smartphones these days have a good camera, internet connectivity and a microphone.
These are the 3 basic things a smart phone application would need to provide users a complete experience. However, it’s possible to build applications that requires one or two of the above things like “Camera + Internet Connectivity” or “Mic + Internet Connectivity” or just Mic or Camera. This although depends on the type of application a developer wants to build. Siri on iOS is a perfect example of how Augmented Reality is being used to perform voice recognition and make your smart phone a personal assistant.
With the above knowledge, let’s move into our main topic. i.e. how we can use AR to determine the freshness of vegetables or fruits.
Imagine you are visiting your nearby supermarket and you have a smartphone application to help you select the pieces to buy.
One might wonder, what’s the big deal about this as he/she can literally she which vegetables are fresh or not. Well, not exactly – while our normal eyes can help us decide whether a vegetable is spoiled, we can’t make accurately determine how fresh those vegetables really are. And definitely our eyes won’t be able to warn us regarding the chances of food-poisoning if we buy a particular vegetable or fruit or is there any pesticide residue.
Beyond this, the possibilities are infinite. If the application will be able to detect what vegetable a customer is buying it can suggest recipes using a particular vegetable and recommend other ingredients the customer may like to buy. Has it ever happened to you, you have bought some vegetables from a supermarket and found out after reaching at home that you have forgotten to purchase few other ingredients that could have helped you in making your favorite dish. This is an example of what “Heinz” has done with their recipe book application. The detection on this app is simple, as it detects the logo of Heinz and displays a recipe book in an AR view.
- Detection Parameters and Standards – How to make this a Reality
Vegetables and Fruit are graded based on their size, color, cleanness, texture and damage. Mentioned below is a list of such specification for various vegetables and fruits:
- Fruit grade standards: http://www.ams.usda.gov/AMSv1.0/freshmarketfruitstandards
- Vegetable grade standards: http://www.ams.usda.gov/AMSv1.0/freshmarketvegetablestandards
Let’s go with Cauliflower and see what information needs to be captured to determine its freshness and grade. http://www.ams.usda.gov/AMSv1.0/getfile?dDocName=STELPRDC5050259
Now using any object detection algorithm it’s possible to detect the size or diameter of a vegetable. It’s generally being captured by detecting edges, lines or curves of an object and possibly comparing it with other predefined database of images.
AR camera view based applications capture continuous frames of an object. The frame capturing can be stopped either manually by a user interaction like tap or button press or automatically when the system detects the object. The color of an object can be determined by comparing sample frames captured in few seconds or by simply tracking the hue, saturation and value of a given object.
One example has been shown here that detects a red apple using OpenCV. http://opencv-srf.blogspot.sg/2010/09/object-detection-using-color-seperation.html
Cleanness can be different for different vegetables and mostly dependent on the original color of the vegetable. For a clean cauliflower might be white where as a clean apple might be red or green. This can be predefined or can be captured user input. Cleanness can be detected similarly as above.
Texture Damage Detection
This needs to be calculated in combination with texture detection, smoothness detection or by detecting visible black or white dots on vegetables. This can also be calculated by detecting change of textures of change of color patterns. For example, a fresh tomato may have orange color while there might be v presence of other colors (like red or black) for spoiled tomatoes.
Along with the above grading standards this whitepaper highlights the key features of fresh vegetables. http://www.ams.usda.gov/AMSv1.0/getfile?dDocName=STELDEV3103623
- Detection & Classification Process
From the standards above we have a reference index and after detecting the object attributes we get their current index. The comparison can take place on mobile application without any online processing if we have the reference index stored in a local database. However, if we would like to update some values in dynamically we need to roll-out an updated version of depend on an offline-caching and online synchronization method. We can use SAP Mobile Platform for this.
Or we can do all the heavy-lifting work on backend powered by HANA.
SAP HANA is SAP’s implementation of in-memory database technology. HANA DB takes advantage of the low cost of main memory (RAM), data processing abilities of multi-core processors and the fast data access of solid-state drives relative to traditional hard drives to deliver better performance of analytical and transactional applications.
The output of the above will give each piece of product a final freshness grade. Based on this analysis, and multi-object detection – our application will help shoppers in deciding which vegetables and fruit they should buy.
Recommendation of recipes and other ingredients can be made when certain vegetables or fruits are being purchased.
There are 3 important steps on the above architecture.
- Step 2: An admin uploads sample images or specification of target objects (vegetables, fruits). This is part of system learning.
- Step 3: Client side algorithms for object and specification detections.
- Step 1: Used for object Storage, analysis and recommendation.
Augmented reality provides personalized shopping experience to a shopper. With the above approach meaningful recommendation and diet suggestions can be given to a customer by allowing them live an easy and healthy life. Retail centers can adopt such solutions to ensure the products they are selling are of highest quality.
Customers can select a list of health nutrition they want to have and they want to avoid. Our solution can detect them real time and make suggestions accordingly.
Want to learn what other members of our team are doing?
Visit www.sap.com/mobileinnovation and learn how to achieve your business objectives.