Whenever I go to the store to buy food, I always have to stare at the fruit and vegetables to figure out if they’re ripe enough. I’ll poke at them with my fingers, squeeze them gently and sometimes even sniff them to find out just how ripe they are, and I don’t always get it right. I figured there had to be a better way to figure this out, and it all came down to science. I contacted a team for app development in Singapore to help me with a project that I created to tell the ripeness of produce.
My method for determining ripeness depended on two factors. The first was how the fruit looked. I wanted to use the technology that was already present in the phone to help people determine just how ripe their produce was, and the best thing in the phone that I could use to do this was the camera and the flash. The camera can take pictures of the produce to compare it to ripe produce, and the flash would be able to give a pseudo x-ray effect by shining bright light through the flesh. The light against the flesh would feed data back to the camera and give what I called a produce density rating.
The second way of determining ripeness involved adding an attachment to the phone, which was a small probe. This was something that I would sell on my website, and people could buy it if they wanted, but it was totally optional. By poking the produce with the probe, data would be sent back to the phone that would determine resilience, which would factor into ripeness. Creating an app that would be able to do this took some time, but once it was done, it gave accurate readings that people loved.