Point your cell phone camera at an object and click. A search result interface appears as an overlay upon your screen. This is the future of internet searches and the future is here! This is Google Lens. As Google CEO Sundar Pichai says, “It’s a set of vision-based computing abilities that can understand what you’re looking at and help you take action based upon that information.”
A limited version of this can be found at Pinterest and on Samsung’s new Galaxy S8 cell phone which Pinterest also powers. But neither Pinterest nor Samsung have Google power which provides access to and the ability to learn from billions of searches every day, a seemingly never ending supply of photos on its servers, a satellite imagery of the world and map of its streets.
Google Lens runs on image recognition algorithms and AI (artificial intelligence), a field we spoke recently about here at Comanche Marketing. According to Mathworks, “Image recognition is the process of identifying and detecting an object or a feature in a digital image or video.”
Facebook uses image recognition when you tag someone in a picture. The difference with Google Lens is that, when pointed at an object, it not only produces (or attempts to) meaningful search results; it learns based on who you are over time and produces results relative to you.
For instance, let’s say you’re a food critic standing in front of a restaurant with your friend who is a bank teller. Your friend points her camera at the building and an overlay with a website link to restaurant reviews appears. The overlay on your camera is more focused on the interests of food critics, not necessarily relevant to the average consumer.
We are only limited by our imaginations as to the applications for image recognition in our businesses today. For example, ordering parts from the field might be one application. Hold the camera up to a compressor and boom, price and availability from your favorite distributor is shown. Or hold the camera up to a furnace installation and boom from your own picture database, similar installations come up.
If recognition algorithms work from the data fed into them, is it possible to influence that data? And if so, how might we go about that? Or if what the algorithms produce is not relevant, do we have access to what data has been fed to the algorithm?
Unless someone from the HVAC industry is willing to work with developers like ex-Googler and ex-Nester Tony Fadell, it will probably be some time before our industries can influence a technology like AI beyond its original design. For us now, it’s enough to learn and use the operation and function of a device like the Nest thermostat.
Recognition algorithms however, might be different. If we’re able to influence the data set that goes into an algorithm, then what activities should we be looking at pursuing now? The act of branding is an analogy? If you want to be known as an honest and integrity laden company then you must behave with honesty and integrity.
By no means are recognition algorithms a fad. We are at the advent of a technology that will be ingrained and embedded in our personal and business lives. Google’s involvement only accelerates its forward movement. I am merely speculating about how our industry might influence data sets. Consult with your technology advisers and be prepared should the opportunity manifest.
Our country’s railroad system would have gone nowhere without the tracks. Be ready to lay tracks should influencing data sets become doable.
Contemplate a future where your prospective customer points her camera at her old furnace and your company’s info comes up as an overlay. What are the tracks that you should be laying today to make that a reality tomorrow?