Julie Weiss, Front Rush
Yesterday while standing in a long snaking line in Bed Bath and Beyond I caught myself in a brief state of holiday hypnosis as I gazed into the eyes of a dancing Santa figurine perched on the top shelf of the aisle endcap. Sigh, ‘tis the season for magic. As I came out of my haze I wondered if this familiar shopping experience is soon to change.
Earlier this month Amazon, a company who continues to push the magical envelope when it comes to catering to the consumer, unveiled it’s newest trick, an even more convenient convenience store they are calling Amazon Go. You may have seen the commercial. Customers walk into the store and are free to put whatever they want in their bag then simply walk out. No check out, no lines. It’s all handled through your phone. The Amazon Go app recognizes what you have put in your bag (blows my mind) and then charges you accordingly. So how exactly do they do that?
Amazon has figured out a way to read our minds. Upon entering the store a drone greets you with a bag full of everything that is on your shopping list. Okay okay, so maybe it’s not quite like this (yet).
The concept of the store is made possible through a combination of machine learning, sensors in the form of cameras and microphones; and artificial intelligence. USAToday outlines the flow as follows…
- Customer walks in, taps phone on sensor in an area Amazon is calling the “transition area”
- Surveillance identifies the customer
- Cameras placed throughout the store capture items shoppers pick up and can determine whether the item stays with them or is placed back on the shelf
- Microphones are used to detect where customers are by the noises they make
- Infrared pressure and load sensors are used on the shelves to help note when an item is picked up or put back
- The sensors also tell the store where everything and everyone is at any moment
- Upon exit, items are totaled up and charged to the user’s Amazon account where they will receive a receipt for their purchases.
The blueprint is intriguing. The system feels thought out and well planned (so far?). The patent filed by Amazon in 2014 gives us an even closer (and somewhat creepier) look at what is involved.
“The use of cameras can even go as far as to determine your skin color. The patent says this is used to identify the shopper’s hand to see whether they actually pick up anything off of a shelf, but combine that with the fact that Amazon knows what you’re buying and who you are and this is pretty next-level market research data.” (verge.com 12/6/16)
Yes it does sound a little creepy, but if they make us aware of the creepiness from the get go while making the shopping experience more efficient in the process, do we give them a pass?
What about the human variable? What if an item is put back in the wrong place? How will the sensors react? Are the microphones able to differentiate between customers? Will quiet shoppers go undetected when drowned out by crying babies? How will Amazon account for multiple cell phones when families, friends (and teams), shop together? Will an army of drones be released on potential shoplifters? It is yet to be seen how such variables will be taken into account.
The beta store located in Seattle is currently being tested out on Amazon employees and is slated to open to the public in early 2017. The inventory consists of basic grocery needs and pre-made meal kits. With Amazon’s hold on ecommerce it is not hard to imagine how this concept will branch out if proven successful.
Imagine running to the store to stock up on food for your next away game without lines.
Looking past retail, the possibilities that this technology creates are endless.