Interview Transcript

This is a snippet of the transcript, sign up to read more.

Regarding the specific hardware you were interfacing with, you mentioned a couple. I'm curious about some of the edge cases that slow down the integration of automation. You mentioned a cylindrical item. I've talked with some former Amazon folks, maybe people you interfaced with, about specific challenges. For example, can the robotic hand determine whether it's a blue iPhone case or a red one? Integrating vision into the system and machine learning—what were some of the bottlenecks or challenges your team faced? Were there areas you felt were solvable problems but kept slowing down the progress of your work?

Some of the biggest problems stem from legacy choices in the systems we were interfacing with, which we didn't control. For example, the Kiva floor automation solution decided long ago to use cubbies with elastic bands to hold products. This works for humans, as they are adept at manipulating these things. However, for a completely automated interface, a better solution would be to use tubs. I've seen some public information about this approach. Using bins for storage on the floor simplifies the process. Instead of finding space in cubbies, you retrieve a bin, unpack it as necessary, repack it, and send it back. This is the future of automated floors, in my opinion.

This is a snippet of the transcript, sign up to read more.

Sign up to test our content quality with a free sample of 50+ interviews