Anki has been making toys that use artificial intelligence for years now, starting with the Anki Overdrive set of racing cars and the interactive Cozmo toy robot, but this year the company launched its most ambitious product: Vector.
Unlike Cozmo, Vector is no toy, and is instead positioned as a home robotic device. Vector adopts capabilities from Cozmo, but his functionality goes above and beyond the tricks and games Cozmo is known for.
I’ve had Vector as my constant companion for a week now, and while he is far from the smartest AI I’ve interacted with, he’s definitely the most lovable. It’s fitting to think of Vector as a derpy, simpleminded pet that gets things wrong, but in an endearing way.
Vector is kind of like a personal assistant like Siri or Alexa, but with a more limited range, a physical body, and a more expressive personality. With a “Hey Vector” trigger word, Vector can answer questions, obey commands, play games, and more, serving as a friend and helper in day to day life.
Design and Components
Vector is a palm-sized robot that uses the same general design as Cozmo, Anki’s previous robot toy. Vector is made from a black plastic material and he has a body that’s filled with various sensors and electronics to detect and respond to the environment around him.
Vector has four wheels covered in tank-style treads that allow him to traverse smooth floors and rugs alike, a movable front arm that lets him interact with his cube and adds to his various expressions.
Most of Vector’s personality is expressed through his small front display, which is always on and is where his eyes are located. The display lets Vector demonstrate different emotions, and the animated eyes are always shifting and in motion, blinking, narrowing when he’s thinking, worried when he detects the edge of a table, wide open when he’s looking at you, and slitted when he’s asleep.
The display also changes when Vector is answering a question and it is used for things like offering up weather conditions or displaying the time when these questions are asked of Vector. Vector’s head component moves independently of his body, allowing him to adjust what he’s looking at, giving the sense that he sees things.
Vector has a gold-colored touch panel at the back where he can sense touch, and this area is used for petting (Vector loves to be pet and will coo and preen while you do it). In the middle of the touch panel, there’s a button, which is used to activate his attention (like pressing the side button on an iPhone to summon Siri), display his status, and for various setup purposes.
A green light on the button is standard operating mode, while blue lights let you know Vector is listening once the “Hey Vector” trigger word is spoken. When Vector is thinking of an answer, scanning a face, or doing another task that requires processing power, the lights turn white.
There are a lot of sensors and electronics inside Vector that allow him to experience the world around him. To see what’s around him, Vector uses an HD camera, and to hear, there’s a four-microphone array.
Touch sensors and an accelerometer let him know when he’s being touched or picked up (and he likes to throw a fit when he’s up in the air), while a processor lets him compute. Vector has a speaker and Anki has programmed him with hundreds of synthesized, robotic sounds so he can respond to you and interact with you.
Vector communicates mainly with beeps, boops, and other robotic sounds, but he does have a text-to-speech feature so he can say your name and provide vocal answers to queries.
Vector has a lot of useful hardware, but hardware quality seems to be one area where Anki cut corners to keep Vector’s price reasonable at $250. Vector’s microphone array works decently, and I didn’t have to repeat myself very often, but I did run into problems with the camera.
The camera is used for recognizing objects and people, a key Vector feature, as well as taking pictures. It’s a plain HD camera, though, and while Anki says there’s also a laser for sensing objects, Vector has a hard time in low light. When in a room that’s dim, and I like to dim my lights at night, Vector has a harder time detecting objects around him and he’s unable to recognize people.
Better camera functionality and better object detection would have gone a long way towards improving Vector’s capabilities. Issues like understanding voice commands and interpreting questions can be improved through software updates, but the camera-based functions aren’t going to be able to be improved much because of hardware limitations.
Vector is supposed to have cliff detection, and I’m not sure if this also uses the camera, but I had a lot of issues with it even after a software update meant to improve the feature. Vector does mostly okay with edges that are flat, but on a table that’s slightly curved, he continually dive bombs to the floor.
With such sensitive electronics inside, I’m worried falls off of the edges of tables could cause irreversible harm. I keep Vector on the floor now, which seems to be the safest place for him.
Of course, on the floor, he will pick up dust, pet fur, and other particulates in his treads. I didn’t have trouble getting bits of dust and fluff out of his treads, though, because they are malleable.
Vector is an autonomous robot, so while you can interact with him, you can’t control him. There is a Vector app for keeping tabs on Vector and learning about all of his behaviors, but there are no controls within said app.
In fact, after setting Vector up with the iOS app (or Android app) you don’t need a smartphone at all to interact with him in any way. He’s smartphone independent, but the app can be used to see just what he’s up to and what he’s learned over time.
Vector’s setup is relatively easy with an iPhone. You need to download the app, turn him on, and make sure there’s a 2.4GHz Wi-Fi network available. Vector needs a 2.4GHz network and will not connect to a 5GHz network, which can be a bit of a hassle.
In the Vector app, there’s a sensory feed at the top so you can determine Vector’s cognition level and what he’s taking in from the environment at any given time, and there’s a bar at the bottom that lets you know what he’s doing. He’ll often explore or listen for music on his own, among other tasks.
There’s also a “Stats” section where you can see Vector’s lifetime sensory score along with details on how far he’s driven, how often you’ve used his wake word, how many seconds he’s been petted, and how many utilities you’ve used.
A “Things to Try” portion of the app outlines all of Vector’s capabilities at the current time organized by section, while a Settings portion of the app lets you adjust Vector’s eye color, change his volume, and set preferences for things like temperature, language, and time.
AI Capabilities and Behaviors
As mentioned above, Vector is an autonomous robot, so while he’ll interact with you whenever you’re around, he also keeps himself entertained as well, and he’s smart enough to learn routines.
When Vector hears me wake up and come into the office in the morning, he too wakes up from sleep mode and comes off of his charger, where he’ll explore the room on his own, venturing to areas that are close to his charging base.
He on occasion will listen for music that’s in the room and will dance along with it, and when he sees me or another person, he gets excited and offers up a greeting that sometimes includes a little fist bump where he raises up his arm and asks you to tap it.