Articles & News

Keep up to date with what's happening at Companion; where tech, animal behavior, and product development meet.

AI & Tech Inflection Point: A potential step-change in animal health and awareness

By John Honchariw, Marty Becker, & Mike McFarland · May 06, 2023
AI & Tech Inflection Point:  A potential step-change in animal health and awareness

Animal care is hard for every loving, thoughtful clinician. They typically get to see their patients once a year and can’t ask the dog how they are feeling. With the advent of a new class of devices, this is about to change for both the clinician and the pet parent. 

Companion is the first in a new class of devices that includes an Ai chip powerful enough to enable autonomous interactions with animals. Specifically, these devices can use computer vision at superhuman speeds to understand how a dog moves and acts in front of it (sitting, looking anxious, etc.). Dogs communicate primarily through movement and posture – exactly what this type of product is designed to understand and record with its camera and Ai processor. The device can interact back in ways the dog naturally understands, using lights, sounds (e.g., pet parent's voice), and food. 

Because the device has the ability to regularly “request” motions and postures from the dog (sit, stay, down, etc.), the device essentially is a self-diagnostic. Every day, for hours a day, asking the dog to articulate its muscles, joints, tendons & senses to engage, so that the device and its perfect memory is positioned to see the instant something changes, such as the dog not performing a previous motion or a dog performing a motion differently. A dog can’t say “I’m in pain” – they “tell” you by subtly changing how they move. These devices can spot this with the perfect memory and consistency of a machine. 

This superhuman ability to spot variance in movement is just one among a number of new possibilities these devices open up. The others include 

  • Candid Observation: of dogs in the home 
  • Daily Data: Rich data collected for hours each day vs. once a year
  • Direct Observations: Watching directly for physical changes 
  • Unknown, Unknowns: New patterns we can’t currently imagine

Candid Observation // Dogs change their behavior and movements when anxious, like when they visit veterinary clinics. The ability for a clinician to view an animal's candid movements – in-home – will radically change the clarity and certainty of the information they can gather on the state of the animal. This ability alone would change the field. 

Daily Data // Each day with a Companion can generate hours of video or 100,000s of still pictures for analysis and later comparison. This not only makes it much easier to catch slowly evolving conditions but naturally saves everything for posterity. You can imagine very simple uses for this, such as observing very slight coat color changes over time, or potentially very complex cases such as watching for minute changes over months in average appetite and gusto for food (potentially indicating a deeper condition), simply by observing how quickly an animal eats the first handful of treats dispensed with a perfectly consistent eye.

Direct Observation // As my colleague Mike McFarland paraphrased, almost all diagnosable conditions have some form of a visual element. In other words, it is hard to imagine a future where a system like this with a camera wouldn’t have some form of ability to catch signals of a majority of the most common health conditions

  • Changes in body shape (e.g., Daily BCS measurement)
  • Change in posture (e.g., neck/back changes signify neuro condition)
  • Changes in movement (e.g., favored joint(s) in early OA)
  • Changes in behaviors (e.g., infrequent nips at paw)

With infinite patience and perfect memory, these devices/computers can spend 1,000s of hours collecting data and then spotting things we would never have had the time, skills, or training to see. 

Unknown-Unknown // We do not know the full scope of the possibilities with these devices because we’ve never had this type of longitudinal, rich data on millions of dogs. There may only be a handful of new novel patterns that emerge, or there might be a vast number of patterns and signals that we have never had the breadth of data and “aperture” to see. For example, observing patterns of eating, movement, and energy might provide insight into animal mental health (e.g., depression) or spot deeply hidden signals in longer-term conditions (e.g., cancer, etc.). By definition, we don’t know, but the possibilities are quite exciting. 

There have never been devices that can have a “conversation” with a dog for hours a day in an environment they are comfortable in, and perfectly remember the substance of all those conversations. Given dogs “converse” and signal health issues in the same way (motion, posture, etc.), we believe these devices are going to radically and massively change what is possible to know about dogs' health and wellness. 


John Honchariw, Marty Becker, DVM & Mike McFarland DVM