Our Patent Portfolio
In the global health and wellness market, computer vision is now central to improving exercise, rehabilitation, and general fitness experiences. With more people relying on digital, at-home solutions, the key challenge is ensuring both accurate body tracking and intuitive, accessible interfaces. This blend of precision and thoughtful UI/UX can transform basic motion detection into a truly user-centric platform, making it much easier for individuals to achieve their fitness or therapeutic goals.
FitCam Health’s approach rests on three patented innovations that together define its AI Motion Guide. First, a “gauge” user interface calibrates camera alignment and user positioning in a way that uniquely delivers a comprehensive and intuitive setup experience—ensuring that critical details like distance, angle, and environment are verified before any motion analysis begins. In practice, this calibration step is the only effective way to guarantee reliable user engagement and high-fidelity tracking from the start. Second, a two-stage convolutional neural network (CNN) pipeline refines pose inference to a high degree of accuracy. Finally, robust analytics compare the captured movements to ideal positions, generating personalized feedback and progress benchmarks. By pairing this vital “gauge” UI/UX with advanced computer vision, FitCam Health provides a user experience that is both approachable and scientifically precise, securing a defensible leadership position in AI-powered health and wellness.

United States Patent / Dec. 6, 2022 / US 11,521,326:
Systems and methods for monitoring and evaluating body movement
This patent discloses a system and method for automatically analyzing a user’s body position and motion using a two-stage convolutional neural network (CNN) pipeline, where the initial CNN infers basic pose information from one or more images captured while the user performs a physical movement, and a subsequent CNN refines that pose data into more precise body information; the refined results are then used to generate and deliver personalized recommendations or feedback (such as posture correction) to the user in real time or after the movement, enabling more accurate exercise monitoring, improved guidance, and potential integration with other coaching or authentication tools.
United States Patent / Oct. 3, 2023 / US 11,776 421:
Systems and Methods for Monitoring and Evaluating Body Movement
This continuation patent builds on the prior motion-analysis technology by including a calibration step that introduces a “gauge” to help users position themselves correctly for more accurate feedback. In essence, after capturing an initial set of images for calibration, the system presents a visual and/or audio gauge that informs the user of proper camera distance, angle, and environment setup; it then records further images or video of the user’s movement, computes a model of the user’s body, and compares the user’s positions and orientations against target poses. Using this refined calibration approach, the invention delivers more precise recommendations and guidance, resulting in enhanced body-movement evaluation and improved user feedback.
United States Patent / Nov. 19, 2024 / US 12,148,317:
Systems and Methods for Monitoring and Evaluating Body Movement
This continuation patent extends the body-movement analysis approach by highlighting a two-stage CNN pipeline for capturing images of a user’s body, refining inferred pose information in a second network, and generating specific recommendations based on the improved pose data. It covers both local and server-based embodiments (e.g., smartphones or remote computing devices) and further elaborates on integrating these refined pose estimations into a full-body model—ultimately helping users receive more precise, personalized feedback on how to correct or optimize their physical movements.