Computer Vision-Based Focus Tracking PyQt Application

Introduction

In today’s fast-paced digital world, maintaining focus at work is increasingly challenging due to the constant barrage of distractions, particularly from mobile phones. To address this issue, a computer vision-based application can be developed to monitor and measure the percentage of a user’s focus on work. This application will utilize the YOLOv4-tiny model for mobile phone detection and Mediapipe for eye detection to determine whether the user is focused on their work environment.

Project Overview

This PyQt application aims to measure the user’s focus and calculate distraction time. The core functionalities include detecting if the user is holding a mobile phone and checking if the user is looking at their work environment. By analyzing these two factors, the application can provide real-time feedback on the user’s focus level.

Methodology

  1. Mobile Phone Detection:
  • The application uses the YOLOv4-tiny model, a lightweight yet powerful object detection model, to identify if the user is holding a mobile phone. The YOLOv4-tiny model is chosen for its balance between accuracy and speed, making it suitable for real-time applications.
  1. Eye Detection:
  • Mediapipe, a robust library for real-time face and body detection, is employed to detect the user’s eyes.

By leveraging computer vision techniques with YOLOv4-tiny and Mediapipe, this PyQt application provides a practical solution for monitoring and improving user focus at work. The integration of these technologies allows for accurate and real-time detection of distractions, offering users valuable insights into their work habits and helping them stay more focused and productive.

Check github repo

Demo Video