Jan-Hendrik Witte, Jorge Marx Gómez2024-04-082024-04-082024978-3-88579-738-8https://dl.gi.de/handle/20.500.12116/43868In modern pig livestock farming, animal well-being is of paramount importance. Monitoring activity is crucial for early detection of potential health or behavioral anomalies. Traditional object tracking methods such as DeepSort often falter due to the pigs' similar appearances, frequent overlaps, and close-proximity movements, making consistent long-term tracking challenging. To address this, our study presents a novel methodology that eliminates the need for conventional tracking to capture activity on pen-level. Instead, we segment video frames into predefined sectors, where pig postures are determined using YOLOv8 for pig detection and EfficientNetV2 for posture classification. Activity levels are then assessed by comparing sector counts between consecutive frames. Preliminary results indicate discernible variations in pig activity throughout the day, highlighting the efficacy of our method in capturing activity patterns. While promising, this approach remains a proof of concept, and its practical implications for real-world agricultural settings warrant further investigation.enprecision livestock farmingdeep learningcomputer visionactivity monitoringImage-based activity monitoring of pigsText/Conference Paper1617-5468