Alright, let me tell you about this little project I tackled recently – building a foot traffic counter that also showed heatmap data. It sounded cool, and I figured it might be useful to see where people actually walk and stop in a space, not just how many walk by.
So, first things first, I needed some hardware. I had an old Raspberry Pi 3 lying around collecting dust, so that was my starting point. Saved me some cash right there. For the camera, I just grabbed a basic USB webcam I had in a drawer. Nothing fancy, figured I’d start simple and see if it even worked before splashing out on anything better.
Getting the Pi set up wasn’t too bad. Flashed the latest Raspberry Pi OS onto an SD card, booted it up, ran the updates. Standard stuff. Plugged in the webcam, and thankfully, the OS recognized it straight away. Phew, dodged a bullet there, sometimes peripherals can be a real pain on these little boards.
Getting the Software Side Working
Now for the brains. I knew I’d need something to process the video feed and actually spot people. OpenCV is the usual go-to for this kind of image stuff, so I decided to wrestle with that. Took a while to get it installed on the Pi, lots of dependencies and compiling, you know how it is. Grabbed a coffee while that churned away.
My first goal was just counting people. I looked around at different ways to detect humans. Found some complex models, but decided to try the built-in HOG descriptor in OpenCV first. It’s older, maybe not the absolute best, but seemed simpler to get going. I wrote a basic Python script: grab a frame from the webcam, run the HOG detector, see if it finds anything that looks like a person.
Initially, it was kind of messy. It detected my office chair, a coat hanging on the door, sometimes even shadows. Lots of false alarms. And sometimes it missed me walking right past! Spent a good chunk of time tweaking the HOG parameters – the win stride, padding, scale factor. Felt like tuning an old radio, just fiddling until the static cleared a bit. Eventually got it to a point where it was mostly detecting actual people.
To make it a counter, I drew an imaginary line across the middle of the video frame in my script. Then, I needed to track detected people briefly to see if they crossed that line. This part was tricky. I didn’t want to implement super complex tracking, so I did something basic: check if the center of a detected person-box crossed the line compared to the last frame. It wasn’t perfect, sometimes counted someone twice if they lingered near the line, or missed them if they moved too fast. But hey, progress.
Adding the Heatmap Spice
The counter was okay-ish, but I really wanted that heatmap. The idea was to show where people spent the most time. So, every time the script detected a person, I grabbed the coordinates (specifically the bottom-center point) of the bounding box around them. I stored these coordinates in a big list.
After collecting points for a while, I needed to visualize them. I created a blank, black image the same size as my video frame. Then, for every recorded coordinate, I drew a small, semi-transparent circle or dot on this black image. Where lots of dots overlapped, it would get brighter. This sort of worked, but it wasn’t very colorful.
To make it look more like a proper heatmap, I used OpenCV’s colormap functions. I took the grayscale image created by the dots, applied a ‘jet’ or ‘hot’ colormap, so areas with lots of traffic went from blue/green to yellow/red. Looked much better!
Finally, I overlaid this semi-transparent heatmap image onto the original live video feed. So you could see the camera view with this colorful glow showing the high-traffic areas. Looked pretty neat, actually.
Putting it All Together and Testing
I combined the counting logic and the heatmap generation into one script. Let it run pointed at my doorway for an afternoon.
- The counter: It was approximate. Definitely not security-grade accuracy, but it gave a rough idea of how many times someone passed through. Good enough for my initial goal.
- The heatmap: This was the cooler part. It clearly showed the path people took and where they tended to pause just inside the room. That visual was quite insightful.
Of course, it had limitations. Changes in lighting sometimes threw off the people detector. The simple tracking meant fast movements or crowded scenes would confuse it. And the Pi 3, while capable, was definitely working hard; the video feed wasn’t super smooth.
But overall, I got it working. It counts, it maps heat, and it was a fun exercise using tools I already had. It’s not a commercial product by any means, just a home-brewed attempt. Learned quite a bit getting the detection and visualization dialed in. Definitely a worthwhile weekend project.