Okay, let’s talk about this little project I did on tracking people moving around in public spots. It wasn’t for anything official, just me messing around to see if I could get it working.
Getting Started – The Idea
So, I was sitting watching people walk through a plaza one day, and the thought just popped into my head: how hard would it be to track their paths automatically using a camera? Seemed like an interesting challenge to tackle. I wasn’t aiming for super high-tech stuff, just something basic to see the flow of foot traffic.
Setting Up the Gear
First things first, I needed a camera. Didn’t want to spend much, so I dug out an old webcam I had lying around. Found a spot overlooking a moderately busy walkway near my place – not too crowded, not too empty. Getting the right angle was a bit fiddly. I propped the webcam up high, trying to get a wide view without too much distortion. Plugged it into my laptop which I set up nearby, trying to be inconspicuous.
Recording Some Footage
Once the camera was positioned, I started recording. Just let it run for a good hour or so during the afternoon. Needed enough raw material, you know, actual people walking back and forth, sometimes stopping, sometimes changing direction. Captured a bunch of video files.
Trying to Make Sense of It – The Tracking Part
Alright, this was the core bit. How to actually track? I didn’t jump straight into complex software. My first thought was basic motion.
- Step 1: Spotting Movement. I started by trying to figure out what was moving compared to the background. Like, comparing video frames to see what changed. This helped isolate the people from the static stuff like benches and trees.
- Step 2: Identifying ‘Blobs’. Once I could see the moving parts, I tried to group pixels together that belonged to the same person. Basically, drawing rough shapes or ‘blobs’ around the moving things.
- Step 3: Following the Blobs. Then, the tricky part: trying to follow these blobs from one frame to the next. Giving each blob some kind of temporary ID and trying to match it in the next frame based on position and maybe size.
Hitting Roadblocks – The Challenges
Man, this wasn’t as straightforward as it sounded in my head. Lots of things went wrong.
Lighting was a pain. Clouds moving, shadows shifting… it really messed with telling the background from the foreground. Sometimes a shadow moving looked like a person!
People crossing paths. When two people walked close or crossed, the system got totally confused. It would merge their ‘blobs’ or swap their IDs. Untangling that was a nightmare.
Losing track. If someone stopped for too long, the system sometimes thought they became part of the background and just forgot about them. Or if they moved too fast near the edge of the view.
Stuff that wasn’t people. Bikes, dogs, even blowing trash sometimes got flagged as moving objects. Telling a person apart from a large dog wasn’t always easy for my simple setup.
Tweaking and Adjusting
I spent quite a bit of time fiddling with settings. Tried adjusting the sensitivity for motion detection. Played around with the size filters to ignore really small moving things (like that blowing trash). Looked up some simpler ways online people were doing basic tracking, tried to adapt some ideas. It was mostly trial and error. Changed some parameters, ran the video, saw what broke, changed something else.
The End Result – Kinda Working!
After all that tinkering, I got to a point where it sort of worked. On the screen, I could see rough outlines moving around, generally following the people in the video. It wasn’t perfect by any means. Lots of mistakes, lost tracks, and weird jumps. But, you could definitely see the basic paths people were taking across the area. For a homebrew experiment with basic gear, I felt pretty okay about it. It showed the concept, even with all its flaws. It was a good learning exercise, mostly showing me how complex even seemingly simple tasks can get in the real world.