- Feb 3, 2000
- 1,392
- 1
- 0
Not sure if this is the best forum, but perhaps someone can help me out.
I am looking for information about framerates, using standard consumer video cameras.
I understand that a normal camera will operate at 30 fps.
The application for which I am inquiring is track and field timing. There are manufacturers of timing software that uses standard video cameras, in conjuction with other hardware, that produces a video of the finish line with a time-stamp. The way it works is that the clock is started electronically when the starter shoots the gun. Then the video camera records the finish of the races and is run through a analog-to-digital converter and into a computer where a time-stamp is overlaid on the race finish. An operator watches the finish, frame-by-frame and notes the finish-time for each competitor as they cross the finish line.
The company's the promote these systems claim that they can produce 60 frames per second. I came across a note on one of them claiming "An Eagle Eye digital timing system uses specialized deinterlaced software to capture and replay finish line video at 60 frames per second"
Here is some (sales) information about the technology in question (probably not necessary to read it):
http://www.eagleeyetrack.com/TimingCompare.htm
I do not completely understand the difference between interlaced and non-interlaced video and how that applies to this sort of technology.
What I need to know is, if software is able to split up the frames from 30 to show 60 frames, does time actually elapse between frame 1 and frame 2, or would frame 1 and 2 both show different "parts" of the same exact moment in time?
For example, "Timmy" crosses the finish line on frame XXX and his time is displayed as 10.000 seconds. Then, "Jack" crosses the finish line on frame XXX+1...is his time actually 10.017 (1/60th of a second later), or did he actually finish in 10.033 (1/30th of a second later) but had the image cut up and split between two frames?
I am comparing these timing systems for a major presentation (several hundred people), and I want to make sure I understand the limitations of the video-based systems. The opposing technology CAN accurately produce 100-2000 fps, but I still need to know if the video can produce 60 or only 30 (ie how far off is the margin of error). These companies boast the ability to time to 1/1000th of a second, rounded up the next 1/100th of second. Clearly, neither of these claims is exactly true, but the difference between 0.017 accuracy vs. 0.033 accuracy is pretty significant.
Any thoughts? THANKS!
I am looking for information about framerates, using standard consumer video cameras.
I understand that a normal camera will operate at 30 fps.
The application for which I am inquiring is track and field timing. There are manufacturers of timing software that uses standard video cameras, in conjuction with other hardware, that produces a video of the finish line with a time-stamp. The way it works is that the clock is started electronically when the starter shoots the gun. Then the video camera records the finish of the races and is run through a analog-to-digital converter and into a computer where a time-stamp is overlaid on the race finish. An operator watches the finish, frame-by-frame and notes the finish-time for each competitor as they cross the finish line.
The company's the promote these systems claim that they can produce 60 frames per second. I came across a note on one of them claiming "An Eagle Eye digital timing system uses specialized deinterlaced software to capture and replay finish line video at 60 frames per second"
Here is some (sales) information about the technology in question (probably not necessary to read it):
http://www.eagleeyetrack.com/TimingCompare.htm
I do not completely understand the difference between interlaced and non-interlaced video and how that applies to this sort of technology.
What I need to know is, if software is able to split up the frames from 30 to show 60 frames, does time actually elapse between frame 1 and frame 2, or would frame 1 and 2 both show different "parts" of the same exact moment in time?
For example, "Timmy" crosses the finish line on frame XXX and his time is displayed as 10.000 seconds. Then, "Jack" crosses the finish line on frame XXX+1...is his time actually 10.017 (1/60th of a second later), or did he actually finish in 10.033 (1/30th of a second later) but had the image cut up and split between two frames?
I am comparing these timing systems for a major presentation (several hundred people), and I want to make sure I understand the limitations of the video-based systems. The opposing technology CAN accurately produce 100-2000 fps, but I still need to know if the video can produce 60 or only 30 (ie how far off is the margin of error). These companies boast the ability to time to 1/1000th of a second, rounded up the next 1/100th of second. Clearly, neither of these claims is exactly true, but the difference between 0.017 accuracy vs. 0.033 accuracy is pretty significant.
Any thoughts? THANKS!