Thread: Recording TV with camera

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Citizen of Awesometown the_jackass's Avatar
    Join Date
    Oct 2014
    Location
    Awesometown
    Posts
    269

    Recording TV with camera

    This question might be a bit out of place here, but anyways...I recorded a Pokemon episode from tv today with my ancient PSP (640x480 with 15 fps, the best video camera I have...I know this sucks), just for the fun of it ya know. Both TVs I have are of the "box" variety..ie picture tube type. The recorded video had a lot of flicker and sometimes large parts of the screen went in phase with the "dark" time of TV ( = a big swathe of the video went black for a few seconds occasionaly). All this crap vanished when I reduced the brightness and contrast on my TV but the video still looks crappy.Also I'd built a stand so the camera stayed fixed in place without any "shaking".Now I'm asking these mostly out of curiosity...Would this effect, by any chance, be gone if the tv's "framerate" is made much higher?I've got an new LCD monitor in my home. If I somehow play tv on this monitor would it create better video recordings?Oh and what's the framerate for new box type TVs?If I'm able to record decent quality videos from tv with the stuff I have it'll be a cool thing lol.
    "Highbrow philosophical truth: Everybody is an ape in monkeytown" --Oscar Wilde

  2. #2
    Registered User
    Join Date
    Sep 2004
    Location
    California
    Posts
    3,268
    Would this effect, by any chance, be gone if the tv's "framerate" is made much higher?
    No, it would not. Most TVs and monitors run a refresh rate of 30 or 60 Hz. This is already faster than your camera's capture rate. This effect is compounded by the fact that your box TV is almost certainly interlaced (redraws only half the image per cycle). If you were to somehow increase the TV's refresh rate to something like 200 Hz, it would increase the number of redraws per second, and thus increase the chance that the camera would capture an image during a redraw. If you want to minimize this effect, you should use a TV that does progressive scan (instead of interlaced), and also runs at a lower refresh rate.

    Of course your best course of action would be to use a capture device that could take in the TV's analog outputs. Something like this.
    bit∙hub [bit-huhb] n. A source and destination for information.

  3. #3
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    There are three things going on.

    One, is the synchronicity between the TV frame rate and the camera capture frame rate, as you already thought of. Think of this in terms of Nyquist's sampling theorem. You need to sample the signal (TV image) at least twice as fast as the minimum signal change period. In other words you have to capture at the TV frame rate, or higher.

    If you capture precisely at the TV frame rate, you are "critically sampling" and some phenomena can happen like the complete vanishing of signal components which are right at the Nyquist frequency. As you sample at higher frequencies this effect vanishes. If you sample at a LOWER frequency than the TV frame rate, you will get "temporal aliasing" which would cause higher frequency varations in the image to alias into lower frequencies. This is the source of the "backward turning wheel spokes" in some footage. It doesn't sound like this is what is happening though.

    The second factor is the alignment of the pixel grid of the TV with the pixel grid of the camera. Here is another kind of aliasing: not temporal, but spatial. This can produce wave-like interference patterns on the captured image or Moire-style patterns when the orientation of the grids do not align with each other.

    Third is the closed-loop gain control within the camera. Here is a circuit which tries to determine the overall scene brightness and adjust things like aperture and exposure time to maintain a visible scene. This closed-loop control process has a fundamental rate of operation, and so it can interact with the frame rate of the TV in strange ways. For instance if the camera capture aligns closely but not exactly with vblank, there will periods of time where the camera "sees nothing" and the closed-loop control drives the brightness to maximum. At other moments the camera sees the TV image and the controller drives the brightness down. So you may see an overall brightness modulation in the scene which can be quite drastic. This is most likely what you are seeing. Turning down the brightness on the TV caused the closed-loop controller to behave itself a little better.
    Code:
    //try
    //{
    	if (a) do { f( b); } while(1);
    	else   do { f(!b); } while(1);
    //}

  4. #4
    Citizen of Awesometown the_jackass's Avatar
    Join Date
    Oct 2014
    Location
    Awesometown
    Posts
    269
    Ok thank you both for your ideas.

    So all this theory means that I need to make camera frame rate higher than tv frame rate. This certainly isnt possible given the equipment I have.

    But I've recorded a gameplay video from a computer monitor (CRT) and it has appeared quite normal. Will I get progressive scanning if I'm somehow able to watch TV through my computer monitor? I know TV through computer monitor is possible with a certain electronic box that I might be able to borrow from a friend.

    edit:

    Here's a still from that video. A Moire pattern is certainly visible.

    Recording TV with camera-vlcsnap-144229-jpg
    Last edited by the_jackass; 12-16-2014 at 02:20 PM.
    "Highbrow philosophical truth: Everybody is an ape in monkeytown" --Oscar Wilde

  5. #5
    Citizen of Awesometown the_jackass's Avatar
    Join Date
    Oct 2014
    Location
    Awesometown
    Posts
    269
    Experimented a bit and found out that only reducing the contrast removes those phase issues. I didnt change the color and brightness this time and the results are kinda better:

    Recording TV with camera-charizard2-jpg
    "Highbrow philosophical truth: Everybody is an ape in monkeytown" --Oscar Wilde

  6. #6
    Citizen of Awesometown the_jackass's Avatar
    Join Date
    Oct 2014
    Location
    Awesometown
    Posts
    269
    I noticed another problem with my PSP. It makes the videos corrupt that are larger a certain length of time so I'm forced to use DivFix++.
    Also copying to computer is sometimes taking waaay longer than necessary and when I copy the same file twice, both copies are having different MD5 checksums. wtf! And also these videos are stopping on a frame after seeking to a certain point, most probably due to crappy copying.

    This is making me really ........ing angry.

    Edit:
    Unrelated, I just found out about Meowth's song by searching for Meowth on this site. Cool!
    Last edited by the_jackass; 12-17-2014 at 01:03 PM.
    "Highbrow philosophical truth: Everybody is an ape in monkeytown" --Oscar Wilde

  7. #7
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    As far as I know, no computer monitors are interlaced. I mean, they're typically put around a meter or so from the computer, so the signal only had to travel over the length of a wire instead of a large distribution network over the entire country, so bandwidth (primary reason behind interlacing AFAIK) have never been an issue. So I'm not an expert on the subject, but I do suppose it would help since it gets rid of the interlace issue.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  8. #8
    Registered User
    Join Date
    Mar 2011
    Posts
    596
    Quote Originally Posted by Elysia View Post
    As far as I know, no computer monitors are interlaced. I mean, they're typically put around a meter or so from the computer, so the signal only had to travel over the length of a wire instead of a large distribution network over the entire country, so bandwidth (primary reason behind interlacing AFAIK) have never been an issue. So I'm not an expert on the subject, but I do suppose it would help since it gets rid of the interlace issue.
    I believe you are correct. Computer monitors are all progressive scan as far as I know, also.

    If I'm not mistaken, there was at least one interlaced monitor very early on, and the flicker was very objectionable. Unlike the analog TV signals, which tended to blend the interlaced images together, the bit mapped characters and and simple graphics had an annoying flicker as the odd and even lines alternated. It didn't survive very long.

    -

  9. #9
    Master Apprentice phantomotap's Avatar
    Join Date
    Jan 2008
    Posts
    5,108
    primary reason behind interlacing AFAIK
    O_o

    The bandwidth question wasn't really the issue.

    Sending 60 half-frames per second isn't meaningfully different than sending 30 full-frames per second.

    Soma
    “Salem Was Wrong!” -- Pedant Necromancer
    “Four isn't random!” -- Gibbering Mouther

  10. #10
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    I am going to disagree with that statement ("Sending 60 half-frames per second isn't meaningfully different than sending 30 full-frames per second."), but if we assume it was true back then, then why send only half the frames and having to invent the whole interlaced system instead of just sending all 60 frames at once?
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  11. #11
    Master Apprentice phantomotap's Avatar
    Join Date
    Jan 2008
    Posts
    5,108
    I am going to disagree with that statement ("Sending 60 half-frames per second isn't meaningfully different than sending 30 full-frames per second."), but if we assume it was true back then, then why send only half the frames and having to invent the whole interlaced system instead of just sending all 60 frames at once?
    O_o

    What nonsense are you talking about?

    1): The production is 30 frames* per second one way or the other.

    2): The production does not send half the number of frames per second.

    Soma

    *) I don't want to discuss 29.97, PAL, or other standards specifically so just keeping general.
    “Salem Was Wrong!” -- Pedant Necromancer
    “Four isn't random!” -- Gibbering Mouther

  12. #12
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    I don't know how it was back then, when interlacing was introduced, but I mean, today, 30 is much different from 60, depending on source.
    I take it interlaced has something to do with the frequency of the voltage from the outlet. Therefore, if the source is 30, they would have to somehow "extend" it to 50/60. It still doesn't make sense, though, as they could just duplicate frames. So that leaves my question unanswered: if the source is 30 fps, then why bother with interlacing if bandwidth was not an issue?
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  13. #13
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Ireland
    Posts
    8,446
    Quote Originally Posted by Elysia View Post
    So that leaves my question unanswered: if the source is 30 fps, then why bother with interlacing if bandwidth was not an issue?
    Quote Originally Posted by phantomotap View Post
    I find it strange that you can apparently conceive of the costs of overcoming bandwidth limitations in the existing infrastructure as being a factor, yet you can imagine no other limitations of the time that would have informed the decisions made during development.
    Quote Originally Posted by Elysia View Post
    Whatever then >_<
    From the wikipedia entry on Interlaced Video (first phrase): "Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth."

    And this is why wikipedia is not always a good source. Or, when it is, it pays to actually read the whole article.

    Interlaced video was introduced back then, not because really of bandwidth limitations. These limitations actually didn't exist. You could produce high frame rate videos as far back as the 80s. The problem was that bandwidth was (still is, just in a less amount) directly tied the price of the equipment in the production chain. That is video bandwidth isn't just a matter of buying fatter cables. It actually affects everything in the production chain, from recorders to displays.

    Instead the real reason why interlaced video was introduced was because there were no known methods for compression, transport of compressed video signal through the analogue pipeline, and decompression at the customer end. These methods already existed for digital video.

    The digital video signal goes as far back as the 70s. But it only became commercially viable in the 90s. The nature of the signal permits compression and this is why interlaced video is no longer a viable option.
    Last edited by Mario F.; 12-18-2014 at 06:14 AM.
    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  14. #14
    Registered User
    Join Date
    Mar 2011
    Posts
    596
    Quote Originally Posted by Mario F. View Post
    From the wikipedia entry on Interlaced Video (first phrase): "Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth."
    That's basically true. For the same reason, movie film frame rate is 24 frames per second, but each frame is shown twice, for a flicker rate of 48 Hz.

    Bandwidth for progressive 30 fps video would be the same as interlaced 30 fps. Both are 525 lines in 1/30th second. But the flicker would be noticable if progressive. I'm not sure, but the slower scanning from screen top to bottom might also have been a problem.

    -

  15. #15
    Registered User
    Join Date
    Mar 2011
    Posts
    596
    Quote Originally Posted by Elysia View Post
    I don't know how it was back then, when interlacing was introduced, but I mean, today, 30 is much different from 60, depending on source.
    I take it interlaced has something to do with the frequency of the voltage from the outlet. Therefore, if the source is 30, they would have to somehow "extend" it to 50/60. It still doesn't make sense, though, as they could just duplicate frames. So that leaves my question unanswered: if the source is 30 fps, then why bother with interlacing if bandwidth was not an issue?
    Your question is good - bandwidth was an issue.

    The NTSC standard for video allowed 4.5 Mhz bandwidth (analog). To display the video with full resolution with progressive scanning would require 30 fps (frames per second), and that is 30 progressive frames per second. The amount of flicker present would be unacceptable. Without increasing the bandwidth, the frame rate could not be increased. So the frame rate was kept the same, but divided into two fields, one containing the even numbered scan lines, and one containing the odd. This doubles the flicker rate from 30 Hz to 60 Hz. (see my last post about movie film projection flicker) The interlacing process itself did not change bandwith requirements. For a given frame rate, it would be the same, whether progressive or interlaced. What interlacing did was to allow a higher (and unnoticable) flicker rate.

    The bandwidth of NTSC transmissions was based on limitations of receiving circuitry and the need to fit some number of channels into the commercial TV transmission band. Interlacing was adopted instead of higher frame rate.

    I just recalled an early attempt at adding information digitally to the bottom of TV images. This wasn't a digital transmission, but was information added digitally to the picture. The horizontal lines in the graphics, especially, had an annoying flicker. Interlacing does not work well when components of the image reside exclusively in only one field of the frame. They are there for one field, then gone in the next, then back again in the next; this repeats 30 times a second and is quite noticable. This is also why the early interlaced computer monitors never caught on. A line of pixels would reside on only an even scan line or an odd one, and would noticably flicker as the even and odd fields alternated.

    The 60 Hz power line frquency was also a factor in selecting the field rate. It was very difficult to remove power supply "ripple" from the TV receiver circuitry. This ripple showed up in the picture, but was unnoticable as long as it was motionless. Matching the field frequency to the linefrequncy removed the motion. This ripple manifests as a very slight warping of the picture. It can be seen in color broadcasts on CRT sets, since the field rate is 59.94 Hz. The ripple passes through the picture once every 16 seconds approximately

    -
    Last edited by megafiddle; 12-19-2014 at 02:39 AM.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Replies: 3
    Last Post: 07-28-2012, 03:29 PM
  2. Recording
    By MadCow257 in forum Tech Board
    Replies: 0
    Last Post: 06-20-2006, 11:24 AM
  3. Recording
    By cerin in forum A Brief History of Cprogramming.com
    Replies: 4
    Last Post: 03-30-2005, 08:20 AM
  4. Digital Camera -> Slo-Mo Camera??
    By Masa in forum Tech Board
    Replies: 6
    Last Post: 12-24-2003, 11:11 AM
  5. recording in C++
    By Unregistered in forum C++ Programming
    Replies: 4
    Last Post: 11-16-2001, 01:47 PM