• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 253
  • Last Modified:

difference between television and data

why is it that we are able to view television in real time, which is video, but when it comes to view streaming video on the internet or across a network bandwidth becomes such a factor.  I guess what i dont understand is how a television signal differs from data packets in terms of bandwidth.  For instance, cable television and high-speed cable Internet.

Why is that that the data rata could potentially fluctuate but we dont really have the same problem with television signals?

just a little confued.  thanks for any help.
0
andreacadia
Asked:
andreacadia
5 Solutions
 
publicCommented:
NTSC uses about 6 MHz wide channels. The rate does not fluctuate, it is fixed by design.
High speed internet in the US is not very high speed compared to asia. Entrenched providers have thus far obstructed any meaningful broadband deployment.
0
 
Fatal_ExceptionSystems EngineerCommented:
And, you must realize that Cable TV signals are originating locally, through a Headend where the satellite dishes are located.  From this point, the signal is pushed to your TV across high speed Optical Lines and heavy gauge copper, and amplified (copper) every 1000 feet.  

Your streaming Internet video must go through many routers to get from the original server to your location.  Sometimes around the world.  Packets become fragmented and sometimes lost, and the packets must be resent.  This will certainly cause choppy video on your end.

Anyway, this is the easiest explanation I can offer without getting too technical.

FE
0
 
JonShCommented:
I'll also add that there is lot more raw data being shoved to a computer monitor than a TV monitor, image for image.  This is why video for PCs is expensive relative to video for game boxes (playstation is a good example).

Raw data=bandwidth commitment :)

0
Free Tool: Port Scanner

Check which ports are open to the outside world. Helps make sure that your firewall rules are working as intended.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

 
mazzlCommented:
television uses a diffrent frequency then you're internet signal, this gives it more bantwith, also .. television signal is not digital, this meens that if you "mis"a bit of data.. you see a small flikker.. or something, while for internet you need to receive all the information.
technically if you would use all the frequency;s for internet, it would be a lot faster, but in fact you also have radio and tv signals on the same cable. witch use up a lot of the frequency's..
0
 
cagriCommented:
Ok, let me try to explain as far as my knowledge goes;

1. TV broadcast in the cable is carried as an analog signal. Which is the video/audio signal embeded in to (sorry on to) a basefrequency and carried down to your TV set. However, when you intend to carry Internet traffic through the cable, you use the same analog signal to carry data and a conversion came into play. You get your digital signal, convert it to analog, put on the cable and do the reverse on your cable modem. This greatly slows down (better say limits) the available capacity while adding a considerable amount of over-head traffic.

2. Even more, behind the cable technology, as some other people mentioned, you generally try to carry video traffic between continents (or cities at lest) while the TV signal is just carried along your street.

My 2 cents.
0
 
Fatal_ExceptionSystems EngineerCommented:
Split em up...
0
 
JonShCommented:
I'd recommend it closed no points awarded
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Get your problem seen by more experts

Be seen. Boost your question’s priority for more expert views and faster solutions

Tackle projects and never again get stuck behind a technical roadblock.
Join Now