What is HD on TV? Digital HD HDV Format

What it is?

Format High Definition Video(abbreviated as HD) is new standard video that offers the user higher quality (that is, clarity) of the image by increasing the resolution (number of pixels) on the video image of the playback device (TV, monitor, plasma or LCD panel). Therefore, its more common name is "high definition format" or "high resolution".

What is High Definition Video?

In principle, any video content with a resolution greater than 1280x720 pixels can already be classified as high-definition video. At the same time, this format has its own:

What forms of HD are there?

The HD format is developing in two directions: HDV (High Definition Video), intended for playback from various media, and HDTV- intended for broadcasting on cable, satellite and terrestrial television channels, it is also called HDTV High Definition Television.

What is the resolution?

Today the main ones are: HD1080 (1920x1080) And HD720 (1280x720).
Both of them have a frame (screen) width to height ratio of 16:9.
Also, video with a resolution of 1920x1080 pixels can have progressive scan or interlace alternation of frame fields. And Video with a resolution of 1280x720 only line-by-line interleaving (scanning).
Video formats are also designated accordingly, for example HD1080i - where the letter " i" indicates interlaced alternation of fields, or 720p - where " R"line-by-line alternation.

How many frames per second should there be in HD?

In this regard, high-definition video does not have many differences from DVD video: with line-by-line interleaving of fields in the Pal- 25 frames/second, in NTSC- 30 (29,97) frames/second; with interlaced alternation of fields in the Pal system 50 half-frames/second, in NTSC- 60 half frames/second.
But in the IVTC Film system everything 24(23.976) frames, which is a standard for film, thanks to which it is possible to keep the film speed at the original.

What are the advantages of HD video over standard video (SD)?

The HD picture looks more advantageous compared to SD video, plus HD video has a significantly clearer picture on large screens.
The pictures below show a comparison of frames of different formats.

How to play High Definition Video?

You can view HD video in all its glory only on playback devices (Monitors, TV panels, projectors...) that support a screen resolution of at least 1280x720 points.

From what media can I watch high-definition video?

The main carriers of HD content are high data density disks, such as Blu-ray. Or view via TV receiver that receives and reproduces TV programs in a HDTV signal.

What codec is the HD video signal encoded with?

HD video is encoded in various codecs: MPEG-2 HD- MPEG-2 to DVD receiver. It provides fairly high image quality, but its compression algorithm is very outdated and does not provide a sufficient level of compression. Nevertheless, this format was chosen as the main one for high-definition consumer video systems. And it is no coincidence: it is well known and mastered; complex decoders are not needed to implement the path.
H.264 AVC (MPEG-4 v.10)- a young and promising codec based on the new generation MPEG-4 H.264 codec standard. AVC stands for Advanced Video Coding.
Unlike its older brothers of the h.263 standard (divx, XviD), it demonstrates significantly best quality images with greater path compression, which is associated with the use of significantly improved video data compression technologies. But encoding and decoding a video stream requires significant processor processing power. To ease the load on the processor, renowned video card manufacturers have recently included support for hardware HD video decoding in their new models.
VC-1- developed by MicroSoft and used to encode HD streams into HD-DVDs. It is based on the MPEG-4 algorithm.
DiVX HD And WMV-HD- slightly modified versions of the codecs of their older MPEG4 brothers.

What is Full-HD?

Typically, the Full-HD label indicates that a TV supports full resolution of 1920x1080 pixels.

What is HD-Ready?

The HD-Ready label indicates that the TV supports a resolution of less than 1920x1080 pixels, for example, if the TV has a resolution of 1024x768 pixels, then it will be able to show the incoming HD signal, but at the same time it will convert (compress) the image to 1024x768, thereby reducing the clarity of the incoming signal.

Often when buying a new TV or monitor for a personal computer, the question arises: “Which is better - 1080i or 1080p?” It would seem that the resolution is 1920 pixels wide and 1080 pixels high, and that’s it. This is the HD standard - it displays images in high quality. But it's not that simple. It turns out that there are two ways to display an image. The first one is interlaced. And the second one is progressive. It is by comparing their strengths and weaknesses and a conclusion will be drawn about which is better: 1080i or 1080p.

Brief background

The certification program called HD Ready was officially launched in Western Europe back in early 2005. Even then it was conventionally divided into two types. The first one is the one that comes with the index "and". It is tied to analog standards, which today are almost completely replaced by digital technologies. The second one (with the index “p”) is maximally focused on supporting the latter. Based on this alone, we can conclude that 1080i or 1080p is better.

Standard 1080i

Let's describe specifications each of the standards, which will allow you to give a reasoned answer to the question of which is better: 1080i or 1080p. Let's start with the first of them. This is one of possible options output images with a resolution of 1920 by 1080 pixels. At the same time, such a device plays video in Full HD format. The refresh rate of the image on such a screen can be 50 or 60 Hz. It is installed by software settings. The picture is formed in two clock cycles. At the first stage, even lines are output, and then odd lines. The human eye does not notice this. But the image itself is not of very high quality. You could even say blurry. The cost of such a TV or monitor will be lower. The main advantage in in this case- this is a reduced video data stream: with this technology it is reduced by 2 times. Accordingly, circuit and technical solutions are simplified, which affects the lower cost of the device.


Standard 1080p

To decide which is better to choose - 1080i or 1080p - let's look at the characteristics of the second standard. The matrix resolution in this case is similar - 1920x1080. In addition to refresh rates (50 and 60 Hz), there is also support for 24 Hz. But the image is displayed in one clock cycle. Therefore, the picture turns out much better. But the price in this case will be higher.

Let's compare

Having found out technical features and by comparing them, you can decide what would be more correct to purchase, which of the two standards: 1080i? 1080p? The difference between these formats does not seem to be significant. The image resolution is the same - 1920 pixels in width and 1080 in height. The image refresh rate is identical in both cases. It can be 50 or 60 Hz, depending on the software settings. Additionally, the 1080p standard includes image output at a frequency of 24 Hz. It is designed to play movies in certain formats. The key difference is how the image is formed. In the format with the “and” index, the picture is obtained by drawing lines step by step. A similar principle was used on old analog TVs and monitors. Previously, this output method had an extremely negative impact on human vision. Now, due to more modern technologies, such influence is minimized. But still, the real resolution that is processed per clock cycle is reduced. The actual video format is as follows: 1920 by 540. But 1080p updates the entire picture. That is, the “real” resolution is 1920 by 1080. In fact, this is full Full HD. These are the main features of the 1080i, 1080p formats. The differences between them are as follows:

  • The standard with the index “p” supports video output at a frequency of 24 Hz. This allows you to view some non-standard files in proper quality.
  • The same index “p” completely updates the image on the screen in one clock cycle. And 1080i will have to spend two iterations to perform the same operation.

Don't forget about the cost of the device. In the case of 1080p it will be 10-20 percent higher.


What's better

Now let's sum it up and decide which is better: 1080i or 1080p? Of course, the cost of the second one will be higher in any case, but in terms of quality it will be better. The picture is clear, it is updated frequently, less strain on eyesight - these are the advantages that predetermine this choice. And full support for non-standard video files would also be useful. Don’t forget that 1080p can also work in “and” mode. To do this, just adjust the device’s software settings. But it won’t be possible to do this in the opposite direction. Still, technically it is impossible to convert a half-frame into a whole frame. And this is exactly how the image is formed in this case. And one more important nuance. So far, 1080p has not gained much popularity when transmitting broadcast programs, but it is the future. It is the matter of time. Soon most television channels will broadcast in this format. Then you will be able to fully experience all its benefits. The images in the broadcast signal and on the TV screen will be in the same format, and you can believe that the picture quality will truly be the best.


Summary

Within of this material An analysis of the strengths and weaknesses of the two main standards for image formation on a monitor or TV screen was carried out. Knowing these nuances, it will be easy to make a choice between hd 1080p or hd 1080i. But at the same time, we do not forget about cost, which plays an important role in this matter. In any case, the recommendations outlined in this article will allow you to choose exactly the device that the best way will meet your needs.

Watching good TV is good not only for perception, but also for eye health. Let's figure out what full hd is. So:

What is Full HD?

Full HD is support for all high definition video formats. The maximum video format is 1920x1080. When choosing a TV, be sure to look at its labeling. Many manufacturers of modern TVs resort to a little deception and play on the ignorance of the average consumer. At the factory, the HD marking and the width of the matrix are marked. This matrix value, whatever it is, is not capable of receiving Full HD signals. Such a TV will simply interpolate the image to the top. If you want to have a maximum definition TV, then you must definitely purchase a device labeled Full HD. This marking is affixed to those TVs that cost the most. But this is not surprising, since you can pay a lot for image quality.

A Full HD TV is needed to watch a high definition TV signal. In other words, your regular TV will show channels that send images highest quality, wrong. Some models will not receive a signal at all. Only a TV with Full HD resolution is capable of working with the signal of the latest television format. That's what Full HD is. By the way, in North America In Europe and other developed parts of the world, such television is the norm. In the Russian space, the norm is television of lower image quality.

1080p or 1080i - which is better?

When choosing an HDTV, you will be faced with the choice between 1080p or 1080i resolution. This article is designed to explain and help you decide which is better - 1080i or 1080p and decide whether to buy a new TV. Let’s make a reservation right away: if you buy, then only Full HD 1080p.

The most common HD formats are 720p, 1080i and 1080p. These numbers indicate how high the resolution is and in what format the image is displayed on the screen ('i' means interlaced, 'p' means progressive). 1080p is the most advanced form of High Definition technology to date.

To understand whether 1080p or 1080i is better, you first need to know the differences between these two technologies. When it comes to choosing a TV, what do you pay attention to first? The main characteristics are quality and cost. Image quality is determined by the type of resolution. Let's see how these two video modes differ in terms of image composition and resolution.

Which is better - 1080i or 1080p?

First, let's figure out what this number means - 1080. In fact, it is an abbreviation for 1920 x 1080 - screen resolution. Here 1920 is the number of horizontal pixels, 1080 is the number of vertical pixels. Pixel is a small element of the image on the screen and the more pixels, the better quality Images. Both 1080i and 1080p are equal in resolution, but they differ in the scanning technology used. Let's take a closer look at the differences in scanning technologies between 1080p and 1080i.

The difference is in scanning technology.

The difference in the suffixes “i” and “p” results in a difference between imaging technologies. “i” denotes interlace scanning and “p” denotes progressive scanning. Scanning is a method by which an image becomes visible on a screen. In CRT monitors, scanning is carried out by a beam of electrons emitted from a cathode ray tube. Interlace scanning in TVs is a fairly old scanning method compared to progressive scanning. In interlaced scanning, the image is constructed through a line: even and odd lines. With progressive scanning, all lines are scanned sequentially. This difference in scanning results in significant differences in image quality, especially on CRT monitors.

Image quality.

Let's compare 1080i and 1080p in terms of image quality. One of the problems with 1080i is the flickering picture that results in blurred images. Progressive scan TVs do not suffer from such disadvantages. However, in modern TVs these differences are not so significant.

Videos and games.

Superiority advanced technology scanning becomes even more obvious when you use a 1080p HDTV for gaming or video applications. Fast-paced graphics in games are better displayed on 1080p TVs due to the benefits inherent in advanced scanning technology.

1080p or 1080i - which is better? 1080P!

Share