Header Ads

Breaking News

Why Our World Would End If All Displays Disappeared

Today we will discuss display technology i.e. resolution, PPI, dpi, HD, true HD and 4K etc. Before starting, we have to clear some basic concepts and these concepts are very important in display technology.


Pixel:
The pixel (a word that is invented from 'picture element') is the basic unit of programmable color on the computer/ mobile display (computer image). Actually, the physical size of a pixel depends on how you've set the resolution for the display screen. If you have set the display to its maximum resolution, the physical size of a pixel will equal the physical size of the dot pitch of the display. If, however, you've set the resolution to something less than the maximum resolution, a pixel will be larger than the physical size of the screen's dot (i.e. a single pixel will use more than one dot of the display).

The specific color that a pixel describes is some blend of three components of the color spectrum - RGB. Depending upon the value of color spectrum, the color output of pixel can be changed from the white to the black. Up to three bytes of data are allocated for specifying a pixel's color, one byte for each major color component (R-G-B). A true color or 24-bit color system uses all three bytes. However, many color display systems use only one byte (limiting the display to 256 different colors). Thanks for a true color system (24-bit) giving us 16 million colors.

Magnification:
Magnification is the ability to make small objects seem larger, such as making a microscopic organism visible. Normal human eyes have limitation to see objects beyond the certain level so that magnification is necessary in order to visualize that object. With the help of magnification, it can be made to seem larger depending on the magnifier equipment used.
Resolution:

Resolution is the ability to distinguish two objects from each other. We always say that my mobile has camera resolution of 5MP, 8MP or 16MP. What does it mean by that? Resolution means a number of pixels present in that display or the computer image.

Now, take an example. I have a 15.68" screen display with the resolution of 1366 x 768 pixels (1049088 pixels=1 MP). The dimensions for the screen are 15.68" (diagonally), 13.68" wide x 7.68" tall (the universal dimensions in W x H). It means my display density is around 100 PPI.

How can it be calculated?


I have 15.6" screen with 13.6" x 7.68" (W x H) for resolution of 1366 x 768 pixels, that means I have 1366 pixels/13.66"=100 pixels per inch on horizontal lines as well as 768 pixels/7.68"=100 pixels per inch on vertical lines of the display. And this all is shortened as PPI (pixels per inch) and when the pixel density is measured then this term is generally used. 


If I have 1366 x 768 pixels (1049088 pixels) for whole display (13.66" x 7.68"=104.9088 sq"), hence I have 1049088 pixels/104.9088 sq"=10000 pixels/sq"=100 pixels per inch ( as 1002=10000).

Another important point about the display is given as, the terms Dots Per Inch (DPI) and Pixels Per Inch (PPI) are commonly used to describe the resolution of an image. However, the terms do not mean the same thing and there are distinct differences between the two:

  •  DPI refers to the number of printed dots contained within one inch of an image printed by a printer.

  • PPI refers to the number of pixels contained within one inch of an image displayed on a computer monitor.

Both PPI and DPI are units to describe the output resolution of an image when displayed on a screen (PPI) or printed out on paper (DPI). If you divide the image size (total number of pixels of the digital image) to this output resolution, you can determine the size of which the image is shown on screen or printed out.


For example: an image with 1366 x 768 pixels, printed with an output resolution of 100 dpi will have a print size 13.66" x 7.68" (1366 x 768 pixels / 100 dpi)

Much of the confusion between these two terms happens for a couple of reasons. First, even though PPI refers to the resolution of an on-screen digital image, it can also affect the quality of the final printed picture. Secondly, some professional print services request that pictures must be at a certain DPI level before they can be printed; what they normally mean is PPI, not DPI - thus, this adds to the confusion.

The term DPI is a method to determine the print size of an image on paper in regard to its digital image size, which is the total number of pixels in horizontal and vertical direction. Although some printing applications still use DPI, many newer printing applications instead have a setting so you can select exactly what size (11" x 8.5" of Letter, 11.69 x 8.27' of A4, or other) you want to print a photo. For printing applications that use DPI to determine the print size, increasing the DPI will make the size of the printed image smaller, while decreasing the DPI will make the size of the printed image larger.

PPI represents the resolution (pixel density) of a digital image displayed on-screen. Because it is correlated to the image size (total number of pixels), it also contributes to the quality of an image. If a digital image contains too few pixels (Low resolution), the picture will give fewer details and will appear pixelated. Digital images with more pixels (High resolution) have better details and seems more sharp and crisp. The amount of PPI by a given size on the screen is determined by the image size of the photo .

I am sure your concept has been cleared now.

Let's go to display resolutions. We all know about HD display, Full HD display and recently introduced 4K and 8K displays, but what is the actual meaning of them and how they differ from each other?

Before 2-3 decades, there were only standard definition televisions i.e. with 640 x480 pixels. But nowadays new technologies are developed and going on every day. If you want to check difference then switch on both standard definition TV and HD TV side by side. You will able to get it.

Display resolutions vary from 480p to 8K and for mobiles even low resolution is present in the feature phones i.e. from 144p to 320p.

I have given here the different resolutions for mobiles and televisions.

Low resolutions are:


Common resolutions for displays are:

Standard
Aspect ratio
Width (pixels)
Height (pixels)
SVGA
4:3
800
600
WSVGA
~17:10
1024
600
XGA
4:3
1024
768
XGA+
4:3
1152
864
WXGA
16:9
1280
720
WXGA
5:3
1280
768
WXGA
16:10
1280
800
SXGA
5:4
1280
1024
HD
~16:9
1360
768
HD
~16:9
1366
768
WXGA+
16:10
1440
900
HD+
16:9
1600
900
UXGA
4:3
1600
1200
WSXGA+
16:10
1680
1050
FHD
16:9
1920
1080
WUXGA
16:10
1920
1200
WQHD
16:9
2560
1440
WQXGA
16:10
2560
1600
4K UHD
16:9
3840
2160
8K UHD
16:9
7680
4320


Now, what is qHD? 
The term qHD stands for Quarter High Definition, which refers to any display packing a resolution of 960x540. That's precisely one-quarter of the number of pixels found in a Full HD screen (1920 x 1080 pixels). Smartphones like Moto E, Xolo Q600 S, Sony Xperia M2 etc. have qHD screens on their top. But thanks for fast growing technology that has made the even basic budget phones now to sport the HD screen (1280x720 pixels), which is sharper and clearer.

What is HD and Full HD?
So what actually is HD? HD, or High Definition, refers to the detail level of a screen; or to put it another way, the number of pixels that are packed into a display. A pixel is the smallest visible element on a display, the ‘dots’ that combine to make up the overall picture.

HD follows from standard definition cramming in even more pixels in order to produce sharper, cleaner and smoother images when playing video.

Confusingly there are three different types of ‘HD’ resolution out there, so it’s worth knowing a bit more when shopping around for HD television sets and related gears or video disc player sets. Because if you have HD television but with standard definition video player, it will play video only with standard definition. So let's take a closer look at 720p, 1080i and 1080p here, and what you need to know.


720p, 1080i and 1080p HD - What’s the difference?
720p, 1080i and 1080p are all versions of HD, but they’re all different. So let's begin with, first 720p and 1080p and the differences between those. 
720p v/s 1080p

A 720p screen is 1280 pixels (wide) x 720 pixels (tall). That's more than twice the detail of standard definition, which makes for reasonably sharp video playback on a standard TV. However, 1080p (Full HD or True HD) goes even further, bar up the pixel dimensions of 1920 x 1080 pixels, that's five times more detailed than SD.

An HD television with a resolution of 720p will only be able to display video at this resolution and no higher. So if you’re planning on playing HD games or Full HD videos on your 720p television set, then you just won’t be able to get the absolute best performance. 

Most 720p TV sets you’ll see in shops these days tend to be toward the cheaper end of the price spectrum and will be marketed as being ‘HD Ready’. This is because 720p is the absolute minimum required to meet this standard. 

Most 1080p sets you’ll see will be marketed as being ‘Full HD’ or 'True HD' as it gives you a richer, more well-defined viewing experience and detailed & smooth image.
What is 1080i and how is that different to 1080p?

Now, we are going to talk about the difference between 1080i and 1080p displays. Both display panels show images at the same pixel count as each other i.e. 1920 x 1080 pixels. The big difference is in how images are made up on your TV.

The lowercase ‘i’ in 1080i stands for the interlaced scan. The lowercase ‘p’ in 1080p stands for progressive scan.

Interlaced scan renders images in vertical lines, breaking down the picture into individual columns and then displaying every other line at a very high rate - at 1/60th of a second (i.e. called as refresh rate or cycles/second generally given in 'Hz'). Odd-numbered lines displayed on the screen first, then even numbered lines. While this is incredibly fast and impossible for the human eye to detect, it can create a ghostly flickering effect on live TV broadcasts, particularly live sporting events. At low refresh rates it much possible to be happened. 

While, progressive scan renders images sequentially, all at once. This makes for a much smoother image overall that doesn’t suffer from this ghosting effect. What’s more, flat-panel displays like LCDs, LEDs and plasma's will automatically convert any incoming 1080i signal to 1080p. Good quality television sets (generally expensive and from top value brands) will use clever processing (Contain onboard chip/microprocessor) to replace the missing lines, but cheaper television sets won't make as good as them. 

What is Quad HD?
Quad HD is another name for WQHD, which stands for 'Wide Quad High Definition'.

Quad HD screens have a 2,560x1,440 pixel resolution. This is four times the number of pixels you get on a standard HD panel (720p), hence the name. Quad HD screens aren't as sharp as 4K Ultra HD displays, however, as we'll see later.


WQHD displays also have a 16:9 widescreen aspect ratio, which explains where the ‘W’ in WQHD comes from. This is becoming an increasingly common aspect ratio in mobile phones, which are now often used to enjoy video on the move.

But it’s too easy for people to mistake it for 4K (UHD), that is totally different from QHD display.

What is 4K, or UHD?
4K, or 4K Ultra HD to give it its full title, refers to screens with a resolution of 3840 x 2160 pixels. This is four times as many pixels as you'll find in a Full HD (1920x1080 pixels) display panel. 

4K Ultra HD is in the news a lot at the moment because the next generation of HD TV broadcasts will be shown at 4K resolution.


While a lot of phones are capable of recording video in 4K UHD, including the Samsung Galaxy S7, S6 Edge iPhone 6 and Xperia Z5, but today there is only one mobile phone out there that boasts a 4K screen - the Sony Xperia Z5 Premium. However, the actual visual difference between the Z5 Premium and the standard Z5 is difficult to spot with the naked eye, as mobile phone screens are so small.


Can a 720p TV playback 1080p Full HD/ 4K video?
Every television or mobile or computer screen has its native resolution i.e. specified or actual resolution. A 720p HD television have 1280 x 720 pixels native resolution and if Full HD video would play on it, then it would downscale this resolution to fit and it can play this content and quality will be great as for HD panel, even 4K videos can be played on HD panel but byte rate is quite high. While a 1080p television set would be able to handle it natively. If SD video is played then it is upscaled to fit, but it doesn't look clear and smooth on HD panel.

The real plus point of having a 1080p television set over a 720p one is when it comes to watching Blu-ray movies. Blu-ray is a native 1080p format, so they look their absolute best on the 1080p television set. If you have got a Blu-ray collection then the choice is clear - go for a 1080p television set.

In terms of picture quality on a 21-inch or 27-inch screen, the difference is negligible unless you’re sat up close for HD and Full HD televisions. Only on bigger screens (32" and above) can you start to appreciate the benefits of 1080p.

NOTES: 
  • Most digital cameras will have an image size setting in the camera menu. For the best picture quality, use the highest image size setting available on the camera when taking pictures. Refer to the operating instructions provided with your camera for information about possible image size settings.
  • While buying the televisions never compromise quality for few bucks, otherwise, you may have to regret in future.
  • Getting smaller panel for higher resolution is better than the larger panel for low one. Always choose appropriate according to your preferences.
  • Check for other features like viewing angle, contrast ratio, 3D ready, connectivity like HDMI, Bluetooth, Wi-Fi etc. and the most important are reliability and after sales service.

No comments