4K Monitors - Written by RyanZ

Picture
The way we display our content is constantly changing, whether it be graphics and web design, video editing, video gaming, animation, or even professional photography, technology is always in constant flux. Throughout history we have displayed our content in a number of different ways, but innovation has improved the way computers display their content. It is apparent that not only has the image quality increased but the slender form factor has decreased in proportion with the improvement of resolution.  

We have moved away from the older analogue CRT monitors. CRT monitors require an analogue signal which is where the signal information form continuous electrical signals and not a digital signal (i.e. 10101010100010010). It used to be that the video adapter would convert this signal from digital into analogue. This day in age we use LCD monitors (or LED) which have DVI and HDMI connections and preserve that digital signal from computer to monitor. A single link DVI cable and connection supports a 1920x1080 image, and a dual link cable connection supports up to a 2048x1536 image.
 
Picture
Before 2K and 4K resolutions were available, high definition displays consisted of 1080i, 1080p, and 720p outputs, which referred to the amount of pixels that are on the display. The 'i' and 'p' in 1080 refers to interlace or progressive scan in how the pixels are displayed on the screen. 1080i resolution has higher image quality but only has 30 frames per second while 720p has a lower image quality but has a higher 60 frames per second. In addition 1080i requires "de-interlacing" which gives more detail. 

What is 4K resolution and why should you care about it? 4K resolution is the next step up in monitors and TV displays. It has 4,000 pixels, which is currently the gold standard for the movie industry (4096 x 2160 resolution). What normal consumers don't know is that when you go in and buy a 4K UHDTV from the store it is not the same 4K standard resolution (4096 x 2160 resolution) used in the movie industry. The television industry instead has adapted the 3840 x 2160 resolution as its standard which isn't pure 4K like the movie industry.


How does this change effect the video cards in computers translating the signal into a image? More power. Graphics cards today have already emerged that support 4K resolution but require more graphics processing cores and thus more power. The video card industry has been able to keep these video cards cool but the prices right now are high compared to the 2K and 1080p video cards. However by the end of Q2 2014 we may see a significant price drop in graphics hardware. Video gaming consoles, on the other hand, are still struggling to keep up with 1080p resolution and cannot display at 4K resolution yet because of their video card hardware doesn't support it.

Starting in 2013 we have started to see a drop in price for 4K monitors and TVs. Compared to the monitors that are $1000 or more, Monoprice in particular is releasing a 4K monitor in July 2014 which will be between $500 and $600, running at the standard 4K resolution for the television industry. If you thought it couldn't get better, Intel is partnering with Samsung and ViewSonic, to cut the prices of 4K monitors to $400. So if you are planning on purchasing a 4K monitor and upgrading your graphics card in your computer tower, wait for the price drop at the end of Q2 in 2014.
 


Comments


Comments are closed.