HDMI vs DVI: What’s the best AV input?

A graphic contrasting HDMI and DVI cables

Whenever you purchase a new monitor, a fresh cable will usually be bundled in the box, ensuring the average user can get up and running immediately. HDMI and DVI are two of the most common cable types you will encounter, and in some cases both cables will be provided, with HDMI vs DVI often boiling down to whatever is easiest or most familiar.

However, the default cable is not always the best tool for the job, especially for those users looking to get the most from their new device. So, which is better, HDMI or DVI?

What is DVI and HDMI?

High Definition Media Interface, or HDMI, first appeared in the consumer market back in 2003 on early HD-ready TVs. Since then, it has become the standard cable for HD or UHD televisions, as well as video recording hardware. It's usually available for computer monitors too, and is by far the simplest type to use, given it carries both video and audio feeds.

Digital Visual Interface, or DVI, is a slightly older technology, first appearing in 1999. It was regarded as one of the first cable types to try and create a video standard across the industry, and as such is available is a number of formats, including one that supports the now obsolete VGA connection. However, unlike HDMI, DVI only delivers video signal and does not support audio, so its use usually requires external speakers.

Both cable types use the CEA-861 standard, meaning signals carried by HDMI are compatible with DVI. This means an adapter can be used to convert an HDMI connection into a DVI, or vice versa, without the loss of signal or quality.

HDMI vs DVI: Layout

The most obvious difference between DVI and HDMI ports is their appearance. HDMI is a much smaller connection, more like an oversized USB than anything else, while DVI is larger and significantly more complex.

There are various types of DVI cables, but the most common layout is a 24-pin setup that looks like a SCART lead. Because it's available in different iterations, you must make sure you've got the right one for your display.

DVI is available in three primary variants: DVI-I (analogue and digital) and DVI-A (purely analogue), and DVI-D (digital), the latter of these being far more common on the market today. Confusingly, DVI and HDMI are available in either single-link or dual-link formats, which support different maximum screen resolutions (which we will come onto later).

While different HDMI models are available, they're relatively simple, with newly released versions sticking to a numbered system. Right now we have HDMI 2.0, and 2.1 – the latter of which adds support for resolutions up to 10K, frame rates up to 120Hz, and dynamic HDR.

A significant difference between the two is that HDMI supports up to 32-channel audio, whereas DVI supports video only. Unfortunately, if you have a DVI cable, you will need to either switch to HDMI or use an additional audio cable to get any sound to or from the monitor.

HDMI vs DVI: Compatibility

Naturally, the biggest question when shopping for cables is compatibility. Nobody likes buying a fancy monitor and learning their hardware doesn't have the right ports to connect it.

Ultimately, this comes down to what inputs your existing tech has, but it's worth noting that HDMI is by far the most common connection source. An HDMI cable is likely to fit most modern PCs, Windows-based laptops, monitors, and gaming consoles – although it's being phased out in favour of USB-C, especially on devices like Chromebooks and Macs.

If your laptop has an HDMI input, but your monitor has a DVI, it's not the end of the world. Given they both use the same CEA-861 standard, adaptors can convert HDMI to DVI and vice versa without sacrificing quality. These are also very cheap to buy from most electronics retailers.

HDMI vs DVI: Quality & refresh rates

Given that 4K has become a familiar resolution in many offices, the differences between the cables have become more obvious. With HDMI 2.0, support was added for 60Hz (which roughly corresponds to 60 frames per second) at 4K resolutions, whereas this was limited to just 24Hz on previous versions. It also benefits from HDR, a technology that is becoming increasingly available on top-end devices, which essentially allows a greater range of detail across varying light qualities.

Another difference is that HDMI supports HDCP copy protection as standard, whereas DVI doesn't. This is a system that prevents HD content such as BluRay videos from being played on unauthorised devices.

Refresh rate has become an important consideration for those in the market for a new monitor. Higher rates can offer a smoother PC experience that is much easier on the eye, reducing the strain and headaches associated with long days in the office. A refresh rate can be simply considered as the number of frames per second a monitor is capable of putting out, with 144Hz offering a potential 144 frames per second.

In its most recent versions, HDMI has caught up to DVI as both can now output 144hz at 1080p. In fact, HDMI is now considered the better option, as the cable simply supports more advanced technology than its dated counterpart.

HDMI vs DVI: Price

Cables have been known as an easy target for scammers who raise their prices for these products by claiming they provide better quality than others. In contrast, the only cable where the prices accurately reflect its standard are analogue ones. For these cables, manufacturers which provide a low quality build directly translates to a worse image clarity. However, for other cables, even those that are seen as bad quality ones, are perfectly able to transmit the same level of signal.

You can apply this to HDMI and DVI cables, where you can expect the same level of image no matter the price difference between cables. It’s worth pointing out, however, that gold-plated cables might be more durable since the gold is corrosion-resistant. Additionally, it might also be more resistant to damage but this depends on the quality of the connector in your cable.

With this in mind, rest easy knowing that it’s more than fine to go online to choose the cheapest HDMI or DVI cable you encounter. You can relax and confidently know that the image on your device won’t be any different to a cable that’s £5 more expensive.

HDMI vs DVI: Is DVI better than HDMI?


Which is the best way to acquire your IT?

Purchase, lease or consumption-based IT solutions


It's important to always make sure you base your judgement on whichever cable fits your hardware when you’re looking for the perfect connector to your AV gear.

In the past, DVI was considered a good option largely due to it's support for higher refresh rates, and its compatibility with older connection types, such as VGA.

However, HDMI has largely made DVI obsolete, especially given that hardware advancements have slowly eroded any advantage DVI had. DVI is also a more complicated type of connection, given the variety of forms that customers need to contend with. Finally, the fact HDMI also supports audio, means there's very little reason to look at DVI these days, unless your hardware requires it.

Bobby Hellard

Bobby Hellard is IT Pro's reviews editor and has worked on Cloud Pro and Channel Pro since 2018.

In his time at IT Pro, Bobby has covered stories for all the major technology companies, such as Apple, Microsoft, Amazon and Facebook, and regularly attends industry-leading events such as AWS Re:Invent and Google Cloud Next.

Bobby mainly covers hardware reviews, but you will also recognise him as the face of many of our video reviews of laptops and smartphones.

He has been a journalist for ten years, originally covering sports, before moving into business technology with IT Pro. He has bylines in The Independent, Vice and The Business Briefing.

Contact him at bobby.hellard@futurenet.com or find him on Twitter: @bobbyhellard