Solved

Vizio 26" LCD TV As monitor - ASUS EAH3450 Video Card - HDMI Display mode looks bad

Posted on 2009-05-14
10
3,694 Views
Last Modified: 2013-12-01
I have a Vizio 26" TV/Monitor (VA26LHDTV10T) connected to the HDMI port using an ASUS EAH3450 video card. The system is a Dell 530s, Intel Core2, 4gb RAM, Windows XP Professional SP2. The display looks okay, but not very good. It's not crisp like my other LCD display, the text seems to "glow" with a white border and it all looks pixelated. I know this is not a great description, but what I'm wondering is do I need a specific monitor driver for this setup? The monitor type shows up as "Plug and Play Monitor" and I've tried to use other drivers such as sony, samsung LCD drivers. I've tried all HDTV modes for the display, i.e. 720p60 NTSC, 1080i25 PAL, I've also tried all possible resolutions from 640x40 looks like lego land, to 1920 x 1080. I've also tried all refresh rates, from 35 to 120 Hertz. Is there a trick, driver, or setting I can use to dial in the appearance for this monitor/video card using the HDMI port?
0
Comment
Question by:itbalance
10 Comments
 
LVL 16

Expert Comment

by:Brian Pringle
ID: 24390056
The problem is that you are trying to convert the display signal to something that a TV can recognize.  Does your TV have a VGA (standard monitor) connector?  If so, buy a VGA cable and use that instead.  You will get a much better picture and will be able to read text.
0
 

Author Comment

by:itbalance
ID: 24390517
But the video card has a HDMI slot. Doesn't the card provide a clean signal for the HDMI monitor? If not, why does the graphics card have an HDMI port?
0
 
LVL 16

Expert Comment

by:Brian Pringle
ID: 24390706
It is a clean signal, but it is not designed for small text.  It is still a lower resolution (1080 max) than what you get out of the analog VGA connector.
0
Free Tool: Path Explorer

An intuitive utility to help find the CSS path to UI elements on a webpage. These paths are used frequently in a variety of front-end development and QA automation tasks.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

 
LVL 34

Expert Comment

by:jamietoner
ID: 24391571
Did you try 1366x768? This is the tv's native resolution.
0
 

Author Comment

by:itbalance
ID: 24391666
1366x768 is not an available option for the monitor, this is why I was thinking I needed a different driver than "Default" or "Plug and Play Monitor".
0
 
LVL 16

Expert Comment

by:Brian Pringle
ID: 24392000
It isn't the monitor, but the video card.  You video card will sense what the monitor supports and will only show those modes.  I have an ATI video card on my computer and it is connected to my 32" TV via VGA cable.

ATI has the ability to configure "Custom Timings", which means that I can manipulate the resolution and frequency of the video card.  I had to set mine to 1360x768 (yes, 1366 is native, but it is distorted).  I lose 3 pixels on both sides, but there is no distortion.

I have also set this up on other people's computers, but the video card has to support it.
0
 
LVL 16

Expert Comment

by:Brian Pringle
ID: 24392001
And, mine still shows up as "Plug and Play Monitor".
0
 
LVL 69

Expert Comment

by:Callandor
ID: 24395372
Try using PowerStrip - it has a 1366x768 custom mode.
0
 
LVL 17

Accepted Solution

by:
xema earned 250 total points
ID: 24435167
itbalance
I have a Nvidia 9600GT conected to a Vizio 37" and the best way is to use the VGA connector.
Your TV is a 1080i that means it will draw the 540 odd lines and AFTER the 540 even lines, also if you set it to 720p you'll be losing 48 lines.
So use the VGA connector with the native resolution of your TV, 1366x768.
Also you won't get a sharp image, see the following questions
http://www.experts-exchange.com/Hardware/Displays_Monitors/LCD_and_Plasma/Q_23876397.html
http://www.experts-exchange.com/Hardware/Displays_Monitors/LCD_and_Plasma/Q_23694415.html
http://www.experts-exchange.com/Hardware/Components/Motherboards/Q_22901491.html
0
 

Author Closing Comment

by:itbalance
ID: 31581710
I understand now, I guess I expected the HDMI port on the video card to do high-definition-quality at 1920x1080 just like a regular PC monitor. Live and learn :) Thanks!
0

Featured Post

Networking for the Cloud Era

Join Microsoft and Riverbed for a discussion and demonstration of enhancements to SteelConnect:
-One-click orchestration and cloud connectivity in Azure environments
-Tight integration of SD-WAN and WAN optimization capabilities
-Scalability and resiliency equal to a data center

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
Utility for identifying files that are different 5 85
Lengthy time accessing hard drive 9 107
Windows 10 Black Screen and Reboot Redux 22 180
S1200BTL GPUs 2 20
The article will include the best Data Recovery Tools along with their Features, Capabilities, and their Download Links. Hope you’ll enjoy it and will choose the one as required by you.
In the modern office, employees tend to move around the workplace a lot more freely. Conferences, collaborative groups, flexible seating and working from home require a new level of mobility. Technology has not only changed the behavior and the expe…
Two types of users will appreciate AOMEI Backupper Pro: 1 - Those with PCIe drives (and haven't found cloning software that works on them). 2 - Those who want a fast clone of their boot drive (no re-boots needed) and it can clone your drive wh…
In an interesting question (https://www.experts-exchange.com/questions/29008360/) here at Experts Exchange, a member asked how to split a single image into multiple images. The primary usage for this is to place many photographs on a flatbed scanner…

808 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question