I'm planning on buying a new laptop soon and have been going through some of the threads regarding the HDMI port and I am a bit confused.
I would like to be able to take a 1080p/5.1 BD or .mkv on my laptop and connect it to my AV receiver via HDMI which then outputs 1080p/5.1 to my HDTV/speakers. I have no idea what minimum video card/sound card/processor is required for this.
my finger is currently hovering over the BUY button of a great 1080p monitor at a great price. There is only ONE left in stock and I have a dilema!!! My specs are shown below.
Intel® Core™ 2 Duo Processor T8300 (2.4 GHz, 800 MHz FSB, 3 MB L2 Cache) 4096MB 667MHz Dual Channel DDR2 SDRAM [2x2048] 256MB ATI Mobility RADEON HD 3450 320GB (5.400rpm) SATA Hard Drive Genuine Windows Vista® Home Premium 64-bit Fixed Internal 8X DVD+/-RW Slot Load Drive including Software Primary 9-cell 85 WHr Lithium Ion battery 15.4" Wide Screen WXGA (1280 x 800) Display with TrueLife™
The other day I played a 720p mkv file via HDMI output to a tv, with no problems, picture/sound was fine.
I then tried 1080p output of another video file, via HDMI to the same tv. This tv has a native resolution of 1680 x 1050 and I know that this is not full HD (1080p) which is the resolution of the video I was trying to watch but if I recall the display settings on my laptop were set to 1080p... The video was unwatchable and choppy, pixelated and skipped a lot.
I then tried the same video via VGA with slightly better results, running this time at the natvie resolution of the tv (1680 x 1050). The video was less pixellated but still skipped, enough to make it unwatchable.
I have eliminated the cheap (Ł3.50!) HDMI cable as the VGA cable was also similar in quality.
My question is this:
Is this poor quality simply due to the fact that my GPU cannot handle pumping out 1080p via hdmi/vga output to a TV? (it plays fine on my laptop's native resolution of 1200x800, although clearly that is not 1080p so it seems that the GPU can handle it?) If the GPU cannot handle it then I am dissapointed as I assumed when purchasing a laptop with a HDMI port that I could output full HD videos to my televsion.
On the other hand if I purchase this new monitor that supports full 1080p HD will the video play ok? In other words, is it the my televison, and not my laptop that is the problem; surely the GPU works just as hard to play the video regardless of what the display output is?
So I was still confused about whether or not the SXPS Samsung SSDs would be able to accept the future firmware update (and thus be able to support TRIM). I contacted Dell's tech support via email and was told that the drives will support TRIM (though on 2 occasions he called it TRIP). I am not inspired by his lack of knowledge regarding the proper term and dont think that he is right. If I save the email would this help my case of getting a SSD from Dell in the future that does actually support TRIM?
I got my dv6500t with an 8400GS card to output to my 1080p HDTV but the problem is that the image that displays on the TV is only 90% of the full image as shown on my laptop.
This is via HDMI.
I'd like the images on both laptop and HDTV to match perfectly and I've tried every combination of resolution between the TV and the laptop yet I always get the same result: a fully displayed desktop on my laptop and a cropped desktop on my HDTV.
I have a studio xps 16 and whenever I plug it via HDMI to my tv (which has a 1080p resolution) there are black bars all around the image, meaning the image is smaller than the tv. I've set the resolution of the tv (from Catalyst Control Center) to 1920x1080 and it still won't do full screen...
I decided to try playing some 720p video on my Sanyo 42" LCD HDTV (native res 1366x768). I assumed it would be as easy as just connecting an HDMI cable and setting my m1530 to clone desktop.. I couldn't have been more wrong.
After doing some research it appears that LCD HDTVs aren't as flexible as regular LCD monitors with respect to HDMI input. No matter what I do, the screen always appears cropped around the edges on my TV.
What I've tried so far: 1) Setting a custom 1366x768 resolution in the nVidia control panel. 2) Tried setting my hdtv as primary monitor and setting the res. 3) Used Powerstrip (program) in an attempt to force a 1366x768 res (not sure if I'm doing this entirely correctly, I'm a little confused over what the correct advanced timings should be). 4) I hear this issue is called "overscan", so I looked through my TV menu for any overscan or zoom/image scaling options but couldn't find any.
I'm using the 179.28 Nvidia driver.
I'm running out of things to try, does anyone know how to make this thing work?
I have heard numerous comments stating that individuals are only getting Stereo Sound from their HDMI port but then other comments explaining that yes, indeed, it supports Dolby Digital....5.1 Audio etc.
The HDMI port is 1.2 which allows for a PCM 8 audio stream signal.
I have a receiver that allows for internal 7.1/5.1 Audio streams (all of the latest codecs can be decoded and processed) with HDMI output/input connections.
Will I be able to send a 5.1 PCM Audio stream to my AV Receiver and thereby utilizing the full range of audio?
Ive been using a 40 inch HDTV ever since I received my Z about a month ago. I had to move recently, and now when i attempt to use the display it looks very strange, almost as if someone turned the digital vibrance or contrast setting up all the way. The text looks strange, and it is overall unpleasant to look at. I checked all of the settings and everything is the same as before, and tried varying the resolution and color settings. I can't figure it out!
I can take a picture if that would help explain better. I also checked the TV's settings and I do not believe that is the issue.
I also tried reinstalling the NVIDIA drivers from sonys website to no avail.
I have a Sony Vaio FW laptop with a HDMI output, when i had vista i could plugin my hdmi output in to my TV hdmi input and immediatly see my computer display on my TV. As soon as i upgraded to Win 7 Ultimate 64 bit, when i plugin the hdmi cable nothing happens, i am unable to get my tv and laptop to notice each other, Win 7 doesnt seem to notice at all.
I got Windows 7 ultimate installed on my laptop (did a clean install). I am trying to get output from my HDMI port to the TV but when I plug the cable in my laptop screen goes black and there is nothing showing on the tv.
When I looked at the "Overview" on the Inspiron 1720 at Dell's site I noticed that there were no DVI/HDMI video output ports. Can someone please confirm this? Is the only monitor output a VGA port? If so I think this royally sucks! My e1705 comes with both VGA & DVI...
I'm having a problem with HDMI output from my dv7-3079wm to my 37" LG 1080p LCD. When I view the input on my TV, there is a black border around the image (like it is overscanned?). This happens at most resolutions, except for 1360x768, and I would like to watch Blurays in 1080p. So I'm not sure why it is doing this. I also tried this on a 42" Hitachi 720p TV with the same result (all resolutions) so I don't think that it is the TV. I couldn't find any settings in Catalyst Control Center also. My graphics card is an ATi Mobility Radeon 4650.
I can silence the noise lowering volume on the Stereo Mix High Definition Audio Codec. However, my AMD HDMI Output is greyed out and static, or as I understand from other complainers; NOT PLUGGED IN. What can I do to solve this problem. It is a problem for when I open the Stereo Mix High Depfinition Audio Codec, a shrieking noise leaves me depth, and it eats the volume all up.
My system: Windows 7HPÂ Lop Top - A6 - 3400M - APU with Quad Core Radeon HD Graphics - 6.0GB RAM -Â AMD Radeon HD 6520