Dell :: Does The Inspiron 1720 Come With A DVI Or HDMI Video Output Option
Aug 11, 2007
When I looked at the "Overview" on the Inspiron 1720 at Dell's site I noticed that there were no DVI/HDMI video output ports. Can someone please confirm this? Is the only monitor output a VGA port? If so I think this royally sucks! My e1705 comes with both VGA & DVI...
I've got an Acer v5-571P-6866 that won't output video to either the LCD, or out through HDMI. It powers on, doesn't seem to have any issue booting. My first thought was the motherboard, but I've tried a second and it still has the same issue.
HP ENVY 15-j165 hdmi output problem - tv recognised in video and audio setup BUT external screen blank .... Ā It even mentions the SONY-TV in the output settings (audio and video) but there is nothing on my tv, just a blank screen .... Ā I have tried re-installing drivers, getting frustrated ....
I have games such as COD4, Crysis, and Assassins Creed. What are the best drivers I can get for this laptop in order to play these games with good quality? I've done some reading and saw that it's possible to overclock the GPU.
Is this safe? Is it really worth it? I'd like to try and increase the quality. I've never really played Crysis or Assassins creed because it was a bit choppy.
I have since installed the T9300 processor and havn't tried crysis since as I replaced my HDD with a 7200 vs the 5400 that was in it.
i would like to upgrade my video card, if it is possible and if it is worth it. my 1720 has the mobile intel 965 express chipset family. ive seen sigs (dr650se and someformofhuman) stating that they have the nvidia 8600m gt and was wondering if this is compatible with my 1720.
also i found out through this forum that i can upgrade my cpu (which is the measly t5450 1.66 ghz) to the beastly t9300, and add a 2nd hard drive (which is my first step in upgrading this dell because i only have a 160 gb drive).
i was looking for a new laptop until i came across this forum and found out the possibilities of my dell. i decided to stick with this and just upgrade it! thanks for the insight, and thanks in advance for all the help that i will be asking for!
I have a studio xps 16 and whenever I plug it via HDMI to my tv (which has a 1080p resolution) there are black bars all around the image, meaning the image is smaller than the tv. I've set the resolution of the tv (from Catalyst Control Center) to 1920x1080 and it still won't do full screen...
I decided to try playing some 720p video on my Sanyo 42" LCD HDTV (native res 1366x768). I assumed it would be as easy as just connecting an HDMI cable and setting my m1530 to clone desktop.. I couldn't have been more wrong.
After doing some research it appears that LCD HDTVs aren't as flexible as regular LCD monitors with respect to HDMI input. No matter what I do, the screen always appears cropped around the edges on my TV.
What I've tried so far: 1) Setting a custom 1366x768 resolution in the nVidia control panel. 2) Tried setting my hdtv as primary monitor and setting the res. 3) Used Powerstrip (program) in an attempt to force a 1366x768 res (not sure if I'm doing this entirely correctly, I'm a little confused over what the correct advanced timings should be). 4) I hear this issue is called "overscan", so I looked through my TV menu for any overscan or zoom/image scaling options but couldn't find any.
I'm using the 179.28 Nvidia driver.
I'm running out of things to try, does anyone know how to make this thing work?
I have heard numerous comments stating that individuals are only getting Stereo Sound from their HDMI port but then other comments explaining that yes, indeed, it supports Dolby Digital....5.1 Audio etc.
The HDMI port is 1.2 which allows for a PCM 8 audio stream signal.
I have a receiver that allows for internal 7.1/5.1 Audio streams (all of the latest codecs can be decoded and processed) with HDMI output/input connections.
Will I be able to send a 5.1 PCM Audio stream to my AV Receiver and thereby utilizing the full range of audio?
Ive been using a 40 inch HDTV ever since I received my Z about a month ago. I had to move recently, and now when i attempt to use the display it looks very strange, almost as if someone turned the digital vibrance or contrast setting up all the way. The text looks strange, and it is overall unpleasant to look at. I checked all of the settings and everything is the same as before, and tried varying the resolution and color settings. I can't figure it out!
I can take a picture if that would help explain better. I also checked the TV's settings and I do not believe that is the issue.
I also tried reinstalling the NVIDIA drivers from sonys website to no avail.
I have a Sony Vaio FW laptop with a HDMI output, when i had vista i could plugin my hdmi output in to my TV hdmi input and immediatly see my computer display on my TV. As soon as i upgraded to Win 7 Ultimate 64 bit, when i plugin the hdmi cable nothing happens, i am unable to get my tv and laptop to notice each other, Win 7 doesnt seem to notice at all.
I got Windows 7 ultimate installed on my laptop (did a clean install). I am trying to get output from my HDMI port to the TV but when I plug the cable in my laptop screen goes black and there is nothing showing on the tv.
I'm planning on buying a new laptop soon and have been going through some of the threads regarding the HDMI port and I am a bit confused.
I would like to be able to take a 1080p/5.1 BD or .mkv on my laptop and connect it to my AV receiver via HDMI which then outputs 1080p/5.1 to my HDTV/speakers. I have no idea what minimum video card/sound card/processor is required for this.
I'm having a problem with HDMI output from my dv7-3079wm to my 37" LG 1080p LCD. When I view the input on my TV, there is a black border around the image (like it is overscanned?). This happens at most resolutions, except for 1360x768, and I would like to watch Blurays in 1080p. So I'm not sure why it is doing this. I also tried this on a 42" Hitachi 720p TV with the same result (all resolutions) so I don't think that it is the TV. I couldn't find any settings in Catalyst Control Center also. My graphics card is an ATi Mobility Radeon 4650.
I can silence the noise lowering volume on the Stereo Mix High Definition Audio Codec. However, my AMD HDMI Output is greyed out and static, or as I understand from other complainers; NOT PLUGGED IN. What can I do to solve this problem. It is a problem for when I open the Stereo Mix High Depfinition Audio Codec, a shrieking noise leaves me depth, and it eats the volume all up.
My system: Windows 7HPĀ Lop Top - A6 - 3400M - APU with Quad Core Radeon HD Graphics - 6.0GB RAM -Ā AMD Radeon HD 6520
I have a HP Notebook 2000, with Windows 8.1. At some point, the hdmi output to monitor stopped sending an audio signal. The video signal is fine.
It is labeled in Device Manager: 1-STD HDMI TV (AMD High Definition Audio) Ā The monitor and hdmi cable work fine with other computers. I have disabled/enabled the HDMI device, and also updated drivers (Windows says my drivers are up to date).
My U330P is running Windows 7 with the latest drivers for both the Intel HD video module and the Realtek audio chip. I would like to connect my machine to a TV via HDMI and have it both act both as the display as well as the audio playback device (i.e. TV speakers should blare, not the ones in my Ideapad). Ā The thing is, I don't have said TV yet and if what I want to do is not possible, I will not bother buying it. I already did look around in Control Panel -> Sound -> Playback in order to find out whether there's an option to set the HDMI port as a playback device but found nothing. I also have the playback devices menu display disconnected and deactivated devices, but the only thing showing is the built-in stereo speakers. The listings in my device manager don't suggest something like a "digital sound whatnot" either. Ā So in general whether the audio signal can be rerouted to the HDMI output connector on a U330P.