To start I should remind everyone that i'm not a mac user. I have never owned a mac or used OSX properly before today. As such, my opinions should be taken as coming from someone who has much knowledge and experience as far as PC's are concerned, but who is a total mac n00b.
When I first walked over to the Macbooks I was quite taken aback. The build quality is out of this world. In fact, I immediately walked over to the latest PC laptops to compare and immediately thought "manufacturing FAIL". Really, the Macbooks are italian sports cars, and PC's are 30 year old farm tractors to use the typical "car analogy".
I really only had any interest in the Macbook when I first went to the Apple stand, but came away with the intention of buying a MacBook Pro. Let me explain:
The MacBook's build quality is awesome. It's solid, no creaking plastic or flexable build, completely and utterly solid. Awesome.............
I have a Dell precision M90 that the original video card died in
I managed to get a FX3600 for a good price
The card does fit in the notebook with a small case mod (break off one plastic tab) and your must install the led cable first as a few pins actually sit underneath the card
The card itself works great i installed the drivers for the M6300 and it works just fine.
The only problem is the M90 bios will not recognize the card It just shows up as unknown video card in the system bios So because of this the LCD dimming controls do not function and the LCD is only at about 60% brightness which is a bit of a pain
However on external monitors it works just fine
Does anyone know a way to get the M90 system bios to see the card properly I sure would like to be able to turn up the brightness on my lcd
I have a 8600gt with the 176.44 driver and I got 4802 marks in 3dmark06. I was wondering if that was better than the other drivers such as 175.97. (I was using the Zalman NC-1000 laptop cooler while running 3dmark06)
even though this computer is hot to the touch and she is having a few issues with it i figure id put it threw its paces for you guys. id just like to mention before hand that we have the WLED screen (1366x768)
out of the box running with dell's 9.4 drivers it ran a 5060 in 3dmark 06 i ended up having to down grade the drivers to dell's 9.2 because i was unable to over clock with the 9.4's. however even with the 9.2 the only program that would work is AMD GPU clock tool. i was able to test it up to 749 mhz core / 945 mhz mem. it works like a charm, no artifacting or anything. with just one run threw 3dmark 06 it brought the temp up to 76C. after an 45min of looping the test the temps peaked and leveled out at 83C. it idles around 60-63C at these clocks (stock idle is about 8C cooler)
i have yet to test if its stable at higher clock then these. i feel as though it still has a small amount of room left, maybe another 20 mhz on ether . i haven't had time to test this out yet though. im probably going to try undervolting the CPU and see if that brings down the temps any. ill report back with the results from both the CPU and GPU when i get a chance to do it all.
this is a spiffy little computer, im sure that once we resolve the screen and power supply issues with dell that my wife will be happy with the unit.
If you own an Intel X-25M SSD, please post your benchmark score(s). I am having an issue with my 4K read/write. It seems as tho they are 50% less than what they should be. Nevertheless, post your benchmarks. I want to know if any other DELL owners experiencing the same problem as i do..
I would have posted this in the Hardware SSD Thread, but that thread is just out of control. I am mostly interested in the Random 4K reads/writes and this is probably the most important attribute in everyday computing for regular users like us.
on a side note, can someone tell me why my 4K reads/writes are so much slower than what they are supposed to be?
After spending a great deal of time reading through various forums I've arrived here in hopes of getting Fallout 3 to run properly on my M6300.
Whenever run the game it seems to wind up having many graphic bugs (improper texture mapping etc.).
I believe the issue lies with the fact that the M6300 uses a Quadro card which isn't optimized for game playing--though I would appreciate it if I could play the game on any level, albeit the lowest possible settings.
I'm just selling my Inspiron 9400 on eBay and considering to buy a M6300 instead. It seems to be a powerful Laptop and a worthy upgrade.
1. Is it worth to buy 3 GB of RAM instead of 2 GB? I mean, is there ANY performance boost (for gaming)? Or will upcoming titles be probable of making use of more than 2 GB? I will be using Win XP btw...
2. Is the C2D T8300 (2.4GHz,3MB L2 Cache, 800MHz FSB) a good choice? Or is the T9300 (2.5GHz) THAT much better (in terms of gaming performance)?
3. Is the 1440x900 non-glossy screen any good? Maybe some comparison to the one I used before (sharp/1920x1200/glossy)?
4. My set of choice will cost around 1550 euro (without the VAT Tax) and be something like:
- T8300 (2,4 GHz) - FX 3600 - 200 GB HDD (7200 rpm) - 1440x900 non-glossy - 3 years guarantee
Is there any 15"/17" laptop from dell with a similar graphic performance at a similar (or cheaper) price?
5. Is it really worth to pay the 350 euro more to get the fx 3600 instead of the fx 1600?
As far as I can see it the only option with the possibility of docking and decent graphics performance are the m90/m6300 (regardless of manufacture). Has anyone heard news of a 8800 equivalent card for the m6300? I am duo to replace my m90 and believe the docking to be very convenient. Why doesn't the 5793 or m15x come with a docking option!
i recently saw some cheap mobile penrynn processors on ebay. so i think about upgrading my cpu in the future (they get cheaper from day to day) .
but i have the following questions before further considering that:
1. does changing the cpu on your own void your dell warranty in any way? 2. is it difficult to change it (never opened my laptop before)? 3. can i use ALL of the processors listed on the bottom of this post? (as the x9000 has 44 watt instead of the others' 35 watt) 4. is it just plug & play /stick the new cpu inside and it works?
how to interpret these benchmarks. what they mean...why do the first and second benchmark graphs differ so much? hp pavilion g60-120us amd dual core 2ghz, 3gb ram, 250gb hd, geforce 8200m vista home premium. running firefox with multiple tabs open, majic jack-internet phone, virus and spyware.
What's a good program for OSX to benchmark a SSD? Also, does anyone know if there's a similar program like Intel Matrix Storage Manager that will let me view all of a SSD drive properties (make/model/firmware/etc.) in OSX?
Unable to run bootcamp, or any xp/vista/7 on my friend's computer.
I'm not looking to debate the geforce vs quadro performance nor do I want people to tell me to wait for the 9m series cards, just a couple quick questions.
Does the m1730 have a docking port? From what I can find, I'm thinking it does not. For that matter, does the m6300 have one? I think it does...
Can an 8800gtx from an m1730 be put into an m6300? If not, I'll be fine with the 3600m, but I prefer geforce cards.
If yes, where can you get an 8800gtx and how much do they cost? I've seen people saying anything from over $2000 to $700.... which is quite a difference.
I guess while I'm at it, are there any other huge issues that I've maybe overlooked? I'm thinking about things like the huge LCD problems that everyone experienced with light leakage and whatnot on the m170/9300 series and even overflowing into the next gen stuff iirc.
I have installed a FX 2500M (rev A02) on a M6300 system board A10 Bios, the card is working but its clock is fixed to 100MHz (100MHz core + 100MHz mem).
FX 2500m doesn't switch to 3D Low mode (200/300) or 3D Performance mode (500/600). It always remains on 2D Mode (100/100)
Off Course the same card works correctly on M90 System Board.
I have tried many drivers, DELL version and generic version by laptopvideo2go, but same results (always fixed to 100MHz).
I have tried Rivatuner to force clock switch. Same results No Switch
I have even installed a new OS (Vista 64) and with it, I have a strange situation.
With Vista64 (no SP1) original driver supplied by MS (and based, I suppose, on forceware 96.xx) the card seems to works at high clock.
No information tools can be used (with MS Driver) so I can't see which clock are active but 3D performance (games and benchmark) are near the performance achieved with M90.
But once I update the driver by installing a recent version supplied by Nvidia (Dell version or generic is the same), the card blocks to 100MHz even under Vista64.
Unfortunately with MS Driver there are many graphics error and it isn't so usable with 3D application (i.e. games) and furthermore I want to use the card with XP32 (Vista64 has been installed only for tests).
Well i can usually fix my laptop problems but this one has me very surprised and i wanted to ask for opinions before doing somehting drastic (like changing the screen)
But alf m laptop screen goes dim, if this were the only thing the anwser would be obvious (broken screen change it) but the thing fixes when it goes into battery and slowly comes back when its conected again
I has virtually the same brightness on battery mode than on conected mode so the only real thing conecting her does its dimming alf the screen.
Why would it work while on battery and not conected? and why only alf screen?
what tools can we use for benchmarking Apple/mac systems?
So far OpenMark is pretty much dead. I use XBench at the moment, but the result presentation is very much in text mode. Would be nice if we have something like Passmark Performance test tool of Windows.