Photos Part 3: The Card in Detail
First up, off with the cooler!
Everything is making good contact, I’m very happy to see the MOSFETs getting some serious cooling. Even the MOSFET drivers get active cooling.
Almost a reference card, but not quite. The reference uses smaller inductors for the memory power, as well as not extending beyond the white line on the right side there. It looks like gigabyte used that extra area mostly for R&D bits.
What you’re supposed to do with all those is beyond me. The stuff on the right controls the fan speed.
For core power we have this controller. Four phases, datasheet available, this thing is begging for a voltmod.
Here on the other side of the PCB we have the four core phases, each with one high side MOSFET and two low side MOSFETs. The memory power is on the right, the small square bit ringed with pins and SMD bits is the memory power controller. It has two built in drivers that run the two 2-in-1 MOSFETs to the north there. Each one of those packages has two MOSFETs in it, making for a very compact power operation.
Here’s an angled shot that just barely shows the uP1605 controller chip’s markings. No datasheet is available for this one. None of these chips have very clear markings.
Eight of these GDDR5 chips give us the 2GB of ram. They’re rated for the stock speed of 1500MHz (6000MHz effective). I’m hoping they’ll overclock a bit.
Lastly we have the heart of the card, the GK104 core.
It is worth noting that this is the exact same core that the GTX670 uses, the only meaningful difference is in the default core speed and the memory bus. The GTX670 has a 256bit bus compared to the GTX660s 192bit. Nvidia seems to have arranged the memory quite strangely to force the 192bit bus with 2GB of ram. Four chips each have their own 32bit channel, the last four chips are split into two pairs of two that each share a 32bit channel. This is a pretty artificial limitation, as the PCB still has two more 32bit channels that could be populated for less than a dollar worth of SMD components.
On the other hand if Nvidia wants to save $1 in parts and cut $100 off the price I am certainly not going to argue!