Test Setup

abit Fatality F-I90HD / ASRock 4Core1333-FullHD Testbed
Processor Intel Pentium (Core 2 Based) E2160
Dual Core, 1.8GHz, 1MB Unified Cache, 9x Multiplier, 800FSB
CPU Voltage 1.3250V
Cooling Scythe Ninja Mini
Power Supply SeaSonic S-12 II 480W
Memory OCZ HPC Reaper PC2-6400 (4x1GB)
Memory Settings 4-4-4-12 (2.00V abit/2.04V ASRock)
Video Cards On-Board X1250, Gigabyte HD 2600XT, Galaxy 8600GTS HDMI
Video Drivers AMD 7.8, NVIDIA 163.44
Hard Drive Seagate DB35.3 7200RPM 750GB SATA 3/Gbps 16MB Buffer
Western Digital 74GB 10, 000RPM SATA 16MB Buffer (2 for RAID testing)
Optical Drives Plextor PX-B900A, Toshiba SD-H802A, Pioneer BDC-S02BK
Audio Card Realtek ALC-888, ASUS Xonar D2
Audio Drivers Realtek 1.73, ASUS 5.12.01.0008.17.19
Audio Test Equipment Swans M10 (2.1), Swans D1080 (2.0), Acculine A2 (5.1)
Onkyo TX-SR605 A/V Receiver
Case Zalman HD160XT
BIOS abit 1.4, ASRock 1.30C
Operating System Windows Vista Home Premium 32-bit
.

MSI G33M Testbed
Processor Intel Pentium (Core 2 Based) E2160
Dual Core, 1.8GHz, 1MB Unified Cache, 9x Multiplier, 800FSB
CPU Voltage 1.3250V
Cooling Scythe Ninja Mini
Power Supply SeaSonic S-12 II 480W
Memory OCZ HPC Reaper PC2-6400 (4x1GB)
Memory Settings 5-5-5-12 (2.0V)
Video Cards On-board GMA3100, Gigabyte HD 2600XT, Galaxy 8600GTS HDMI
Video Drivers Intel 15.4, AMD 7.8, NVIDIA 163.44
Hard Drive Seagate DB35.3 7200RPM 750GB SATA 3/Gbps 16MB Buffer
Optical Drives Plextor PX-B900A, Toshiba SD-H802A, Pioneer BDC-S02BK
Audio Card Realtek ALC-888, ASUS Xonar D2
Audio Drivers Realtek 1.73, ASUS 5.12.01.0008.17.19
Audio Test Equipment Swans M10 (2.1), Swans D1080 (2.0), Acculine A2 (5.1)
Onkyo TX-SR605 A/V Receiver
Case Zalman HD160XT
BIOS v1.00
Operating System Windows Vista Home Premium 32-bit
.

We selected the Intel E2160 Core 2 Duo processor as our main choice for the Intel platform boards since it represents a great bargain when comparing price against performance in the low end of the market where we will concentrate our uATX review efforts. We also switched to Microsoft Vista Home Premium 32-bit as our operating system of choice for this category. After speaking with several of the larger OEMs, we found out this OS choice is the one most widely offered to consumers. It was a natural then that we would test on Vista Home Premium with a 4GB memory configuration due to rapidly falling memory prices. Even though Vista 32-bit cannot take advantage of the entire 4GB of memory address space, we found the additional 1.2GB (on average) of memory available provided improved performance during multitasking events and gaming. We would not recommend anything less than 2GB with Vista Home Premium.

Our hard drive choice is a little out of the norm but since we will be testing the multimedia capabilities of our boards in an HTPC article we felt like the PVR designed drive would be a natural fit. Our OCZ memory choice was determined based upon a combination of price and performance levels that will be required during the overclocking testing with the higher end G33 boards. We did test each board with a wide variety of budget DDR2-800 memory from several suppliers that will be listed in our compatibility charts at the end of this article series. Our boards were set to utilize 256MB of memory for the IGP solution. The ASRock board supported up to 512MB of shared memory but all of the test results were the same so we left the setting at 256MB.

We will also present GPU comparison testing using external video cards from AMD and NVIDIA. Our results today will include video and gaming performance results with the AMD HD 2600 XT from Gigabyte and NVIDIA 8600 GTS card from Galaxy. All other components in our test configurations are identical with the boards being set up in their default configurations except for memory settings being optimized to ensure maximum throughput on each board. We will cover image quality analysis, audio, installation, and peripheral components in detail in separate articles.

Our choice of software applications to test is based on programs that enjoy widespread use and produce repeatable and consistent results during testing. Microsoft Vista has thrown a monkey wrench into testing as the aggressive nature of the operating system to constantly optimize application loading and retrieval from memory or the storage system presents some interesting obstacles. This along with the lack of driver maturity will continue to present problems in the near future with benchmark selections.

Our normal process is to change our power settings to performance, delete the contents of the prefetch folder, and then reboot after each benchmark run. This is a lengthy process but it results in consistency over the course of benchmark testing. All applications are run with administer privileges.

Memory Testing Synthetic Graphics Performance
Comments Locked

22 Comments

View All Comments

  • Sargo - Tuesday, August 28, 2007 - link

    Nice review but there's no X3100 on Intel G33. http://en.wikipedia.org/wiki/Intel_GMA#GMA_3100">GMA 3100 is based on much older arhitechture. Thus even the new drivers won't help that much.
  • ltcommanderdata - Tuesday, August 28, 2007 - link

    Exactly. The G33 was never intended to replace the G965 chipset, it replaces the 945G chipset and the GMA 950. The G33's IGP is not the GMA X3100 but the GMA 3100 (no "X") and the IGP is virtually identical to the GMA 950 but with higher clock speeds and better video support. The GMA 950, GMA 3000, and GMA 3100 all only have SM2.0 pixel shaders with no vertex shaders and no hardware T&L engine. The G965 and the GMA X3000 remains the top Intel IGP until the launch of the G35 and GMA X3500. I can't believe Anandtech made such an obvious mistake, but I have to admit Intel isn't helping matters with there ever expanding portfolio of IGPs.

    Here's Intel's nice PR chart explaining the different IGPs:

    http://download.intel.com/products/graphics/intel_...">http://download.intel.com/products/graphics/intel_...

    Could you please run a review with the G965 chipset and the GMA X3100 using XP and the latest 14.31 drivers? They are now out of beta and Intel claims full DX9.0c SM3.0 hardware acceleration. I would love to see the GMA X3000 compared with the common GMA 950 (also supported in the 14.31 drivers although it has no VS to activate), the Xpress X1250, the GeForce 6150 or 7050, and some low-end GPUs like the X1300 or HD 2400. A comparison between the 14.31 and previous 14.29 drivers that had no hardware support would also show how much things have increased.
  • JarredWalton - Tuesday, August 28, 2007 - link

    I did look at gaming performance under Vista with a 965GM chipset in the http://www.anandtech.com/mobile/showdoc.aspx?i=306...">PC Club ENP660 review. However, that was also tested under Vista. I would assume that with drivers working in all respects, GMA X3100 performance would probably come close to that of the Radeon Xpress 1250, but when will the drivers truly reach that point? In the end, IGP is still only sufficient for playing with all the details turned down at 1280x800 or lower resolutions, at least in recent titles. Often it can't even do that, and 800x600 might be required. Want to play games at all? Just spend the $120 on something like an 8600 GT.
  • IntelUser2000 - Wednesday, August 29, 2007 - link

    quote:

    I did look at gaming performance under Vista with a 965GM chipset in the PC Club ENP660 review. However, that was also tested under Vista. I would assume that with drivers working in all respects, GMA X3100 performance would probably come close to that of the Radeon Xpress 1250, but when will the drivers truly reach that point? In the end, IGP is still only sufficient for playing with all the details turned down at 1280x800 or lower resolutions, at least in recent titles. Often it can't even do that, and 800x600 might be required. Want to play games at all? Just spend the $120 on something like an 8600 GT.


    It has the drivers at XP.
  • JarredWalton - Wednesday, August 29, 2007 - link

    Unless the XP drivers are somehow 100% faster (or more) than the last Vista drivers I tried, it still doesn't matter. Minimum details in Battlefield 2 at 800x600 got around 20 FPS. It was sort of playable, but nothing to write home about. Half-Life 2 engine stuff is still totally messed up on the chipset; it runs DX9 mode, but it gets <10 FPS regardless of resolution.
  • IntelUser2000 - Wednesday, August 29, 2007 - link

    I get 35-45 fps on the demo Single Player for the first 5 mins at 800x600 min. Didn't check more as its limited.

    E6600
    DG965WH
    14.31 production driver
    2x1GB DDR2-800
    WD360GD Raptor 36GB
    WinXP SP2
  • IntelUser2000 - Tuesday, September 11, 2007 - link

    Jarred, PLEASE PROVIDE THE DETAILS OF THE BENCHMARK/SETTINGS/PATCHES used for BF2 so I can provide equal testing as you have done on the Pt.1 article.

    Like:
    -What version of BF2 used
    -What demos are supposed to be used
    -How do I load up the demos
    -etc
  • R101 - Tuesday, August 28, 2007 - link

    Just for the fun of it, for us to see what can X3100 do with these new betas. I've been looking for that test since those drivers came out, and still nothing.

  • erwos - Tuesday, August 28, 2007 - link

    I'm looking forward to seeing the benchmarks on the G35 motherboards (which I'm sure won't be in this series). The X3500 really does seem to have a promising feature set, at least on paper.
  • Lonyo - Tuesday, August 28, 2007 - link

    quote:

    This is not to say any of the AMD and NVIDIA IGP solutions are that much better; they are in many ways, but without earnest competition from Intel these solutions do just enough to stay ahead of Intel. However, at least these solutions provide a much higher degree of compatibility and performance with most games, video playback, and applications. While running the latest games such as Bioshock or Supreme Commander will require a resolution of 800x600 with medium-low quality settings, at least a user has the chance to play the game until they can afford a better performing video solution.


    quote:

    the R4x0 series fits the bill with its lack of SM3.0 support and use of 24-bit floating point precision. The basic design for the X1250 is taken from the X700, with some modifications. While we would love to see Shader Model 3.0 support (which current Intel hardware claims to be capable of in XP with the latest drivers), developers writing DX9 apps will still be designing for the SM2.0 target which the X1250 meets.



    Bioshock requires SM3.0.

Log in

Don't have an account? Sign up now