Overclockaholics Forums

Overclockaholics Forums (http://www.overclockaholics.com/forums/index.php)
-   General Discussion (http://www.overclockaholics.com/forums/forumdisplay.php?f=2)
-   -   classified mobo (http://www.overclockaholics.com/forums/showthread.php?t=161)

skarface 03-18-2009 05:25 AM

classified mobo
 
Did everyone take a glance at the new mobo by evga? Its pretty interesting they have ginormous nb heat sink on it. The price is very high tho evga has it priced at 459.99. If the wife doesnt kill me i might try to switch to i7 and the works! I just hope the 1 grand and some to switch will be worth it.

Chuchnit 03-18-2009 05:30 AM

Honeslty bro you'd be better off with the regular evga X58 or your choice of another brand. With you running 295's it has been stated that you will take a performance hit in games because of the increased latency with the nf200 chip.

Kal-EL 03-18-2009 05:33 AM

I dunno bout the performance hit there chuckos, seems like you'll actually get 1fps better than non-nforce chip mobos.

Far as the standard x58, well, that mobo had a host of problems and you can tell from the step up cue that x58 standard owners wernt too happy. I'd wait and see what kinda numbers and complaints churn outa the x58 classified before plopping down the big dough. If things look great and you can afford it, then launch away :D

skarface 03-18-2009 05:37 AM

Quote:

Originally Posted by Chuchnit (Post 691)
Honeslty bro you'd be better off with the regular evga X58 or your choice of another brand. With you running 295's it has been stated that you will take a performance hit in games because of the increased latency with the nf200 chip.

So go with the standard x58? I like evga but i really like that x58 extreme gigabyte board. Know whats funny i just spent like 100 bucks for more water cooling stuff on my 780i ftw board. Damn technology they should slow down some i dont have the bank account to rebuild every 6 months!

Chuchnit 03-18-2009 05:37 AM

Honestly supes, I haven't even looked at a review of the board yet. I'm kinda going off how the P6T6 Revolution did and what Andre Yang and even Shamino stated themselves. I do remember seeing Andre and Sham say that you will get a small hit if running Quad SLI/XFIRE. For Tri SLI/XFIRE it's supposed to give the advantage. :confused:

skarface 03-18-2009 05:39 AM

Quote:

Originally Posted by Chuchnit (Post 695)
Honestly supes, I haven't even looked at a review of the board yet. I'm kinda going off how the P6T6 Revolution did and what Andre Yang and even Shamino stated themselves. I do remember seeing Andre and Sham say that you will get a small hit if running Quad SLI/XFIRE. For Tri SLI/XFIRE it's supposed to give the advantage. :confused:

More advantge with 3 gpus vs 4? So are the vids passing up the technology to back them????

Kal-EL 03-18-2009 05:45 AM

Quote:

Originally Posted by skarface (Post 697)
More advantge with 3 gpus vs 4? So are the vids passing up the technology to back them????

Quad SLI has been a flop since the first iteration of quad sli back in the 7 series gpu's I think it was. Its a driver problem that they've never really solved.

Gotta remember how sli works, SLI splits the on screen dutys in half while tri-sli splits it up three times and quad sli well, four times. Thats alot of paralell syncronized processing going on. I'm not an expert but I'd reckon this is where the driver struggles in its dutys.

Honestly, the performance advantages between sli and trisli vary from gen to gen. It all depends on if you're overclocking and how much. For gaming, it depends heavily on the game playin nice with the drivers as well.

skarface 03-18-2009 05:49 AM

Quote:

Originally Posted by Kal-EL (Post 699)
Quad SLI has been a flop since the first iteration of quad sli back in the 7 series gpu's I think it was. Its a driver problem that they've never really solved.

Gotta remember how sli works, SLI splits the on screen dutys in half while tri-sli splits it up three times and quad sli well, four times. Thats alot of paralell syncronized processing going on. I'm not an expert but I'd reckon this is where the driver struggles in its dutys.

Honestly, the performance advantages between sli and trisli vary from gen to gen. It all depends on if you're overclocking and how much. For gaming, it depends heavily on the game playin nice with the drivers as well.

Makes me wonder if they should be work on a firmware update vs newer cards and drivers!

Kal-EL 03-18-2009 05:51 AM

Quote:

Originally Posted by skarface (Post 700)
Makes me wonder if they should be work on a firmware update vs newer cards and drivers!

Well, then they would stop making money altogether as the 8800ultras would still be the top performers of the day ;)

Nah, they gonna stick it to us every 3-6 months like usual and play catch up with the drivers as they prepare for the next launch.

skarface 03-18-2009 05:58 AM

Quote:

Originally Posted by Kal-EL (Post 702)
Well, then they would stop making money altogether as the 8800ultras would still be the top performers of the day ;)

Nah, they gonna stick it to us every 3-6 months like usual and play catch up with the drivers as they prepare for the next launch.

Yea im begining to wonder if i made a big mistake switching from the tri 280s to the 295s. So far only difference is ive lost some frames in some games and picked up in others. But now i cant play crysis at all! freakin crysis:argh::argh::argh:

Kal-EL 03-18-2009 06:13 AM

Quote:

Originally Posted by skarface (Post 706)
Yea im begining to wonder if i made a big mistake switching from the tri 280s to the 295s. So far only difference is ive lost some frames in some games and picked up in others. But now i cant play crysis at all! freakin crysis:argh::argh::argh:

Usually the single card "dual pcb" setups perform really well but when paired with a second nvidia x2 card, things get squirly. I'd overclock one of the cards and see about gaming performance. With my 9800gx2's I got a nice overclock and better game performance from running just a single card as opposed to quad slo.

You could always sell off the second card to pay for a nice waterblock, wc loop.

Incidentally, the ati x2 cards don't seem to suffer the quad sli blues.

skarface 03-18-2009 06:26 AM

Quote:

Originally Posted by Kal-EL (Post 708)
Usually the single card "dual pcb" setups perform really well but when paired with a second nvidia x2 card, things get squirly. I'd overclock one of the cards and see about gaming performance. With my 9800gx2's I got a nice overclock and better game performance from running just a single card as opposed to quad slo.

You could always sell off the second card to pay for a nice waterblock, wc loop.

Incidentally, the ati x2 cards don't seem to suffer the quad sli blues.

I have had a spare system for the vid i just needed the blocks i have one and waiting on the second then ill put both blocks on the 295s at the same time. The only issue i have is i got the same performace out of crysis on one card as having both enabled??????:confused:

Kal-EL 03-18-2009 06:31 AM

Quote:

Originally Posted by skarface (Post 709)
I have had a spare system for the vid i just needed the blocks i have one and waiting on the second then ill put both blocks on the 295s at the same time. The only issue i have is i got the same performace out of crysis on one card as having both enabled??????:confused:

Are you sure quad sli was enabled? You would think performance would be a little better, but then again if the drivers suck, then you get the big suck. :rofl

DrNip 03-18-2009 06:33 AM

I personally stepped outside the EVGA box and went with the DFI LanParty UT X58-T3EH8 as that was the mobo most used to put up big #'s at HWBot.

skarface 03-18-2009 06:41 AM

Quote:

Originally Posted by Kal-EL (Post 710)
Are you sure quad sli was enabled? You would think performance would be a little better, but then again if the drivers suck, then you get the big suck. :rofl

:rofl:rofl:rofl

hellcamino 03-18-2009 08:22 AM

Quote:

Originally Posted by Kal-EL (Post 710)
Are you sure quad sli was enabled? You would think performance would be a little better, but then again if the drivers suck, then you get the big suck. :rofl


Crysis doesn't scale past 2 gpu's the last I knew, an extremely poorly coded game and since EA was the vendor I never considered buying it.

Kal-EL is right on CrossfireX scaling well to 4 gpu's though, in games that are multi-threaded such as the COD series they scale very well with no latency hit as far as I can tell.

Chuchnit 03-18-2009 08:31 AM

Quote:

Originally Posted by hellcamino (Post 719)
Crysis doesn't scale past 2 gpu's the last I knew, an extremely poorly coded game and since EA was the vendor I never considered buying it.

Kal-EL is right on CrossfireX scaling well to 4 gpu's though, in games that are multi-threaded such as the COD series they scale very well with no latency hit as far as I can tell.

Well I only have 2 gpus, but I am amazed by the scaling I have seen in COD WaW. I haven't tried in XP to do a real comparision, but the framerates go far beyond my expectations in vista.

skarface 03-18-2009 09:07 AM

Quote:

Originally Posted by hellcamino (Post 719)
Crysis doesn't scale past 2 gpu's the last I knew, an extremely poorly coded game and since EA was the vendor I never considered buying it.

Kal-EL is right on CrossfireX scaling well to 4 gpu's though, in games that are multi-threaded such as the COD series they scale very well with no latency hit as far as I can tell.

Well with my tri 280s i was getting 71 fps and high 50's average! Chuchnit and i played alot with the setting and anything past 2aa is kill for my rig im thinking ram issues or drivers.

hellcamino 03-18-2009 11:16 AM

Quote:

Originally Posted by skarface (Post 723)
Well with my tri 280s i was getting 71 fps and high 50's average! Chuchnit and i played alot with the setting and anything past 2aa is kill for my rig im thinking ram issues or drivers.

If you are speaking of Crysis that sounds about normal but if you are talking about COD 4-5 then something is either screwed up or SLi is just bad in general and not just limited to dual card SLi. I play at max settings with driver settings at max quality using 1920x1200 res, I limited my fps to 125 but was previously hitting the limit I had set at 250 and averaging around 200 fps.

skarface 03-18-2009 11:45 AM

Quote:

Originally Posted by hellcamino (Post 733)
If you are speaking of Crysis that sounds about normal but if you are talking about COD 4-5 then something is either screwed up or SLi is just bad in general and not just limited to dual card SLi. I play at max settings with driver settings at max quality using 1920x1200 res, I limited my fps to 125 but was previously hitting the limit I had set at 250 and averaging around 200 fps.

Lord no not cod lol we are talking about crysis the game is unplayable maxed out on my sli 295's but the game ran like butter on tri 280s!:confused:

hellcamino 03-18-2009 12:12 PM

Crysis<cat poop

skarface 03-19-2009 07:08 AM

Quote:

Originally Posted by hellcamino (Post 737)
Crysis<cat poop

i like :poop:

Russianhaxor 03-19-2009 07:19 AM

Crysis was a good game, it was AWFULLY coded though. Otherwise that game made me cream my pants when i first played the beta and it raped my entire computer.


All times are GMT -10. The time now is 02:14 PM.


Copyright ©2009 Overclockaholics.com