What's new

directx 10

gokuss4

Meh...
I heard good (and bad) things about Microsoft's new API. What really kinda sucks is it won't come out for windows XP, making anyone who wants Directx 10 support for upcoming games such as Crysis, Flight Simulator X (which will come out with Directx 9, then later ported to directx 10), Hellgate London, and Age of Conan (same plan as Flight Simulator X) to have to get Windows Vista. They did this for Windows Vista because otherwise they'd have to take some things out of Directx 10. It looks like now the only reason anyone would get Windows Vista is for directx 10 and Halo 2.

Basically Directx 10 makes it easier for game developers in programming, and it allows for more things to be added in. Directx 10 effects could've been implemented in DX 9, but it would've done a HUGE performance hit. The geometry shader and what not makes things a shitload faster. The bad thing about all this is.. more power usage, and hotter cards.

Damn DX10 video cards are expected to take up 300watts alone, and now PSU makers are making 1kw power supplies! I wonder how hot your system will get.. jeez. I hope they'll come up with a better way to fix this issue.

Discuss

Here's a link to some more info. http://www.extremetech.com/article2/0,1697,1989806,00.asp
 

BlueFalcon7

New member
300 watt GPUs... Well, theres only one thing I can be certian on, when I have a 300 watt GPU, and that is heat, power, and cooling. I dont see how a coolant system would take over 20 Watts, so I will be waiting to see what I can get with a 300 Watt GPU...
 

Hexidecimal

Emutalk Bounty Hunter.
Um... 300 watts isn't really that big of a deal. Most current nVidia GPUs already recommend you have at least a 400 watt supply because under load the 6800 - 7900 gpus suck up about 285, depending on what game you're playing. A 15 watt power jump is acceptable in my book for the shear ammount of shit DX10 does, not to mention the fact that the next generation of both AMD and Intel chips (Conroe and K8L) are down to the 65nm contruction, therefore requiring less heat and power. It balances out.
 

t0rek

Wilson's Friend
Hexidecimal said:
Um... 300 watts isn't really that big of a deal. Most current nVidia GPUs already recommend you have at least a 400 watt supply because under load the 6800 - 7900 gpus suck up about 285, depending on what game you're playing. A 15 watt power jump is acceptable in my book for the shear ammount of shit DX10 does, not to mention the fact that the next generation of both AMD and Intel chips (Conroe and K8L) are down to the 65nm contruction, therefore requiring less heat and power. It balances out.

Gokus4 mean 300W sucked just by the videocard alone, not including the whole rig
 

BlueFalcon7

New member
My computer was built by some yahoo at dell, before I knew how to build computers. So I dont exactly know how much power my computer has. But on average, how much power does the CPU and GPU use?
 

t0rek

Wilson's Friend
There's no average, it depends what videocard and CPU are you using. For example that netburst P4 you are using drains a lot of power indeed. But your 5200 doesn't suck too much
 

Hexidecimal

Emutalk Bounty Hunter.
t0rek said:
Gokus4 mean 300W sucked just by the videocard alone, not including the whole rig

So did I, at full load, a GeForce 6800GT will pull a good 285watts of power, anything after the 6800 line pulls a lot of juice. The fact that the Geforce 8 will pull 300 doesn't bother me, since the next line of processors will use less.
 

TerraPhantm

New member
That can't be right.... the X1900s are known to take the most power, and the highest anyone got with a single card was ~180W consumption from the card alone.
 

Doomulation

?????????????????????????
The Athlon64 FX consumes about a 115 watt of power under load I think. And it's the most power hungry cpu.
 

TerraPhantm

New member
I was taking into account overclocking and voltage adjustments on those cards, which can drastically effect the power draw.
 
OP
G

gokuss4

Meh...
TerraPhantm said:
I was taking into account overclocking and voltage adjustments on those cards, which can drastically effect the power draw.

Well I'm talking about factory defaults. The new cards will take up 300w factory defaults. Huge difference still don't you think?
 

Cyberman

Moderator
Moderator
Well I guess a lot of game companies are going to have poor results in there sales if they require windows Vista to run there directX10 software. Normally game companies are conservative what they bet there life on. This might be the begining of a few failures.
As for cards, ehhh. What is the deal with the power usage? This is not exactly going for 'green' computing by any means.
Although it may allow people to abuse the card to do some interesting things such as large amounts of number crunching for various things. Might be interesting to use it for all sorts of things. HMMM.

Cyb
 
OP
G

gokuss4

Meh...
Cyberman said:
Well I guess a lot of game companies are going to have poor results in there sales if they require windows Vista to run there directX10 software. Normally game companies are conservative what they bet there life on. This might be the begining of a few failures.
As for cards, ehhh. What is the deal with the power usage? This is not exactly going for 'green' computing by any means.
Although it may allow people to abuse the card to do some interesting things such as large amounts of number crunching for various things. Might be interesting to use it for all sorts of things. HMMM.

Cyb

The problem is M$ is forcing everyone to do things by their rules, and make things according to what they release. Who knows, apple could catch up during this time. At this moment, I'm not really demanding for more graphical demanding games, I'm demanding for games that have a good fucking storyline and plot. I don't need better graphics. So they can take their time on directx 10, and making directx 10 cards. What they need to do is make less demanding power and heat video cards. I'm actually really happy with the CPU race at this moment, Intel is finally getting smart, and I can't wait to see what AMD has in store for the future. I'm actually more interested in the CPU industry rather than the GPU industry for the first time.
 

ShizZy

Emulator Developer
sds
So did I, at full load, a GeForce 6800GT will pull a good 285watts of power, anything after the 6800 line pulls a lot of juice. The fact that the Geforce 8 will pull 300 doesn't bother me, since the next line of processors will use less.
It's not quite 285, quite a bit less actually. If that were the case, my two 120gb harddrives, mobo, Athlon64, fans, dvd burner, sound card etc etc would all be running under 75 watts, assuming I was completely maxing my psu, which I'm not :) (Though I probably should be running off of something a bit bigger than 350)
 

Top