Not logged inOpenClonk Forum
Up Topic General / General / Use the GPU
1 2 Previous Next
- - By Maddin Date 2010-02-02 17:45 Edited 2010-02-02 20:04
As some may know that the processing power of modern videocards is really impressive and it can be used for non-visual stuff for a while now, so why shouldn't we make use of that?

Nvidia shows it with their PhysX-Physics-Engine (which, of course, only works on Nvidia-GPUs ;-) ) and there are some other examples as well I think. So GPU-processed Physics has been talked quite a while about now; here's my question: Couldn't Open Clonk have GPU processed physics? Wheren't you annoyed by the lag that occured, when you set huge amounts of water free? Could we have more advanced Physics in Clonk?
Here is a Techdemo from Nvidia, which shows GPU-based realistic water rendering:  http://www.youtube.com/watch?v=UYIPg8TEMmU
Note that according to Nvidia this techdemo is renderd on their new Fermi-GPU, which is a shader-monster (Shader-Units are used for this processing). But also note that these are huge amounts of 3D-particles, and we have only 2D.

Of course it would require some work on such a little GPU-water-physics-engine and i totally don't have any skills on that area but I think it's worth a try?

Edit: Another disadvantage I want to cover really quick is that not all the people of today have a graphicscard with good shaders and their computer's CPU might do even better than the GPU. To solve this problem, the old CPU water-physics could be kept for those people, who might be not a minority. Cool feature anyway. :p
Parent - - By Clonk-Karl [de] Date 2010-02-02 17:53
This is "nice to have" but in no way essential. I think there much more important things to do at the moment. Of course I won't stop anyone implementing this if they want to.

Note that we need to make sure that the computation needs to be exactly identical to the CPU implementation and to the one for other graphics card to preserve synchronization in network and records.
Reply
Parent - - By Maddin Date 2010-02-02 18:00
Well, there's OpenCL and that's what would be preferably used, I think. Noone would have to code a library that enables clonk to communicate with the GPU.

But I agree, there are more important things to do.

I'm just saying. ;)
Parent - By Maddin Date 2010-02-03 10:07
OpenCL isn't the way to go. GPU-Z says that my GPU doesn't support it (GF 9800 GT). But there's still the DirectCompute library. And that's supported.
Parent - - By Newton [de] Date 2010-02-02 17:58
Quite impressive demo
Parent - - By Maddin Date 2010-02-02 18:02
Well, and it runs with more than 100 FPS I think. Much room for all the different graphics processing.
Parent - - By AlteredARMOR [ua] Date 2010-02-02 18:10 Edited 2010-02-02 18:14
If you implement this kind of stuff for vital parts of the game you will probably loose a whole bunch of possible players that do not have the access to the current technology (what about ATI users, for instance?)
Reply
Parent - - By Maddin Date 2010-02-02 18:17
I wasn't talking about implementing PhysX (Anyone 10000$ for the license?), but using open GPU-communicating program libraries to use the GPU for water physics processing.
Parent - - By AlteredARMOR [ua] Date 2010-02-02 18:20
Yes, I know. But will the old graphics card support GPU-processing?
Reply
Parent - - By Maddin Date 2010-02-02 18:24 Edited 2010-02-02 18:29
No. But a graphics card from 2006/07 will do(?).
But as I said, the CPU could and must also be kept as physics-processing-unit (PGU, hooray! o.o ). And don't start complaining about lame CPUs, there are no other alternatives. x-)

I really don't know how to implement this since I'm not a programmer, so I ask the ones that probably know how to do this and make a suggstion. :)
Parent - - By AlteredARMOR [ua] Date 2010-02-02 18:33
Complaining is not an option in any case :-). I agree, this is a promising feature (which by itself is very discussable) but since it is not a core aspect of game development (and no one in particular is going to implement it) there is nothing to talk about. Maybe a bit later...
Reply
Parent - - By Maddin Date 2010-02-02 18:36
Maybe I'll do it. Can't be so hard. Okay, I guess it is. :D
Parent - By AlteredARMOR [ua] Date 2010-02-02 18:41
Well, I can only adore you if you do :-). Personally I am a complete n00b in 3D-development
Reply
Parent - - By knight_k [us] Date 2010-02-02 18:43

>(Anyone 10000$ for the license?)


PhysX SDK is free. What are you talking about? Getting access to the PhysX-Code? Would that be necessary?

see http://developer.nvidia.com/object/physx_downloads.html:

>How to access the Binary PhysX SDK
>
>The NVIDIA binary PhysX SDK is 100% free for both commercial and non-commercial use and is available for immediate download by registered PhysX developers. To become a registered PhysX Developer >please complete the registration form (steps provided below) on NVIDIA's PhysX Developers Website.

Reply
Parent - By Maddin Date 2010-02-02 18:51
I didn't knew that. But makes sense, makes more people use it. But implementing PhysX would be complete bullshit anyway; so there you go. :)
Parent - By Carli [de] Date 2010-02-03 09:15
…would not be synchronous
Parent - By Caesar [de] Date 2010-02-02 20:19
Nice to play around a bit with it, but not that impressive anymore if you have it around.
Parent - - By Sven2 [de] Date 2010-02-02 18:52
Problem with this is synchronization between network clients.
Parent - - By Maddin Date 2010-02-02 18:55
Could you explain that a little bit?
Parent - - By Sven2 [de] Date 2010-02-02 19:16
Game synchronization works by exchanging player commands only, then have the whole game calculated on all clients equally. This means that as soon as calculation runs differently on one client, the game runs out of sync between network clients, i.e. players in network don't see the same game any more.

If we start implementing acceleration on the GPU, we need to make sure that a fallback CPU implementation delivers the exact same output. That might be hard to achieve.
Parent - - By Maddin Date 2010-02-02 19:19
Maybe it would be an approach to force all players joining in on a GPU-accelerated game to turn on the option and make sure they all have the same shader-model or so.
Parent - - By AlteredARMOR [ua] Date 2010-02-02 19:28
Too complicted. It would be sort of sad if you can not join a game only because your videocard is somewhat older than other one's
Reply
Parent - - By Maddin Date 2010-02-02 19:59
Well, I am not sad when I can not join a game because a password is required or the player list is full. I'm sure there would be still enough games that don't have this option because the host doesn't has it also.
Parent - - By AlteredARMOR [ua] Date 2010-02-02 20:10
Then it leads us to the same issue we talked before: Some-Super-Cool-Game-Tweaking that is not so much of the discussion as long as other, more important stuff is still unimplemented. As for me, I like the current water behavior (though I admit there is a lot of space for various improvements). Ideas of tweaks like this one (GPU) will come up from time to time and it is great that anyone can implement some new features (hale to the Open Source concept). And the best way of doing things could be: "Hey guys! I've implemented a SuperMegaAwesome algorithm of water calculation which is available in branch #blahblahblah" - "Yeah man, it is totally amazing!". Only then we have something to talk about... :-)
But I admit that in most cases better first ask the community that wasting time on a feature that won't be popular.
Reply
Parent - - By Luchs [de] Date 2010-02-02 20:26

>I like the current water behavior


It's not only the behavior, but also the speed of the current implementation, which would probably be better when calculated by a GPU.

>But I admit that in most cases better first ask the community that wasting time on a feature that won't be popular.


I'm sure it will be popular - most games need good graphic cards, which support these features.
Parent - - By Maddin Date 2010-02-02 21:21
Yes, even Radeon HD 2000 Series have already ShaderModel 4.0. And the Geforce GT series also have that.
Parent - - By Carli [de] Date 2010-02-03 09:19
even if it was so, you can have two clients having different anti-aliasing options and the sync is gone.
Parent - - By AlteredARMOR [ua] Date 2010-02-03 09:20
Probably the speed of the game will have to stick to a lower-performance computer
Reply
Parent - By Carli [de] Date 2010-02-03 10:03
That's not what i talked about.

I talked about having different calculation results from different GPU hardware, GPU drivers or GPU settings.

A GPU is a blackbox and the results are not the same bit-for-bit but you can see the same picture. But this makes the network mode not synchronous
Parent - - By Maddin Date 2010-02-03 09:48
Anti-Aliasing? What are you talking about? AA is just a visual thing for smoothing jaggedies.
Parent - - By Carli [de] Date 2010-02-03 10:02
yes, there are some settings that can change the behaviour of computing of the GPU

we had allready discussed this before. with a GPU implementation you would kill any network sync mode.
Parent - - By Maddin Date 2010-02-03 10:42
At least i personally will try to do such a thing in a local game. I might need some help though.
Parent - Date 2010-02-03 11:20
Parent - By Maddin Date 2010-02-03 14:46
I just dug a little bit through the landscape-source-code. Some things I don't understand because of the C++ syntax. I will work through that in the next days and weeks and see what I can do there. JFF.
Parent - - By Luchs [de] Date 2010-02-03 22:08
There are games with PhysX and network mode - so how do they work?
Parent - - By Clonk-Karl [de] Date 2010-02-03 22:38
Their network mode works in a different way. They do not require each client to do the exact same computations.
Reply
Parent - - By Maddin Date 2010-02-04 11:22
Couldn't the host do the computation and then send it to the clients? Would that be too slow?
Parent - - By Kanibal [de] Date 2010-02-04 17:18
Well, if the host only has a onboard graphic-card?
Reply
Parent - By Maddin Date 2010-02-05 12:43
Then he wouldn't enable the GPU option.
Parent - By Ringwaul [ca] Date 2010-02-04 17:19
I wonder if that would cause host/client lag problems, like in Gears of War. (ie: Host advantage of no lag, while clients all lag)
Reply
Parent - By Günther [de] Date 2010-02-05 21:21
The control data alone can saturate slow connections if enough clients are in the game. Sending the game state only works for typical 3d games because they make heavy use of interpolation and prediction on the client side, and the host only sends the necessary data so that the client can render what it's player sees at the moment. That's typically a lot less dynamic stuff than is on the screen in a Clonk game: A few enemies, weapons and projectiles instead of hundreds of landscape pixels. So, while it might very well be possible to come up with a way to make this work for Clonk, it would be very, very difficult.
Reply
Parent - - By PeterW [de] Date 2010-02-03 14:03 Edited 2010-02-03 14:12
I would be very carful about concluding anything from such demos. If I'm not mistaken, this demo solves some differential equations for all voxel points in the water - which is a horribly calculation-intensitive thing to do. Which is, obviously, why it was chosen as a test to showcase this feature.

But in applying it to Clonk, this means that you will waste a lot of power on water that's essentially standing still. Some waves can be emulated at lower cost, and might even end up lookin more realistic than actual waves, because we can tweak them more easily. So do we really need a whole new submodule just to make the rare big floods look more realistic? Couldn't we do something more useful with that calculation power (scaling the landscape up *hint hint*)?

And yes, the whole network synchronization thing is a killer argument here. But that has been explained enough elsewhere in this thread.

Addendum

Looking at the demo more closely it seems they actually use a lot of little spheres to model water, just like that 2D physic simulation knight_k posted once. I'm pretty sure that this approach has huge performance problems once really large amounts of liquid are involved (see above: flood).
Parent - - By Maddin Date 2010-02-03 14:32
First of all, I was talking about moving to take the massmoving/water-processing to the GPU, not programming a water-physics-engine. Although that seems to be not so difficult since there's already some kind of realistic physics in clonk.

>scaling the landscape up *hint hint*


What do you mean?
Parent - By AlteredARMOR [ua] Date 2010-02-03 14:34
He means giving the landscape more clear-cut look when using large scale rates
Reply
Parent - - By PeterW [de] Date 2010-02-03 16:24

> First of all, I was talking about moving to take the massmoving/water-processing to the GPU, not programming a water-physics-engine.


Massmoving/water-processing is our water physics engine, as far as I'm concerned.

Well, in case you just intend to "port" the current mass mover to a GPU: I'm pretty sure it will actually run slower on a GPU, because as far as I know doing actual loops over thousands of pixels isn't really a GPU's strong point (opposed to pretty local calculations working with massive parallelism). You need to change the approach.
Parent - - By Maddin Date 2010-02-03 17:41

>You need to change the approach.


In which direction would you point?
Parent - - By PeterW [de] Date 2010-02-03 18:02
Well, something more "local". Like having a pressure and velocity value for each pixel (group?) and have it propagate with each iteration. Sven proposed something like that once - I think with pixel shaders it could become borderline doable.
Parent - By Maddin Date 2010-02-03 18:31
Yes, I already thought about that. Isn't so hard to do, but it would increase the memory-storage size per pixel.
Parent - - By knight_k [us] Date 2010-02-03 16:36

> just like that 2D physic simulation knight_k posted once


probably refers to: http://www.phunland.com
Reply
Parent - - By Carli [de] Date 2010-02-03 16:52
but phun has not such high water pixel resolution and it has vector objects, no pixels.
Up Topic General / General / Use the GPU
1 2 Previous Next

Powered by mwForum 2.29.7 © 1999-2015 Markus Wichitill