Recent Posts

Pages: 1 2 [3] 4 5 ... 10
21
@ninekorn, if you can use ShaderResourceView (Direct3D 11, right?) to render the texture (taken from the framebuffer), then you can also write an HLSL effect (shader) for this ShaderResourceView which will do the required operation per pixel. And it will be almost instant as the job done only on GPU.

And if you can totally avoid the byte arrays and can directly apply the shader to the frame buffer it will be the best solution performance-wise. It should be possible to avoid ScreenCapture and simply hook to the application itself.
22
Jumpgate can have a pass condition. That's used to ensure that player doesn't leave the starting system before finishing the first quest and also that player cannot enter Alien systems early. Perhaps that will suffice for your idea?
Of course, the discovery of a completely new star system sounds much more interesting than simply unlocking a jumpgate to another system, but considering that the game architecture was written for one-time world generation it's much easier to achieve by using the already proven game API...
23
I was thinking this more as a part of a mission chain, not as the universe continuously expanding. Would it work with a jumpgate then?
24
nevermind I think I fail.
25
@ninekorn,
usually any stuff which is dealing with framebuffer data should operate as a GPU shader, otherwise you're forcing GPU to flush all the graphics command to access the framebuffer data, then modify it and push back. No wonder it's ultra expensive.

With a GPU shader - you can call it a GPU post-processing shader - you can perform the same operation per pixel very effectively. And the code of such shader will be super simple as you basically need only the framebuffer for input and a very short pixel shader to modify the color of each pixel.

Unfortunately, I don't have any knowledge of how you can hook into the graphics API with Oculus, but there are should be definitely a pretty simple way - as you're already doing something very similar, but in a different way (by copying the framebuffer from GPU to RAM and modifying it with CPU, and then pushing it back to GPU).

Regards!

I am using the byte array created by the screencapture from XooF (Alexandre Mutel). Once I get this byte array, I am creating a ShaderResourceView from the Ab4D Engine (Ab3d.DirectX.TextureLoader.Cr eateShaderResourceView) and then I assign that to the texture material shader. That was the fastest way in the Ab4D engine. It's super fast. But computing the white background of 1920 * 1080 * 4bytes = 8294400 bytes is crazy expensive.

But I succeeded... 10 minutes ago... hell yeah ;), well at least with 736000 bytes for the moment out of 8294400 bytes. This is ultra fast. I am down to 0 milliseconds. And of course nothing interferes with any game whatsoever. I am really pleased with the results. I'm gonna try to map the rest of the 1920*1080 screen and see where it goes
26
@ninekorn,
usually any stuff which is dealing with framebuffer data should operate as a GPU shader, otherwise you're forcing GPU to flush all the graphics command to access the framebuffer data, then modify it and push back. No wonder it's ultra expensive.

With a GPU shader - you can call it a GPU post-processing shader - you can perform the same operation per pixel very effectively. And the code of such shader will be super simple as you basically need only the framebuffer for input and a very short pixel shader to modify the color of each pixel.

Unfortunately, I don't have any knowledge of how you can hook into the graphics API with Oculus, but there are should be definitely a pretty simple way - as you're already doing something very similar, but in a different way (by copying the framebuffer from GPU to RAM and modifying it with CPU, and then pushing it back to GPU).

Regards!
27
Just wanted to clarify that it never is going to change the gameplay... Only the background visual and maybe foreground. I am not into "hacking" void expanse ;). I succeeded in making the background completely transparent in virtual reality so we see only "decorations/objects/ships/stations" but I am somewhat lacking knowledge in speeding things up with c# pointers. I am working on that right now... I just wish to be able to do another tutorial this weekend but I am caught up in making this work.

I am trying MemoryStreams and Pointers (which I have a big gap knowledge in accessing bytes per bytes for MemoryStreams)... Right now I am at 2 milliseconds overhead in just making the player area transparent. I can't seem to bring it down to 0 but at 1920*1080 it still is 736000 bytes to check every frames for the player ship area that starts at 720 in the X and 360 in the Y coordinates.

I have been 1 week on this junk and I am starting to lose confidence.

By the way, if any Void Expanse players have an Oculus Rift, please write here. I'd love to know how many we are.

Here's a code snippet that I can't make any better for the moment.

Code: [Select]
  fixed (byte* textureByteArray = _textureByteArray, previousTextureByteArray = _previousTextureByteArray, currentByteArray = _currentByteArray)
            {
                byte* buffer0;
                byte* buffer1;
                byte* buffer2;

                for (int x = xxPlayerShip; x < xxPlayerShip + widthOfRectanglePlayerShip; x++)
                {
                    for (int y = yyPlayerShip; y < yyPlayerShip + (heightOfRectanglePlayerShip); y++)
                    {
                        bytePos = ((y * 1920) + x) * 4;

                        buffer0 = &textureByteArray[bytePos + 0];
                        buffer1 = &textureByteArray[bytePos + 1];
                        buffer2 = &textureByteArray[bytePos + 2];

                        if (*buffer0 + *buffer1 + *buffer1 >= 750)
                        {
                            byte* buffer00 = &currentByteArray[bytePos + 0];
                            byte* buffer11 = &currentByteArray[bytePos + 1];
                            byte* buffer22 = &currentByteArray[bytePos + 2];
                            byte* buffer33 = &currentByteArray[bytePos + 2];


                            *buffer00 = 0;
                            *buffer11 = 0;
                            *buffer22 = 0;
                            *buffer33 = 0;
                          }
                     }
                }
           }

28
It might be possible as the generation scripting API allows spawning new objects at any time.
However, I'm afraid that the new jumpgate will not work as the routes are cached after the galaxy generation or after the savegame loading. If a new jumpgate spawned the cache should be invalidated but, as we didn't have such need during the game development, I can't say for sure.
Also, it's impossible to remove the star system so it might become a problem as more and more systems spawned during the gameplay.
29
Modding info / Dynamic adding of NPCs and other objects
« Last post by Wilmore on July 27, 2018, 02:36:40 pm »
Is it possible to add NPCs and other things post world creation? Or is everything created with the world?
30
Modding info / Creating new systems on the go (or activating existing jumpgates)
« Last post by Wilmore on July 27, 2018, 02:35:35 pm »
Would it be possible as a part of a quest chain, to generate a new station or system (like "fixing" a jumpgate or something like that). That would be epic!
Pages: 1 2 [3] 4 5 ... 10