I miss being able to interact with the game scene while the game is running like in Unity. It’s a big productivity boost being able to just zoom out and grab and move objects. In Godot you have to open the remote tree and interact with values manually which can be a pain for debugging.
New nodes/interfaces for adaptativ music/sound design.
“FilePath”. Has PackedScene’s functionality where the path to the file is updated when the file is moved in the editor.
I would use it in place of a string to the next level being loaded. I tend to add subdirectories as I create new worlds. Manually updating strings isn’t so bad but when the fuctionality seems like it’s right there ut sucks. Can’t use PackedScene as each level would require loading the next level, and could cause cylic references.
Ez ray tracing, because ray tracing is cool.
Good answer. Especially if it could degrade gracefully for low performance, without temporal artifacts. E.g., have ray-surface hits act like point projectors with approximate visibility, so indirect lighting splashes brightness on a soft region instead of creating a hotspot.
I think there’s a general solution to noise that’s gone unused.
Okay, Metropolis light transport jiggles steps in each bright path, to find more low-probability / high-intensity paths. Great for caustics. But it’s only used on individual pixels. It samples a lot of light coming in to one spot. We want the inverse of that.
When light hits a point, it can scatter off in any direction, with its brightness adjusted according to probability. So… every point is a light source. It’s not uniform. But it is real light. You could test visibility from every hit along any light path, to every point onscreen, and it would remain a true unbiased render that would eventually converge.
The sensible reduction of that is to test visibility in a cone from the first bounce offscreen. Like if you’re looking at a wall lit by the moon, it goes eye, wall, moon, sun. Metropolis would jitter to different points on the moon to light the bejeezus out of that one spot on the wall. I’m proposing to instead check moon-to-wall visibility, for that exact point on the moon, but nearby spots on that wall. (Deferred rendering can skip testing between the wall and your eye. Pick visible spots.)
One spot on the moon would not accurately recreate soft moonlight - but Blender’s newer Eevee renderer proves that a dozen can suffice.
One sample per pixel could be a million.
You don’t need to go from all of them, to every point onscreen, to get some pretty smooth results. Basically it’s “instant radiosity” but with anisotropic virtual point sources. It’s just a shame shadowing still needs to be done (or faked) or else applying them would be doable in screen space.
Direct blender file import.
Godot supports .blend files check documentation
That is kinda half direct.
Well dang didn’t know GD 4 support this.