Orisphere aims to be an editor and an embeddable renderer for the algorithmic generation of 3D shapes, materials and animations using analytic Signed Distance Functions (2D and 3D) and related techniques, for LÖVE.
The primary purpose of this project is to develop a simple methodology that I can use to easily experiment with 3D modeling and explore simple, complex, weird or eerie 3D aesthetics.
Beyond the interactive rendering of the editor, I have two major use cases for the project.
The first is the creation of isometric games (making tiles, etc.) with IPR. In this case, it would only render shapes and materials for further processing instead of a full render. This method should be very resilient regarding modeling freedom and performances. One example is the ability to use lower bound SDFs by baking costly objects and because small SDFs are evaluated for each tile instead of a big world SDF.
The second one is Orisphere Playground, a game built on this project which extends the same editing methodology for gameplay purposes. Each level would be an interactively rendered world SDF, akin to what Shadertoy users are doing, making it much less resilient than the first case. The hope is that the editor will give the tools to intelligently design a scene to keep the amount of primitives, operations and overall cost low enough for it to be a real time experience (in addition to rendering quality options).
The core ideas and constraints for the project are as follow:
- Analytic / algorithmic: The editing process is non-destructive using abstract constructs. There should be no concrete external data involved to produce the shapes, materials or final rendering. Also, for simplicity and quality, the process should avoid baking things as much as possible.
- Global SDF: The editor generates a single SDF for what is edited, model or map. The full scene is described by the SDF which allows to use elegant SDF rendering techniques such as "fake" ambient occlusion or soft shadows.
- Surface only: The solid opaque surface of the SDF is the goal. No blending, no subsurface or volumetric effects, no refraction, etc.
- Scene graph: The editor works on a scene graph and other structures to manage objects and other properties (e.g. animations, scene documentation). The entire scene or "project", or parts of it, can be exported to and from Lua objects (as a format).
- Abstractions: To simplify the editing process and to be more amenable to automatic optimizations or other transformations, it should not involve writing GLSL directly. For low-level computations, which can be especially useful to control various properties from global parameters (e.g. animations), a simple graphical dataflow language can be created. It also makes the output, a model or a level in the Playground case, more portable and safer to share.
Considered rendering techniques:
- Physically based rendering, which should be heavily based on the work done on the Filament renderer. Materials with normal, roughness, metallic, reflectance and emission components.
- Direct lighting (directional, point, spot, etc.) with soft shadows. For punctual lights, also try an "analytic" approach, that is, use the same SDF editing capabilities to represent non overlapping lights (domain repetition of lights).
- Simple ambient lighting with "fake" ambient occlusion.
- Simple volumetric lighting / shadows (maybe too costly).
- Bloom.
- Dithering.
- Generate multiple functions, especially for interactive rendering. One could be just to find the intersection point and can also be used for normals, shadows and ambient occlusion. Another could be just for the materials and would be evaluated afterwards without raymarching.
- SDF bounding volumes.
The editor will be under the GPL and the embeddable renderer under the MIT license.
The work of Inigo Quilez and the Shadertoy community are important learning resources to achieve this project.