GPUs want to draw triangles, and in fact only know how to draw triangles[0]. Pretty much all graphics API innovation has been around either feeding more triangles to the GPU faster, letting the GPU create more triangles after they've been sent, or finding cool new ways to draw things on the surface of those triangles.
2D/UI breaks down into drawing curves, either as filled shapes or strokes. The preferred representation of such is a Bezier spline, which is a series of degree-three[1] polynomials that GPUs have zero support for rasterizing. Furthermore, strokes on the basis functions of those splines are not polynomials, but an even more bizarre curve type called an algebraic curve. You cannot just offset the control points to derive a stroke curve; you either have to approximate the stroke itself with Beziers, or actually draw a line sequentially in a way that GPUs are really not capable of doing.
The four things you can do to render 2D/UI on a GPU is:
- Tesselate the Bezier spline with a series of triangles. Lyon does this. Bezier curves make this rather cheap to do, but this requires foreknowledge of what scale the Bezier will be rendered at, and you cannot adjust stroke sizes at all without retessellating.
- Send the control points to the GPU and use hardware tessellation to do the above per-frame. No clue if anyone does this.
- Don't tessellate at all, but send the control points to the GPU as a polygonal mesh, and draw the actual Beziers in the fragment shaders for each polygon. For degree-two/quadratics there are a series of coordinate transforms that you can do which conveniently map all curves to one UV coordinate space; degree-three/cubics require a lot more attention in order to render correctly. If I remember correctly Mozilla Pathfinder does this[2].
- Send a signed distance field and have the GPU march it to render curves. I don't know much about this but I remember hearing about this a while back.
All of these approaches have downsides. Tessellation is the approach I'm most familiar with because it's used heavily in Ruffle; so I'll just explain it's downsides to give you a good idea of why this is a huge problem:
- We can't support some of Flash's weirder rendering hacks, like hairline strokes. Once we have a tessellated stroke, it will always be that width regardless of how we scale the shape. But hairlines require that the stroke get proportionally bigger as the shape gets smaller. In Flash, they were rendering on CPU, so it was just a matter of saying "strokes are always at least 1px".
- We have to sort of guess what scale we want to render at and hope we have enough detail that the curves look like curves. There's one particular Flash optimization trick that consistently breaks our detail estimation and causes us to generate really lo-fi polygons.
- Tessellation requires the curve shape to actually make sense as a sealed hull. We've exposed numerous underlying bugs in lyon purely by throwing really complicated or badly-specified Flash art at it.
- All of this is expensive, especially for complicated shapes. For example, pretty much any Homestuck SWF will lock up your browser for multiple minutes as lyon tries to make sense of all of Hussie's art. This also precludes varying strokes by retessellating per-frame, which would otherwise fix the hairline stroke problem I mentioned above.
[0] AFAIK quads are emulated on most modern GPUs, but they are just as useless for 2D/UI as triangles are.
[1] Some earlier 2D systems used degree-two Bezier splines, including most of Adobe Flash.
[2] We have half a PR to use this in Ruffle, but it was abandoned a while back.
> For degree-two/quadratics there are a series of coordinate transforms that you can do which conveniently map all curves to one UV coordinate space; degree-three/cubics require a lot more attention in order to render correctly. If I remember correctly Mozilla Pathfinder does this[2].
That's interesting! Do you have any references (or links to sample fragment shader code) for that quadratic case coordinate transformation?
Basically, all quadratic Bezier curves are just linear transformations[0] of the curve u^2 - v. The fragment shader just evaluates that one equation to draw the curve, and texture mapping does all the rest. As long as you're careful to ensure that your fill surface polygon actually makes sense, you get back out perfectly-rendered Beziers at any zoom factor or angle.
[0] Scale/shear/rotate - all the things you can do by matrix multiplication against a vector. Notably, not including translations; though GPUs just so happen to use a coordinate system that allows linear translations if you follow some conventions.
I found a reference on HN for anyone else following along:
> In 2005 Loop & Blinn [0] found a method to decide if a sample / pixel is inside or outside a bezier curve (independently of other samples, thus possible in a fragment shader) using only a few multiplications and one subtraction per sample.
- Integral quadratic curve: One multiplication
- Rational quadratic curve: Two multiplications
- Integral cubic curve: Three multiplications
- Rational cubic curve: Four multiplications
2D/UI breaks down into drawing curves, either as filled shapes or strokes. The preferred representation of such is a Bezier spline, which is a series of degree-three[1] polynomials that GPUs have zero support for rasterizing. Furthermore, strokes on the basis functions of those splines are not polynomials, but an even more bizarre curve type called an algebraic curve. You cannot just offset the control points to derive a stroke curve; you either have to approximate the stroke itself with Beziers, or actually draw a line sequentially in a way that GPUs are really not capable of doing.
The four things you can do to render 2D/UI on a GPU is:
- Tesselate the Bezier spline with a series of triangles. Lyon does this. Bezier curves make this rather cheap to do, but this requires foreknowledge of what scale the Bezier will be rendered at, and you cannot adjust stroke sizes at all without retessellating.
- Send the control points to the GPU and use hardware tessellation to do the above per-frame. No clue if anyone does this.
- Don't tessellate at all, but send the control points to the GPU as a polygonal mesh, and draw the actual Beziers in the fragment shaders for each polygon. For degree-two/quadratics there are a series of coordinate transforms that you can do which conveniently map all curves to one UV coordinate space; degree-three/cubics require a lot more attention in order to render correctly. If I remember correctly Mozilla Pathfinder does this[2].
- Send a signed distance field and have the GPU march it to render curves. I don't know much about this but I remember hearing about this a while back.
All of these approaches have downsides. Tessellation is the approach I'm most familiar with because it's used heavily in Ruffle; so I'll just explain it's downsides to give you a good idea of why this is a huge problem:
- We can't support some of Flash's weirder rendering hacks, like hairline strokes. Once we have a tessellated stroke, it will always be that width regardless of how we scale the shape. But hairlines require that the stroke get proportionally bigger as the shape gets smaller. In Flash, they were rendering on CPU, so it was just a matter of saying "strokes are always at least 1px".
- We have to sort of guess what scale we want to render at and hope we have enough detail that the curves look like curves. There's one particular Flash optimization trick that consistently breaks our detail estimation and causes us to generate really lo-fi polygons.
- Tessellation requires the curve shape to actually make sense as a sealed hull. We've exposed numerous underlying bugs in lyon purely by throwing really complicated or badly-specified Flash art at it.
- All of this is expensive, especially for complicated shapes. For example, pretty much any Homestuck SWF will lock up your browser for multiple minutes as lyon tries to make sense of all of Hussie's art. This also precludes varying strokes by retessellating per-frame, which would otherwise fix the hairline stroke problem I mentioned above.
[0] AFAIK quads are emulated on most modern GPUs, but they are just as useless for 2D/UI as triangles are.
[1] Some earlier 2D systems used degree-two Bezier splines, including most of Adobe Flash.
[2] We have half a PR to use this in Ruffle, but it was abandoned a while back.