Яндекс

Sunday, November 17, 2019

Rendering in Low Level Emulation mode. Part III

Hi!

In the previous article I described a new approach to processing N64 low-level polygons. This approach helped me to solve several serious issues with LLE rendering in general. However, main practical benefit was expected in Golden Eye, Perfect Dark and Killer Instinct Gold - the only games, which can't work properly without LLE support. Does the new method help to fix LLE-related issues in these games? Unfortunately, no. For example, KI with old method looks like this:
It looks nearly the same with the new method.

So, the problem is somewhere else. The source of the problem was discovered quite quickly. The sky on the screen-shot above has rendered by one low-level triangle command. As I explained before, low-level triangle commands render not triangles but trapezoids. In this case one triangle command renders the sky rectangle. I noticed that lower vertices of the rectangle have negative W coordinate. Normally, W can not be negative. Polygon with negative W vertex coordinate must be clipped. The microcode running on RSP performs the clipping. However, sky polygons in KI, sky and water polygons in GE and PD are exceptions. Crafty Rare programmers sent raw low-level polygons data directly to RDP bypassing RSP processing. That is why these games need LLE support even in HLE mode. Probably the code, which generates this low-level data is buggy and sometimes produces incorrect result. You may run KI and see that sometimes the sky is correct and few seconds later it is wrong again.

AL RDP plugin has no problems with such polygons. After long debugging I found that it implements an interesting feature: texture coordinates clamp. It is not the standard tile clamp explained in N64 manuals. It is rather a sanity test: if texture coordinate can't be calculated correctly it force clamped to some special value. Negative W is one of the cases, which triggers that clamping. I dumped texture coordinates calculated by AL RDP for KI case. Look at this diagram:
It shows how S coordinate changes from top to bottom of the rectangle. It wraps several times, but at the bottom becomes constant. It is where W coordinate turns negative. The sky polygon alone looks like this (click to see full size):
As you may see, the very bottom part of the polygon is filled with some constant color. This part usually is covered by other geometry, but I hacked the AL RDP sources to get that picture.

AL RDP software plugin emulates work of RDP and renders that polygon line by line. When W becomes negative at some line, RDP clamps texture coordinate to some constant. That constant coordinate points to some texel inside the texture, and this texel is used for all pixels in the line.

Hardware render can't work this way. Color, depth and texture coordinates provided per vertex and interpolated for each pixel inside the polygon. Interpolation is a smooth function. In this case texture coordinates do not behave smoothly and interpolation does not work as it should.

I found a solution. All coordinates work properly while W is positive. If W becomes negative for some vertex (or vertices), the algorithm searches for the Y coordinate, where W changes its sign. Then part of the polygon from the top to that Y is rendered. The part below Y rendered too, but all vertices of that part have the same texture coordinate, so it is filled with some constant color fetched from the texture. The result:

That fix also solved the water texture issue in Golden Eye Frigate level. However I met another issue there: the colors of water and sky were somehow wrong, not as dark as they have to be:
The color combiners for water and sky mix texture color with shading color. Since texturing looks correct, the problem should be in the shading color. I compared color of vertices calculated by GLideN64 with color calculated by AL RDP at the same points. The results were very close. I decided to hack the color combiner for water: remove texturing and draw only the shading color:
This result puzzled me at first. The input data is nearly the same but the output is dramatically different. Color of top vertices is dark and rather blue, so the result should be as on the right screen shot from AL RDP. Then I noticed that value of W is very high for top vertices but is very low at the bottom:

This explains the problem. N64 hardware is not powerful enough to perform perspective correction for colors. It uses plain Gouraud shading, that is simple interpolation of vertex color. GLideN64 powered by OpenGL. Modern OpenGL applies perspective correction to all outputs of vertex shader by default, including shading color of course. Perspective correction makes shading color almost constant in that case, because the differences in vertex color intensity compensated by differences in vertex W. Luckily, OpenGL allows to disable perspective correction for any parameter. I disabled perspective correction for shading color and finally got the correct result:
Thus, the LLE-specific issues in KI, GE and PD have been fixed. GLideN64 LLE rendering still has unsolved issues mentioned in the previous article. This work has WIP status. Alpha builds available to project's patrons on Patreon.com.


4 comments:

  1. Amazing job and thanks for the fascinating writing!

    ReplyDelete
  2. What's left that doesn't render correctly? The decals with z fighting? Can you elaborate on the trapezoids? Why is the hardware generating these. Are you sure it's splitting them into two triangles? It could be doing a two fold linear interpolation which is fairly trivial to do in software since you just interpolate left to right along the scanlines, but it will produce potentially different interpolation (and depth values) for 4 sided primitives. Also depending on whether these primitives are actually co-planar.

    ReplyDelete
  3. > What's left that doesn't render correctly? The decals with z fighting?

    Yes, it is the main problem atm.

    > Can you elaborate on the trapezoids? Why is the hardware generating these.

    Do you mean N64 hardware? N64 low-level triangle command fills area between three edges, see the first article. Two of the edges can be non-intersecting, even parallel, so the area between the edges is a polygon.

    > Are you sure it's splitting them into two triangles?

    My algorithm turns triangle into triangle, rectangle into two triangles and so on.

    > It could be doing a two fold linear interpolation which is fairly trivial to do in software since you just interpolate left to right along the scanlines, but it will produce potentially different interpolation (and depth values) for 4 sided primitives.

    If you calculated vertices correctly the resulted polygons also must be correct.

    > Also depending on whether these primitives are actually co-planar.

    This is not easy to test. I don't know the original data and can't say for sure is my algorithm extracts that data correctly.

    ReplyDelete
  4. If you split the quad into 2 triangles just calculate the normal for each and compare them. If the normals are identical then the faces are planar, and the problem lies elsewhere.

    ReplyDelete