Яндекс

Sunday, April 2, 2017

Resident Evil 2

Resident Evil 2 for Nintendo 64 is hard to emulate game. While the game uses standard ucode (or slight modification of standard one), it uses few non-standard tricks, which are hard to reproduce on PC hardware. I spent lots of time on this game when I worked on Glide64 plugin. Abilities of 3dfx graphics card allowed me to obtain pretty good result: the game was fully playable on Voodoo4/5 with some minor glitches. Later necessary functionality was added to glide wrapper, so you can run the game on any modern PC card.

What makes the game hard to emulate? As you know, the game consists of static 2D backgrounds with 3D models moving over. Background size may vary from place to place: someplace it is 436x384, someplace 448x328 and so on. Frame buffer size corresponds to background size. Video interface stretches image to TV resolution 640x480.

The first problem, which hardware plugin faces in this game is the way how background loaded to frame buffer. To optimize background load and rendering on N64 hardware, background loaded as image with width 512. That is 448x328 image is loaded as 512x287. The game allocates color buffer with width 512 and renders background with BgCopy command into it. In fact BgCopy works as memcpy to copy background content from one address in RDRAM to another. When buffer copy completed, the game allocates buffer with the same origin, but with width 448. Now buffer has correct proportions, and 3D models can be rendered over.

Why it is a problem for hardware graphics plugin? The plugin executes BgCopy command, which loads 512x287 image. It is no problem to create 512x287 texture and render it to frame buffer. The result will look like this:


If the background rendered right to frame buffer, that result can't be fixed. If frame buffer object is used for rendering, you may try to change size of buffer texture the same way as N64 changes size of color buffer. I did not find a way to change size of existing texture without loosing its content with OpenGL. glTexImage2D can change the size/format for existing texture object, but it removes all previous pixel data. Of course, it is possible to copy texture data to conventional memory, resize texture and write the data back, but it will be slow. If you know better method, please share.

There is fast solution of the problem: a hack. Value of video interface register VI_WIDTH is the same as actual width of background image. Thus, we can recalculate background image dimensions and load it properly:


I used that hack in Glide64 and I still don't know better solution. Unfortunately, it works only for HLE, because BgCopy is high-level command. For LLE we still need somehow resize buffer texture.

The next problem is depth compare. I already described the problem here and here, so I cite myself:
"Few games use scenes consisting of 3D models moving over 2D background. Some of objects on the background can be visually "closer" to user than 3D model, that is part of the 3D model is "behind" that object and that part must not be drawn. For fully 3D scene problem "object behind other object" is usually solved by depth buffer. 2D background has no depth, and depth buffer by itself can't help. Zelda OOT solves that problem by rendering auxiliary 3D scene with plain shaded polygonal objects, corresponding to the objects on the background. Thus, the scene gets correct depth buffer. Then the background covers this scene and 3D models rendered over the background are cut by depth buffer when the models are behind the original polygonal objects.
In Resident Evil 2 all screens are 3D models over 2D backgrounds. But the game does not render auxiliary 3D geometry to make depth buffer. Instead, the game ROM contains pre-rendered depth buffer data for each background. That depth buffer data is copied into RDRAM and each frame it is rendered as 16bit texture into a color buffer which then is used as the depth buffer. To emulate it on PC hardware the depth buffer data must be converted into format of PC depth buffer and copied into PC card depth buffer."

Glide64 was the first plugin, where the problem was solved. Copy values to depth buffer was relatively easy with glide3x API: glide3x depth buffer format is 16bit integer, as for N64. I could load depth image as 16bit RGB texture, render it to a texture buffer and then use that buffer as depth buffer, exactly as N64 does. OpenGL could not do it, but glide wrapper authors also manged to solve that problem. It was kinda hackish, but it works.

GLideN64 uses another solution. I invented it for NFL Quarterback Club 98 TV monitor effect. It is described in details in my Depth buffer emulation II article. Depth image is loaded as texture with one component RED and texel format of GL_UNSIGNED_SHORT. When the texture is rendered, fragment shader stores fetched texel as its depth value. Depth value from fragment shader passed to depth buffer, exactly as we need.

So, we have color background and depth buffer correctly rendered. Victory? Not yet. Depth buffer compare works, but not always. Here it works ok:


but if I step behind it looks like this:


Where is the problem? The problem is in the way N64 depth buffer works. N64 vertex uses 18bit fixed point depth value. N64 depth buffer stores 16 bit elements. N64 uses non-linear transformation of 18bit vertex depth value to 16bit value, which will be used for depth compare and then kept in the depth buffer. OpenGL uses floats for vertex depth and for depth buffer, but it is incorrect to directly compare GL depth component with value from N64 depth image. First, the same transformation must be applied to vertex depth. Fortunately, necessary shader code was already written for depth based fog, which Beetle Adventure Racing uses. I reused that code and finally got perfect result:







If you want to support my work:


13 comments:

  1. With z64gl I get this result in the main menu: https://cloud.githubusercontent.com/assets/7283660/21733350/4b190d00-d45e-11e6-9915-d69c06af8bcb.png

    The background and the text are rendered to different buffers. The plugin detects that these buffers share the same space in RDRAM and copies them together.

    I think it can be fixed with shaders. Before copying the texture it might be possible to recalculate it with a shader (from 512x287 to 448x328 for example). But I'm not sure yet if OpenGL allows this

    ReplyDelete
    Replies
    1. > The background and the text are rendered to different buffers. The plugin detects that these buffers share the same space in RDRAM and copies them together.

      Yes, first SetColorImage command declares color buffer with width 512 and background copied into it. Next SetColorImage command declares color buffer with the same origin, but with width equal to real width of the background image. Other objects rendered over.

      > I think it can be fixed with shaders. Before copying the texture it might be possible to recalculate it with a shader (from 512x287 to 448x328 for example).

      It also would be a hack. Besides, shader modification is not enough. You will not get correct result copying 448x328 image to 512x287 frame buffer.

      Delete
    2. I think my code is very general. I noticed my code for Super Bowling's fb effects makes RE2's backgrounds appear. But it needs to be recalculated, like
      if width (src)>width(dst)
      recalculate (src)
      Instead of cutting it off. Right now it's just cut off on the right side.
      width(src) is always 512 in RE2

      >You will not get correct result copying 448x328 image to 512x287 frame buffer.

      Why would I need that?

      Delete
    3. Could you point me on your code for Super Bowling?

      > Why would I need that?

      Because background image is rendered to buffer with width 512.

      Delete
    4. >Could you point me on your code for Super Bowling?

      https://github.com/purplemarshmallow/z64/blob/a0a6dc88811b039b9b2e503e742f7eb93a321ec3/src/rgl.cpp#L631

      >Because background image is rendered to buffer with width 512.

      I don't copy to a buffer with 512 width, only from this buffer

      Delete
    5. The idea with copy buffers and recalculating texel position in copy shader really may work. I did not thought about such possibility.

      Delete
  2. Gonetz, have you thought about making a Patreon page for GLideN64? Emulators such as CEMU have one and they are very successful (right now raising $42,000 USD monthly from patrons. yeah. you heard that right. look it up).

    ReplyDelete
    Replies
    1. Yes, I thought about it:
      http://gliden64.blogspot.ru/2015/06/gliden64-funding.html

      Delete
  3. I´m with Kimberly J. Ortega, patreon is a good site for reward your effort. You only need more publicity.

    ReplyDelete
    Replies
    1. I have very limited time for my hobbies. PR activity needs time. All I can to do is react on GitHub tickets, and sometimes write articles there. Also, I'm helpless in web design, and can't create nice project page to attract patrons.

      Delete
    2. Of course, family and hobbies first. No doubt.

      Btw, the cemu page is very very simple. I think this blog is enough to attract patrons. Keep doing this awesome work and i´m sure you got the people´s gratitude.

      Delete
  4. This comment has been removed by the author.

    ReplyDelete