You’re right, there are multiple ways to implement normal maps. The right/optimal way to implement normal maps is something that depends solely on what you need, how you generate the normal maps, and how your shaders consume the maps. OptiX supports access to your textures, of course, but OptiX can’t really support normal maps directly, for multiple reasons including that there isn’t one right way to implement normal maps. One way to think about it is that OptiX provides a framework for building many different kinds of ray tracing renderers, but specific render & shading features such as normal maps are outside the bounds of OptiX.
A couple of things to consider:
Normal maps are popular in rasterization pipelines, but when you combine normal maps with ray tracing (namely reflections and global illumination) you’re more likely to run into shading problems. Specifically, you need to make sure diffuse & reflection rays don’t accidentally trace into your surface, and that your shading math accounts for the rays that would have gone inside. This paper has a nice demonstration of the problem, a proposed solution, and a lot of good references and further reading: https://jo.dreggn.org/home/2017_normalmap.pdf
You may want to evaluate whether displacement maps are a realistic alternative for you. Tessellating will typically consume more memory, but might simplify and improve the performance of your shader, while removing the need for the normal maps. Displacing is more “physically correct”, if that matters to you. Depending on your scenes, trace times with displaced meshes might be nearly as fast, or as fast as trace times with un-displaced meshes, especially if you’re using RTX hardware, as larger meshes tend to utilize the hardware more efficiently than small ones.