The Method to Order Point Clouds for Visualization on the Ray Tracing Pipeline

Мұқаба

Дәйексөз келтіру

Толық мәтін

Ашық рұқсат Ашық рұқсат
Рұқсат жабық Рұқсат берілді
Рұқсат жабық Тек жазылушылар үшін

Аннотация

Currently, the digitization of environment objects (vegetation, terrain, architectural structures, etc.) in the form of point clouds is actively developing. The integration of such digitized objects into virtual environment systems allows the quality of the modeled environment to be improved, but requires efficient methods and algorithms for real-time visualization of large point volumes. In this paper the solution of this task on modern multicore GPUs with support of hardware-accelerated ray tracing is researched. A modified method is proposed where the original unordered point cloud is split up into point groups which visualization is effectively parallelized on ray tracing cores. The paper describes an algorithm for constructing such groups using swapping arrays of point indices, which works faster than alternative solutions based on linked lists, and also has lower memory overhead. The proposed method and algorithm were implemented in the point cloud visualization software complex and approbated on a number of digitized environment objects. The results of the approbation confirmed the efficiency of proposed solutions as well as their applicability for virtual environment systems, video simulators and geoinformation systems, virtual laboratories, etc.

Толық мәтін

Рұқсат жабық

Авторлар туралы

P. Timokhin

Scientific Research Institute for System Analysis of the Russian Academy of Sciences

Хат алмасуға жауапты Автор.
Email: p_tim@bk.ru
Ресей, Moscow

M. Mikhailyuk

Scientific Research Institute for System Analysis of the Russian Academy of Sciences

Email: mix@niisi.ras.ru
Ресей, Moscow

Әдебиет тізімі

  1. Guo M., Sun M., Pan D., Wang G., Zhou Y., Yan B., Fu Z. High-precision deformation analysis of yingxian wooden pagoda based on UAV image and terrestrial LiDAR point cloud // Heritage Science. 2023. V. 11. P. 1–18. https://doi.org/10.1186/s40494-022-00833-z
  2. Adamopoulos E., Papadopoulou E.-E., Mpia M., Deligianni E.-O., Papadopoulou G., Athanasoulis D., Konioti M., Koutsoumpou M., Anagnostopoulos C.N. 3D Survey and Monitoring of Ongoing Archaeological Excavations via Terrestrial and Drone LIDAR // ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2023. V. X-M-1-2023. P. 3–10. https://doi.org/10.5194/isprs-annals-X-M-1-2023-3-2023
  3. Weiser H., Schäfer J., Winiwarter L., Krašovec N., Fassnacht F.E., Höfle B. Individual tree point clouds and tree measurements from multi-platform laser scanning in German forests // Earth System Science Data. 2022. V. 14. № 7. P. 2989–3012. https://doi.org/10.5194/essd-14-2989-2022
  4. Risbøl O., Gustavsen L. LiDAR from drones employed for mapping archaeology – Potential, benefits and challenges // Archaeological Prospection. 2018. V. 25. P. 329–338. https://doi.org/10.1002/arp.1712
  5. Sketchfab – The leading platform for 3D & AR on the web. 2023. https://sketchfab.com/
  6. Casado-Coscolla A., Sanchez-Belenguer C., Wolfart E., Sequeira V. Rendering massive indoor point clouds in virtual reality // Virtual Reality. 2023. V. 27. P. 1859–1874. https://doi.org/10.1007/s10055-023-00766-3
  7. Kharroubi A., Hajji R., Billen R., Poux F. Classification and Integration of Massive 3D Points Clouds in a Virtual Reality (VR) Environment // The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2019. V. XLII-2/W17. P. 165–171. https://doi.org/10.5194/isprs-archives-XLII-2-W17165-2019
  8. Discher S., Masopust L., Schulz S., Richter R., Döllner J. A Point-Based and Image-Based Multi-Pass Rendering Technique for Visualizing Massive 3D Point Clouds in VR Environments // Journal of WSCG. 2018. V. 26. № 2. P. 76–84. https://doi.org/10.24132/JWSCG.2018.26.2.2
  9. Timokhin P. Yu., Mikhaylyuk M.V. Modeling of Landscape Features by Means of Point Clouds in Virtual Environment Systems // Proc. GraphiCon. Proceedings of the 33rd International Conference on Computer Graphics and Vision (GraphiCon 2023). Moscow, 2023. P. 157–168. https://doi.org/10.20948/graphicon-2023-157-168.
  10. Kivi P.E.J., Mäkitalo M.J., Žádník J., Ikkala J., Vadakital V.K.M., Jääskeläinen P.O. Real-Time Rendering of Point Clouds With Photorealistic Effects: A Survey // IEEE Access. 2022. V. 10. P. 13151–13173. https://doi.org/10.1109/ACCESS.2022.3146768
  11. Kobbelt L., Botsch M. A survey of point-based techniques in computer graphics // Computers & Graphics. 2004. V. 28. № 6. P. 801–814. https://doi.org/10.1016/j.cag.2004.08.009
  12. Botsch M., Hornung A., Zwicker M., Kobbelt L. High-Quality Surface Splatting on Today's GPUs // Proc. Eurographics/IEEE VGTC Symposium Point-Based Graphics. 2005. P. 17–24. https://doi.org/10.2312/SPBG/SPBG05/017-024
  13. Linsen L., Müller K., Rosenthal P. Splat-based Ray Tracing of Point Clouds // Journal of WSCG. 2007. V. 15. P. 51–58. https://dspace5.zcu.cz/bitstream/11025/1426/1/Linsen.pdf
  14. Wald I., Seidel H.-P. Interactive ray tracing of point-based models // Proceedings Eurographics / IEEE VGTC Symposium Point-Based Graphics (Jun. 2005). 2005. P. 9–16. https://doi.org/10.1145/1187112.1187176
  15. Adamson A., Alexa M. Ray tracing point set surfaces // Proceedings of the Shape Modeling International (SMI '03). 2003. P. 272–279. https://doi.org/10.1109/SMI.2003.1199627
  16. Hubo E., Mertens T., Haber T., Bekaert P. Self-similarity based compression of point set surfaces with application to ray tracing // Computers & Graphics. 2008. V. 32. № 2. P. 221–234. https://doi.org/10.1016/j.cag.2008.01.012
  17. Tejada E., Gois J.P., Nonato L.G., Castelo A., Ertl T. Hardware-accelerated Extraction and Rendering of Point Set Surfaces // Proceedings of the 8th Joint Eurographics – IEEE VGTC Symposium on Visualization (EuroVis '06). 2006. P. 21–28. https://doi.org/10.2312/VisSym/EuroVis06/021-028
  18. Zhang Y., Pajarola R. Deferred blending: Image composition for single-pass point rendering // Computers & Graphics. 2007. V. 31. № 2. P. 175–189. https://doi.org/10.1016/j.cag.2006.11.012
  19. Wimmer M., Scheiblauer C. Instant points: Fast rendering of unprocessed point clouds // Eurographics Symposium on Point-Based Graphics (eds. Botsch M., Chen B., Pauly M., Zwicker M.). Geneva, Switzerland: The Eurographics Association. 2006. P. 129–136. https://doi.org/10.2312/SPBG/SPBG06/129-136
  20. Hubo E., Mertens T., Haber T., Bekaert P. The Quantized kd-Tree: Efficient Ray Tracing of Compressed Point Clouds // 2006 IEEE Symposium on Interactive Ray Tracing, Salt Lake City, UT, USA. 2006. P. 105–113. https://doi.org/10.1109/RT.2006.280221
  21. Günther C., Kanzok T., Linsen L., Rosenthal P. A GPGPU-based Pipeline for Accelerated Rendering of Point Clouds // Journal of WSCG. 2013. V. 21. № 2. P. 153–162. https://dspace5.zcu.cz/bitstream/11025/6978/1/Gunther.pdf
  22. Schütz M., Kerbl B., Wimmer M. Rendering Point Clouds with Compute Shaders and Vertex Order Optimization // Computer Graphics Forum. 2021. V. 40. № 4. P. 115–126. https://doi.org/10.1111/cgf.14345
  23. Kashyap S., Goradia R., Chaudhuri P., Chandran S. Implicit Surface Octrees For Ray Tracing Point Models // ICVGIP’10: proceedings of the 7th Indian Conference on Computer Vision, Graphics and Image Processing (December 2010). 2010. P. 227–234. https://doi.org/10.1145/1924559.1924590
  24. Karras T. Maximizing Parallelism in the Construction of BVHs, Octrees, and k-d Trees // Eurographics/ ACM SIGGRAPH Symposium on High Performance Graphics. Eurographics Association. 2012. P. 33–37. https://doi.org/10.2312/EGGH/HPG12/033-037
  25. Kim H.-J., Cengiz Öztireli A., Gross M., Choi S.-M. Adaptive surface splatting for facial rendering // Computer Animation Virtual Worlds. 2012. V. 23. № 3–4. P. 363–373. https://doi.org/10.1002/cav.1463
  26. Schütz M., Krösl K., Wimmer M. Real-Time Continuous Level of Detail Rendering of Point Clouds // IEEE VR2019: the 26th IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan. IEEE. 2019. P. 103–110. https://doi.org/10.1109/VR.2019.8798284
  27. Kulik A., Kunert A., Beck S., Matthes C.-F., Schollmeyer A., Kreskowski A., Fröhlich B., Cobb S., D'Cruz M. Virtual Valcamonica: Collaborative Exploration of Prehistoric Petroglyphs and Their Surrounding Environment in Multi-User Virtual Reality // Presence: Teleoperators and Virtual Environments. 2017. V. 26. № 3. P. 297–321. https://doi.org/10.1162/pres_a_00297
  28. Schütz M., Herzberger L., Wimmer M. SimLOD: Simultaneous LOD Generation and Rendering // ArXiv, abs/2310.03567. 2023. P. 1–12. https://doi.org/10.48550/arXiv.2310.03567
  29. NVIDIA Turing GPU Architecture // NVIDIA Corporation. 2018. https://images.nvidia.com/aem-dam/Solutions/design-visualization/technologies/turing-architecture/NVIDIA-Turing-Architecture-Whitepaper.pdf
  30. Sanzharov V.V., Frolov V.A., Galaktionov V.A. Survey of Nvidia RTX Technology // Programming and Computer Software. 2020. V. 46. № 4. P. 297–304. https://doi.org/10.1134/S0361768820030068
  31. Timokhin P.Y., Mikhaylyuk M.V. An Efficient Technology of Real-time Modeling of Height Field Surface on the Ray Tracing Pipeline // Programming and Computer Software. 2023. V. 49. № 3. P. 178–186. https://doi.org/10.1134/S0361768823030064
  32. Rusch M., Bickford N., Subtil N. Introduction to Vulkan Ray Tracing // Ray Tracing Gems II. NVIDIA. 2021. P. 213–255. https://doi.org/10.1007/978-1-4842-7185-8_16
  33. Sjoholm J. Best Practices for Using NVIDIA RTX Ray Tracing (Updated) // NVIDIA Technical Blog. Jul 25, 2022. https://developer.nvidia.com/blog/best-practices-for-using-nvidia-rtx-ray-tracing-updated/
  34. Lefrançois M.-K. Intersection Shader // NVIDIA Vulkan Ray Tracing Tutorials. 2020–2023. https://github.com/nvpro-samples/vk_raytracing_tutorial_KHR/tree/master/ray_tracing_intersection
  35. C++ reference. Containers library. Sequence containers. 2023. https://en.cppreference.com/w/cpp/container
  36. Vulkan 1.3.275 – A Specification (with all ratified extensions) // The Khronos Vulkan Working Group. 2024. https://registry.khronos.org/vulkan/specs/1.3-khr-extensions/pdf/vkspec.pdf
  37. Wong U., Whittaker W., Jones H., Whittaker R. NASA Planetary Pits and Caves Analog Dataset. 2014. https://ti.arc.nasa.gov/dataset/caves/

Қосымша файлдар

Қосымша файлдар
Әрекет
1. JATS XML
2. Fig. 1. Influence of the Kmax parameter on the average frequency of image synthesis, in frames per second (left), and on the number of NAABBs of bounding parallelepipeds (right). In the above plots, a logarithmic scale is used for the Kmax and NAABBs axes, and the curves a, b, c, d, e correspond to point clouds #2, 4, 7, 8, 9 from Table 1. In the left graph, the dashed horizontal line indicates the lower threshold of 25 frames per second corresponding to real-time visualisation

Жүктеу (266KB)
3. Fig. 2. Examples of visualisation frames obtained with our modified point cloud visualiser. Images a) - d) correspond to point clouds No. 2, 1, 6, 4 from Table 1

Жүктеу (1MB)

© Russian Academy of Sciences, 2024