Project Non-Photorealistic Rendering for Virtual Reality Applications

Project Non-Photorealistic Rendering for Virtual Reality Applications

M.Sc. Adrian Kreskowski
M.Sc. Gareth Rendle
M.Sc. Sebastian Stickert
Prof. Dr. Bernd Fröhlich

DegreeStudy ProgrammeExamination RegulationsECTS
B.Sc.Informatikall15
B.Sc.Medieninformatikall15
M.Sc.Computer Science for 
Digital Media
PV18 and lower15
M.Sc.Computer Science for
Digital Media
PV2012
M.Sc.Computer Science and Mediaall15
M.Sc.Human-Computer InteractionPV17 and lower15
M.Sc.Human-Computer InteractionPV1912/18

Motivation

The term Non-Photorealistic Rendering (NPR) refers to a family of rendering techniques that produce stylized, simplified or abstract images based on 3D geometry. Unlike photorealistic rendering, their purpose is not the simulation of the real world in as much detail as possible, but the abstraction of objects within a specific context. Examples of this are visual simplification of architecture and highlighting of geometric features such as edges and corners [1,2], the creation of blueprints [6] or exploded view diagrams [4] for models of complex mechanical parts, and the transformation of 3D worlds into distinct works of art with a visually consistent style [5,8,9,10].

One of the main challenges in creating NPR effects is to design the algorithms such that the stylizations are consistent between different viewing perspectives to allow for stereoscopic perception [9] and temporal plausibility, for example when models are animated [3,7].

Exemplary artistic stylization of a VR scene in our distributed virtual reality framework. Although the techniques (edge detection & enhancement, color quantization and halftoning) serve a certain visual appeal when they are appliec on static, monoscopic images, they break the stereo and temporal perception of the scene and effectively the presence of the user, if they are applied naively in image space. In this project, we are going to research, analyze, understand and extend the state-of-the-art of virtual reality compatible non-photorealistic rendering with the aim of creating general purpose illustrative rendering techniques that fulfil artistic visions and illustrative techniques that help to emphasize important geometric features (silhouette enhancement, blueprint rendering) in a cross-platform manner within Unity.

Description

In this project, we will review the NPR literature, with a focus on whether existing algorithms are suitable in the context of a variety of virtual reality applications. In order to support a wide range of operating systems, such as Windows, Linux and Android, as well as different display devices such as head-mounted displays and projection-based systems, the techniques will be implemented entirely with Unity, using hand-crafted shader pipelines and Unity's abstract graphics API. In this way, the same code base can be used to create NPR rendering assets that work on any device without major changes. Depending on the size of the project group, the results of the initial research phase, and your individual interests, we will explore illustrative rendering techniques in different scenarios using a combination of architectural models, volumetric datasets, realistic avatar representations, procedural geometry, and skeletal animated models. 

Do you want to learn about and create compelling NPR effects for virtual reality applications? Do you want to dive into illustrative rendering and graphics programming within Unity? Do you have at least a coarse understanding of a rasterization-based rendering pipeline?

If you answered "yes" to the questions above, we would look forward to welcoming you in our project!

State-of-the-Art VR Hardware

In order to be able to test reference NPR implementations and your own work on a variety of platforms and regardless of whether you are currently working with your colleagues in our new lab or quickly want to try out some idea at home, we will provide each project participant with an Oculus Quest 2 HMD for the entire duration of the project.

Requirements

As well as willingness to work in a team, and enthusiasm for learning about and developing rendering techniques on cutting edge hardware, you should have the following competencies:

  • Solid programming skills in C# or C++ are required
  • Experience in the field of real-time computer graphics and shader programming is helpful
  • Previous experiences with graphics APIs such as OpenGL DirectX or Vulkan is helpful

If you are in doubt as to whether you fulfil the requirements, or if you have any further questions regarding the project, we are happy to have a discussion with you during the project fair on 4th of April, or after that via email to adrian.kreskowski[at]uni-weimar.de.

Assessment

The final assessment of your work will be conducted based on the project contributions of every team member, including:

  • Active participation in the project during and in between weekly meetings
  • Design, implementation and evaluation of spatially and temporally consistent non-photorealistic rendering effects usable in a Cross-Platform context within Unity
  • Intermediate talks
  • Intermediate and final project presentations
  • Documentation in form of a short paper

If you are excited about real-time graphics programming, non-photorealistic rendering and virtual reality, we would be happy to see you at the project fair!

Literature

[1] Gooch, A., Gooch, B., Shirley, P., Cohen, E. (1998):
A non-photorealistic lighting model for automatic technical illustration. SIGGRAPH '98: Proceedings of the 25th annual conference on Computer Graphics and interactive techniques, pp. 447-452.

[2] Klein, A. W., Li, W., Kazhdan, M. M., Corrêal, W. T., Finkelstein, A., Funkhouser, T. A. (2000):
Non-photorealistic virtual environments. SIGGRAPH ’00: Proceedings of the 27th annual conference on Computer Graphics and interactive techniques, pp. 527-534.

[3] Kalnins, R. D., Davidson, P. L., Markosian, L., Finkelstein, A. (2003): Coherent Stylized Silhouettes, ACM Transaction on Graphics (Proc. SIGGRAPH): 22, pp. 856-861.

[4] Li, W., Agrawala, M., Salesin, D. (2004): Interactive Image-Based Exploded View Diagrams, Proceedings of Graphics Interface 2004, pp. 203-212.

[5] Santella, A. (2005): The Art of Seeing: Visual Perception In Design and Evaluation of Non-Photorealistic Rendering, Dissertation, New Brunswick, New Jersey.

[6] Nienhaus, M. (2005): Real-Time Non-Photorealistic Rendering Techniques For Illustrating 3D Scenes And Their Dynamics, Dissertation, Potsdam.

[7] Chen, J., Turk, G., MacIntyre, B. (2012): A Non-Photorealistic Rendering Framework with Temporal Coherence for Augmented Reality, 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Atlanta, GA, pp. 151-160.

[8] Hegde, S., Gatzidis, C., Tian, F. (2012): Painterly rendering techniques: a state-of-the-art review of current approaches, Computer Animation & Virtual Worlds, 24: pp. 43-64.

[9] Lee, Y., Kim, Y., Kang, H., Lee, S. (2013): Binocular Depth Perception of Stereoscopic 3D Line Drawings, Proceedings of the ACM Symposium on Applied Perception, Dublin, Ireland, pp. 31-34.

[10] Chiu, C.-C., Lo, Y.-H., Lee, R.-R., Chu, H.-K. (2015): Tone- and Feature-Aware Circular Scribble Art, Computer Graphics Forum 34.