Connect with us

Technology

Rendered or real? Real-time ray tracing makes cars look convincingly realistic.

Published

on

Which is the real car?
Which is the real car?

Image: unity technologies

The hyper-realistic reflection of light on the ground in front of the car should’ve given it away, but I was so distracted by the realness of everything else that I didn’t pick up on the clues that the BMW 8 series coupe I was looking at wasn’t the real thing.

San Francisco-based Unity Technologies general manager Tim McDonough had given me this challenge via video chat. On the screen, he showed me three sets of side-by-side images of a BMW car on a stage. Some of the images I was looking at had been created in virtual reality, using the 3D imaging tools on the Unity Technologies platform with Nvidia software processing to make a photo of a car look like the car was actually on a stage. I had to guess which was the real car and which was made through a Unity ray-traced rendering.

Real-time ray tracing — a rendering method that creates a virtual image by copying how light falls and interacts with images — makes for photorealistic images that are almost indistinguishable from the real thing. The light bounces off the rearview mirror and the grille glimmers in the light. 

And I was easily fooled — all of the images looked real enough to me.

The company announced the new ray-tracing tool at the Game Developers Conference in San Francisco Monday afternoon. 

It’s a tool that can be used for rendering cars for designing or marketing purposes. Outside of the automotive industry, the technique is popular in media, entertainment, and gaming to create realistic virtual scenes, characters, and more. 

For the car industry, this means being able to create virtual cars of a prototype that’s not as far along as the real thing but will still look complete, finished, and most importantly, real. If a designer wants to see what the car would look like parked in front of the Sydney Opera House at sunset, for instance, it can do that — without heavy equipment and long re-rendering times to process the images.

Here were two of the challenges I took to see if I could tell which was a photo of a real car:

Cms%252f2019%252f3%252fd523ac9d 84db e790%252fthumb%252f00001.jpg%252foriginal.jpg?signature=f3ebrrxi5kv3yrncyozkqlsgyxi=&source=https%3a%2f%2fvdist.aws.mashable

Cms%252f2019%252f3%252fdd287490 8460 446d%252fthumb%252f00001.jpg%252foriginal.jpg?signature=eaxhck jm2rych2kq0bbrva08aq=&source=https%3a%2f%2fvdist.aws.mashable

In the end, I got every single one wrong: 0/3. A fail for me, however, is a win for the company. The images fooled my untrained eye — which means that if a car company used the rendered version in its marketing website, most of us wouldn’t know the difference. Designers don’t need to work with expensive tools and heavy-duty equipment when a simplified VR tool running on a usual design set-up can produce the same thing. 

The one on the right was rendered for both of those challenges (did you guess correctly?), and Unity had made its point. 

As McDonough said, “Ray tracing lets designers and developers create content that looks super real … You can’t tell the difference anymore.”

Uploads%252fvideo uploaders%252fdistribution thumb%252fimage%252f90777%252fb31f63d0 c958 47ae a274 df3f4f80de5d.jpg%252foriginal.jpg?signature=nnaq3flijuz9alsmisw8thf mks=&source=https%3a%2f%2fblueprint api production.s3.amazonaws

Continue Reading
Advertisement Find your dream job

Trending