Optical Laser Systems for Autonomous Driving
The OSRAM Podcast: Episode #4 with Gunnar Moos
Hello, and welcome to the Photonstudio of the OSRAM Podcast. My name is Dieter Schierer and I am an OSRAM employee in the digital communication department. I'm looking forward to a new episode on the wonders of light. Today it is about autonomous driving and the key role that light plays in it. We know about autonomous driving from numerous science fiction films. It always works very simply there. The actor gets into a futuristic vehicle, makes himself comfortable, tells the on-board computer the destination and off he goes. And indeed, this vision of the future is already taking shape today. But time will pass before we can sit back in our vehicles everywhere and read the newspaper. Optical laser systems that record the environment play a central role on the way to this goal. Today I would like to find out what technical challenges still have to be overcome and what contribution OSRAM is making to ensuring that cars can see better and better.
I have invited Gunnar Moos to the Photonstudio today. He is head of the Autonomous Driving division in OSRAM's Automotive business unit. With his teams in Munich and Berlin he is working on the development of high-resolution LiDAR modules.
Dieter: Gunnar, I am very pleased that you are with us in the Photonstudio today!
Gunnar: Yes, I would like to thank you for giving me the opportunity to tell you something about our work. I am very happy to be here.
Dieter: Thank you, thank you! Let's move on to my first question. You are at OSRAM in the Automotive Business Unit and head of the autonomous driving department. What does it have to do with OSRAM now that everyone still knows us as lighting experts? How does autonomous driving fit in with OSRAM?
Gunnar: That's exactly the point, it's not about lamps, but it's about light. It's about what we can do with light when we drive autonomously. That can be one thing: What kind of driving functions we can do, that goes in the direction of sensor technology. Or what can we do in the area of comfort? The key point is actually that light can do more than just illuminate. That's why we are looking at precisely this application in the area of autonomous driving.
Dieter: Can you tell us a little bit more what does your working day look like?
Gunnar: I'll take two more examples of the topics we deal with. One is the topic of LiDAR sensor technology. It's about using this sensor to scan the environment of an autonomous vehicle. In other words, an invisible, infrared laser scans the environment and can thus detect objects in the surrounding area. A pulse is emitted and then it is reflected by another car, pedestrian or dog and can be received again by the LiDAR sensor. Then, from the time it takes for the laser pulse to return, one can calculate how far away the object is. And if you do this point by point, you can put together a complete environmental image of a vehicle. Something like a 3D map of the vehicle's environment. And if I know exactly what the car looks like, of course a driver assistance system or an autonomous driving system can also draw conclusions and say turning left is possible, or no, it is not possible, instead go straight for another 10 meters and then you can turn left. We are developing such a LiDAR sensor in our team in Berlin and in Garching.
Another example is the interior of the vehicle. The driver's cab is increasingly becoming the living room, study or even the children's room. For example, work is being done on how to use the windows in a vehicle in order to watch videos or edit e-mails. As a large television or monitor, so to speak. And here comes the light again. That means that you can use LEDs or lasers to project these images onto windows. These are topics that not only OSRAM, but many companies in industry are working on. But our role is of course to develop these light systems and to see where we as OSRAM can create added value and a solution for our customers.
Dieter: You talked about LiDAR. I've heard it a thousand times and I think our listeners have heard it too, but what is LiDAR anyway?
Gunnar: LiDAR is an abbreviation and stands for light detection and ranging. The funny thing is that it is not a new thing but something very old that has been used for a long time, for example in atmospheric physics, to find out what gas concentration you have at a height of a hundred kilometres. In the field of autonomous driving this is used to measure distances around the vehicle. In other words, when you talk about the LiDAR sensor, you are talking about a sensor that is able to measure in a very specific direction how far away an object is. And the fact that the LiDAR sensor can make a direct distance measurement distinguishes it from other sensors.
Dieter: But why develop this when radar already exists? Everybody knows radar from ships or airplanes, why not simply place a radar on the roof of a car?
Gunnar: It is not an either/or. Autonomous vehicles require a combination of sensors, usually camera, radar and LiDAR, as each of these sensors has certain advantages. For example, a LiDAR sensor can take pictures with very good resolution at long distances. I can see whether there is a person standing at 200 meters or whether there is no person standing there. A radar, for example, cannot measure with such high resolution. On the other hand, a lot is done with cameras, which can see colours and can, for example, see traffic signs. With cameras you can also see what facial expression a person has at the side of the road and see if he looks like he is about to cross the road or not. But what the camera cannot do is measure the distance. Or when a shadow is cast on the road, the camera only sees a line. But whether this is just a shadow or whether there is a pole or even a tree trunk lying there, the camera cannot distinguish in certain circumstances or only with uncertainty. And if I can measure the distance directly with a LiDAR, then I can see if there is a bigger unevenness or if it is only a shadow. That means you combine the individual advantages of these sensors, such as distance, resolution, weather, suitability, color vision and night vision, that you have a complete environment model of the vehicle, which is as reliable as possible.
Dieter: Then all three systems, i.e. camera, radar and LiDAR, will be installed in every car or will one technology eventually prevail?
Gunnar: Today we assume that these three sensors, i.e. camera, radar and LiDAR, will be used in parallel in autonomous vehicles. On the one hand, these sensor systems complement each other with their advantages and disadvantages. And on the other hand it is so that one gets a certain redundancy by using several sensor systems. If, for example, one sensor cannot provide security, then you look at what the other sensor says. And so you can rely on several systems and only if a consistent picture is obtained, you can derive a corresponding decision. We are familiar with such concepts from the aviation industry, for example, where this kind of thing is naturally done a lot. Safety is the big issue, so you have to know exactly what's going on around the vehicle.
Dieter: To get back to the physics of the process, how does that work exactly? A LiDAR module sends light pulses to read. In which range is that, which wavelength is crucial? And the second question, I was at CES in Las Vegas with OSRAM two years ago and there we also showed various LiDAR modules. So far I have such a picture of the LiDAR module that the car has a round thing on the roof that rotates super fast and captures the surroundings. But isn't it supposed that the LiDAR modules are installed somewhere where you don't see them, for example in a headlight or in the front area?
Gunnar: Well, I will do both questions one after the other. How exactly does it work with the laser and the wavelength? We use a wavelength of about 900 nanometres for our LiDAR modules. This means that there is no visible light, because of course you don't want to disturb or dazzle anyone. And on the other hand, the components, i.e. the laser diodes, but also the detectors that receive the pulse again, are technologically well available and so far developed that they can be installed in the vehicle at a correspondingly low cost. The point of what can I install in a vehicle and what does it cost also goes straight to the second part of your question. Technically, these LiDAR systems are already super good today. I can have a perfect LiDAR image around my vehicle if I have such a sensor mounted on the roof. But if it's really suitable, the normal vehicle probably won't have it, if I want to consider cost and design. In other words, it's quite clear that these LiDAR sensors, like other cameras or radar sensors, are either built into the headlight or behind the windscreen, where cameras are often found today. Or as you know it from distance radar, also in the radiator grille. In other words, LiDAR, which is still known today as the can on the roof, is what we want to integrate into the vehicle.
Dieter: We talk all the time now about cars that drive autonomously or partially autonomously. We have known them for a long time from blockbusters or science fiction films. In some films the cars will even be flying in 2020, even if that is still a great dream of the future. But where are the autonomously driving cars? We are in the year 2020 and somehow I don't see very many cars that drive completely autonomously. Can you say something about that?
Gunnar: I think you really have to split that in two. On the one hand, there are these highly autonomous vehicles, which are also available as test vehicles. That means that in the near future we'll be seeing fully autonomous vehicles in certain use cases. So be it the shuttle that takes guests from one exhibition hall to another or a shuttle that travels on a certain route from the airport to the city centre. These are limited use cases from limited autonomy. They already exist today, and that's why more and more people are seeing them. I expect it will take well over five years before this really becomes widespread. We are concentrating our work on driver assistance systems, because we expect this to become more and more important in the next few years. And, of course, you don't necessarily see this on the road, because on the one hand, cars no longer have the cans on their roofs and on the other hand, they still have a driver. But the assistance systems are a great relief in my view. If I can give up long trips on the motorway or use the traffic jam to do something else, then of course I have gained a lot.
Dieter: I only know this with cameras, that cameras capture the whole thing by means of software and make this partial autonomy possible or is LiDAR already involved?
Gunnar: There are first vehicles in which LiDAR is installed, but this partial autonomy of level 3 does not exist on the German roads at the moment. So you can see that it is a very complex topic. It's not just a matter of developing the technology to make it possible at all, because the technology is relatively advanced. It's also a matter of testing to make sure that you can teach your vehicle all kinds of things that can happen there. Last but not least, it is also about the legislation that allows me to hand over responsibility to the vehicle. That is where we are today in 2020 on the subject of partial autonomy. Then there is this highly automated driving, levels four and five, where the vehicle can do everything and then you can either intervene or not intervene at all. And that is where I think it will be some time before we have that on a broad scale in our garages.
Dieter: What challenge do you see ahead of you as chief developer at OSRAM for LiDAR systems? What are you working on at full speed right now or what problems still need to be solved?
Gunnar: The crux of the matter from my point of view is to make the whole thing suitable for series production. And, of course, that means that the functionality has to be provided completely. So the LiDAR sensor has the right reflection quality, range, resolution and so on. But the whole thing must also meet the qualitative requirements that the whole technology will function reliably for many years, both in winter and at the height of summer. That means suitability for series production for a completely normal vehicle and that this entire package is of course in such a way from the cost and price position that we would buy it and that it is not something that can only remain a top segment.
Dieter: Stupid question, but what exactly is the difference between summer and winter for LiDAR? What is the problem here?
Gunnar: Summer and winter are first of all a huge difference in temperature. The system has to work at minus 20 degrees and when the sun shines into the car and it gets 80 degrees in the right places, the system has to survive and still work properly. That's one thing, but the system has to work just as well in snowfall. Reflections and optical images created by a snow cover must be able to work just like a normal road situation in summer.
Dieter: You say that with LiDAR the car can see up to 250 meters. How does this number come about? A laser can probably radiate over kilometers if the road is straight, so why does the car only see 250 meters?
Gunnar: That depends on the application. How far does the car have to see at all, so that the car can still derive a driving function and react? If I drive fast on the motorway, it means, when can I still brake safely from 130 kilometres per hour or faster, if something unexpectedly happens? That's where the figure comes from, and then of course it's also an economic consideration. What can and will I afford? Do I have to install a LiDAR that can look 500 meters away, which of course will be bigger and more expensive, or will I be able to drive 200 or 250 meters on the highway?
Dieter: Here is a question: if a LiDAR module emits light, then the light is reflected by objects and a pixel image is created where small dots create a 3D image. You probably also need a very powerful software that can derive commands for the car from the huge amount of data. How does this cooperation with software developers work or do we develop software ourselves?
Gunnar: Both. At OSRAM we work on software for the LiDAR sensor ourselves. But of course we also work together with partners who specialize in software and last but not least with our customers. A lot has to happen here. On the one hand, I have a point image that has to be interpreted. Is it a car, pedestrian, tree, traffic sign or is it some kind of disturbance? With what certainty do I see something? Another step is that the images and information derived from a LiDAR are then merged and compared with the information from the camera and the radar. Only from this overall image are driving functions derived in the software.
Dieter: If you were now looking for reinforcements for your team, what would you want them to bring with them so that they could work in the area of autonomous driving at OSRAM?
Gunnar: A very important point for me is to want to build and achieve something new. We are working on the solution for autonomous driving in the future. And anyone who wants to be part of it and contribute with their specific expertise is exciting for us. In other words, I'm looking for committed people with the right expertise. Be it hardware development engineers who are particularly good at optics development or automotive electronics development with us. What is also a great advantage for LiDAR are software developers who can develop automotive software, or have already gained experience, or help us to build up the very extensive area of testing.
Dieter: And most importantly, you are working on an area that is in constant development. That must be a great feeling. What do you enjoy most about your work?
Gunnar: For one thing, it is exactly what you just said. We are building a new technical solution, which we will hopefully see on the road in X years. And at the same time we are building new business opportunities for OSRAM. That's the abstract concept behind it. In everyday life, it is the little things. For example, when we hear that customers find our LiDAR convincing, that's great feedback. I really enjoy the creative work in the team when solutions come up in the technical discussion at our laptops or by the coffee machine. When a technical problem takes up a bit of our afternoon and one or two days later ideas and solutions suddenly emerge from the many small discussions. And maybe we can make something even better out of the solution of the problem, because then we can suddenly develop a special, positive sensor characteristic out of it. Such a thing is of course also a great feeling in everyday life. For example, we were at CES and presented our LiDAR sensor. Of course, it's also great when the work of the last few months is tangible and vivid on the table and not just on the computer and on PowerPoint. I think that's really cool.
Dieter: Gunnar, thank you very much for your time. You explained the area of autonomous driving to me again and showed me how many possibilities we will have and how strongly OSRAM is involved. I thank you for your time and wish you every success with further developments and discoveries.
Gunnar: Thank you, the conversation was fun and thanks to all the listeners!
In this episode of The Photonstudio, Gunnar explained to me what is behind the abbreviation “LiDAR” and why these modules are so important for the autonomous driving car of the future. You can listen to this and all the other episodes of our podcast in German on iTunes, Spotify, Soundcloud and Google Podcast. If you want to learn more about laser systems for environment detection and their importance for autonomous driving, read the online version of our innovation magazine ON. At www.osram-group.com/innovation you will also find many other exciting articles from the world of photonics. Have fun and see you in the next episode of The Photonstudio!