The autonomous vehicle shares the road with other vehicles, which can be bicycles, motorbikes, cars, buses, trucks with trailers, Segway, a police officer on the horse or anything else.
Anything that is allowed to be driven on the road should be included, and any of those participants might have their own way of interacting with the rest of the traffic.
For example, a motorcycle splitting lanes during a traffic jam, while a large truck can easily get stuck in the traffic because of its slow acceleration, and a cyclist might decide to move from the sidewalk to the middle of the road to make a left turn. It is important that all those traffic participants be captured in their unique ways of maneuvering.
The pedestrian and their behaviors also need to be modeled, especially the way they interact with the oncoming traffic.
The engineers need to reproduce the gestures of the pedestrians, for instance, whether or not they are watching the traffic, when they are distracted by texting on the phone while crossing the street. Animals’ behaviors can be even more unpredictable, like jumping in front of the vehicle erratically, getting stuck in the middle of the road, or starring at the car when it’s approaching.
Configurable pedestrian view direction
With this new feature, the pedestrians have gained an ability to look in any direction, or they can be configured to look at a specific point compared to the previous version, where pedestrians were looking straight forward while walking. It provides a more realistic view of the pedestrian looking towards the passing-by cars anticipating the environmental perception
When it comes to the simulation for the real environment, along with the pedestrians, it is equally essential to consider bicyclists as a parametric factor.