A Tesla Model S got embroiled in a minor accident in Dallas recently that caused the vehicle to hit road barriers and skid against concrete dividers while engaged in autopilot.

The news was picked up from Reddit with the driver initially blaming the Tesla's Autopilot technology for hitting the barrier without any form of warning. According to Auto Week, the Tesla's Autopilot was engaged as the road suddenly curved due to construction being done on the road.

Tesla's current version of Autopilot is not capable of safely navigating unclear road markings so the car kept going straight. This caused the Tesla to deflect off the temporary barrier, which deployed the car's air bags and the car incurred flat tires upon hitting the concrete barrier. The good thing is, no one was injured in the accident.

In this instance, a debate was generated in the comments section of the Reddit post citing if the driver was at fault. The general contention is even if the autopilot was engaged; the driver should have kept his eyes on the road and made the necessary adjustment before the car hit the barrier.

A dashcam footage surfaced online revealed that the cause of the accident was that the driver failed to regain control of the vehicle in time, Auto Blog reported. The road curved right but the car stayed straight, clearly showing the car's AP sensors did not catch the curve due to the construction going on.

Calling the Tesla's feature Autopilot must have thrown the owner off thinking it has the AI to distinguish unmarked roadblocks; Tesla has always maintained that it is up to the driver to keep their eyes on the road.

Ever since Tesla activated Autopilot functionality in Tesla Model S and Model X electric vehicles, there were numerous instances where the car's semi-autonomous autopilot feature has helped drivers and passengers veer away from being involved in accidents.

There were also reports in the past where nasty accidents occurred where the autopilot was blamed for the collision, however, such fault was later found to be the driver's, not the car or its Autopilot feature.

This is a story worth pondering about: what stories like these mean for vehicles equipped with an autopilot. Does this mean that drivers also need to be educated more on the safe use of autonomous driving assistance like Tesla's Autopilot? The dashcam video footage can be seen below.