Tesla want videos, asks users to consent in the event of Full-self driving crash

With the latest version of Tesla’s autonomous driving system, called Full Self-Driving with the initials FSD now in beta, the company is asking drivers to consent to the sharing and collection of videos taken by the external and internal cameras of the car in the event of an accident or serious safety risk.
This will mark the first time Tesla will attach footage to a specific vehicle and driver, according to a report from Electrek. Tesla has already collected footage as part of the FSD system, but this data has only been used to train and improve its AI autonomous driving systems. Under the new agreement, this time Tesla will be able to associate the videos with specific vehicles – and therefore drivers.
By enabling FSD Beta, I consent to Tesla collecting image data associated with the VIN from the vehicle’s external cameras and the cockpit camera in the event of a serious security risk or in the event of an event such as a collision
This wording could indicate Tesla wants to make sure it has video evidence in case its autonomous driving system is accused of causing an accident. It could also be used to detect and resolve serious problems more quickly.