Toyota’s Woven Planet will use a camera-only approach to develop its self-driving project, joining Tesla as the only other company to adopt a vision-based strategy to advance itself in the race to self-driving.
Woven Planet said it could use low-cost cameras to collect data and train its self-driving system using a neural network, like Tesla’s. The company said the “breakthrough” technology could help cut costs and expand the company’s self-driving efforts Reuters.
Collecting data with self-driving cars through cameras is a reliable and accurate way to determine the advantages and disadvantages of the system. Tesla has used this strategy with its vehicles to develop a wide range of data to improve the performance of fully automated and autonomous driving combinations. Woven Planet wants to take the same approach, but it is expensive and not scalable when vehicles are equipped with expensive sensors.
“We need a lot of data. It is not enough just to have a small amount of data that can be collected from a small fleet of expensive self-driving vehicles.” Instead, We are trying to prove that we can unlock the advantage that Toyota and the big automaker will have, which is access to a huge set of data, but with a much lower resolution.”
Woven Planet’s cameras will be 90 percent less expensive than the sensors you’ve used previously, and installation is easy, which could make scaling up a project relatively easy.
Tesla has long taken the approach that cameras are essentially camera eyes, and that radar or LiDAR equipment is not necessarily necessary in developing a successful autonomous driving system. Musk once said LiDAR was a “crutch” and last year, during an earnings call, Tesla said it would benefit from a camera-only approach to self-driving.
“When your vision works, it works better than the best human being because it’s like having eight cameras, it’s like having eyes in the back of your head, next to your head, and having three eyes with different focal distances looking ahead. That’s — and it’s processed at breakneck speed,” Musk said during the call. “There is no doubt that with a pure vision solution, we can make a car significantly safer than the average person.”
Beinisch explained that Toyota still plans to use sensors such as LiDAR or other radar systems for its automated axle and autonomous vehicle projects. This is the “best and safest approach to developing robotic bots,” the report said.
“But in many years, it’s entirely possible that camera type technology will be able to catch up and overtake some of the more advanced sensors,” Beinisch added.
During the company’s latest earnings call, Musk said he would be shocked if Tesla didn’t complete its full self-driving suite by the end of the year.
I’d love to hear from you! If you have any comments, concerns or questions, please email me at [email protected]. You can also reach me on Twitter Tweet embedor if you have news tips, you can email us at [email protected].