½ÃÀ庸°í¼­
»óǰÄÚµå
1518884

DJI AutomotiveÀÇ ÀÚÀ²ÁÖÇà »ç¾÷ ºÐ¼®(2024³â)

Analysis on DJI Automotive¢¥s Autonomous Driving Business, 2024

¹ßÇàÀÏ: | ¸®¼­Ä¡»ç: ResearchInChina | ÆäÀÌÁö Á¤º¸: ¿µ¹® 140 Pages | ¹è¼Û¾È³» : 1-2ÀÏ (¿µ¾÷ÀÏ ±âÁØ)

    
    
    



¡Ø º» »óǰÀº ¿µ¹® ÀÚ·á·Î Çѱ۰ú ¿µ¹® ¸ñÂ÷¿¡ ºÒÀÏÄ¡ÇÏ´Â ³»¿ëÀÌ ÀÖÀ» °æ¿ì ¿µ¹®À» ¿ì¼±ÇÕ´Ï´Ù. Á¤È®ÇÑ °ËÅ並 À§ÇØ ¿µ¹® ¸ñÂ÷¸¦ Âü°íÇØÁֽñ⠹ٶø´Ï´Ù.

DJI AutomotiveÀÇ Á¶»ç: µ¶ÀÚÀûÀÎ ±â¼ú °æ·Î·Î NOA ½ÃÀåÀ» ¼±µµÇÕ´Ï´Ù.

2016³â DJI ¿ÀÅä¸ðƼºêÀÇ »ç³» ±â¼úÀÚµéÀº ½ºÅ×·¹¿À ¼¾¼­ ºñÀüÀ¶ÇÕ À§Ä¡ÃßÀû ½Ã½ºÅÛ ÀÏü¸¦ ÀÚµ¿Â÷¿¡ ÀåÂøÇϰí ÁÖÇà¿¡ ¼º°øÇßÀ¸¸ç, DJI ¿ÀÅä¸ðƼºê°¡ µå·Ð ºÐ¾ß¿¡¼­ ÃàÀûÇØ¿Â ÀνÄ, À§Ä¡ÃßÀû, ÆÇ´Ü, °èȹ µî ±â¼úÀ» Áö´ÉÇü ¿îÀü ºÐ¾ß·Î ÀÌÀüÇÏ´Â µ¥ ¼º°øÇß½À´Ï´Ù.

DJI AutomotiveÀÇ ¼³¸³ÀÚ ¹× °æ¿µÁøÀº °ÅÀÇ ¸ðµÎ DJIÀÇ µå·Ð ÇÁ·ÎÁ§Æ® Ãâ½ÅÀ̸ç, DJI AutomotiveÀÇ ¼³¸³ Ãʱ⠸â¹ö´Â ¾à 10¸íÀ¸·Î, ÁÖ·Î DJIÀÇ Flight Control Department¿Í Vision Department¿¡¼­ ÀϽÃÀûÀ¸·Î À̵¿ÇÑ ´ëÇ¥µé·Î ±¸¼ºµÇ¾ú½À´Ï´Ù.

DJI´Â Áö´ÉÇü ·Îº¿ ¿¬±¸ Àü¹® ±â¾÷À¸·Î µå·Ð°ú ÀÚÀ²ÁÖÇàÂ÷´Â Áö´ÉÇü ·Îº¿ÀÇ ´Ù¸¥ ÇüŶó°í ÁÖÀåÇϸç, DJI´Â µ¶ÀÚÀûÀÎ ±â¼ú °æ·Î¸¦ ÅëÇØ NOAÀÇ ´ë·® »ý»ê°ú Ȱ¿ëÀ» ÁÖµµÇϰí ÀÖÀ¸¸ç, DJI AutomotiveÀÇ ¿¹Ãø¿¡ µû¸£¸é, 2025³â ¾à 200¸¸ ´ëÀÇ ½Â¿ëÂ÷°¡ DJI AutomotiveÀÇ Áö´ÉÇü ÁÖÇà ½Ã½ºÅÛÀ» žÀçÇØ ¿îÇàÇÒ °ÍÀ¸·Î ¿¹»óÇϰí ÀÖ½À´Ï´Ù.

½ºÅ×·¹¿À ºñÀü ¼¾¼­ÀÇ Áö¼ÓÀûÀÎ ÃÖÀûÈ­

DJI AutomotiveÀÇ ÇÙ½É ±â¼ú Áß Çϳª´Â ½ºÅ×·¹¿À ºñÀüÀ¸·Î, GPS¿Í °°Àº ´Ù¸¥ ¼¾¼­°¡ °íÀå ³ª´õ¶óµµ ½ºÅ×·¹¿À Ä«¸Þ¶óÀÇ ½Ã°¢Àû ÀνÄÀ» ±â¹ÝÀ¸·Î µå·ÐÀÌ È£¹ö¸µ, Àå¾Ö¹° ȸÇÇ, ¼Óµµ ÃøÁ¤ µîÀ» ÇÒ ¼ö ÀÖ½À´Ï´Ù.

DJI ¿ÀÅä¸ðƼºê´Â ½ºÅ×·¹¿À ºñÀü ±â¼úÀ» ÀÚÀ²ÁÖÇàÂ÷¿¡ Àû¿ëÇÑ ÈÄ, ´Ù¾çÇÑ ÀÚÀ²ÁÖÇà ¼öÁØÀÇ ¿ä±¸»çÇ׿¡ µû¶ó ½ºÅ×·¹¿À ºñÀü ¼¾¼­¸¦ Áö¼ÓÀûÀ¸·Î ÃÖÀûÈ­Çϰí ÀÖ½À´Ï´Ù.

2023³â NOAÀÇ ¿ä±¸¸¦ ÃæÁ·½Ã۱â À§ÇØ DJI Automotive´Â 2 ¼¼´ë °ü¼º ³»ºñ°ÔÀÌ¼Ç ½ºÅ×·¹¿À ºñÀü ½Ã½ºÅÛÀ» Ãâ½ÃÇß½À´Ï´Ù. ÀÌ ½Ã½ºÅÛÀº ¸ÂÃãÇü ±¤ÇÐ Æí±¤ÆÇÀ» Ãß°¡ÇÏ¿© Àüü ·»Áî Èĵ带 Á¦°ÅÇÏ°í ´õ ³ªÀº ÀÚü º¸Á¤ ¾Ë°í¸®ÁòÀ» »ç¿ëÇÏ¿© ´Ü´ÜÇÑ ¿¬°á ¸·´ë¸¦ Ãë¼ÒÇÕ´Ï´Ù. À̸¦ ÅëÇØ ¼¾¼­ ¼³Ä¡°¡ ¿ëÀÌÇÏ°í µÎ Ä«¸Þ¶ó »çÀÌÀÇ °Å¸®¸¦ 180mm¿¡¼­ 400mm±îÁö À¯¿¬ÇÏ°Ô ¼³Á¤ÇÒ ¼ö ÀÖ½À´Ï´Ù. ¸®Áöµå Ä¿³ØÆÃ·Îµå¸¦ ¾ø¾Ö´Â °ÍÀº ½ºÅ×·¹¿À ºñÀü ¼¾¼­ÀÇ Å« ¹ßÀüÀ̸ç, ´õ ¸¹Àº ½Ã³ª¸®¿À¿¡¼­ ½ºÅ×·¹¿À Ä«¸Þ¶ó¸¦ »ç¿ëÇÒ ¼ö ÀÖ°Ô ÇØÁÝ´Ï´Ù.

L3 ÀÚÀ²ÁÖÇàÀÇ ¿ä±¸¿¡ µû¶ó DJI ¿ÀÅä¸ðƼºê´Â 2024³â LiDAR, ½ºÅ×·¹¿À ¼¾¼­, ÀåÃÊÁ¡ ¸ð³ë Ä«¸Þ¶ó, °ü¼º ³×ºñ°ÔÀ̼ÇÀÌ °áÇÕµÈ LiDAR ºñÀü ½Ã½ºÅÛÀ» ¹ßÇ¥Çß½À´Ï´Ù. ÇöÀç ½ÃÀå¿¡¼­ ÈçÈ÷ º¼ ¼ö ÀÖ´Â 'LiDAR Àü¸é Ä«¸Þ¶ó' ¼Ö·ç¼Ç°ú ºñ±³ÇßÀ» ¶§, ÀÌ ½Ã½ºÅÛÀº ¸ðµç ±â´ÉÀ» 100% ´ëüÇϸ鼭µµ ºñ¿ëÀ» 30-40% Àý°¨ÇÒ ¼ö ÀÖ½À´Ï´Ù. ÅëÇÕ ¼³°è ´öºÐ¿¡ "LiDAR-vision" ¼Ö·ç¼ÇÀº Àüü ijºó¿¡ ÅëÇÕÇÒ ¼ö ÀÖ¾î Àüü ÀåÂø ºñ¿ëÀ» Àý°¨ÇÒ ¼ö ÀÖ½À´Ï´Ù.

"LiDAR-vision" ¼Ö·ç¼ÇÀº Â÷·® Á¾¹æÇâ Á¦¾îÀÇ ¾ÈÀü¼ºÀ» ´õ¿í Çâ»ó½Ãų ¼ö ÀÖÀ¸¸ç, LiDARÀÇ Á¤È®ÇÑ °Å¸® ÃøÁ¤ ´É·Â°ú Á¶¸í¿¡ ´ëÇÑ °ß°í¼ºÀ¸·Î ÀÎÇØ "LiDAR-vision" ¼Ö·ç¼ÇÀº ±Ù°Å¸® ÄÆÀÎ, µµ½ÉÀÇ º¹ÀâÇÑ ±³Åë È帧, ±³Åë ¾àÀÚ(VRU)(VRU) ´ëÀÀ, ÀÓÀÇ Àå¾Ö¹° ȸÇÇ, ¿ìȸ, ¾ß°£ VRU µîÀÇ ½Ã³ª¸®¿À¿¡¼­ Áö´ÉÇü ÁÖÇà ½Ã½ºÅÛÀÇ ¾ÈÀü¼º°ú ÆíÀǼºÀ» ´õ¿í Çâ»ó½Ãų ¼ö ÀÖ½À´Ï´Ù.

µ¥ÀÌÅÍ ¼öÁý ¹× ½Ã¹Ä·¹À̼ǿ¡ µå·Ð ±â¼ú »ç¿ë

¼¼ °¡Áö ÀÚÀ²ÁÖÇà µ¥ÀÌÅÍ ¼öÁý ¹æ¹ý Áß Â÷·®¿¡ ÀÇÇÑ ¼öÁýÀÌ °¡Àå ÀϹÝÀûÀÌÁö¸¸, À¯È¿ µ¥ÀÌÅÍ ºñÀ²ÀÌ ³·°í ÁÖº¯ Â÷·®ÀÇ ½ÇÁ¦ µ¿ÀÛÀ» ¹æÇØÇϱ⠽¬¿ì¸ç, ¼¾¼­ÀÇ »ç°¢Áö´ë¿¡ ÀÖ´Â µ¥ÀÌÅ͸¦ ±â·ÏÇÒ ¼ö ¾ø½À´Ï´Ù. ¶Ç ´Ù¸¥ ¹æ¹ýÀº ÇöÀå ¼öÁýÀÌÁö¸¸, À¯¿¬¼ºÀÌ ³·°í ½Å·Ú¼ºÀÌ ¶³¾îÁý´Ï´Ù.

RWTH ¾ÆÇî ´ëÇÐÀÇ ÀÚµ¿Â÷ ±â¼ú ¿¬±¸ ±â°üÀÎ fkaÀÇ ¸é¹ÐÇÑ Á¶»ç¿Í DJI AutomotiveÀÇ Áö³­ 2 ³â°£ÀÇ ½Ç½À¿¡ µû¸£¸é µå·ÐÀ» ÀÌ¿ëÇÑ Ç×°ø Ãø·® µ¥ÀÌÅÍ ¼öÁý¿¡´Â ºÐ¸íÇÑ ÀåÁ¡ÀÌ ÀÖ½À´Ï´Ù. µå·ÐÀº ´õ dzºÎÇÏ°í ¿ÏÀüÇÑ ½Ã³ª¸®¿À µ¥ÀÌÅ͸¦ ¼öÁýÇÒ ¼ö ÀÖ°í, ´ë»ó Â÷·®ÀÇ »ç°¢Áö´ë¿¡ ÀÖ´Â ¸ðµç Â÷·®ÀÇ Ç×°ø ÃÔ¿µ °´°üÀû »çÁøÀ» Àå¾Ö¹° ¾øÀÌ Á÷Á¢ ¼öÁýÇÒ ¼ö ÀÖÀ¸¸ç, º¸´Ù Çö½ÇÀûÀÌ°í °£¼·ÀÌ ¾ø´Â Àΰ£ÀÇ ¿îÀü ÇൿÀ» ¹Ý¿µÇÒ ¼ö ÀÖ°í, ƯÁ¤ µµ·Î ±¸°£À̳ª Ư¼öÇÑ ¿îÀü ½Ã³ª¸®¿À, ¿¹¸¦ µé¾î ¿Â/¿ÀÇÁ ·¥ÇÁ³ª ºó¹øÇÑ ÄÆÀÎ µî ƯÁ¤ µµ·Î ±¸°£À̳ª Ư¼öÇÑ ¿îÀü ½Ã³ª¸®¿À¿¡¼­ º¸´Ù È¿À²ÀûÀ¸·Î µ¥ÀÌÅ͸¦ ¼öÁýÇÒ ¼ö ÀÖ½À´Ï´Ù.

ÀÌ º¸°í¼­´Â DJI AutomotiveÀÇ ÀÚÀ²ÁÖÇà »ç¾÷¿¡ ´ëÇÑ Á¶»ç ¹× ºÐ¼®À» ÅëÇØ DJI AutomotiveÀÇ ÇÙ½É ±â¼ú, ¼Ö·ç¼Ç, °³¹ß µ¿Ç⠵ ´ëÇÑ Á¤º¸¸¦ Á¦°øÇÕ´Ï´Ù.

¸ñÂ÷

Á¦1Àå °³¿ä

  • °³¿ä
  • ÆÀ
  • °³¹ß ¿ª»ç
  • ¸ðµç ½Ã³ª¸®¿ÀÀÇ Áö´ÉÇü ÁÖÇà »ç¿ëÀÚ Àú´Ï ¸Ê
  • Á¦Ç° ·¹À̾ƿô(1)
  • Á¦Ç° ·¹À̾ƿô(2)
  • Á¦Ç° ·¹À̾ƿô(3)
  • ÷´Ü Áö´ÉÇü ÁÖÇà ÁøÈ­ µ¿Çâ¿¡ °üÇÑ DJIÀÇ ÆÇ´Ü

Á¦2Àå ÄÚ¾î ±â¼ú

  • °ü¼º Ç×¹ý ½ºÅ×·¹¿À ºñÀü : Á¦2¼¼´ë±îÁö °³¹ß
  • ÀÔü½Ã ÀÎ½Ä ±â¼ú(1)
  • ÀÔü½Ã ÀÎ½Ä ±â¼ú(2)
  • ºñÀü ¼¾¼­°£ ÆÄ¶ó¹ÌÅÍ ºñ±³
  • LiDAR-vision ½Ã½ºÅÛ°ú »ç¾ç
  • Áö´ÉÇü ÁÖÇà µµ¸ÞÀÎ ÄÁÆ®·Ñ·¯
  • µµ¸ÞÀÎ ÄÁÆ®·Ñ·¯ ¹Ìµé¿þ¾î
  • Lingxi Intelligent Driving System 2.0
  • BEV ÀÎ½Ä ±â¼ú

Á¦3Àå Áö´ÉÇü ÁÖÇà ¼Ö·ç¼Ç

  • Áö´ÉÇü ÁÖÇà ¼Ö·ç¼Ç
  • ¼Ö·ç¼Ç 1
  • ¼Ö·ç¼Ç 2
  • ¼Ö·ç¼Ç 3
  • ¼Ö·ç¼Ç 4
  • ¼Ö·ç¼Ç 5
  • ¼Ö·ç¼Ç 6
  • ¼Ö·ç¼Ç ºñ±³ : ÁÖÂ÷ ±â´É°ú ¼¾¼­ ±¸¼º
  • ¼Ö·ç¼Ç ºñ±³ : µå¶óÀ̺ù ±â´É°ú ¼¾¼­ ±¸¼º
  • Lingxi Intelligent Driving ±â¼ú ·çÆ®

Á¦4Àå DJI Automotive Áö´ÉÇü ÁÖÇà ¼Ö·ç¼Ç ¾ç»ê°ú ÀÀ¿ë

  • ÀÀ¿ë »ç·Ê 1
  • ÀÀ¿ë »ç·Ê 2
  • ÀÀ¿ë »ç·Ê 3
  • Baojun Yunduo¿ë Lingxi Intelligent Driving 2.0
  • ÀÀ¿ë »ç·Ê 4
  • ÀÀ¿ë »ç·Ê 5

Á¦5Àå DJI Automotive°¡ NOA ½Ã½ºÅÛÀ» °³¹ßÇÏ´Â ¹æ¹ý

  • DJI AutomotiveÀÇ NOA ¼Ö·ç¼Ç µµÀÔ
  • ¼¼°è ºñÀü ÀÎ½Ä ·çÆ®¸¦ ¼±ÅÃÇÏ´Â ¹æ¹ý
  • ȯ°æ Àνİú ¿¹Ãø ´É·ÂÀ» È®¸³ÇÏ´Â ¹æ¹ý
  • °íÁ¤µµ ·ÎÄà ÀÚ¼¼ ÃßÁ¤ ±â´ÉÀ» È®¸³ÇÏ´Â ¹æ¹ý
  • DJI Automotive NOA¸¦ ½ÇÇöÇÏ´Â ¾Ë°í¸®Áò°ú ¸ðµ¨
  • DJI Automotive°¡ NOA¸¦ ½ÇÇöÇÏ´Â ¹æ¹ý(1)
  • DJI Automotive°¡ NOA¸¦ ½ÇÇöÇÏ´Â ¹æ¹ý(2)
  • DJI Automotive°¡ NOA¸¦ ½ÇÇöÇÏ´Â ¹æ¹ý(3)
  • DJI Automotive°¡ NOA¸¦ ½ÇÇöÇÏ´Â ¹æ¹ý(4)
  • DJI Automotive°¡ NOA¸¦ ½ÇÇöÇÏ´Â ¹æ¹ý(5)
  • DJI Automotive°¡ NOA¸¦ ½ÇÇöÇÏ´Â ¹æ¹ý(6)
  • DJI Automotive°¡ NOA¸¦ ½ÇÇöÇÏ´Â ¹æ¹ý(7)
  • DJI Automotive°¡ NOA¸¦ ½ÇÇöÇÏ´Â ¹æ¹ý(8)
  • DJI Automotive°¡ NOA¸¦ ½ÇÇöÇÏ´Â ¹æ¹ý(9)
  • DJI Automotive°¡ NOA¸¦ ½ÇÇöÇÏ´Â ¹æ¹ý(10)
  • DJI Automotive°¡ NOA¸¦ ½ÇÇöÇÏ´Â ¹æ¹ý(11)
  • ½Å·Ú¼ºÀ» È®º¸ÇÏ´Â ¹æ¹ý
LSH 24.07.30

Research on DJI Automotive: lead the NOA market by virtue of unique technology route.

In 2016, DJI Automotive's internal technicians installed a set of stereo sensors + vision fusion positioning system into a car and made it run successfully. DJI Automotive's technologies such as perception, positioning, decision and planning accumulated in the drone field have been successfully transferred to intelligent driving field.

Almost all founding and management team members of DJI Automotive came from DJI's drone projects. DJI Automotive had only about 10 members at the beginning, mainly composed of representatives temporarily transferred from the Flight Control Department and Vision Department of DJI at that time.

DJI claims that it is a company specializing in the research of intelligent robots, and drones and autonomous vehicles are different forms of intelligent robots. Relying on its unique technology route, DJI holds lead in the mass production and application of NOA. By DJI Automotive's estimates, around 2 million passenger cars taking to road will be equipped with DJI Automotive's intelligent driving systems in 2025.

Continuously optimize stereo vision sensors

One of the core technologies of DJI Automotive is stereo vision. Even when other sensors like GPS fail, based on visual perception of the stereo camera, drones can still enable hovering, obstacle avoidance, and speed measurement among others.

After applying stereo vision technology to autonomous vehicles, DJI Automotive continues to optimize stereo vision sensors according to requirements of different autonomous driving levels.

In 2023, to meet the needs of NOA, DJI Automotive launched the second-generation inertial navigation stereo vision system, which eliminates the overall lens hood by adding a customized optical polarizer and cancels the rigid connecting rod using a better self-calibration algorithm. This makes it easier to install the sensor, and the distance between two cameras can be flexibly configured from 180 mm to 400 mm. Elimination of the rigid connecting rod is a huge progress in stereo vision sensors, allowing stereo cameras to be applied in much more scenarios.

Based on the needs of L3 autonomous driving, in 2024 DJI Automotive introduced a LiDAR-vision system, which combines LiDAR, stereo sensor, long-focus mono camera and inertial navigation. Compared with the currently common "LiDAR + front camera" solution on the market, the system can reduce the costs by 30% to 40%, while enabling 100% performance and replacing all the functions. Thanks to the integrated design, the "LiDAR-vision" solution can also be built into the cabin as a whole, reducing the overall installation costs.

The "LiDAR-vision" solution can further enhance safety in vehicle longitudinal control. Thanks to LiDAR's precise ranging capabilities and robustness to illumination, the "LiDAR-vision" solution can further improve safety and comfort of intelligent driving system in such scenarios as cut-in at close range, complex traffic flow in urban areas, response to vulnerable road users (VRU), arbitrary obstacle avoidance, detour, and VRU at night.

Use drone technologies for data acquisition and simulation

Among the three autonomous driving data acquisition methods, acquisition by vehicles is the most common, but the proportion of effective data is low, and it is easy to interfere with real behaviors of surrounding vehicles, and it is unable to record data in blind spots of sensors. Another method is acquisition in field, with low flexibility and insufficient reliability, a result of angle skew and low image accuracy.

According to the in-depth research by fka, the automotive technology research institute of RWTH Aachen University, and DJI Automotive's own practices in the past two years, aerial survey data acquisition by drones has obvious advantages. Drones can collect richer and more complete scenario data, and can directly collect aerial objective shots of all vehicles in blind spots of the target vehicle without obstruction, reflecting more realistic and interference-free human driving behaviors, and more efficiently collecting data in specific road sections and special driving scenarios, for example, on/off-ramps and frequent cut-ins.

Why does the implementation of vision-only autonomous driving suddenly accelerate?

Why has the pace of implementing vision-only technology solutions suddenly quicken since 2024? The answer is foundation models. The research shows that a truly autonomous driving system needs at least about 17 billion kilometers of road verification before being production-ready. The reason is that even if the existing technology can handle more than 95% of common driving scenarios, problems may still occur in the remaining 5% corner cases.

Generally, learning a new corner case requires collecting more than 10,000 samples, and the entire cycle is more than 2 weeks. Even if a team has 100 autonomous vehicles conducting road tests 24 hours a day, the time required to accumulate data is measured in "hundred years" - which is obviously unrealistic.

Foundation models are used to quickly restore real scenarios and generate corner cases in various complex scenarios for model training. Foundation models (such as Pangu model) can shorten the closed-loop cycle of autonomous driving corner cases from more than two weeks to two days.

Currently, DJI Automotive, Baidu, PhiGent Robotics, GAC, Tesla and Megvii among others have launched their vision-only autonomous driving solutions. This weekly report summarizes and analyzes vision-only autonomous driving routes.

Table of Contents

1 Overview

  • 1.1 Profile
  • 1.2 Team
  • 1.3 Development History
  • 1.4 All-scenario Intelligent Driving User Journey Map
  • 1.5 Products Layout (1)
  • 1.6 Products Layout (2)
  • 1.7 Products Layout (3)
  • 1.8 DJI's Judgment on Evolution Trends of High-level Intelligent Driving

2 Core Technologies

  • 2.1 Inertial Navigation Stereo Vision: Developed to the 2nd Generation
  • 2.2 Stereo Vision Perception Technology (1)
  • 2.3 Stereo Vision Perception Technology (2)
  • 2.4 Parameter Comparison between Vision Sensors
  • 2.5 LiDAR-vision System and Specifications
  • 2.6 Intelligent Driving Domain Controller
  • 2.7 Domain Controller Middleware
  • 2.8 Lingxi Intelligent Driving System 2.0
  • 2.9 BEV Perception Technology

3 Intelligent Driving Solutions

  • 3.1 Intelligent Driving Solutions
  • 3.2 Solution 1
  • 3.3 Solution 2
  • 3.4 Solution 3
  • 3.5 Solution 4
  • 3.6 Solution 5
  • 3.7 Solution 6
  • 3.8 Solution Comparison: Parking Functions and Sensor Configurations
  • 3.9 Solution Comparison: Driving Functions and Sensor Configurations
  • 3.10 Lingxi Intelligent Driving Technology Route

4 Mass Production and Application of DJI Automotive's Intelligent Driving Solutions

  • 4.1 Application Case 1
  • 4.2 Application Case 2
  • 4.3 Application Case 3
  • 4.4 Lingxi Intelligent Driving 2.0 for Baojun Yunduo
  • 4.5 Application Case 4
  • 4.6 Application Case 5

5 How Dose DJI Automotive Develop NOA System?

  • 5.1 Introduction to DJI Automotive's NOA Solution
  • 5.2 How to Choose Perception Routes for Global Vision
  • 5.3 How to Establish Environment Perception and Prediction Capabilities
  • 5.4 How to Establish High-precision Local Pose Estimation Capabilities
  • 5.5 DJI Automotive's Algorithms and Models to Enable NOA
  • 5.6 How DJI Automotive Realizes NOA (1)
  • 5.7 How DJI Automotive Realizes NOA (2)
  • 5.8 How DJI Automotive Realizes NOA (3)
  • 5.9 How DJI Automotive Realizes NOA (4)
  • 5.10 How DJI Automotive Realizes NOA (5)
  • 5.11 How DJI Automotive Realizes NOA (6)
  • 5.12 How DJI Automotive Realizes NOA (7)
  • 5.13 How DJI Automotive Realizes NOA (8)
  • 5.14 How DJI Automotive Realizes NOA (9)
  • 5.15 How DJI Automotive Realizes NOA (10)
  • 5.16 How DJI Automotive Realizes NOA (11)
  • 5.17 How to Ensure Reliability
ºñ±³¸®½ºÆ®
0 °ÇÀÇ »óǰÀ» ¼±Åà Áß
»óǰ ºñ±³Çϱâ
Àüü»èÁ¦