Object detection for emergency steering control
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | 황원준 | - |
dc.contributor.author | 강상연 | - |
dc.date.accessioned | 2022-11-29T02:32:52Z | - |
dc.date.available | 2022-11-29T02:32:52Z | - |
dc.date.issued | 2021-08 | - |
dc.identifier.other | 31241 | - |
dc.identifier.uri | https://dspace.ajou.ac.kr/handle/2018.oak/20418 | - |
dc.description | 학위논문(석사)--아주대학교 일반대학원 :인공지능학과,2021. 8 | - |
dc.description.tableofcontents | Ⅰ. Introduction 1 Ⅱ. Related Work 5 Ⅲ. Proposed Method 7 A. Baseline Model 7 B. Proposed Method 9 C. Loss Function 11 Ⅳ. Datasets & Experiments 13 A. Datasets 13 B. Implementation Details 14 Ⅴ. Experimental results 15 A. Experimental results(mAP) for each scenario 15 B. Object Detection 16 C. Performance evaluation in real dangerous situations 17 D. Focal loss 18 Ⅵ. Conclusion 19 Ⅶ. Reference 20 | - |
dc.language.iso | kor | - |
dc.publisher | The Graduate School, Ajou University | - |
dc.rights | 아주대학교 논문은 저작권에 의해 보호받습니다. | - |
dc.title | Object detection for emergency steering control | - |
dc.type | Thesis | - |
dc.contributor.affiliation | 아주대학교 일반대학원 | - |
dc.contributor.department | 일반대학원 인공지능학과 | - |
dc.date.awarded | 2021. 8 | - |
dc.description.degree | Master | - |
dc.identifier.localId | 1227966 | - |
dc.identifier.uci | I804:41038-000000031241 | - |
dc.identifier.url | https://dcoll.ajou.ac.kr/dcollection/common/orgView/000000031241 | - |
dc.subject.keyword | deep learning | - |
dc.subject.keyword | focal loss | - |
dc.subject.keyword | object detection | - |
dc.subject.keyword | yolo | - |
dc.description.alternativeAbstract | Traffic accidents cause a lot of damage every year due to the negligence of drivers or pedestrians. In a sudden collision situation, the car has activated the Autonomous Emergency Braking System to avoid the competition situation. However, situations in which collisions cannot be avoided simply by Autonomous Emergency Braking (AEB) occur more often than they would otherwise. In this paper, we present a surrounding environment recognition methodology that enables accidents to be avoided through Automatic Emergency Steering (AES) in order to solve such situations. The road environment recognition method presented in this paper first defines the scenarios of possible accidents. To learn the defined scenario, proceed with the existing lane recognition [1], then vehicle recognition, simply using yolo v2 [2]. This reduces the dependence on the lane and presents a simpler method than the conventional two-step learning method. This is an autonomous driving system that pursues high-speed situational judgment, demonstrating that it shows better efficiency than conventional methods. Also, in this paper, focal loss [3] is used to solve the class imbalance between foreground and background. Present a new loss using the modulation factor for the existing cross entropy loss [4] so that we can reduce the weight of the predictable easy example and focus on the hard example. | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.