CarND-Capstone

所属分类:自动驾驶
开发工具:Jupyter Notebook
文件大小:0KB
下载次数:0
上传日期:2022-11-22 00:24:15
上 传 者sh-1993
说明:  为真正的自动驾驶汽车编程,
(Programming a real self-driving car,)

文件列表:
.idea/ (0, 2020-04-29)
.idea/CarND-Capstone.iml (621, 2020-04-29)
.idea/misc.xml (221, 2020-04-29)
.idea/modules.xml (280, 2020-04-29)
.idea/vcs.xml (167, 2020-04-29)
.idea/workspace.xml (49705, 2020-04-29)
Configs/ (0, 2020-04-29)
Configs/faster_rcnn_resnet101_bosch.config (3674, 2020-04-29)
Configs/faster_rcnn_resnet101_real.config (3685, 2020-04-29)
Configs/faster_rcnn_resnet101_real_1.config (3689, 2020-04-29)
Configs/faster_rcnn_resnet101_sim.config (3683, 2020-04-29)
Configs/faster_rcnn_resnet101_sim_1.config (3683, 2020-04-29)
Configs/label_map.pbtxt (550, 2020-04-29)
Dockerfile (1207, 2020-04-29)
classifierTest/ (0, 2020-04-29)
classifierTest/tl_classifier.ipynb (41412502, 2020-04-29)
data/ (0, 2020-04-29)
data/churchlot_with_cars.csv (1961, 2020-04-29)
data/grasshopper_calibration.yml (659, 2020-04-29)
data/maptf.launch (136, 2020-04-29)
data/sim_waypoints.csv (272679, 2020-04-29)
data/wp_yaw.txt (300881, 2020-04-29)
data/wp_yaw_const.csv (323740, 2020-04-29)
imgs/ (0, 2020-04-29)
imgs/autoware_computing.png (253134, 2020-04-29)
imgs/autoware_tf1.png (404590, 2020-04-29)
imgs/autoware_tf2.png (137945, 2020-04-29)
imgs/open_simulator.png (149817, 2020-04-29)
imgs/select_waypoint.png (561527, 2020-04-29)
imgs/unity.png (222015, 2020-04-29)
requirements.txt (160, 2020-04-29)
ros/ (0, 2020-04-29)
ros/.catkin_workspace (98, 2020-04-29)
ros/launch/ (0, 2020-04-29)
ros/launch/site.launch (1009, 2020-04-29)
ros/launch/styx.launch (924, 2020-04-29)
... ...

# Udacity Self-Driving Car Engineer Nanodegree ## System Integration - Capstone project This is the project repo for the final project of the Udacity Self-Driving Car Nanodegree: Programming a Real Self-Driving Car. For more information about the project, see the project introduction [here](https://classroom.udacity.com/nanodegrees/nd013/parts/6047fe34-d93c-4f50-8336-b70ef10cb4b2/modules/e1a23b06-329a-4684-a717-ad476f0d8dff/lessons/462c933d-9f24-42d3-8bdc-a08a5fc866e4/concepts/5ab4b122-83e6-436d-850f-9f4d26627fd9) (available to students). ![front-cover](https://i.imgur.com/2nIgpgQ.jpg) ## Introduction This project consists of two parts: - Program a self driving car using [ROS](http://www.ros.org/ "ROS-link") to navigate autonomously in a highway, detecting traffic lights and accelerate/deaccelarate accordingly to maintain a good and safe driving experience. - Test the code on a real self-driving car 'Carla', a Lincoln MKZ model running Ubuntu. ## Team members **Team name: ADAS 2.0**
Name email
Anastasios Stathopoulos stathopoan@gmail.com
Aruul Mozhi Varman S aruulmozhivarman@outlook.com
Francesco Fantauzzi (lead) Francesco_Fantauzzi@yahoo.com
## Overview This project consists of several ROS nodes implementing functionalities such as traffic light detection, control and waypoint following. The overall architecture is displayed below and illustrates the three basic components: perception, planning, and control. ![ROS-architecture](https://i.imgur.com/76fgoSK.png) For every component a representative node has been implemented as illustrated below:

## Waypoint updater Node This node is responsible for navigating the car in the road adjusting the velocity for every waypoint ahead based on the traffic light state. It receives data from the topics: - /base_waypoints, a complete list of waypoints the car will be following. - /current_pose, the vehicle's current position - /traffic_waypoint, the locations to stop for red traffic lights and subscribes to the topic: `/final_waypoints` which is a list of waypoints head of the car. The code is located at the file `/ros/src/waypoint_updater/waypoint_updater.py`. ![waypoint-updater-ros-graph](https://i.imgur.com/xnGRgax.png) This node receives all the waypoint positions of the track and stores them. Everytime it receives data from the current car position it tries to find the next closest waypoint to localize itself. If there is no traffic light ahead it adjusts the speed of the next specified waypoints (variable:`LOOKAHEAD_WPS`) making sure the speed limit is not exceeded while maintaining the acceleration and jerk values below maximum allowed values. If there is a yellow or red traffic light near, it deaccelerates the car's speed as smoothly as possible respecting the speed, acceleration and jerk limit. While the light color is red the car is not moving by sending an empty list to the `/final_waypoints` topic. When the traffic light color turns green the car accelerates to the maximum allowed speed until the next red traffic light. ## Traffic Light Detection Node This node is responsible for detecting a yellow or red traffic light ahead and notify for its state. It receives data from the topics: - /base_waypoints, a complete list of waypoints the car will be following. - /current_pose, the vehicle's current position - /image_color, the image taken from a camera in front of the car and subscribes to the topic: `/traffic_waypoint` which is the index of the waypoint closest to the red light's stop line if near the car else -1. The code is located at the file `/ros/src/tl_detector/tl_detector.py`. ![tl-detector-ros-graph](https://i.imgur.com/DAnxuOH.png) The node receives initially all the traffic light positions and the waypoint positions of the track and stores them after mapping each traffic light position with the nearest waypoint. Then it subscribes to the `/image_color` topic to be able to receive images from a camera in front of the car. If the light position is larger than the specified distance the node publices -1 to the `/traffic_waypoint` topic. If the light position is smaller than the specified distance, the image taken is processed and a color classification is being conducted. If the color is yellow or red then the waypoint index closest to the traffic light position is being published indicating the car must fully stop at that specific waypoint otherwise -1 is being published. ## DBW Node This node is responsible for controlling the car in terms of throttle, brake, and steering. Carla is equipped with a drive-by-wire (dbw) system, meaning the throttle, brake, and steering have electronic control. It receives data from the topics: - /current_velocity, receive target linear and angular velocities - /twist_cmd, receive target linear and angular velocities - /vehicle/dbw_enabled, indicates if the car is under dbw or driver control and subscribes to the topics: - /vehicle/throttle_cmd, throttle - /vehicle/brake_cmd, brake - /vehicle/steering_cmd, steering The code is located at the files: - `/ros/src/twist_controller/twist_controller.py` - `/ros/src/twist_controller/yaw_controller.py` ![dbw-ros-graph](https://i.imgur.com/TXEN4pI.png) ## Notes before running This project requires the use of a GPU. Make sure you have available a Nvidia GPU. Traffic lights classification is very demanding and requires a lot of computational power. To ensure smooth and proper simulation please follow the above recommendation. ## Running Instructions Run the ROS code and open simulator (see instructions at Usage section). In order for the car to move autonomously uncheck "*Manual*" checkbox. To turn on the camera and allow traffic lights recognition, make sure "Camera" is checked. After the first time you check "Camera" during the simulation, it takes a few seconds for traffic lights recognition to start working. If the camera moves before that, it is likely to ignore the first traffic light.
## Installation Instructions ### Native Installation * Be sure that your workstation is running Ubuntu 16.04 Xenial Xerus or Ubuntu 14.04 Trusty Tahir. [Ubuntu downloads can be found here](https://www.ubuntu.com/download/desktop). * If using a Virtual Machine to install Ubuntu, use the following configuration as minimum: * 2 CPU * 2 GB system memory * 25 GB of free hard drive space The Udacity provided virtual machine has ROS and Dataspeed DBW already installed, so you can skip the next two steps if you are using this. * Follow these instructions to install ROS * [ROS Kinetic](http://wiki.ros.org/kinetic/Installation/Ubuntu) if you have Ubuntu 16.04. * [ROS Indigo](http://wiki.ros.org/indigo/Installation/Ubuntu) if you have Ubuntu 14.04. * [Dataspeed DBW](https://bitbucket.org/DataspeedInc/dbw_mkz_ros) * Use this option to install the SDK on a workstation that already has ROS installed: [One Line SDK Install (binary)](https://bitbucket.org/DataspeedInc/dbw_mkz_ros/src/81e63fcc335d7b64139d7482017d6a97b405e250/ROS_SETUP.md?fileviewer=file-view-default) * Download the [Udacity Simulator](https://github.com/udacity/CarND-Capstone/releases). ### Docker Installation [Install Docker](https://docs.docker.com/engine/installation/) Build the docker container ```bash docker build . -t capstone ``` Run the docker file ```bash docker run -p 4567:4567 -v $PWD:/capstone -v /tmp/log:/root/.ros/ --rm -it capstone ``` ### Usage 1. Clone the project repository ```bash git clone https://github.com/udacity/CarND-Capstone.git ``` 2. Install python dependencies ```bash cd CarND-Capstone pip install -r requirements.txt ``` 3. Make and run styx ```bash cd ros catkin_make source devel/setup.sh roslaunch launch/styx.launch ``` 4. Run the simulator ### Real world testing 1. Download [training bag](https://drive.google.com/file/d/0B2_h37bMVw3iYkdJTlRSUlJIamM/view?usp=sharing) that was recorded on the Udacity self-driving car (a bag demonstraing the correct predictions in autonomous mode can be found [here](https://drive.google.com/open?id=0B2_h37bMVw3iT0ZEdlF4N01QbHc)) 2. Unzip the file ```bash unzip traffic_light_bag_files.zip ``` 3. Play the bag file ```bash rosbag play -l traffic_light_bag_files/loop_with_traffic_light.bag ``` 4. Launch your project in site mode ```bash cd CarND-Capstone/ros roslaunch launch/site.launch ``` 5. Confirm that traffic light detection works on real life images

近期下载者

相关文件


收藏者