Welcome aboard, let me explain what it is
This example is forΒ achromatopsia, which can see the world with only grayscale. There are lot more types of color-disability than chromatopsia.
We all love Berlin.Β The picture above is a image of FriedrichstraΓe, which is close to Haupftbanhof(centeral station) of Berlin. You can see heavy traffic here, let's imagine a situation that you've become a color-blinded person. If the unexpected car, bicycle, comes in front of you, can you react and brake down your car at the exact right time? At this point, you can see color disability is critical for driving situation, which can hurt someone's life. Most of color disabled people cannot drive. Even getting drive license is not allowed in some countries.
OurΒ Color Saves LifeΒ program comes at this point. Our goal is to attatchΒ transparent displayΒ on front-window of vehicle*(Thanks to LG
)*, and simple sensors (normal camera in colorblind case). Boom! Now you can see AugmentedReality - Based - Driver Infortainment! In our program, we used Gazebo Simulator for prototype development. To adventure the our world of Gazebo Simulator, pleaseΒ check here. You can find full storyline about the simulator here.
Another important feature of our program, isΒ Easy-to-develop. If you just add your detecting algorithm and drawing part(opencv) for each frame inΒ pluginsΒ folder, it is very easy develop new features. You can even run multiple plugins at same time, with plugin_master's features.Β Check hereΒ and find more interesting ideas for future development.
Architecture - Ideal
Architecture - Prototype
Folder Structure
./
βββ ros2pkg/
β βββ image_subscriber/
β βββ test_publisher/
β
βββ srcs/
β β # Simulation Part
β βββ simulation_ws/src/
β β βββ sim/ # ros2 pkg for gazbeo simulation world and vehicle model
β β βββ teleop/ # ros2 pkg for gazbeo vehicle teleoperation
β β βββ tracking/ # ros2 pkg for detecting the eye position
β β
β β # Submodules
β βββ yolov5/
β β
β β # Python Client
β βββ assets/ # test images, fonts
β βββ plugins/ # You can deploy your own plugin here
β β βββ color_disability/
β β β βββ model/ # Train by using YOLOv5
β β β βββ color_disability.py
β β β βββ traffic_object.py
β β βββ plugin.py
β β βββ plugin_master.py
β βββ disability_assistant.py
β βββ image_subscriber.py
β βββ main.py
β βββ requirement.txt
β
βββ test_drive_data.tar.xz # rosbag data of driving in gazebo to test the detection model
β
βββ LICENSE
β
βββ assets/ # asset folder for documentations
β βββ imgs/ # image files
β βββ presentations/ # presentation matrials
β
βββ README.md # your entrypoint!
Plain Text
볡μ¬
Presenation Matrials
Please visitΒ hereΒ to check our final videos and presentation files(pptx)! It may crash a little bit because of lack of font files, but I hope it works fine in your machines (:
How to Use
1.
Download the docker image and unzip
# Download Releases/ColorSavesLife/ColorSavesLife.tar.bz2
bunzip2 ColorSavesLife.tar.bz2
Shell
볡μ¬
2.
Turn on the docker environment
docker load --input ColorSavesLife.tar
Shell
볡μ¬
3.
Open 3 terminal
# Fist Terminal
docker run -it --env DISPLAY=$DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix csl:1.0 /bin/bash
# Second Terminal
docker ps # Check docker container ID
docker exec -it <container_ID> /bin/bash
# Thrid Terminal
docker ps
docker exec -it <container_ID> /bin/bash
Shell
볡μ¬
4.
Unzip rosbag data
cd ~/ColorSavesLife
tar -xf test_drive_data.tar.xz
Shell
볡μ¬
5.
Run the application
# First Terminal
cd ~/ColorSavesLife/srcs
python3 main.py
# Second Terminal
cd ~/ColorSavesLife
ros2 bag play test_drive_data
# Third Terminal
rviz2 # Add Image_msg -> Set topic (/car/camera1/image_raw)
Shell
볡μ¬
World of Simulation
We used ROS2 and Gazebo simulation to implement and test our idea. The following demonstrates how to build from source, run the simulation, and process the video. We used Ubuntu 20.04 with ROS2 Foxy and Gazebo 11, assuming you have a similar working environment and correct installations.
cd srcs/simulation_ws
colcon build
Shell
볡μ¬
1. Test eye tracking simulator
This program perceives the position of your eyes through a webcam and moves the camera object on Gazebo accordingly. This allows you to obtain a first-person perspective-like view within the simulation, mimicking your movements as if you were moving within the simulation environment.
Secify the path to the model in theΒ gui.iniΒ file located inΒ .gazebo, and set theΒ gazebo_model_pathΒ in theΒ package.xmlΒ file to fit your local environment, you will be able to use the following features. Subscribe to the Image topicΒ car/camera1/image_rawΒ in RViz2 to monitor the first-person perspective.
# Terminal 1
source install/local_setup.bash
ros2 launch sim test.launch.py
# Terminal 2
source install/local_setup.bash
ros2 run sim camera_movement
# Terminal 3
source install/local_setup.bash
ros2 run tracking eye_tracking
# Terminal 4
rviz2
Plain Text
볡μ¬
2. Test teleoperation of the vehicle
Now that you have the vision, it's time to drive. This time, we'll simulate driving in a world with simple roads, buildings, and traffic objects. By following the instructions below, you'll be able to move a vehicle forward, backward, and steer within the simulation. We'll use theΒ pygameΒ library for keyboard input with the WASD keys, assuming you have it installed.
Click on the empty pygame screen that pops up, then try using WASD keys to move the vehicle.
# Terminal 1
source install/local_setup.bash
ros2 launch sim sim.launch.py
# Terminal 2
source install/local_setup.bash
ros2 run teleop controller
# Terminal 3
rviz2
Plain Text
볡μ¬
If you have successfully followed along up to this point, you should have an idea of how to simultaneously run eye tracking and vehicle driving. Now, utilize both functionalities to collect realistic visual data and provide it to vision processing model.
DEMO!!!
Future Development Plan
Color disability is not only disability that affects to normal life. In the report of WHO(World Health Organization), one of five people has problem of hearingΒ (who.int/health-topics/hearing-loss). It is very important fact that 80% of them are living in low-income/mid-income countries, and hearing care interventions are cost-effective. If they can get help for their life with getting driving skills, this will effect a lot. With this solucation Solution, we can help them to drive much more safely and make their life much more easier and enjoyable for driving. As an example, we show you some of our future development plan for hearing disability and dementia.
SoundVisualizer for HearingDisability
Sound sense is also so much important in driving situations and sirious problem. There are 450 million people who have hearing disability. Think about the emargency situation that ambulance is coming from your behind. If you have problem with hearing, this will cause worse circumstanses like car accident or a person who needs help cannot be rescued. But with SoundVisualizer plugin , that visualize any sound around your car and which direction does it comes from on front driver window. This will help you to react to the situation and make a right decision.
AI Driving Assistant for Dementia
Dmentia is also a serious problem for driving. There are 55 million people who have dementia. Even their disablity is not effecting to their driving skills, they are not allowed to drive in some countries. Because they tend to pay less attention to the road and they are not able to make a right decision. With our AI Driving Assistant plugin for Dementia, that learn the driver's driving pattern and other driving data, that can show some warning message or make a sound when the driver is not paying attention to the road or detecting some unusual driving pattern.
So our platform is not only for color disability, but also for other disabilities. This has great potential to improve the lives of people with disabilities, making driving easier and more enjoyable for them.
Team Member
β’
Kwanho Kim:Β @KKWANH
β’
Hokyung Park:Β @Ho-mmd
β’
Sujong Ha:Β @lalywr2000
β’
Shuta Ogura:Β @Shuta-Syd
β’
Oscar Lopez
References
β’
β’