Group 11

Baby Mood Detection and Interaction Robot

Objective: Develop a robot that can detect a baby’s mood and take appropriate actions to keep the baby happy. The robot should also maintain a safe distance from the baby using an ultrasonic sensor.

Deep Learning:

  1. Mood Detection:

    • Task: Detect the baby’s mood (happy, sleepy, or unhappy).

    • Implementation: Train a deep learning model to recognize the baby’s mood based on visual and audio inputs.

    • Friday Deadline: Ensure the mood detection model is functional by Friday.

Actions Based on Mood:

  1. Sleepy:

    • Task: Play soothing music.

    • Implementation: Integrate a music playback system that activates when the baby is detected as sleepy.

  2. Happy:

    • Task: Activate a baby mobile.

    • Implementation: Design a mechanism to turn on a baby mobile to entertain the baby when detected as happy.

  3. Unhappy:

    • Task: Perform actions to cheer up the baby.

    • Implementation: Add mechanical arms that can dance or perform gestures to make the baby happy.

Robotics and Hardware:

  1. Mechanical Arms:

    • Task: Design and implement mechanical arms that can dance or gesture.

    • Friday Deadline: Ensure the mechanical arms are functional by Friday.

  2. Safety Distance Maintenance:

    • Task: Maintain a safe distance from the baby using an ultrasonic sensor.

    • Implementation: Integrate ultrasonic sensors to continuously monitor the distance between the robot and the baby, adjusting the robot’s position as needed.

Implementation Steps:

  1. Deep Learning Development:

    • Dataset Collection: Collect and annotate data of babies’ different moods.

    • Model Training: Train a deep learning model to accurately detect the baby’s mood.

    • Model Validation: Test the model to ensure it reliably detects moods.

  2. Robotics Development:

    • Mechanical Arms: Design, build, and test mechanical arms for dancing and gesturing.

    • Music Playback: Integrate a system for playing soothing music.

    • Baby Mobile Activation: Design a mechanism to activate the baby mobile.

  3. Safety System:

    • Ultrasonic Sensors: Install and calibrate ultrasonic sensors to maintain a safe distance from the baby.

    • Integration: Ensure the sensor data is used to adjust the robot’s movements to avoid getting too close to the baby.

Next Steps for Students:

  1. Deep Learning Tasks:

    • Dataset Preparation: Collect and label data for different baby moods.

    • Model Training: Train the mood detection model and ensure it is accurate and reliable.

    • Validation: Validate the model with real-world tests.

  2. Robotics Tasks:

    • Mechanical Arms: Complete the design and functionality of the mechanical arms for dancing and gestures.

    • Music and Mobile Systems: Integrate and test the music playback and baby mobile activation mechanisms.

  3. Safety Tasks:

    • Sensor Integration: Ensure ultrasonic sensors are correctly installed and integrated with the robot’s control system.

    • Safety Testing: Test the robot’s ability to maintain a safe distance from the baby.

Final Considerations:

  • Safety First: Prioritize the baby’s safety by ensuring the robot maintains a safe distance at all times.

  • System Reliability: Focus on creating a reliable system for mood detection and appropriate responses.

  • User-Friendly Design: Ensure the system is easy to use and adjust as needed for different babies and environments.

Summary:

  • Develop a deep learning model to detect baby moods and integrate it with a robot.

  • Design mechanical arms for dancing and other gestures to cheer up the baby.

  • Implement systems to play soothing music and activate a baby mobile based on the baby’s mood.

  • Ensure the robot maintains a safe distance from the baby using ultrasonic sensors.

  • Meet the deadlines for functional mood detection and mechanical arms by Friday.

Last updated