Design and implementation of smart cooking based on amazon echo
Smart cooking based on Amazon Echo uses the internet of things and cloud computing to assist in cooking food. People may speak to Amazon Echo during the cooking in order to get the information and situation of the cooking. Amazon Echo recognizes what people say, then transfers the information to the cloud services, and speaks to people the results that cloud services make by querying the embedded cooking knowledge and achieving the information of intelligent kitchen devices online. An intelligent food thermometer and its mobile application are well-designed and implemented to monitor the temperature of cooking food.
💡 Research Summary
The paper presents a comprehensive design and implementation of a smart cooking system that leverages Amazon Echo, Alexa voice services, AWS Lambda, cloud‑based back‑end, and a custom Bluetooth‑enabled food thermometer with a companion mobile application. The authors begin by situating smart cooking within the broader smart‑home and Internet‑of‑Things (IoT) landscape, noting that modern kitchen appliances increasingly embed Wi‑Fi or Bluetooth connectivity, enabling remote monitoring and control via smartphones. They argue that cooking, especially in multi‑appliance environments common in East Asian kitchens, still suffers from a lack of real‑time feedback on food temperature and doneness, which can lead to over‑ or under‑cooking and a generally tedious user experience.
The system architecture (Figure 1) consists of four main layers:
-
Amazon Echo (Smart Speaker) – Acts as the user‑facing voice interface. When a user utters a command such as “What’s the temperature of my food?” the device captures the audio, sends it over Wi‑Fi to the Alexa Voice Service, which performs speech‑to‑text conversion and intent matching.
-
AWS Lambda (Serverless Compute) – Implemented in Node.js (v8.10), Lambda receives the parsed intent and slot values, constructs a RESTful HTTP request, and forwards it to the cloud back‑end. It also formats the JSON response from the back‑end into a spoken string that Alexa returns to the user.
-
Cloud Services (Public Cloud – AWS, AliCloud, Tencent Cloud, etc.) – Host a set of stateless REST APIs that manage the state of all connected kitchen devices. The cloud maintains a relational database (MySQL/SQLite) for persistent storage of temperature readings, user preferences, and a cooking ontology expressed in OWL. The ontology encodes food items, recipes, cooking notes, and USDA‑defined doneness levels for various meats, enabling semantic queries via SPARQL/Jena.
-
Intelligent Food Thermometer + Mobile App – The hardware comprises a development board, LCD display, three control buttons, a high‑temperature (up to 1200 °F) curved probe, a Bluetooth Low Energy (BLE) module, and an ADC. The probe can be clipped into pots, elevated above grills, or protected by a fiberglass sleeve. The mobile app (Android) connects to the thermometer via BLE, visualizes temperature trends in real‑time scatter plots, stores a local SQLite copy of the cooking ontology, and synchronizes with the cloud over Wi‑Fi. Crucially, the app predicts the remaining time to reach a target doneness using a simple model based on current temperature, target temperature, and observed heating rate.
The authors define four Alexa intents to expose the thermometer’s capabilities:
- CurrentTempIntent – Queries the current temperature.
- SetTargetTempIntent – Sets a desired target temperature.
- CookTimeIntent – Returns an estimated time‑to‑doneness.
- SetTargetAlarmIntent – Schedules an audible alarm when the target temperature is reached.
Each intent is described with sample utterances in a JSON interaction model uploaded to the Alexa Skills Kit. The Lambda handler code demonstrates error handling for network failures and internal exceptions, returning appropriate spoken prompts.
On the cloud side, all communication follows REST principles with JSON payloads. The back‑end can ingest data from BLE‑connected devices directly via Wi‑Fi bridges, or indirectly from Bluetooth‑only devices through the mobile app. The cooking ontology, stored in OWL, provides a structured knowledge base that can be queried for recipe steps, ingredient substitutions, or safety thresholds.
Implementation details include:
- Configuration of Alexa skill JSON files (intent names, sample utterances, slot types).
- Node.js Lambda functions that construct HTTP GET requests to the cloud (e.g.,
http://<server>/NewHotStuff/Aimtemp?token=...). - Cloud API endpoints written in multiple languages (Java, PHP, C#.NET, Python) that perform CRUD operations on the database and execute SPARQL queries against the ontology.
- Firmware for the thermometer handling temperature acquisition, BLE advertising, and command parsing from the mobile app.
- Mobile app UI components for temperature display, target setting, alarm configuration, and real‑time charting.
The paper concludes that integrating Amazon Echo with a purpose‑built smart thermometer and cloud services creates a cohesive smart‑cooking experience: users can interact hands‑free, receive precise temperature feedback, and be notified when food reaches the desired doneness. However, the authors acknowledge several limitations: no quantitative performance evaluation (latency, accuracy, battery life), absence of user studies to assess usability, and limited discussion of security/privacy (authentication, data encryption, firmware update mechanisms). They suggest future work should address these gaps, explore multi‑device coordination (e.g., simultaneous monitoring of several dishes), and enhance the ontology to support automatic recipe recommendation based on available ingredients and user preferences.
Overall, the work demonstrates a viable prototype that showcases how mainstream voice assistants can be extended into the culinary domain, but further engineering and human‑centered validation are required before the system can be considered ready for widespread domestic deployment.
Comments & Academic Discussion
Loading comments...
Leave a Comment