Skip to Content

We would like to share our experiment on the prototype we built for Smart Learning Room.

Prototype Challenge (Problem Statement):

Imagine you go into an official Learning Room and you are the owner of it. The room is divided into 2 small area: the Learning Lounge and the Social Learning Area. The Learning Lounge is a silent and peaceful room for individual learning. The Social Learning Area is a versatile team learning environment for groups up to 8 colleagues. This learning space will optimize learning group interaction.

The task is to come up with a prototype using different sensors through which an intelligence is built to gather information like, how many people visit the room each day? Are there peak times / low times of usage? Are people really silent? How loud is it really? Etc.

[Going forward in this blog we will refer ‘Learning Lounge’ and ‘Social Learning Area’ as ‘Learning Room’]

Our approach on the Prototype:

For this prototype challenge, we came up with two personas:

  • Room in-charge

A Room in-charge should get Real-Time information on:
a) The Flow and Number of Employees currently using the Learning Room
b) Current Atmosphere of the Learning Room
c) Sound Level of the Learning Room
d) Is the room being misused?
A Room in-charge should also get below listed information for further analysis and take appropriate action for the betterment of the room utilization
a) How frequent and how long the rooms are utilized in a day
b) Average Sound level analysis
c) Room utilization analysis like under/over/optimally utilized, Peak hour and lowest usage
d) Sustainability information like Energy consumption with respect to the Number of Employees using the room

  • Employee using the room

An employee(s) who is using the room should be intimated if he/they are not silent enough and if it is been misused.

Based on the availability of the sensors we decided to use below sensors for our requirement along with Raspberry Pi and Grove board.

Sensor Used For Requirement Tackled
Ultrasonic Ranger & PIR Motion Sensor Motion detection To get number of employees and flow
Sound Sensor Sound detection To get the sound level
Light Sensor Light detection To get the luminous condition
DHT Sensor Temperature and Humidity detection To get the temperature and humidity level
LED For Notification To notify the employee if the   noise level is high and if the room is overcrowded

 

Installation/Placement of Sensors:

At the entrance of the Learning Room, Ultrasonic Ranger & PIR Motion Sensor should be placed serially so as to detect the Employee’s Inward or Outward movement. When an Employee is entering the room, he first gets detected by Ultrasonic Ranger and then when he moves forwards gets detected by PIR motion sensor making it as an Inward movement and for Outward movement the employee gets detected by PIR Motion sensor first and then Ultrasonic ranger.

Sound Sensors should be placed at various places in desk/table to get accurate sound data which can be used for sound level indications.

Based on the room size, DHT Sensors and Light Sensors needs to be installed accordingly to get Temperature, Humidity and luminous levels of the room. The signal transmission of DHT sensor is up-to 20 meters and it would ideally be sufficient for 65 x 65 feet room.

We have created a miniature prototype Learning Room using chart papers and cardboard and installed the sensors accordingly. All the sensors are connected to the Grove board and Raspberry Pi.

Technical Details


All the sensors are connected to the Grove Board which intern is connected to Raspberry Pi. These sensors and its data type are configured in HCP IoT cockpit. IoT MMS service is further used to configure the target table in which sensor data needs to be stored.

The Learning Room prototype has 3 major components/applications

  • Edge Adapter application

The Edge Adapter is a server side application built on Python and deployed in Raspberry Pi. As and when there is a signal from sensor, the data is processed by the Edge Adaptor and pushed to the Cloud via HCP IoT MMS service.

  • Backend

HANA is the target database and it has the relevant tables and Calculation/Analytic views. The master data like room information, threshold values, etc. are preconfigured in the database. The XS Odata is the service layer which exposes Analytic/Calculation view from the HANA database.

  • User Interface

The Learning Room UI application (Fiori based) is developed using SAP HCP BUILD Prototyping tool. The Room In-charge person will use the application to monitor and analyze the room usage.

 

Conclusion:

Based on the limited availability of sensor, we have used the Grove Pi (Grove board + Raspberry Pi). In real scenario instead of Grove Pi, a combination of Arduino board, XBee and Raspberry Pi can be used to achieve the same. Also we can consider different edge processing like SQL Anywhere, Smart Data Stream Lite on the edge side.

I would like to thank Ashutosh and Nitin who helped me in building this prototype and blog.

To report this post you need to login first.

1 Comment

You must be Logged on to comment or reply to a post.

Leave a Reply