Skip to main content

Sense-based user interface platform for behavioral pattern analysis of young children

Abstract

This paper presents a system integration of a sense-based user interface (SUI) platform, comprised of flexible pressure and humidity sensor arrays with a commercial inertial measurement unit (IMU), to analyze behavioral patterns of young children. The pressure sensors utilize a sensor array created using flexible inkjet printing, with each sensor using a piezoresistive sensing layer. The humidity sensors employ an interdigitated capacitive sensor based on a polyimide humidity-sensitive layer and are also manufactured using the flexible inkjet printing technique. To achieve a wide measurement area, both the pressure and humidity sensors are expanded into 5 × 5 and 5 × 10 sensor arrays, respectively. Also, commercial IMU, including accelerometer/gyroscope sensors, is employed. Finally, the SUI platform is in the form of a cuboidal block model, with an IMU and circuits embedded within the block. Multilayered pressure and humidity sensor arrays are installed on the external surface of the block. Collected data from each sensor are visualized through heatmaps and 3D motion representation to create a platform that integrates fine-grained behavior as well as global behavior information of young children. This research would provide a foundation for the development of SUI technology, especially aimed at individuals who have difficulty with conventional forms of input–output devices.

Introduction

The advances in digital devices such as computers and smartphones have also led to the widespread use of input–output devices such as keyboards, pointing devices, and touchscreens. Such input–output devices offer rapid response times, convenience, and ease of use [1]. Specifically, the use of touch-based input–output devices has been a rapid surge because of its simple and easy operation by finger touches and hand gestures. The onset of the era of social distancing due to the COVID-19 pandemic further accelerated the utilization rates of various digital devices [2, 3]. This led to the increased exposure of children to a multitude of digital devices from their early years of development [4, 5]. Nonetheless, infants and young children who are not yet capable of reading have not developed the visual acuity required to perceive rapid movements within the screen [6]. Therefore, such adult-centric input–output devices are physically unsuitable during the early childhood stage, which is a critical period for language cognition, visual perception, and sensory development. Moreover, digital devices operated by simple hand gestures are primarily designed for adult convenience, making them unsuitable for use in the early childhood phase when significant development of gross and fine motor skills, essential for tasks like object manipulation, is crucial. Therefore, digital input–output devices tailored for adults can have negative implications for the physical development of young children [7]. Furthermore, adult-centric input–output devices may adversely affect the neural development of young children. Repeated exposure to digital devices blurs the distinction between the real and virtual world for young children. Therefore, psychological assessments for young children are necessary, despite being influenced by adult-centric approaches. Conventional methods for assessing mental health involve expert consultations, surveys, and brain imaging within mouse and keyboard-based environments [8]. Unfortunately, these methods are designed for adults and require a minimum level of physical and intellectual capabilities, which can result in subjectivity and bias, making it challenging to accurately diagnose mental health conditions in children without specialized expertise [9]. Therefore, the development of a young children-focused input–output system is imperative.

Because of limitations in language communication, young children express their emotions and intentions through actions, such as shaking or throwing objects, pressing objects, and mouthing behaviors. For instance, in the case of neurodevelopmental disorders such as autism, diagnosis can be predicated upon the child’s behavior. Seemingly purposeless rotations of objects or repetitive actions may appear trivial in infants but are indicative of early autism symptoms in toddlers [10]. There is a need to develop a method to understand the intent behind such expressions of young children or behavioral patterns that may be indicative of early signs of a disease. Contemporary techniques for analyzing the behavioral patterns of children employ visual information such as segmentation, object detection, and classification technologies involving aspects like positioning and movement patterns [11]. Nevertheless, these conventional methods rely on videos recorded from a third-party perspective, limiting their capability to discern only the gross behaviors of young children. Information regarding their subtle actions and the pattern within these actions remains elusive. Consequently, to capture fine-grained behaviors, there is a need to design sensors capable of detecting various behaviors and to utilize a platform that can serve as a child-friendly intermediary in the form of familiar toys. This platform can help integrate task-based educational programs and collect data on the behavior, emotions, and intentions of young children. This data can potentially facilitate early diagnosis of physical, neurological, and mental health disorders in young children.

With the advancement of sensor technology, our research aims to integrate lightweight and compact physical sensors into digital interfaces accessible to young children. The goal of our research is to develop a sensor-based user interface (SUI) capable of collecting and analyzing various data related to specific behaviors in young children who may be challenging to analyze through direct communication. In the suggested SUI, we integrate commercial accelerometer/gyroscope sensors into toy blocks to detect physical movements and include pressure and humidity sensors to detect fine-grained behaviors, such as finger pressing or trembling related to emotional behaviors. This may facilitate the understanding of the behaviors of young children, and therefore, we can estimate emotions and intentions of young children. The data from these sensors can be potentially processed in real-time using artificial intelligence (AI) algorithms. Also, this platform may enable the development of educational content focusing on cognitive and play development through specific tasks. By integrating input–output devices, we propose a comprehensive system to analyze young children’s cognitive and play behaviors. The proposed SUI is expected to serve as an indirect communication tool benefiting young children as well as those with communication challenges, including the hearing-impaired, elderly, and disabled.

Results and discussion

Sense-based user interface (SUI) configuration

The sensor-based user interface (SUI) is comprised of an inertial measurement unit (IMU), pressure sensors, and humidity sensors to measure fine-grained behavior as well as global behavior. The platform is in the form of a cuboidal block model, with IMU and circuits embedded within the block. Multilayered pressure and humidity sensor arrays are installed on the external surface of the block. In this configuration, the humidity sensors, operating by direct contact with external water molecules, are positioned on the outermost layer of the block, and the pressure sensors are installed beneath them (Fig. 1). To detect actions involving pressing/touching with hand or mouth, the pressure and humidity sensors were fabricated using a printed circuit board (PCB) printer (V-one, Voltera). For global motion detection, an IMU was used without the need for a separate flexible sensor, as it is unrelated to direct contact such as pressing with the mouth or applying pressure. In this study, a commercial Arduino Nano BLE (Bluetooth low energy) board (Nano 33 BLE, Arduino) with built-in accelerometers and gyroscopes was employed to capture global motion and impacts of objects.

Fig. 1
figure 1

Sense-based user interface (SUI) for behavioral analysis. Input behavior parameters and the design framework of the sensor-integrated platform

Pressure and humidity sensors and sensor array

Pressure sensors can be broadly categorized into two types based on their pressure detection mechanisms: capacitive and piezoresistive. In this study, we opted for the piezoresistive method due to its advantages, including high sensitivity at low pressures and ease of fabrication [12]. Piezoresistive pressure sensors operate on the principle of converting changes in external pressure into electrical signals by varying the resistance of a pressure-sensitive layer within the sensor [13]. Commercial Velostat is utilized as a pressure-sensitive layer (Velostat 1361, Adafruit). Velostat is a pressure-sensitive material that imparts electrical conductivity by incorporating carbon black into a polymer layer [14, 15]. It is lightweight, flexible, and cost-effective, making it suitable for a wide range of applications. The pressure intensity is measured by exploiting the principle where the distance between conductive particles within the Velostat film decreases due to applied pressure or bending, leading to an increase in the number of conductive pathways and a subsequent reduction in the film’s resistance [13]. A single pressure sensor has a sandwich structure consisting of two conductive electrodes printed on a polyimide (PI) film using a PCB printer, and the pressure-sensitive Velostat film positioned in between [16]. To evaluate the fabricated single-pressure sensor, we applied force on the single sensor and measured resistance values using both a source meter (2400 standard, Keithely) and an Arduino. At maximum force applied to the single sensor, the source meter measured a resistance of approximately 250 ohms, prompting us to set a reference resistance of 10 kohms. When no force is applied, the resistance measurement approaches infinity, resulting in an Arduino reading approximating 0 V. As force is applied, the resistance of the sensor decreases, resulting in a voltage reading approximating the supplied voltage of 5 V. Through the use of voltage division principles, we reverse-calculated the sensor’s resistance based on this voltage and subsequently calculated the input pressure. This calculated resistance value was then utilized to output the pressure data via the serial monitor. Sequentially applying force ranging from 7 to 55 kPa to the single pressure sensor, we constructed a graph based on the average resistance values output by the sensor corresponding to each weight. The graph depicted in Fig. 2a showed that the fabricated single sensor exhibited a trend of exponentially decreasing resistance with increasing pressure.

Fig. 2
figure 2

Single Flexible resistive pressure sensor and capacitive humidity sensor. a Single resistive pressure sensor and pressure-resistance correlation; b single capacitive humidity sensor and humidity-capacitance correlation

Humidity sensors can be categorized into two types based on their measurement methods: capacitive and resistive. The capacitive method utilizes tiny water droplets attached to a substrate, causing a change in the dielectric constant and, consequently, a variation in capacitance [17, 18]. It measures the humidity by detecting an increase or decrease in capacitance. The advantages of this approach include a high linearity, less sensitivity to external temperature, and the ability to measure low humidity [19, 20]. In this study, we adopted the capacitive method to fabricate a humidity sensor. The conventional parallel capacitor design [21], which uses a capacitive structure for humidity sensors, is unsuitable for humidity measurement because both the top and bottom are enclosed by plates and dielectric material, leaving no space for the permeation of water droplets. Therefore, in this study, we designed a humidity sensor with an interdigitated structure, as shown in Fig. 2b. Furthermore, unlike the traditional interdigitated electrode (IDE) structure, for sensor integration purposes, we printed the capacitor electrodes on both sides, creating a humidity sensor. For these reasons, although the traditional IDE structure has the drawback of an additional serial capacitor, leading to a decrease in initial capacitance, we adopted this design approach due to its advantages in terms of integration and the potential for sensor array expansion. Then, we characterize the fabricated single humidity sensor using both an LCR meter and an Arduino. To evaluate the functionality of the completed single humidity sensor, we utilized a constant temperature and humidity chamber (TH3-ME, JEIO tech) to gradually increase humidity over approximately 15 min. We measured the changes in capacitance and verified that multiple sensors manufactured under the same conditions exhibited similar ranges of capacitance values. These results served as the basis for constructing the graph depicted in Fig. 2b. The graph shows that the fabricated sensors show an increase in capacitance with rising humidity levels, indicating the correct functioning of the humidity sensor.

To increase the measurement area of the designed single pressure and humidity sensors, we expanded them into 5 × 5 and 5 × 10 sensor array structures. For the pressure sensor array, we reduced the noise from parasitic capacitance by printing the top-side lines horizontally and the bottom-side lines vertically, avoiding interference between them. For the humidity sensor array, we printed electrodes on both sides of the PI film to promote integration and minimize weight. To reduce interference noise between electrode lines, we printed the top-side lines vertically and the bottom-side lines horizontally. To simultaneously collect signals from the 25 and 50 sensors within the two sensor arrays developed, we utilized a multiplexer (MUX, CD74HC4067, Texas Instruments) and connected flexible flat cables (FFC, AWM 20,624, Fuxell) to each side accordingly. Python’s Heatmap tool was used to visualize the received signal data, as depicted in Fig. 3, allowing us to observe the partial activity of the sensors. The overall array was then evaluated. In Fig. 3a, the regions where resistance decreased upon application of selective force to specific areas of the pressure sensor are highlighted in blue, providing a clear indication of where the force was exerted. Similarly, in the humidity sensor array, as seen in Fig. 3b, placing a wet tissue on a portion of the sensor resulted in a red heatmap, demonstrating that our fabricated pressure and humidity sensor arrays function effectively even when only specific sections are engaged.

Fig. 3
figure 3

Flexible pressure and humidity sensor array. a Fabricated pressure sensor array and performance evaluation with absence of pressure (left) versus pressure application to a specific area (right); b fabricated humidity sensor array with absence of humidity (left) versus the placement of wet tissue on a specific area (right)

System integration for behavioral pattern detection

Figure 4a shows the system integration with the pressure and humidity sensor arrays on the external surface of the block and the IMU into the block. The BLE board with the IMU is installed inside the block and receives rotated angle data (e.g. yaw, pitch, and roll) in real time in accordance with the block’s movement. The pressure sensor array is attached to the block’s surface, and the humidity sensor array is installed on top of it. The MUX and FFC connected to the pressure and humidity sensor arrays are installed inside the block, and real-time data is acquired through Arduino. Based on the integrated block, Python’s heatmap and processing program are used to verify whether the sensors operate appropriately in specific environments. In Fig. 4b, wet tissue is placed on specific areas of the sensor, and pressure is applied on top of it to visually confirm the simultaneous measurement of humidity and pressure in both the affected and unaffected regions. After integrating the sensor, when pressure is applied without using wet tissue, pressure is clearly confirmed, but humidity is observed weakly. Conversely, when using wet tissue without applying pressure, the pressure heatmap does not respond, and only humidity data is output to some extent. The yaw, pitch, and roll data, which is calculated using the “Madgwick filter” from the accelerometer and gyro data output from the BLE board, is received through a processing program to visualize the board’s movement in 3D form. Based on the degree to which the BLE-equipped block is tilted, it outputs the 3D motion of the block created using the yaw, pitch, and roll data from the three axes, as depicted in Fig. 4c. As shown in Fig. 5a, b, behavioral actions ere categorized into fine-grained behavior and global behavior, and visualization was carried out to analyze specific behavioral patterns using the integrated block. First, fine-grained behavior detection using humidity and pressure sensors was divided into two cases: simply pressing the finger and lightly trembling the finger. The humidity and pressure sensor arrays can clearly distinguish when a finger is pressed and when it is not pressed on the completed platform, as confirmed through the heatmap. Global behavior was tested using the IMU based on accelerometer data, divided into shaking and throwing actions. When shaking the block, all three axes of the accelerometer values change, and when the motion stops, a constant acceleration value can be seen. This demonstrates that significant changes in the accelerometer values occur only during the shaking action. When the throwing action is performed, the acceleration value in the direction of the fall shows the most significant change. Repeated throwing actions result in nearly identical patterns of accelerometer data output on all three axes. Using the sensor data from the completed platform, it is expected that different values will be output for each specific action, allowing for the distinction of various behaviors.

Fig. 4
figure 4

Flexible Force and Humidity sensor—IMU integration. a The comprehensive sensor platform and the cross-sectional view of the layered sensor layer. b Simultaneous detection of both pressure and humidity by the integrated sensor. c Visualization of accelerometer and gyroscope measurements from the IMU sensor

Fig. 5
figure 5

Behavior pattern analysis. a Detection of fine-grained behavior with integrated data visualization. b Global behavior detection by IMU accelerometer data

Conclusion

This study developed a digital sensor-based user interface (SUI) for analyzing behavioral patterns, utilizing flexible pressure and humidity sensors, alongside a commercial inertial measurement unit (IMU). A pressure sensor with rapid response, high sensitivity at low pressures, and a straightforward fabrication process was engineered. For humidity sensing, a capacitive sensor with high linearity and minimal sensitivity to external temperature variations was crafted. Each sensor was expanded into arrays to enhance the measurement area and resolution. The device motion was detected using a commercial IMU sensor. Input data from each sensor was preprocessed and then visualized using heatmaps and 3D motion representations. The conventional visual information-based behavioral pattern analysis techniques, based on videos recorded from the perspective of third parties, have limitations in analyzing fine-grained behavioral patterns of subjects due to observer subjectivity. However, integrating three sensors into an SUI platform made it possible to digitize, analyze, and collect subtle behavioral patterns of young children that may go unnoticed by experts or caregivers. This research may potentially lay the foundation for the development of SUI for young children as well as individuals with communication difficulties.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding authors on reasonable request.

References

  1. Mushroor S, Haque S, Amir R (2020) The impact of smart phones and mobile devices on human health and life. Int J Community Med Public Health 7(1):9–15

    Article  Google Scholar 

  2. Bahkir F, Grandee S (2020) Impact of the COVID-19 lockdown on digital device-related ocular health. Indian J Ophthalmol 68(11):2378–2383

    Article  Google Scholar 

  3. De R, Pandey N, Pal A (2020) Impact of digital surge during COVID-19 pandemic: a viewpoint on research and practice. Int J Manag 55:102171

    Article  Google Scholar 

  4. Campos L, Kcrmar M, Caldas Osório A (2023) Predictors of screen exposure among infants under 2 years of age during the COVID-19 pandemic. Infant Behav Dev 73:101885

    Article  Google Scholar 

  5. Sivrikova N, Ptashko T, Perebeynos A, Chernikova E, Gilyazeva N, Vasilyeva V (2020) Parental reports on digital devices use in infancy and early childhood. Educ Inform Technol 25:3957–3973

    Article  Google Scholar 

  6. Harris L, Davis N, Cunningham U, Vocht L, Macfarlane S, Gregory N, Aukuso S, Taleni T, Dobson J (2018) Exploring the opportunities and challenges of the digital world for early childhood services with vulnerable children. Int J Environ Res Public Health 15(11):2407

    Article  Google Scholar 

  7. Cadoret G, Bigras N, Lemay L, Lehrer J, Lemire J (2016) Relationship between screen-time and motor proficiency in children: a longitudinal study. Early Child Dev Care 188(22):231–239

    Google Scholar 

  8. Carter A, Little C, Briggs-Gowan M, Kogan N (2000) The infant–toddler social and emotional assessment: comparing parent ratings to laboratory observations of task mastery, emotion regulation, coping behaviors, and attachment status. Infant Ment Health J 20(4):375–392

    Article  Google Scholar 

  9. Grimes D, Schulz K (2002) Bias and causal associations in observational research. Lancet 19(9302):248–252

    Article  Google Scholar 

  10. Webb S, Jones E (2009) Early identification of autism: early characteristics, onset of symptoms, and diagnostic stability. Infants Young Child 22(2):100–118

    Article  Google Scholar 

  11. Sun Y, Hu J, Wang W, He M, de With PHN (2021) with PHN camera-based discomfort detection using multi-channel attention 3D-CNN for hospitalized infants. Quant Imaging Med Surg 11(7):3059–3069

    Article  Google Scholar 

  12. Chen W, Yan X (2020) Progress in achieving high-performance piezoresistive and capacitive flexible pressure sensors: a review. J Mater Sci Technol 43(15):175–188

    Article  Google Scholar 

  13. Kalantari M, Dargahi J, Kovecses J (2012) A new approach for modeling piezoresistive force sensors based on semiconductive polymer composites. IEEE Am Soc Mech Eng 17(3):572–581

    Google Scholar 

  14. Jeong E, Lee J, Kim D (2011) Finger-gesture recognition glove using velostat. In: International Conference on control, automation and systems (ICCAS), pp 206–210

  15. Barba R, Madrid A, Boticario J (2015) Development of an inexpensive sensor network for recognition of sitting posture. Int J Distrib Sens Netw 11(8):969237

    Article  Google Scholar 

  16. Giovanelli D, Farella E (2016) Force sensing resistor and evaluation of technology for wearable body pressure sensing. J Sens 2016(13):9391850

    Google Scholar 

  17. Farahani H, Wagiran R, Hamidon M (2014) Humidity sensors principle, mechanism, and fabrication technologies: a comprehensive review. Sensors 14(5):7881–7939

    Article  Google Scholar 

  18. Najeeb M, Ahmed Z, Shakoor R (2018) Organic thin-film capacitive and resistive humidity sensors: a focus review. Adv Mater Interfaces 5(21):1800969

    Article  Google Scholar 

  19. Kim Y, Jung B, Lee H, Kim H, Lee K, Park H (2009) Capacitive humidity sensor design based on anodic aluminum oxide. Sens Actuators B 141(2):441–446

    Article  Google Scholar 

  20. Yoo K, Lim L, Min N (2010) Novel resisitive-type humidity sensor based on multiwall carbon nanotube/polyimide composite films. Sens Actuators B 145(1):120–125

    Article  Google Scholar 

  21. Rivadeneyra A, Fernandez-Salmeron J, Banqueri J, Lopez-villanueva J, capitan-Vallvey L, Palma A (2014) A novel electrode structure compared with interdigitated electrodes as capacitive sensor. Sens Actuators B 204:552–560

    Article  Google Scholar 

Download references

Acknowledgements

This research was supported by the convergence R&D over Science and Technology Liberal Arts Program through the National Research Foundation of Korea funded by the Ministry of Science and ICT (Grant No. 2022M3C1B6081061) and INHA UNIVERSITY Research Grant.

Funding

This research was supported by the convergence R&D over Science and Technology Liberal Arts Program through the National Research Foundation of Korea funded by the Ministry of Science and ICT (Grant No. 2022M3C1B6081061) and INHA UNIVERSITY Research Grant.

Author information

Authors and Affiliations

Authors

Contributions

SL and JJ fabricated the devices, conducted experiments, and analyzed the data. SL and JJ drafted the figures and manuscript. YTL and MK supervised the experiments and revised the manuscript. All the authors have read and approved the manuscript.

Corresponding authors

Correspondence to Young Tack Lee or Min-gu Kim.

Ethics declarations

Ethics approval and consent to participate

The authors declare that they have no competing interests.

Consent for publication

The authors consent the Springer Open license agreement to publish the article.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lee, S., Jang, J., Lee, Y.T. et al. Sense-based user interface platform for behavioral pattern analysis of young children. Micro and Nano Syst Lett 11, 21 (2023). https://doi.org/10.1186/s40486-023-00186-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40486-023-00186-7

Keywords