Resonant Body Communication for Neurodivergent Contexts: A Multimodal, Temporally Elastic Blueprint for Inclusive Biometric Systems

Author's Information:

Piper Hutson

Lindenwood University, USA 

https://orcid.org/0000-0002-1787-6143  

James Hutson

Lindenwood University, USA

 https://orcid.org/0000-0002-0578-6052

Vol 02 No 11 (2025):Volume 02 Issue 11 November 2025

Page No.: 952-963

Abstract:

Prevailing emotion-sensing systems privilege a facial-and-gaze paradigm that encodes neurotypical tempo, channel priority, and expression classes, thereby mischaracterizing or erasing neurodivergent communication. This article advances a design-oriented framework for biometric sensing that centers resonant body communication: temporally extended, multimodal, and environmentally situated patterns of posture, gesture, rhythm, and interoception that carry affective meaning. Through an integrative methodology that synthesizes cognitive neuroscience, embodied arts practices, and human–computer interaction, the study formalizes a theoretical model with four pillars: temporal elasticity, multimodal sensory hierarchies, resonant gestures and rhythmic entrainment, and environmental attunement. Building on this model, the article specifies technical requirements for next-generation systems, including whole-body pose capture, wearable inertial and physiological sensing, ambient context instrumentation, and cross-channel fusion pipelines aligned to individual baselines. Temporal analytics—windowed sequence modeling, rhythm extraction, and state trajectory inference—are proposed to recover slow affective dynamics that escape frame-level classifiers. Illustrative design patterns are presented for clinical diagnostics, affective interfaces, extended reality environments, and learning technologies, emphasizing participatory co-design and neurodiversity-affirming outcomes. A parallel ethics program addresses consent, privacy, representational harm, and the risk of normative enforcement, recommending local control, transparent inference, and disability-led governance for deployment settings. The contribution is twofold: a unifying vocabulary for neurodivergent affect as embodied resonance, and a concrete technical blueprint for inclusive biometric architectures. By rebalancing attention from faces toward bodies in context, affective technology can move from narrow detection toward attuned interpretation, improving accuracy, dignity, and usefulness for a broader range of minds. Future work outlines validation protocols and cross-domain deployment benchmarks.

KeyWords:

Neurodiversity, Biometrics, Embodiment, Interoception, Inclusivity

References:

  1. Abbas, R., Ni, B., Ma, R., Li, T., Lu, Y., & Li, X. (2025). Context based emotion recognition: A survey. Neurocomputing, 618, 129073. https://doi.org/10.1016/j.neucom.2024.129073
  2. Abalde, S. F., et al. (2024). A framework for joint music making: Behavioral findings, neural and computational methods. Psychology of Music. Advance online publication. https://doi.org/10.1177/0305735624112345
  3. Aikat, V., Carpenter, K. L. H., Babu, P. R. K., Di Martino, J. M., Espinosa, S., Compton, S., Davis, N., Franz, L., Spanos, M., Sapiro, G., & Dawson, G. (2025). Autism digital phenotyping in preschool- and school-age children. Autism Research, 18(6), 1217–1233. https://doi.org/10.1002/aur.70032
  4. Al-Azani, S., & El-Alfy, E. M. (2025). A review and critical analysis of multimodal datasets for emotional AI. Artificial Intelligence Review, 58, 334. https://doi.org/10.1007/s10462-025-11271-1
  5. Alhejaili, R., Alshammari, M., Alshammari, R., & Alotaibi, S. (2023). The use of wearable technology in providing assistive solutions for mental well-being. Sensors, 23(17), 7378. https://doi.org/10.3390/s23177378
  6. Atsumi, T., Ide, M., Chakrabarty, M., & Terao, Y. (2025). The role of anxiety in modulating temporal processing and sensory hyperresponsiveness in autism spectrum disorder: An fMRI study. Scientific Reports, 15, 17674. https://doi.org/10.1038/s41598-025-02117-5
  7. Bagdasarova, N. (2024, September 9). The price of emotion: Privacy, manipulation, and bias in Emotional AI. Business Law Today. American Bar Association. Retrieved from https://www.americanbar.org
  8. Buzi, G., Eustache, F., Droit-Volet, S., Desaunay, P., & Hinault, T. (2024). Towards a neurodevelopmental cognitive perspective of temporal processing. Communications Biology, 7, 987. https://doi.org/10.1038/s42003-024-06641-4
  9. Cabitza, F., Ciucci, D., & Balsano, C. (2022). The unbearable (technical) unreliability of automated facial coding. Big Data & Society, 9(2), 1–15. https://doi.org/10.1177/20539517221129549
  10. Canino, S., Morese, R., & Zullo, G. (2022). On the embodiment of social cognition skills: The inner and outer body in the emotional experience. Frontiers in Psychology, 13, 1018278.
  11. Cano, S., Cubillos, C., Alfaro, R., Romo, A., García, M., & Moreira, F. (2024). Wearable solutions using physiological signals for stress monitoring on individuals with autism spectrum disorder (ASD): A systematic literature review. Sensors, 24(24), 8137. https://doi.org/10.3390/s24248137
  12. Carvalho, R. C., & Mendes, L. (2025). Applications and reliability of automatic facial emotion recognition software: A five-year review (2020–2024). In Proceedings of the 11th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-exclusion (DSAI 2024). ACM.
  13. Castro-Alonso, J. C., Ballesteros, T. R., & Paas, F. (2024). Research avenues supporting embodied cognition in learning and instruction. Educational Psychology Review, 36, 37.
  14. Chandrasekharan, S., Arabian, H., & Moeller, K. (2024). Decoding emotions: How temporal modelling enhances recognition accuracy. IFAC-PapersOnLine, 58(24), 439–442. https://doi.org/10.1016/j.ifacol.2024.11.077
  15. Chen, H., Zhang, X., Chen, Z., Ren, Y., & Liu, R. (2025). Auxiliary diagnostic method for children with autism spectrum disorder based on virtual reality and eye-tracking technology. Scientific Reports, 15, 40552. https://doi.org/10.1038/s41598-025-24243-w
  16. Chen, W., Burns, M. A., Pai, V. M., Moore, S. M., Palmer, J. T., van Sobel, J., McGarry, S. D., Nishimura, T., Jacobs, P. M., & Lang, M. (2025). Multimodal data curation via interoperability: Use cases with the Medical Imaging and Data Resource Center. Scientific Data, 12, 5678.
  17. Choi, D. (2025). Designing inclusive AI interaction for neurodiversity (Doctoral dissertation). University of Washington.
  18. Coelho, F., Pereira, A., & Amaral, J. (2025). Sensory processing of time and space in autistic children. Journal of Autism and Developmental Disorders. Advance online publication.
  19. Cortese, S., et al. (2025). Latest clinical frontiers related to autism diagnostic strategies. The Lancet Regional Health – Europe, 45, 100791.
  20. Dahlstrom-Hakki, I., Edwards, T., Larsen, J., Alstad, Z., Belton, G., Lougen, D., & Santana, D. (2025). Inclusive VR through inclusive co-design with neurodiverse learners. Immersive Learning Research – Academic, 1(1), 61–65. https://doi.org/10.56198/ysgp7045
  21. Daniel, S., Cañas, G., Nowell, S. W., & Shire, S. Y. (2024). A handbook for rhythmic relating in autism: Supporting social play for unconventional communicators. Frontiers in Psychology, 15, 1384068. https://doi.org/10.3389/fpsyg.2024.1384068
  22. Eigsti, I. M. (2024). Deliberate synchronization of speech and gesture: Effects of neurodiversity and development. Language and Cognitionhttps://doi.org/10.1017/langcog.2024.12
  23. Feldman, M. J. (2024). The neurobiology of interoception and affect. Trends in Cognitive Sciences. Advance online publication. https://doi.org/10.1016/j.tics.2024.07.003
  24. Ferreira, R. d. S., & Castro, T. H. C. d. (2024). Participatory and inclusive design models from the perspective of universal design for children with autism: A systematic review. Education Sciences, 14(6), 613. https://doi.org/10.3390/educsci14060613
  25. Goffin, K. (2025). Do not go reading my emotions: Affective harm and the ethics of emotion recognition technology. Philosophical Psychology. Advance online publication. https://doi.org/10.1080/09515089.2025.2571445
  26. Guerrero-Sosa, J. D. T., Romero, F. P., Menéndez-Domínguez, V. H., Serrano-Guerrero, J., Montoro-Montarroso, A., & Olivas, J. A. (2025). A comprehensive review of multimodal analysis in education. Applied Sciences, 15(11), 5896. https://doi.org/10.3390/app15115896
  27. Han, Y., Zhang, P., Park, M., & Lee, U. (2024). Systematic evaluation of personalized deep learning models for affect recognition. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 8(4), Article 206. https://doi.org/10.1145/3699724
  28. Ingber, A. S. (2025). Regulating emotion AI in the United States: Insights from empirical inquiry. In Proceedings of the 2025 ACM Conference on Fairness, Accountability, and Transparency (pp. 1–13). https://doi.org/10.1145/3715275.3732014
  29. Jasodanand, V. H., et al. (2025). An AI-first framework for multimodal data in Alzheimer’s disease and related dementias. Alzheimer’s & Dementia: Diagnosis, Assessment & Disease Monitoring. Advance online publication.
  30. Jeppesen, A. C. E., et al. (2025). An examination of gaze behaviour in autistic adults using a naturalistic virtual reality paradigm. Autism Research. Advance online publication.
  31. Kalantari, N., Zheng, H., Graff, H. J., Evmenova, A. S., & Genaro Motti, V. (2021). Emotion regulation for neurodiversity through wearable technology. In 2021 9th International Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 1–8). IEEE. https://doi.org/10.1109/ACII52823.2021.9597458
  32. Katirai, A. (2025). Autism and emotion recognition technologies in the workplace. Autism. Advance online publication. https://doi.org/10.1177/13623613241279704
  33. Keil, M. V. (2024). AI-supported UI design for enhanced development of neurodiverse-friendly IT-systems. In NordDesign 2024 Conference Proceedings (pp. 1–10). The Design Society.
  34. Khare, S. K., Blanes-Vidal, V., Nadimi, E. S., & Acharya, U. R. (2024). Emotion recognition and artificial intelligence: A systematic review (2014–2023) and research recommendations. Information Fusion, 102, 102019. https://doi.org/10.1016/j.inffus.2023.102019
  35. Kim, H., Cvetković, B., & Martinez, A. M. (2025). A review of 25 spontaneous and dynamic facial expression databases. IEEE Transactions on Affective Computing. Advance online publication.
  36. Klein, M., Schmitz, J., & Paulus, M. (2025). Interoception in individuals with autism spectrum disorder: A systematic review and meta analysis. Neuroscience & Biobehavioral Reviews, 163, 105559. https://doi.org/10.1016/j.neubiorev.2024.105559
  37. Lazzarelli, A., Laquale, D., & Lombardi, L. (2024). Interoceptive ability and emotion regulation in mind–body practices: An embodied approach to clinical intervention. Frontiers in Psychology, 15, 1159128. https://doi.org/10.3389/fpsyg.2024.11591285
  38. Li, F., & Zhang, D. (2025). Multimodal physiological signals from wearable sensors for affective computing: A systematic review. Intelligent Sports and Health, 1(4), 210–222. https://doi.org/10.1016/j.ish.2025.10.001
  39. Li, J., Kuipers, J. R., Årsand, E., & Solbakken, S. M. (2024). A comparison of personalized and generalized approaches to emotion recognition using consumer wearable devices: Machine learning study. JMIR AI, 3(1), e52171. https://doi.org/10.2196/52171
  40. Lian, H., Lu, C., Li, S., Zhao, Y., Tang, C., & Zong, Y. (2023). A survey of deep learning based multimodal emotion recognition: Speech, text, and face. Entropy, 25(10), 1440. https://doi.org/10.3390/e25101440
  41. Lind, S. K., Xiong, Z., Forssén, P. E., & Krüger, V. (2024). Uncertainty quantification metrics for deep regression. Pattern Recognition Letters, 186, 91–97. https://doi.org/10.1016/j.patrec.2024.09.011
  42. Lu, H., Chen, J., Zhang, Z., Liu, R., Zeng, R., & Hu, X. (2025). Emotion recognition from skeleton data: A comprehensive survey. arXiv preprint, arXiv:2507.18026.
  43. Mahajan, A. (2025). A time aware approach to emotion recognition. Journal of International Research for Engineering and Management, 5(4), 1–8.
  44. Mohamed, Y., Lemaignan, S., Guneysu, A., Jensfelt, P., & Smith, C. (2024). Fusion in context: A multimodal approach to affective state recognition. arXiv preprint, arXiv:2409.11906.
  45. Mourad, J. J., de Lillo, L., & Berardis, D. (2024). Innovative digital phenotyping method to assess body representations in autistic individuals. Sensors, 24(20), 6523. https://doi.org/10.3390/s24206523
  46. Navarro, L., et al. (2025). The effect of music interventions in autism spectrum disorder. Autism Research and Treatmenthttps://doi.org/10.1155/2025/12602440
  47. Pan, B., Hirota, K., Jia, Z., & Dai, Y. (2023). A review of multimodal emotion recognition from datasets, preprocessing, features, and fusion methods. Neurocomputing, 561, 126866.
  48. Possaghi, I., Vesin, B., Zhang, F., Sharma, K., Knudsen, C., Bjørkum, H., & Papavlasopoulou, S. (2025). Integrating multi-modal learning analytics dashboard in K–12 education: Insights for enhancing orchestration and teacher decision-making. Smart Learning Environments, 12, 53. https://doi.org/10.1186/s40561-025-00410-4
  49. Qin, L., Zhang, H., & Li, Y. (2024). New advances in the diagnosis and treatment of autism spectrum disorders. European Journal of Medical Research, 29, 132. https://doi.org/10.1186/s40001-024-01916-2
  50. Qu, J., Chen, Y., Zhang, H., & Li, X. (2024). Daily emotion recognition from physiological and environmental signals in the wild. In Proceedings of the 2024 ACM International Joint Conference on Pervasive and Ubiquitous Computing (pp. 1–13). https://doi.org/10.1145/3715931.3715934
  51. Rad, D. (2025). Synthetic emotions and the illusion of measurement. Brain Sciences, 15(9), 909. https://doi.org/10.3390/brainsci15090909
  52. Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act). (2024). Official Journal of the European Union, L 1689, 1–144.
  53. Rescigno, M., Spezialetti, M., & Rossi, S. (2020). Personalized models for facial emotion recognition through transfer learning. Multimedia Tools and Applications, 79, 35811–35828. https://doi.org/10.1007/s11042-020-09405-4
  54. Schelenz, L., Shew, A., & Bødker, S. (2025). The contributions of human–computer interaction to diversity, equity, and inclusion: Comparing US-American and Western/Northern European perspectives. Digital Creativity, 36(1–2), 73–90.
  55. Schilbach, L. (2025). Synchrony across brains. Annual Review of Psychology, 76, 1–23. https://doi.org/10.1146/annurev-psych-080123-101149
  56. Shen, Z., & Paik, I. (2025). Temporal modeling of social media for depression forecasting: Deep learning approaches with pretrained embeddings. Applied Sciences, 15(20), 11274. https://doi.org/10.3390/app152011274
  57. Shuai, T., Zhang, Z., & Li, H. (2025). Advances in facial micro-expression detection and recognition: A comprehensive review. Information, 16(10), 876. https://doi.org/10.3390/info16100876
  58. Soccini, A. M., & Clocchiatti, A. (2025). Daily life adaptation in autism: A co-design framework for the validation of virtual reality experiential training systems. Electronics, 14(21), 4268. https://doi.org/10.3390/electronics14214268
  59. Sokołowska, E., Sokołowska, B., Chrapusta, S. J., & Sulejczak, D. (2025). Virtual environments as a novel and promising approach in (neuro)diagnosis and (neuro)therapy: A perspective on the example of autism spectrum disorder. Frontiers in Neuroscience, 18, 1461142. https://doi.org/10.3389/fnins.2024.1461142
  60. Song, X., Liu, C., Xu, L., Gao, B., Lu, Z., & Zhang, Y. (2025). Affective computing methods for multimodal embodied AI human–computer interaction. Aslib Journal of Information Management, 77(3), 421–448.
  61. Tatom, R. C., & Newbutt, N. (2025). Using virtual reality to support autistic employees: A perspective on creating inclusive neurodiverse work environments. Frontiers in Virtual Reality, 6, 1648971. https://doi.org/10.3389/frvir.2025.1648971
  62. Tcherdakoff, N. A. P., Stangroome, G. J., Milton, A., Holloway, C., Cecchinato, M. E., Nonnis, A., Eagle, T., Al Thani, D., Hong, H., & Williams, R. M. (2025). Designing for neurodiversity in academia: Addressing challenges and opportunities in human–computer interaction. In CHI EA ’25: Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (pp. 1–5). ACM.
  63. Vassall, S. G., et al. (2025). Recommendations for increasing sample diversity in autism research: Lessons from multisensory studies. Journal of Autism and Developmental Disorders. Advance online publication. https://doi.org/10.1007/s10803-025-07022-4
  64. Wadley, G., Lascau, L., Tzafestas, E., Billinghurst, M., D’Mello, S., et al. (2022). The future of emotion in human–computer interaction. In CHI ’22 Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems (pp. 1–8). ACM.
  65. Wafa, A. A. (2025). Advancing multimodal emotion recognition in big data through prompt engineering and deep adaptive learning. Journal of Big Data, 12, 1264.
  66. Wu, Y., Mi, Q., & Gao, T. (2025). A comprehensive review of multimodal emotion recognition: Techniques, challenges, and future directions. Biomimetics, 10(7), 418. https://doi.org/10.3390/biomimetics10070418
  67. Yan, L., Wu, X., & Wang, Y. (2025). Student engagement assessment using multimodal deep learning. PLOS ONE, 20(6), e0325377. https://doi.org/10.1371/journal.pone.0325377
  68. Yu, L., Wang, Y., & Jiang, X. (2025). Multimodal sensing-enabled large language models for human state understanding: A survey. IEEE Transactions on Affective Computing. Advance online publication.
  69. Zheng, Z. (2025). Development and validation of the embodied cognition scale for Chinese university students. Frontiers in Psychology, 16, 1682631. https://doi.org/10.3389/fpsyg.2025.1682631
  70. Zolyomi, A., Hicks, E., Ito, M., Jeyte, U., & Park, S. (2025). Integrating connected learning, neurodiversity, and emerging technology: An intersectional analysis of HCI, education, and disability research. Connected Learning Alliance.