Music, Vision and Artificial Intelligence = window to cognitive and learning behavior.
Music involves multiple sensory processing, integration and quasi-simultaneous decision making and action taking. Especially with the classical piano playing, it requires fast visual recognition of the patterns from music and relaying to comprehension of the motor task and execution in split seconds succession.
Our founder and neuro-aficionado, Yayoi Sakaki has created cognitive model based on piano playing mechanism and established a method to quantitatively measure the direct cognitive feedback.
Our products are based on this model and the method and we have 2 genres of offering:
1) research data analysis and help create AI model based on our cognitive model with various visual inputs for our customers. Pattern recognition over different sensory inputs, especially with the visual recognition relaying to decision making and action initiation are often modeled after neural models. We are confident that our model can help accelerate the visual recognition to relay to make the decision making process and help monitor how the associative memory in correlation to the specific visual stimuli can be accumulated and retrieved. With our unique use of musical notation to help us understand how we see and how we recognize visual input as well as how we relay such information to the actual action execution, we are confident not only to help us decode our brain mechanism relating to cognitive control via visual input but also to re-create the visuo-motor processing with AI with its own capability to make decision based on the accumulated associative memory.
2) quantitative cognitive control testing examining user biofeedback data.
Our model is ideal to measure how the user control the motor action based on the visual stimuli by comprehending the command. Visual recognition capability of human beings is actually fairly straightforward to train, proven from 15+ years of piano teaching experience if the user is an adult (children, especially up to about 13 years of age, take more time to learn how to recognize patterns). However, the "glitch" seem to occur when it is combined with motor execution based on the comprehended information. The "glitches" often include impulsiveness in motor control and unclear understanding over the command due to the prioritized motor initiation over visual processing and/or decision making. Current cognitive control testing methods are based on subjective and observation-based checklists and there is no quantitative criteria applied to evaluate the patients. Whereas our approach enables to monitor direct cognitive feedback of the test users, with selective biofeedback responses, such as haptic, eye movements and/or qEEG depending on the physicians' choices appropriate for the targeted conditions. Our testing help track the users' cognitive state as often as determined necessary by the physicians using VR (Virtual Reality) headset compatible smartphones at the convenience of home or at clinics with the presence of healthcare providers. Our target conditions currently include:
ADHD, Parkinson's Disease, mTBI (concussion), cognitive decline and mild cognitive impairment due to the
side effects of chemotherapy
Our test product of haptic response analyzer app to measure cognitive control is currently in development.
3) Neuromodulation digital therapy app using the same system configuration (same add-on sensors to generic mobile VR).
Using the same system as 2), we are in exploration with its therapeutic application using specific rhythm synchronization with sensory stimuli to help induce and regulate certain brain oscillation frequency. We are looking into its positive impact on the following conditions:
regulating sleep cycles, anxiety relief, emotion regulation, induction of optimized visual attention during visual tasks
(the above matches to good numbers of non-motor symptoms of Parkinson's Disease, which is our primary target condition).
[Ipsilon News updates]
June, 7, 2017: We are pleased to announce that we now have a clinical collaboration confirmed with Prof. Gabriele Siciliano and Dr. Sigrid Baldanzi of Neurology department, Hospital Santa Chiara, Pisa, Italy.
June, 2017: Prototype updated for anticipated clinical trial use (EEG 8-point sampling, Haptic response on MIDI 25 key keyboard controller with computer vision algorithm software for the data analysis).
- We are seeking investment as pre-seeding startup.
- Our business deck is available upon request (or can be seen at Slideshare of Yayoi Sakaki, Founder).
- We are in preparation for patent filing with our model and method to measure direct cognitive feedback in quantitative manner as well as its possible therapeutic use with specific sensory stimuli projected in the app
- We are pleased to announce that our operational office is setup at Erasmus Center for Entrepreneur (ECE) campus (11th Fl) at the Science Tower, Rotterdam, The Netherlands. We are in the transitional period between April to Summer, 2017 for setting up office for clinical testing purposes.
We have been mentored at incubator/accelerator at:
ENDuRE Project (University of Pisa, www.endureproject.eu/), May 30, 2016 to June 11, 2016
Erasmus Center for Entrepreneurship (http://ece.nl/), May 6, 2016 to July 1, 2016
key words: decoding visual stimuli (music notation/ symbols) to spatial understanding and execution, eye fixation/ saccades and comprehension/ decision making, visual recognition and processing, implicit and explicit learning, problem solving skills, object related attention, spatial attention, auditory related attention, metacognition, memory, personalities, learning tendencies, performance, algorithm, eye-hand coordination, fine motor skill, Cerebellum and Basal Ganglia functions, music, piano.