I am working on a personal project that helps people learn to recognize Chinese characters. In addition to seeing the character and it hearing its sound, I want to have the skin feel the stroke order. Hopefully one can learn to recognize characters faster with this additional sensory input (the latest brain plasticity research is positive). To achieve this I want to make a grid of vibrational motors that can be activated by specific stroke coordinates. I would love help thinking through a v1 design, pick out components and do a first schematic in Eagle! Thank you for the time, and if you are interested we will setup a meeting time at the makerspace.
Thank you Walter for the reply, and I like where your thoughts are goind. @zmetzing gave some good advice to break out the key areas of functionality with some requirements to help make this more digestible. This should make it easier for everyone to reply to specific areas. Please note this is still a work in progress, and any comments, links, and help to one ore many sections are welcome!
Haptic Surface
The purpose of the haptic pad to translate the position and motion of the individual strokes of a character to vibrations the user can feel. The basis for this approach is found in recent haptic [vest products](http://news.rice.edu/2015/04/08/vest-helps-deaf-feel-understand-speech/) and show the most promising approach to influence the brain. Here more details:
A pad like surface is ideal so that it can be easily moved to different body parts
Actuators should have the most direct contact with skin as possible
Actuators should be at least 6cm apart to be distinctly felt
Individual strokes need to be represented a vibrations and held in an on state in the stroke order for the character
@wandrson The Nitonol approach is interesting, but how well will it represent individual strokes. A key learning step is see every stroke of a specific character drawn out.
Stroke Position to Motor Translation
The feature is where vector graphic points are translated to specific actuators and commands to drive the actuators. Does anyone know a good algorithm for this?
Mobile Hanzi Drawing App
This is a mobile phone app that will draw each hanzi character for the user. It should respond to start and stop commands, and as the character is being drawn on screen the actuators must vibrate in parallel.
Mobile App to Haptic Surface Communication
This concerns the communication mechanism from the mobile app to the Haptic surface mentioned above. My first approach is bluetooth LE or USB. Will bluetooth be easier? Are there other options?
@wandrson
This seems an excessively course distance. If you look at the braille devices you’ll find that people can distinguish much closer sensations.
Good point, and yes fingertip sensation can recognize much closer spaces. I chose 6cm since I see the surface being used on someone’s back or forearms. Various studies seem to agree that the back has around a 5-6cm 2point spatial discrimination measurement. One’s fingers will be used to interact with the mobile app so I want to get those free.
@wandrson
What size grid are you going to need to represent these characters? All of the technologies I am aware of that can create physical movements would need to use a pixel representation of the strokes and the character.
Very good question, and I think you are spot on about the pixel representation of the strokes and character. In fact I see it as similar technolgy to LED display grids, but instead of LEDs there is actuators. Is that what you mean as well? If so, can LED drivers be used to turn on and off the actuators? I still need to do research on the grid size, but starting small (i.e 5x4) might be a good way to start. Thoughts?
Interesting sounding project. Given that your grid will be fairly course, you can get away from the types of actuators that a small and hence expensive. A straight solonoid would work, but if it were me I would use a mini or micro servo such as those used in rc cars. Ther contain all of the drivers and all you need is a pwm signal to control them.
If you don’t have micro controller experience you may want to take @zmetzing next intro to ARM class since driving a matrix with decent response times is a bit beyond your typical arduino.