We present a neuro-inspired system for the investigation of fine dynamic haptic discrimination for neurorobotic and neuroprosthetic applications. A Braille reading task is adopted as case study. First, tactile inputs are encoded at the level of primary afferents mimicking human mechanoreceptors. Then, a network of simulated second-order neurones processes these primary signals prior to their transmission to a downstream classifier. The latter estimates the likelihood distribution of all Braille characters which is used to determine the letter being read. We also investigate how this distribution could be used to regulate the fingertip acceleration to maximise Braille-reading performances. We employ the spiking neural network paradigm to model first- and second-order neural responses, and apply an information theoretical analysis to measure the neurotransmission reliability of the spiking patterns from peripheral to more “central” areas of the system. Our results show that the firing patterns of first- and second-order responses convey enough information to achieve an offline perfect discrimination of the entire Braille alphabet as rapidly as 250 ms after the occurrence of the first spike. Furthermore, 89% of the scanned characters are correctly recognised during an online Braille reading task at constant velocity. Finally, we show that the class probability distributions obtained during reading, can be used to optimise the scanning velocity.