Late Blindness and Deafness are Associated with Decreased Tactile Sensitivity, But Early Blindness is Not

—Perceptual experience is shaped by a complex interaction between our sensory systems in which each sense conveys information on speciﬁc properties of our surroundings. This multisensory processing of complementary information improves the accuracy of our perceptual judgments and leads to more precise and faster reactions. Sensory impairment or loss in one modality leads to information deﬁciency that can impact other senses in various ways. For early auditory or visual loss, impairment and/or compensatory increase of the sensitivity of other senses are equally well described. Investigating individuals with deafness ( N = 73), early ( N = 51), late blindness ( N = 49) and corresponding controls, we compared tactile sensitivity using the standard monoﬁl-ament test on two locations, the ﬁnger and handback. Results indicate lower tactile sensitivity in people with deafness and late blindness but not in people with early blindness compared to respective controls, irrespective of stimulation location, gender, and age. Results indicate that neither sensory compensation nor simple use-dependency or a hindered development of the tactile sensory system is suﬃcient to explain changes in somatosensation after the sensory loss but that a complex interaction of eﬀects is present. (cid:1) 2023 The Author(s). Published by


INTRODUCTION
Our sensory systems -vision, audition, touch, smell, and taste -serve as highly precise tools to convey information on our environment to function and interact with the world. In its simplest form, each system responds individually to a stimulus using its own specialized receptors to translate the stimulus into an electrical signal that neurons can further process to formulate insights into surrounding events and thereby provide a representation of the external world (Hendry and Hsiao, 2013). Each modality is suited to process specific properties of our surroundings (Amadeo et al., 2019). For instance, our visual system has the highest spatial resolution and therefore acquires most spatial information enabling us to recognize objects or people (Eimer, 2004;Guttman et al., 2005). Audition is the most accurate sense for processing temporal information (Repp and Penel, 2002;Guttman et al., 2005;Barakat et al., 2015). Importantly, information from different senses complement each other -vision is limited to the binocular visual field and requires light, whereas audition can capture events behind the head and is not impeded by darkness (Voss et al., 2010;Bremner et al., 2012). Thus, the perceptual experience of our surroundings is not shaped by each modality separately but by a complex interaction between sensory systems (Eimer, 2004). This multisensory processing of complementary information leads to more precise and faster reactions (Spence, 2011) by improving the accuracy of our perceptual judgments, as uncertainties are minimized by combining different information (Dekker and Lisi, 2020). For example, Alais and Burr (2004) presented brief light ''blobs" (visual stimuli) or sound ''clicks" (auditory stimuli) for which healthy participants stated the relative localization to each other. When both modalities were presented contemporaneous, vision dominated the relative localization. However, when visual stimuli were profoundly blurred, estimation was based more on sound. Notwithstanding, the precision of relative localization based on information of both senses was higher than based on one modality alone. Thus, sensory impairment or loss in one modality leads to a severe information deficiency in an individual's perception and impedes localization abilities.
Accordingly, the sensory compensation hypothesis (Whitton et al., 2021) postulates that the loss of one modality leads to an increase in physiological sensitivity in the intact senses, to compensate for the missing input and preserve functioning of the individual in the world (Allman et al., 2009;Heimler and Pavani, 2014). At first sight, this seems plausible -for instance, imagine trying to find the door in the darkness of the night without bumping into furniture or people. Here, one typically tries to navigate in the room by groping furniture and walls and to listen for movement of others. In other words, one relies on the unhindered senses. This reliance on other senses is of course permanently necessary in people with sensory loss in one modality. Thus, the brain regions processing the respective information show higher activity than those deprived of sensory input, possibly leading to plastic structural changes. For instance, regarding hearing loss, several studies detected activation of the auditory cortex after presentation of tactile and visual stimuli (Kok et al., 2014;Campbell and Sharma, 2016), indicating cross-modal reorganization, meaning a takeover of an area that no longer needed for its original task. However, the compensation mechanism has sometimes been explained as usage-dependent; namely, the compensation takes place on a behavioral level, as another sense is more frequently used to perform daily tasks, which leads to more experience and, therefore, higher ability of the sensory impaired individuals in comparison to those with intact senses. For instance, higher tactile acuity in people with blindness might result from Braille usage and, therefore, may not be a direct consequence of vision loss (Wong et al., 2011). This is supported by studies finding that differences in perceptual performance between sensory impaired and healthy controls can vanish with additional practice of the latter (Grant et al., 2000;Papagno et al., 2016). Evidence on the direction of changes in sensitivity of the intact senses in individuals with blindness or deafness, the most often studied sensory loss groups, is mixed.

People with blindness
Research on sensory compensation in people with blindness has mainly focused on auditory tasks. As vision is crucial for spatial information processing, most research has focused on abilities to localize auditory stimuli. In line with the sensory compensation hypothesis, studies indicate that people with blindness excel in localization tasks in central (Muchnik et al., 1991;Lessard et al., 1998;Ro¨der et al., 1999;Voss et al., 2004) and peripheral auditory space (Ro¨der et al., 1999) as well as behind the inter-aural plane , even when sounds are presented monoaurally (Lessard et al., 1998). As individuals with blindness sometimes use echo cues when navigating, it is not surprising that people with blindness show a higher sensitivity for echo cues (Dufour et al., 2005;Schenkman and Nilsson, 2010;Kolarik et al., 2013a). Moreover, they have been shown to outperform sighted in pitch discrimination , discrimination of speech material in noise (Muchnik et al., 1991), and demonstrate higher ability in the perception of auditory motion irrespective of onset of sensory loss (Lewald, 2013) as well as in auditory temporal order judgment tasks (Stevens and Weaver, 2005).
These studies could indicate a robust link between superior auditory abilities and blindness and therefore support the assumption of sensory compensation. However, several studies found evidence for impaired auditory perception in people with blindness in certain conditions of auditory localization. For example, in monoaural localization of sounds, higher precision in the horizontal plane seems to be concomitant with a deficit in the vertical plane, indicating a trade-off in localization proficiency (Voss et al., 2015). Moreover, the superiority in sound localization seems only to apply to the judgment of relative position, as absolute distance judgments are less precise in people with blindness than in sighted (Wanet and Veraart, 1985;Lewald, 2002;Kolarik et al., 2013b). In the social context, people with blindness have been found to have no advantage over sighted controls in the use of vocal cues for trait attribution (Pisanski et al., 2016;Oleszkiewicz et al., 2017;Pisanski et al., 2017). Although most evidence from previous studies tends to support the sensory compensation hypothesis, there is a body of research offering conflicting conclusions or at least suggesting that compensation is limited to the selected aspects of perception.
Differences in tactile sensitivity of people with blindness compared to sighted have been researched less often, and again, results are inconsistent. While some studies indicate lower grating orientation thresholds for people with blindness (van Boven et al., 2000;Goldreich and Kanics, 2003;Norman and Bartholomew, 2011), others did not find significant differences as compared to sighted controls (Grant et al., 2000;Alary et al., 2009;Pellegrino et al., 2020). Likewise, thresholds for vibrotactile discrimination were both reported to be better (Wan et al., 2010) and not significantly different compared to sighted controls (Alary et al., 2009). Similarly, texture and shape discrimination abilities were sometimes found to be better in people with blindness (Alary et al., 2009;Norman and Bartholomew, 2011) but other times evaluated as identical to sighted individuals irrespective of active or passive tasks (Heller, 1989).

People with deafness
Evidence on the sensitivity in the intact senses in people with deafness seems to be even more controversial than in people with blindness. Regarding space, auditory perception allows processing beyond the visual field. Compensation for this missing information in people with deafness will probably be realized via the visual system, as this sense has a more extended ''reach" than touch, smell, and taste. This compensation has been demonstrated by reporting deaf people's superiority in visual stimuli detection (Heimler and Pavani, 2014), espe-cially in the peripheral field (Lore and Song, 1991) lower motion detection thresholds (Stevens and Neville, 2006;Shiell et al., 2014) and faster and more accurate motion direction discrimination (Hauthal et al., 2013) as well as faster temporal order discrimination compared to hearing individuals (Nava et al., 2008). However, other studies indicate no difference in movement localization performance (Hauthal et al., 2013), motion change detection both in the peripheral and central visual field (Brozinsky and Bavelier, 2004), visual temporal pattern detection (Mills, 1985) or thresholds in the perception of temporal order (Poizner and Tallal, 1987;Nava et al., 2008). Some reported even higher visual temporal thresholds in people with deafness (Heming and Brown, 2005) and their impaired performance for judgment of visual duration (Kowalska and Szelag, 2006).
Like in people with blindness, research on tactile compensation in deafness is scarce, with often lowpowered studies and no clear conclusions on the possible compensational mechanisms. Studies with deaf sample sizes between six and twelve individuals report no difference between people with deafness and hearing in response times for detection of tactile stimuli (Heimler and Pavani, 2014) and discrimination of spatial length of touches (Bolognini et al., 2012). Moreover, tactile temporal thresholds (Heming and Brown, 2005) and temporal duration discrimination (Bolognini et al., 2012) seem to be impaired compared to hearing controls. However, detection thresholds for vibratory stimuli have either been reported to not significantly differ between people with deafness and hearing individuals (Leva¨nen and Hamdorf, 2001;Moallem et al., 2010) or to be higher in people with deafness (Moshourab et al., 2017) while detection of frequency changes in vibratory stimuli is even more accurate (Leva¨nen and Hamdorf, 2001). Additionally, people with deafness seem to have improved haptic orientation processing when they are asked to actively align linear stimuli parallel to each other (van Dijk et al., 2013), while performance was worse compared to hearing in a passive gradient orientation task on the finger (but not on the tongue), though hearing devices reduced the impairment (Pellegrino et al., 2020). Only one of these studies (van Dijk et al., 2013) used a task in which participants had to use stimuli actively. All others only asked for a reaction after the application of stimuli by the experimenter such that the sensory task on the part of the participants was of a purely passive nature. Thus, even on this low level of processing, results are contradictory.
Concluding, evidence concerning visual and tactile changes after auditory or visual loss does not consistently support the sensory compensation hypothesis and even partially contradicts it.

Goals of the present study
Most studies concerning sensory compensation have focused on the impact on vision and audition, while evidence concerning the tactile sense is scarce. Previous studies were often conducted with small samples, which might not be powerful enough to reveal small statistical effects. Studies often focused on only one sensory loss group, making it difficult to draw comparative conclusions. To address these limitations, the present study is conducted in a large pool of subjects with blindness or deafness and matched respective control groups of sighted and hearing subjects. A passive task is used to assess differences in tactile thresholds because the existing evidence for tactile sensory compensation in deafness regarding low level processing is scarce and conflicting. Therefore, in our view, gathering conclusive evidence should start with more basic tasks. We choose to examine tactile sensitivity in two spots: (1) distant phalange of the index finger and (2) the soft tissue at the handback at the junction of the index finger and the thumb, to test if altered tactile sensitivity may result from training in daily tasks. If so, blind individuals were more likely to excel tactile sensitivity task performed in the index finger, but not necessarily at the top of the palm, which is not involved in daily routine (e.g., Braille reading). Correspondingly, this effect should be higher in early compared to late blindness. However, if enhanced tactile sensitivity is present for both locations, this would indicate a general change in the tactile system. Additionally, if this increased sensitivity is also present in individuals with deafness, this could indicate a general effect of attention reallocation on the remaining senses.

EXPERIMENTAL PROCEDURES Subjects
Data from 375 subjects are included in the present study À 100 individuals with blindness (46 female), 98 sighted controls (52 female), 73 participants with deafness (36 female), and 97 hearing controls (50 females). 1 Two control groups were needed because instructions for individuals with blindness and respective controls (further named ''sighted") were given orally, whereas subjects with deafness and hearing controls (further named ''hearing") received written information that a sign language translator could accompany. Both sighted and hearing controls had fully functional senses. None of the participants reported diabetes, neurological, psychiatric diseases, or other cognitive deficits.
Subjects with blindness were contacted through the Polish Association of the Blind, Social Support Services, a non-governmental foundation supporting individuals with blindness, and a contact database from previous projects. People with blindness were, in accordance with Rombaux et al. (2010), classified as people with early blindness (N = 51) with an onset of their blindness with age younger than 2 years or people with late blindness (N = 49) with onset after 2 years of age. In the current sample, all but four of people with early blindness were blind since birth. The former four individuals lost their vision at 1, 1.5, or almost 2 years of age. Sensory loss in the people with late blindness occurred between 2 and 48 years of age (M = 21.56, SD = 13.76, compare Table 1). All individuals with early blindness and 35 with late blindness had knowledge of Braille reading.
Participants with deafness were contacted through the Polish Association of the Deaf, Social Support Services, and educational centers for people with deafness. Only subjects embedded in the deaf community with a medical certificate of a hearing who were not able to hear below 90 dB in the better ear were invited to participate in the study. Auditory screening tests, i.e., vocal audiometry and a triplet test were conducted to confirm profound deafness. In the former, subjects had to decide which of six similar verbal descriptors were presented via headphones. In the latter, three verbally presented digits with varying sound-to-noise ratios had to be repeated (van den Borre et al., 2021). Subjects achieving less than 50% intelligibility under any condition were included in the group with deafness, whereas only subjects reaching 100% intelligibility for all sound-tonoise ratios in both ears were classified as hearing controls. Accordingly, 25 self-declared as deaf had to be excluded from the study as they reached higher values. Their hearing controls were retained in the study. Sensory loss in the remaining subjects with deafness was between 0 and 10 years of age (M = 1.00, SD = 2.21, compare Table 1). 41 individuals with deafness reported usage of hearing treatments, with 31 using hearing aids and 10 cochlear implants.
Controls were recruited via leaflets, press releases, and personal contacts of subjects. Healthy participants were recruited independently for both impaired groups to ensure age and gender adjusted distribution for each of the samples. Exclusion criteria were any reported health impairments.

Ethics
Each participant was informed about the course of the study before giving written informed consent. All received modest monetary compensation for their time and involvement in the study. All procedures were conducted in accordance with the Declaration of Helsinki on human experimentation and approved by the Institutional Review Board at the University of Wroclaw.

Tactile threshold estimation
Tactile thresholds were determined using monofilaments (sizes 1.65 = 0.008 grams to 6.65 = 300 grams-UPC: 768627000482) at the distal phalange of the index finger or the back of the hand on the soft tissue between the thumb, about 1 cm from the junction of the thumb with the condemning finger ( Fig. 1), both on the dominant hand.
The respective hand was determined in an interview prior to the study. Both locations were marked to secure reliable stimulation at the same point. Test subjects were asked to sit comfortably but stably in a chair with their elbows and forearms resting on the table and their dominant hand perpendicular to the body and not to move during testing.
The test involved randomly either touching or not touching the monofilament fiber on the marked location. The participant was then asked to indicate whether they felt touched or not. The monofilament was applied until bent by approximately 5 mm curvature. The principal investigator trained all experimenters to learn the visual assessment of 5 mm bending beforehand. Threshold determination was carried out in a reverse staircase fashion beginning with the thinnest monofilament. If stimulation with a filament was not felt or touch was indicated, although no stimulation occurred, stimulation was continued with a thicker monofilament. After that, if stimulation was felt, the respective size of the used monofilament was noted, and stimulation was continued with a thinner monofilament until no stimulation was felt. This monofilament size again was noted before again continuing to a thicker one. Again, sizes increased until stimulation was felt, the respective monofilament size was noted, and size was decreased until no stimulation was felt. The overall threshold was calculated over all four noted monofilament sizes. As guessing in a 2alternative response (touch felt or not) can by chance of 50% lead to a correct response, three correct touch indications were needed for thin monofilaments sizes 1.65-4.08 and two for thicker monofilaments 4.17-6.65 to be rated as ''felt", reducing the level of chance to 12.5% and 25% respectively.

Procedure
First, participants were interviewed in a quiet room on their health status, handedness, and sensory status. After that, individuals were seated comfortably at a table and received instruction on the procedure for tactile threshold estimation. After clarification of potential questions, the participant was blindfolded. Threshold estimations were performed separately for finger and handback, with a counter-balanced design indicating at which location to start. After completing measurements in the first location, a five-minute break was given before continuing to the second location. For the subjects with deafness and hearing controls, instructions were given in writing. For the interview, possible questions, and in case of need for additional assistance, a sign language interpreter was available for participants with deafness.

Statistical analysis
Differences in thresholds between sensory impaired and respective controls were analyzed with linear mixed models using the package lme4 (Bates et al., 2015) in R Version 4.1.3 (R Core Team, 2022). Separate models were estimated for people with blindness and deafness and their respective controls (sighted and hearing). Sensory status (early, late blindness, and sighted for visual loss, deafness, and hearing for auditory loss), tactile location (finger vs. handback), and their interaction term were included as fixed factors. As random effects, intercepts for subjects were included. Age and gender were included as control variables. Post-hoc contrast analysis was estimated using the package eemeans (Russell V. Lenth, 2022). To test the effect of duration since sensory loss, a separate model for people with late blindness and deafness was estimated with fixed factors duration of sensory loss and location and their interaction while controlling for gender and, in the case of deafness, for hearing treatment condition (none vs. hearing aid vs. cochlear implant). As duration since sensory loss and age are correlated, age was dropped as control variable to avoid multicollinearity in estimation. To test a possible experience dependence of the sensory threshold, we additionally estimated a model including the fixed factors location and Braille knowledge and their interaction. As would be expected, duration of sensory loss is significantly longer (W = 121, p = 0.001) for people with knowledge of Braille (Mdn = 19.00, IQR = 19.00) than those without (Mdn = 10.25, IQR = 9.00). Therefore, in addition to gender, we included duration of sensory loss as a control variable. Only subjects with late blindness were consid- Table 1. For tactile threshold estimation, subjects were stimulated with monofilaments on the dominant hand at the distal phalange of the index finger or the back of the hand. Tactile thresholds were estimated with monofilaments of different sizes (1.65-6.65). After randomly either touching or not touching the marked location, subjects had to indicate whether they felt the touch or not. In a reverse staircase fashion beginning with the thinnest monofilament to the filament of which the touch was felt. Then, the procedure was continued in a reversed fashion until the touch was not felt. This process was executed a second time. Sensory thresholds were calculated as a mean of these four sizes of filaments. Accordingly, lower values indicate higher tactile sensitivity. ered, as all subjects with early blindness had knowledge of Braille. Significance was estimated using Satterthwaite's method using package lmerTest (Kuznetsova et al., 2017). Statistical significance was defined as p < 0.05.

Deaf status
Like in subjects with blindness, the overall model (compare supplementary Table S6) showed significant variance in intercepts across individuals (SD = 0.23, 95% CI: 0.15, 0.30, v 2 (1) = 11.68, p = 0.001) pointing to the individual variability in the tactile sensitivity. Moreover, effects observed in individuals with deafness resembled those observed in participants with blindness. A significantly higher threshold for stimulation Fig. 2. Boxplots display tactile sensory thresholds across testing location (phalange of the index finger vs. back of the hand) for sighted controls (beige), subjects with early blindness (light blue) and late blindness (dark blue). Analysis indicates significantly higher thresholds and therefore lower tactile sensitivity in individuals with late blindness compared to both sighted controls and subjects with early blindness. Over all groups significantly higher thresholds were found at the back of the hand compared to the phalange of the index finger on the dominant hand. at handback compared to finger (b = 0.51, t (167.00) = 8.88, p < 0.001) and a significant decline of tactile sensitivity with age were found (b = 0.01, t (170.00) = 2.37, p = 0.019). Additionally, subjects with deafness showed significantly higher thresholds compared to controls (b = 0.18, t(318.78) = 2.47, p = 0.014) in both locations (Fig. 4), as the interaction between sensory status and stimulus location was not significant (b = À0.06, t(170.00) = À0.71, p = 0.476, compare supplementary Table S7).

DISCUSSION
The current study focuses on possible changes in tactile sensitivity after visual or auditory loss. For this purpose, we assessed tactile sensory thresholds at the finger and handback in the most sensory loss groups, namely individuals with blindness and deafness, and compared results with matched sighted or hearing controls. Our results indicate no differences between people with early blindness and sighted individuals but lower tactile sensitivity for people with late blindness and deafness compared to the respective controls, irrespective of stimulation location, gender, and age. Results show higher sensitivity at the finger compared to the handback in all groups, a decline of tactile sensitivity with age, and similar sensitivities in men and women.
Regarding differences between sensory loss groups, results concerning people with deafness compared to hearing controls contradict the sensory compensation hypothesis, as tactile sensitivity seems to be lower after hearing loss. As most individuals with deafness lost their hearing ability early before the age of one year and none after the age of 10 years, it is possible that the deficit in the tactile sense is based on missing auditory information during the development of the sensory system (Voss et al., 2010;Pavani and Bottari, 2012;Voss, 2016). An early loss of auditory perception could have led to language impairment which can result in deficits in the formation of the other sensory systems (Pavani and Bottari, 2012), including the tactile system. However, we found similar tactile sensitivity for individuals treated with hearing aids and cochlear implants and such without treatment suggesting that the treatments seem not to compensate for the effect of early hearing loss on tactile sensitivity. Consequently, the timing of the start of therapy is of great importance. An effect would only be expected with early therapy in the phase of development of sensory systems. However, as we did not assess the age at the beginning of therapy, our data cannot provide any information on this assumption.
If, per analogiam, early blindness hinders the development of the somatosensory system, we would expect lower sensitivity in people with early blindness, as the formation of sensory systems starts early in childhood. However, our data does not support this line of reasoning. In contrast, we find the reverse effect, in Fig. 4. Boxplots display tactile sensory thresholds across testing location (phalange of the index finger vs. back of the hand) for hearing controls (beige) and individuals with deafness (green). Results indicate significantly higher sensory thresholds and therefore lower tactile sensitivity in individuals with deafness compared to hearing controls. For both groups thresholds were significantly higher at the back of the hand compared to the phalange of the index finger. which late blindness is related to a deficit in tactile sensitivity whereas no difference between early blindness and sighted controls was present. Thus, early visual loss does not seem to negatively influence the development of the tactile system. Nevertheless, it is possible that auditory information is more crucial for the development of touch than visual information. Therefore, missing auditory information might harm the development of tactile sensitivity, whereas missing visual information might not. Previous research supports this idea, as more evidence indicates unchanged (Heller, 1989;Grant et al., 2000;Alary et al., 2009;Norman and Bartholomew, 2011;Pellegrino et al., 2020) or improved tactile sensation in people with visual loss (Alary et al., 2009;Wan et al., 2010) while results for people with deafness relate to decreased (Heming and Brown, 2005;Bolognini et al., 2012;Pellegrino et al., 2020), unchanged (Leva¨nen and Hamdorf, 2001;Bolognini et al., 2012;Heimler and Pavani, 2014;Pellegrino et al., 2020) or improved tactile sensation (Leva¨nen and Hamdorf, 2001;van Dijk et al., 2013;Moshourab et al., 2017) depending on the test and therefore the type of transported information. Thus, an impairment after hearing loss seems more likely than after visual loss, at least for some types of information transported by auditory stimuli. This in turn indicates that only certain information is relevant in the development of the tactile sense.
Notwithstanding, an influence of early visual loss on the development of the tactile sensory system must still be considered in terms of compensatory effects leading to similar tactile sensitivity in subjects with early blindness and sighted controls. As vision is often assumed as the most critical sense for spatial information (Bremner et al., 2012), this spatial information must be received through the other sensory systems if visual information is missing. Individuals with blindness typically perceive near space via touch and far space through auditive spatial information via echo cues (Dufour et al., 2005;Kolarik et al., 2013a). However, as young children typically start to interact with near space, early missing visual spatial information will probably be compensated by touch. The finding that children with visual deficits develop the ability to perceive space in an allocentric rather than in an egocentric frame of reference compared to sighted children (Martolini et al., 2020) indicates longer reliance on their haptic interaction with their near surroundings. This might support the sensory compensation hypothesis, as greater extent of haptic interaction might induce plastic changes in the respective system. Hence, compensation via touch is plausible. As people with late blindness showed lower tactile sensitivity compared to sighted and subjects with early blindness, the timeframe for such compensatory developmentary processes seems crucial. A higher usage of tactile information to compensate for missing near spatial information after vision loss in older age could be too late to induce the relevant plastic changes in the tactile system as most of the somatosensory development is completed. Likewise, sensory compensation in the tactile system regarding people with early deafness is unexpected, as visual information on near space is accessible in this group such that the extent of haptic interaction in early age should be comparable to hearing peers. This potentially interesting avenue of research requires more empirical investigation.
It should be emphasized that our study focused on a non-complex, passive task to detect a certain type of stimulus. Therefore, lower detection ability for such tactile stimuli indicates higher deficits on a basic level of tactile perception and therefore support the idea of deficits due to interactions in the development of senses. However, several studies show that people with deafness may outperform hearing controls in more complex tasks. For instance, the ability to detect frequency changes in vibratory stimuli is higher in people with deafness (Leva¨nen and Hamdorf, 2001). This finding might be coherent when regarding the type of stimulus: the perception of vibration through the skin is more similar to hearing than a simple touch, as vibration and sound are both based on oscillatory patterns (Ruiz-Stovel et al., 2021). Therefore, individuals with deafness can gain access to information through vibration, which they cannot perceive based on vision. For instance, people with deafness have the ability to discriminate between speakers of the same gender based on spectral cues in vibrotactile stimuli (Ammirante et al., 2013). Moreover, they can distinguish musical timbre (Russo et al., 2012) and even can detect musical emotion through vibration (Schmitz et al., 2020). These findings could indicate different mechanisms for different kinds of stimuli, for example for different type of information the respective stimulus transports. Consequently, if the basal mechanism is compensation, this type of information might be of higher importance than the modality of the stimulus, as the need for the perception of such missing information should induce plastic changes, whereas a change for the complete range of stimuli of one modality is unlikely and probably not necessary. For example, the auditory system is highly important for temporal processing (Papagno et al., 2016). Therefore, stimuli of other modalities providing such information, like vibrotactile stimuli (Whitton et al., 2021), should be of higher importance for people with deafness than stimuli not providing such information. Moreover, auditory stimuli provide spatial information beyond the visual field, making them essential for processing the space behind a person (Voss et al., 2010). Accordingly, visual stimuli in the periphery should be more relevant for people with deafness than for healthy controls (Whitton et al., 2021), while visual stimuli without spatial information like such providing information on light or contrast (Voss et al., 2010) should be of comparable importance for both groups. Likewise, as vision is crucial for spatial information processing, stimuli conveying corresponding information should be more relevant for people with blindness. For example, auditive stimuli presented in the horizontal plane provide information on whether the noise source is left or right from the listener , and voices give information about a person's identity (Voss et al., 2010). Such stimuli can compensate, at least to some degree, for the missing visual input and are, therefore, of high importance. In contrast, the simple presentation of unnatural auditory stimuli like white noise gives no relevant information, making it not more important for people with blindness to detect such sounds in comparison to healthy individuals (Voss et al., 2010). Thus, future research should investigate sensory changes using stimuli from the same modality that transport either more or less relevant information concerning the deprived modality.
Besides the approaches to explain sensory differences between sensory loss groups and healthy controls via compensatory mechanisms or deficits in the development of senses, use-dependence of the respective part of the body is often mentioned as an explanation. According to this notion, a higher difference between the sensitivity at the finger compared to the handback is expected for people with blindness compared to other individuals due to the possible training in Braille reading and, thus, more frequent use of the finger. However, our data showed no significant interaction between sensory status and stimulus location in the respective model, and we did not find a difference in thresholds between subjects without knowledge and subjects with knowledge of Braille reading. This might contradict the assumption of use-dependency but rather points to the highest innervations with mechanoreceptors, irrespective of the type of sensory loss. Regarding included covariates, our results support previous studies with a decline of sensitivity with age (Goldreich and Kanics, 2003;Moshourab et al., 2017). However, we did not replicate findings of higher sensitivity in females compared to men, which was previously found in mechanical detection (Moshourab et al., 2017), gradient orientation task (Goldreich and Kanics, 2003), detection of grooved surfaces (Goldreich and Kanics, 2006) and vibration (da Silva et al., 2014;Moshourab et al., 2017). However, some studies also indicated that differences depend on stimuli properties. For instance, studies show that differences in thresholds vanish with higher force of applied stimuli (Goldreich and Kanics, 2003;Goldreich and Kanics, 2006) or frequency of vibration (Moshourab et al., 2017). Other studies found no differences in two-point discrimination (Woodward, 1993), electrical (da Silva et al., 2014, and mechanical stimulation (da Silva et al., 2014) further supported by our study, in which thresholds did not significantly differ between women and men across the sensory groups. Future studies should evaluate if contradictory designs, e.g., type and location of stimuli and therefore types of activated cells or sample characteristics like age and sensory status, explain the discrepancy in the findings on gender differences in tactile sensitivity.
Two limitations of our sample of subjects with sensory loss must be considered when interpreting our results. First, age significantly differed between individuals with late blindness and respective controls. Thus, differences in thresholds could be a result of an age effect. However, additional linear mixed models for analyzing such an age effect on differences in tactile threshold estimation did not indicate any interactions with sensory status or stimulation location (see supplementary Tables S10 and S11). Still, an age-matched sighted group should be included in future studies. Secondly, we cannot draw any conclusions concerning changes in sensory thresholds after hearing loss in older individuals. A subdivision of our group of subjects with deafness in early and late sensory loss does not make sense. Only 10 of our subjects lost their hearing after 2 years of age, and none were older than 10. Moreover, a cut-off for early and late deafness is not as straightforward as for blindness. For example, Kim et al. (2021) referred to a hearing loss before the age of 2 years as prelingual, whereas a loss between the ages of 2 and 8 years was defined as postlingual. Nevertheless, both are classified as early-onset hearing loss. Accordingly, future studies should include a group of subjects that can unambiguously be classified as individuals with late deafness.
Concluding, our results suggest, that both compensatory mechanisms as well as deficits in the development of touch play a role in tactile sensitivity after sensory loss. The combination of higher tactile thresholds in subjects with deafness irrespective of the auditory loss treatment speaks in favor of a deficit in the development of simple touch perception after hearing loss. This effect might either be not present for visual loss or could be reduced in people with blindness due to compensation effects. Such compensatory changes might not be present in people with deafness, as simple touch cannot compensate for missing auditory spatial information in far space and does not transport information like vibration to compensate for other auditory cues.

DECLARATION OF COMPETING INTEREST
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

ACKNOWLEDGMENT
We are thankful for the research assistants who helped in the data collection process and assisted subjects with blindness and deafness. We are grateful to the community of blind individuals in Poland through which we were able to reach our target sample. Finally, we are indebted to the Polish Association of the Deaf and educational centers for people with deafness who helped in subject recruitment and provided contact to the highly qualified sign language translators.

FUNDING
This work was supported by the Polish National Science Centre OPUS Grant (#2017/25/B/HS6/00561) awarded to AO.