RT Journal Article SR Electronic T1 Evidence for shared conceptual representations for sign and speech JF bioRxiv FD Cold Spring Harbor Laboratory SP 623645 DO 10.1101/623645 A1 Samuel Evans A1 Cathy Price A1 Jörn Diedrichsen A1 Eva Gutierrez-Sigut A1 Mairéad MacSweeney YR 2019 UL http://biorxiv.org/content/early/2019/05/02/623645.abstract AB Do different languages evoke different conceptual representations? If so, greatest divergence might be expected between languages that differ most in structure, such as sign and speech. Unlike speech bilinguals, hearing sign-speech bilinguals use languages conveyed in different modalities. We used functional magnetic resonance imaging and representational similarity analysis (RSA) to quantify the similarity of semantic representations elicited by the same concepts presented in spoken British English and British Sign Language in hearing, early sign-speech bilinguals. We found shared representations for semantic categories in left posterior middle and inferior temporal cortex. Despite shared category representations, the same spoken words and signs did not elicit similar neural patterns. Thus, contrary to previous univariate activation-based analyses of speech and sign perception, we show that semantic representations evoked by speech and sign are only partially shared. This demonstrates the unique perspective that sign languages and RSA provide in understanding how language influences conceptual representation.