Luca
Tommasi
aDepartmentofNeuroscienceandImaging,‘G.d’Annunzio’UniversityofChieti-Pescara,Chieti,Italy
bDepartmentofPsychologicalScience,HumanitiesandTerritory,‘G.d’Annunzio’UniversityofChieti-Pescara,Chieti,Italy
cDepartmentofClinicalandExperimentalMedicine,NeuroscienceandCellBiologySection,PolytechnicUniversityofMarche,Ancona,Italy dRegionalEpilepsyCentre,NeurologicalClinic,“OspedaliRiuniti”,Ancona,Italy
h
i
g
h
l
i
g
h
t
s
•Chimericanddichoticstimuliwerepresentedtotwosplit-brainpatients(D.D.V.andA.P.).
•Stimuliconveyedhappy/sadexpressionsandparticipantsjudgedtheiremotioncontent.
•Thetotalsplit-brainpatientD.D.V.neglectsvisualstimuliinthelefthemifield.
•A.P.showsaright-hemisphericdominanceinunimodalanalysis.
•ThevalencehypothesisisconfirmedinbimodalconditionsbyA.P.andthecontrolgroup.
a
r
t
i
c
l
e
i
n
f
o
Articlehistory: Received2June2014
Receivedinrevisedform16June2014 Accepted18June2014
Availableonline24June2014 Keywords: Audio-visualintegration Split-brain Chimericface Dichoticstimulation Hemisphericasymmetry
a
b
s
t
r
a
c
t
Hemisphericasymmetrieshavebeenwidelyexploredinboththevisualandtheauditorydomain,butlittle isknownabouthemisphericasymmetriesinaudio–visualintegration.Wecomparedtheperformanceof apartiallycallosotomizedpatient,atotalsplit-brainpatientandacontrolgroupduringtheevaluation oftheemotionalvalenceofchimericfacesanddichoticsyllables(anemotionalsyllableinoneearand whitenoiseintheotherear)presentedunimodally(onlyfacesoronlysyllables)orbimodally(facesand syllablespresentedsimultaneously).Stimulicouldconveyhappyandsadexpressionsandparticipants wereaskedtoevaluatetheemotionalcontentofeachpresentation,usinga5-pointLikertscale(from verysadtoveryhappy).Inunimodalpresentations,thepartiallycallosotomizedpatient’sjudgments dependedontheemotionalvalenceofthestimuliprocessedbytherighthemisphere,whereasthoseof thetotalsplit-brainpatientshowedtheoppositelateralization;intheseconditions,thecontrolgroup didnotshowasymmetries.Moreover,inbimodalpresentations,resultsprovidedsupportforthevalence hypothesis(i.e.,leftasymmetryforpositiveemotionsandviceversa)inboththecontrolgroupandthe partiallycallosotomizedpatient,whereasthetotalsplit-brainpatientshowedatendencytoevaluatethe emotionalcontentoftherighthemifaceevenwhenaskedtofocusontheacousticmodality.Weconclude thatpartialandtotalhemisphericdisconnectionsrevealoppositepatternsofhemisphericasymmetryin auditory,visualandaudio–visualemotionprocessing.Theseresultsarediscussedinthelightofthe right-hemispherehypothesisandthevalencehypothesis.
©2014ElsevierB.V.Allrightsreserved.
1. Introduction
Studiesonthecerebrallateralizationofemotionalprocessing
haveshownvariablepatternsofresults,andtheaspectof
hemi-spheric asymmetry in this field still remains controversial [1].
∗ Correspondingauthor.Tel.:+398713554216;fax:+3987135542163. E-mailaddress:giulia.prete@unich.it(G.Prete).
According to the right-hemisphere hypothesis [2–4], a
right-hemisphericdominancewouldunderlieallaspectsofemotional
processing.Ontheotherhand,accordingtothevalence
hypoth-esis [5–7] a right-hemispheric superiority would underlie the
processingofnegativeemotions,whereasaleft-hemispheric
supe-rioritywouldunderlietheprocessingofpositiveemotions.
Hemisphericasymmetriesinperceptionhavebeenexploredby
meansofseveralparadigms.Intheauditorydomain,the
organiza-tionofcerebralauditorypathwaysmakesitpossibletoevaluatethe
http://dx.doi.org/10.1016/j.bbr.2014.06.034 0166-4328/©2014ElsevierB.V.Allrightsreserved.
contributionofeachhemispherebypresentingdifferentstimuliin
thetwoears(dichoticlistening).Exploitingthisfact,anumberof
studieshaveshownthatthepresentationofanacousticstimulusin
oneearleadstoahigheractivationinthecontralateralhemisphere
(e.g.,rightear/lefthemisphere)ratherthanintheipsilateral
hemi-sphere[8,9].Inparticular,thedichoticlisteningparadigmreveals
arightearadvantage(REA),indicatingaleft-hemispheric
superi-orityinlinguisticprocessing[10,11],whichhasalsobeenshown
inecologicalcontexts[12].TheREAseemstobeattributableto
theinvolvementofthecorpuscallosum(CC)andtheauditory
cor-tex,besidesattentionalfactors(see[13]forareview).Moreover,
dichoticlisteninghasbeenusedtoinvestigatethelateralizationof
emotionalprocessingintheauditoryverbalandnonverbaldomains
[14].For example,Erhanandcolleagues[15] askedparticipants
tocategorize emotional prosody of dichoticallypresented
non-sensesyllablesashavingpositiveornegativeintonation,andthey
foundaleftearadvantage(LEA),eveniftheyunderlinedan
oppo-siteelectrophysiologicalcorrelate,namelyahigheramplitudeof
theN100component in theleftrather than inthe right
hemi-sphere.Theseauthorsproposedthatsuchapatternofevent-related
potentialsreflectsphoneticanalysisbutthatemotionalevaluation
isprocessedintherighthemisphere,thussupportingthe
right-hemispherehypothesis.Ofnote,however,Erhanandcolleagues
pointedoutthattheLEAwasstrongerfornegativethanforpositive
emotions([15],seealso[72]).Ontheotherhand,HerreroandHillix
[16]foundagenerallowerrecognitionscorefornegativethan
pos-itiveemotionalintonationsentences,andasignificantinteraction
betweenthepresentationearandtheemotionalvalenceofstimuli,
withaspecificloweringofscoresforthenegativeemotional
sen-tencespresentedintherightear.Schepmanetal.[17]haverecently
confirmedthispatternofresults,providingfurthersupportforthe
valencehypothesis.
Other controversial results were also obtained in EEG and
neuroimagingstudiesinwhichdifferentparadigmsthandichotic
listeningwereused.Recordingevent-relatedpotentialsandskin
conductance,Meyersand Smith[18] didnotfindasymmetryin
nonverbalaffectivestimuliprocessing.Nevertheless,inafunctional
magneticresonanceimagingstudy,Wildgruberandcolleagues[19]
founda strongeractivationoftherighthemisphereforacoustic
stimuliwithbothpositiveandnegativevalence,thussupporting
theright-hemispherehypothesis.
Cerebral lateralization has also been largely studied in the
domain of visual perception. Specifically, a paradigm used to
investigatecerebralasymmetriesinfaceprocessingisthatofthe
chimericfaces[2],thatarecreatedbyjuxtaposingtheleftandright
halvesof twodistinct faces.Studiesthat madeuseof chimeric
facesshowedthattherighthemisphereismorespecializedforface
processingthanthelefthemisphere[20].Alsointhiscontext,
how-ever,contrastingresultshavebeenobtainedasregardsemotion
processing.Presentingchimericemotionalfaces,Drebingand
col-leagues[21]providedsupportfortheright-hemispherehypothesis
[21].KillgoreandYurgelun-Todd[22],onthebasisofanfMRIstudy,
concludedthat the valence and theright-hemisphere
hypothe-sisarenotmutuallyexclusive,buttheyshouldbeconsideredas
differentpossibilitiestounderstandfacialemotionprocessing.A
similarlyunclearpatternofresultshasbeenachievedbyexploiting
bilateralpresentations oftwoemotionalfacesorunilateral
pre-sentationsofoneface,providingevidenceinsupportforboththe
right-hemispherehypothesis[23,24]andthevalencehypothesis
[25–27].
Withregardtoaudio–visualintegration,animportantquestion
iswhetherthefusionofmultisensoryinputsleadstoa
combina-tionoftheprocessingactivitiesinvolvedbyeachofthetwosensory
modalitiesconsideredinisolation,orwhethertheinformation
pre-sentedinonesensorymodalitypredominatesoverthatpresented
intheothermodality.Ofnote,anumberofstudiesshowedthat
multisensoryintegrationisautomaticandunintentional,although
otherstudiessuggestedadifferentpointofview(see [28]fora
review).Collignonandcolleagues[29]askedparticipantsto
clas-sifyvocalandvisualstimuliaccordingtotheiremotioncontent,
findingthatthecategorizationimprovedwhenvocaland visual
expressionswerecongruentand that,in theincongruent
condi-tions, participantsbased theirevaluationsmainlyon thevisual
component(visualcapture).Ontheotherhand,Petrinietal.[30]
foundtheoppositeresult,withtheacousticinformation
prevail-ingduringtheevaluationofincongruentaudio–visualemotional
stimuli(auditorycapture).Thus,itcouldbeconcludedthatthere
isareciprocalinfluenceduringauditoryandvisualprocessingof
emotionalinformationdependingoncongruence[31],butthe
pri-oritizationofonesensorymodalityovertheotherdoesnotseem
tofollowaclearrulewhenthetwomodalitiesconveyconflicting
contents.
Manystudiesfocusedonthecerebralcorrelatesofaudio–visual
integrationand showed thatdifferent cerebralareas are
impli-catedinthis process:for example,Kreifeltsandcolleagues[32]
foundthatthebehavioralimprovementduringthesimultaneous
presentationofemotionallycongruentbimodalstimulicorrelated
withanincreasedactivationinthebilateralposteriorsuperior
tem-poral gyrus([32]; see also[33], for a review) and in the right
thalamus;otherstudieshighlightedaleft-hemisphericactivation
during bimodal processing, in particular in the middle
tempo-ral gyrus[34,35]and in theposterior superior temporal sulcus
[36]. Jeongand colleagues[37] foundthat emotionally
congru-entaudio–visualpresentationsenhancedtheactivityinauditory
areas,whereasemotionallyincongruentpresentationsenhanced
theactivityinfacerecognitionareas,suchasthefusiformgyrus.
Ethoferandcolleagues[38]haverecentlyshowntheimportance
of the orbitofrontal cortex in themechanism of habituation to
facialexpressionsandprosody,whichhighlights theroleofthis
region as an important component of a system for emotional
processingoffacesandvoicesandasaneuralinterfaceamong
sen-soryareas.Finally,electrophysiologicaldatademonstratedanearly
audio–visualcrosstalkfollowingstimuluspresentation[39,34].
Anothersourceofinformationaboutaudio–visualneural
cor-relatescomesfrom studiesonneurological patients. Hayakawa
andcolleagues[40]showedthatapatientwithamygdalaand
hip-pocampuslesionscouldnotrecognizefearfromfacialexpressions,
butwasabletorecognizethemfromprosodicandverbalstimuli,
whereasapatientwithamoreextendedlesionbeyondthe
amyg-dalacouldnotrecognizefearexpressionsfromprosodicandverbal
stimuli,evenifhewasabletorecognizethemfromfacialstimuli
[40].
Although audio–visual integration has been largely
investi-gated, hemispheric lateralization in this field remains largely
unexplored,even more soin thecase of multisensory analysis
in emotionperception.One ofthepossiblesources ofevidence
consistsinstudyingsplit-brainpatients[41,42],individualswho
have undergone the surgical resection of the corpus callosum
(CC) as an extreme attemptto prevent thespread of epileptic
seizuresbetweenthe two hemispheres[43]. Importantly,
com-parisonsamongtheperformanceofpatientswhohaveundergone
different degrees of hemisphericdisconnection (total or partial
and,inthelattercase,indifferentregionsoftheCC)provideus
withararechancetostudyhemisphericcompetences[44].
Stud-ieswithcallosotomizedpatientsconfirmedtheright-hemispheric
superiorityinfaceprocessing[45],aswellastheleft-hemispheric
superiority(REA)indichoticlistening([46];see[47],forareview).
Howeverthehemisphericsuperiorityinemotionanalysisremains
controversialeven asregardssplit-brainpatients’results:Stone
etal.[48]showedthatbothdisconnectedhemispherescouldmatch
equally well facial expressions to emotion words, but the left
Fig.1. MidsagittalMRIofpatients.Leftpanel:A.P.’sbrain,showingtheanteriorsectionofthecorpuscallosum,whichsavesthesplenium.Rightpanel:D.D.V.’sbrain,showing thecompleteabsenceofcallosalfibers.
etal.[49],basedontheresultsofasplit-brainpatientwhowas
capabletodiscriminateemotionalfromneutralstimulipresented
inbothLVFandRVF,concludedthatemotionalprocessingcould
takeplaceineitherdisconnectedhemisphere,butonlytheright
hemisphereproducedanautonomicresponsetoemotionalstimuli
[49].Arecentstudyinwhichatotalsplit-brainpatientandan
ante-riorcallosotomizedpatientweretested,providedsupportforthe
righthemispherehypothesisinthecaseofbilateralpresentationof
faceswithahiddenemotionalcontent[24].AsreviewedbyGainotti
[50],theright-hemisphericsuperiorityinsubliminallypresented
emotionalcontentcouldbeexplainedbyasubcorticalasymmetry
inemotiondetection.
Theaimofthepresent studywastoinvestigatehemispheric
asymmetriesinaudio–visualintegrationandthewayinwhich
lat-eralizedvisualandacousticemotionalstimuliareprocessedbythe
twohemispheres. Accordingtopreviousevidenceonemotional
analysisinthevisualandacousticmodalities,onemightexpectthat
eithertheright-hemispherehypothesisorthevalencehypothesis
isapplicablealsototheaudio–visualdomain.
First of all, we expected a moderate hemispheric
asymme-try in emotion evaluation by healthy participants. Second, we
expectedamoreextremehemisphericlateralizationpatternina
totalsplit-brainpatientregardlessofthenatureofstimuli,dueto
thecompleteabsenceofreciprocalcallosalconnectionsbetween
the hemispheres. Finally, we expected that in a patient with
ananteriorcallosalsection,lateralizationwould appear weaker
comparedtothatofthetotalsplit-brainpatient,duetointact
con-nectionsbetweenposteriorcorticalareas,theposteriorpartofthe
CCbeingspared, butstrongercomparedtothatofhealthy
sub-jects,becauseofthecompletesegregationoftheanteriorareas.
However,wecouldnotpredictthedirectionofanylateralization,
becauseofthe mixedresultsavailable intheliterature in both
visualandacousticdomains, andtheabsenceof clearevidence
of cerebral asymmetry in multimodal integrationof emotional
stimuli.
2. Materialsandmethods
2.1. Participants
ApatientwithananteriorsectionoftheCC,apatientwitha
totalsectionoftheCC,and16healthyparticipantstookpartinthe
experiment.
A.P.isa47-year-oldman,whounderwentthesurgicalresection
ofalargeanteriorportionoftheCCtoreducethespreadofepileptic
seizures.Thesurgerywascarriedoutin1993anditleftintactonly
thesplenium(seeFig.1a).A.P.’spostoperativeIQwas87,as
mea-suredbymeansoftheWechsleradultintelligencescale(WAIS),
andhislateralityquotientwas+10,accordingtotheEdinburgh
HandednessInventory[51].
D.D.V.isa48-year-oldman,who,inordertopreventthespread
ofepilepticfoci,underwentthecompleteresectionofthecorpus
callosumthroughdifferentstages,thelastofwhichoccurredin
1994.Thesurgicalresectionleftintacttheanteriorcommissure(see
Fig.1b).D.D.V.’spostoperativeIQwas81,asassessedbytheWAIS,
andhislateralityquotientwas+100,accordingtotheEdinburgh
HandednessInventory[51].
Bothpatientshadnovisualimpairmentsorpsychiatric
symp-toms(furtherdetailsofthepatients’statusareprovidedby[52]).
TheyweretestedattheEpilepsyCenterofthePolytechnic
Univer-sityofMarche(Torrette,Ancona),duringapausebetweenroutine
neurologicalexaminations.
Thecontrolgroupwascomposedof16maleindividuals(age:
M=24.44; SD=1.35). According to self-report, all were
right-handed,hadnormal orcorrected-to-normalvision, andnoneof
themhadneurologicalorpsychiatrichistories.Theseparticipants
weretestedasvolunteersatthePsychobiologyLaboratoryofthe
UniversityofChieti.
Informedconsentwasobtainedfromallparticipantspriorto
thebeginningofthetaskandtheexperimentalprocedureswere
approvedbythelocalethicalcommittee.
2.2. Stimuli
Weusedchimericfacesasvisualstimulianddichotic
presenta-tionsasacousticstimuli.
Facialstimuliwerecreatedfromphotographscontainedinthe
Karolinska directed emotional faces [53], a database of female
andmalefaceswithbothneutralandemotionalexpressions.
Pho-tographsofafemalefaceinahappypose(HA)andinasadpose
(SA)wereselectedandconvertedintogray-scaleimages,
measur-ing5.2◦×6.6◦ofvisualangle(198×252pixel)atadistanceofabout
72cm.Fromtheseimages,hemifacecomponentswerecreated:the
twophotographswerecentrallydividedintotwohalves,inorderto
obtainoneleftsadhemiface,onerightsadhemiface,onelefthappy
hemiface,andonerighthappyhemiface.Chimericfaceswerethus
Fig.2. Anexampleofchimericface(HA/SA).
ofvisualstimuliwascomposedof4chimericfaces:HAleft
hemi-face/HArighthemiface,HAlefthemiface/SArighthemiface,SAleft
hemiface/HArighthemiface,SAlefthemiface/SArighthemiface.In
ordertoreducethesenseofstrangenesspossiblyduetochimeric
stimuli,thetwohalffaceswereseparatedbyathinwhitestripe
(width:.02◦ofvisualangle;seeFig.2).
Acousticstimuliwerecreatedbyrecordingconsonant–vowel
syllablespronouncedbyafemaleactresswithdifferentemotional
accent.Thesyllable/ba/wasrecordedwithtwodifferentemotional
intonations: happy(HA) and sad (SA). Stimuli were then
han-dledwithGoldWaveSoftwarev5.25(GoldWaveInc.,St.John’s,NL,
Canada).Theyweredigitized(16bit,samplingrate44,100Hz)and
presentedinadichoticparadigm,inwhichtheemotionalsyllable
waspresentedinoneearandwhitenoise(WN)wassimultaneously
presentedintheotherear.Acousticstimulilasted350ms,including
50msoflinearfade-inatthebeginningofeachstimulusand50ms
oflinearfade-outattheendofeachstimulus.Dichoticstimuliwere
deliveredbymeansofheadphones(PhilipsSHP5400):anemotional
syllableandwhitenoisewerepresentedintheoppositeears,with
acommononsetandacommondurationof350ms.Thefinalset
ofacousticpresentationswascomposedof4dichoticstimuli:HA
syllableintheleftear/WNintherightear,WNintheleftear/HA
syllableintherightear,SAsyllableintheleftear/WNintheright
ear,andWNintheleftear/SAsyllableintherightear.Wechoseto
avoidthesimultaneouspresentationofdifferentemotional
sylla-blesinthetwoearsinordertokeepthecomplexityofconflicting
informationbetweenvisualandauditoryinputasmanageableas
possibleforsubsequentstatisticalanalyses.
We presented happyand sad emotionalexpressions, onthe
basisofpreviousstudiesindicating thatthesetwoemotionsare
evaluated ashighly opposite in valence (positive and negative,
respectively),andarethuseasytodifferentiatefromeachother
[54]. We chose to use the 350ms presentation time to allow
patientstoeffectivelyprocessallstimulieveninthecaseofbimodal
(possiblyconflicting)conditions,inwhichthecognitiveloadcould
behigh.Wealsopreferredtousea relativelylongpresentation
timetomakesurethatthebimodalintegrationoccurred,giventhat
some electrophysiological data demonstrated that audio–visual
integrationmaytakeplaceonlyatabout180msafterstimulionset
[39]. Onthe otherhand,there isevidence oflateralized effects
obtainedwithlongpresentationtimeinboththeacoustic(e.g.,[55])
andthevisualdomain(e.g.,[56,57]).
2.3. Procedure
Participantswereadministeredfourdifferentsessionsinwhich
theywereaskedtoevaluatetheemotionalvalenceofstimuli,using
a5-pointLikertscale:from1=verysad,to5=veryhappy(1=very
sad;2=sad;3=neutral;4=happy;5=veryhappy),usingthe
com-puterkeyboard.
Inthefirstsession,onlyacousticstimuliwerepresented.The
4dichoticstimuliwerepresentedinrandomorder5timeseach,
constituting asetof 20acousticstimuli. Awhite screenlasting
500mswasfollowedbythepresentationofacentralfixationcross
for850ms.Afterafirstintervalof500msinwhichonlythecross
wasshown,thedichoticstimuluswaspresentedforadurationof
350ms,sothattheendoftheacousticinputcorrespondedtothe
endofthecrosspresentation.Then,thescreenwentblankuntilthe
responsewasgiven,andthenexttrialstarted(seeFig.3a).
Inthesecondsession,onlyvisualstimuliwerepresented.The
4chimericfaceswerepresentedinrandomorder5timeseach,
constitutingasetof20visualstimuli.Awhitescreenlasting500ms
wasfollowedbythepresentationofablackfixationcrossinthe
centerofthescreenfor500ms.Thestimuluswasthenpresented
inthecenter ofthescreenwithadurationof350ms.Then,the
screenwentblankuntiltheresponsewasgivenbytheparticipant,
andthenexttrialstarted(seeFig.3b).
Inthelasttwosessions,acousticandvisualstimuliwere
pre-sented together.Eachof thesesessions wereconstituted by80
stimuli,includingall16possiblecombinationsofthe4chimeric
facesand the4 dichoticpresentations, each combination being
repeated5timesinarandomorder.Thesetwosessionswere
simi-lartoeachother:awhitescreenwaspresentedfor500ms,followed
bythepresentationofablackfixationcrossinthecenterofthe
screenfor500ms.Thedisappearanceofthecrosscoincidedwith
thesimultaneousonsetofboththechimericfaceinthecenterof
thescreenandthedichoticstimulusintheearphones,bothlasting
350ms.Then,thescreenwentblankuntiltheresponsewasgiven,
andthenexttrialstarted(seeFig.3c).Inthefirstofthesetwo
ses-sions,participantswereaskedtoevaluatetheemotionalvalenceof
thevisualstimuli,ignoringtheacousticinput,whereas,inthelast
session,participantswereaskedtoevaluatetheemotionalvalence
oftheacousticstimuli,ignoringthevisualinput.
Inorder tospecifywhich modalityhadtobeattendedto,at
thebeginningofeachsessioninstructionswerepresentedonthe
computerscreen.Intheinstructionsitwasalsospecifiedthatthe
gazehadtobemaintainedatthecenterofthescreenforallthe
testingtime.Attheendofasession,participantswereallowedto
takeashortbreakbeforestartingthenextone.
Thetwopatientsgavetheirresponsesusingtherighthand.As
regardsthecontrolgroup,halfoftheparticipantsusedtheright
handandtheotherhalfusedthelefthand.Thewholetaskwas
controlledbyE-Primesoftware(PsychologySoftwareTools,Inc.,
Pittsburgh,PA)andlastedabout30min.
3. Results
StatisticalanalyseswerecarriedoutusingthesoftwareStatistica
8.0.550(StatSoft.Inc.,Tulsa,USA).Asregardscontrolgroupscores,
thedifferentconditionsineachofthe4sessionswerecompared
by meansofdifferentanalysis ofvariance (ANOVAs), using the
emotionjudgmentasthedependentvariable.PreliminaryANOVAs
Fig.3.Schematicrepresentationofatrialinsession1:dichoticstimuli(toppanel,ontheleft);insession2:chimericstimuli(toppanel,ontheright);andinsessions3and 4:bimodalpresentations,withbothdichoticandchimericstimuli(bottompanel).
between-subjectsfactor,butneitherthemaineffectofthis
fac-tornorinteractionsweresignificant.Thus,wecarried outthree
repeatedmeasuresANOVAs(oneforeachunimodalsession and
oneforthetwobimodalsessions).Post-hoccomparisonswere
car-riedoutbymeansofBonferronitestand correctedp-valuesare
reported.
Intheauditorysession,theANOVAwascarriedoutusing
emo-tionaltone(happyandsad)andpresentationear(leftandright)as
within-subjectfactors.Onlythemaineffectofemotionaltonewas
significant(F(1,15)=53.06,p<.001,2=.78),showingthathappy
syllables were evaluated ashappier than sad syllables (happy:
M=3.94±.14;sad:M=2.26±.10).
In the visual session, the ANOVA was carried out using
face as within-subject factor (happy/happy, sad/sad, happy/sad
andsad/happy).Thismaineffectwassignificant(F(3,45)=157.49,
p<.001, 2=.913): post-hoc comparisons showed that the
happy/happyface wasevaluated happier than all of the other
stimuli and that the sad/sad face was evaluated sadder than
all of the other stimuli (p<.001 for all comparisons; HA/HA:
M=4.74±.07;SA/SA:M=1.69±.12;HA/SA:M=2.90±.12;SA/HA:
M=2.72±.14).
Asregardsthetwobimodalsessions,theANOVAwascarried
outusingTargetmodality(visualandacoustic),face(happy/happy,
sad/sad,happy/sad and sad/happy),emotional tone(happy and
sad) and presentation ear (left and right) as within-subject
factors. The main effect of target modality approached
sta-tistical significance (F(1,15)=4.27, p=.056), visual stimuli being
evaluatedslightlymore positively(M=3.16±.07)thanauditory
stimuli (M=3.03±.08). The main effect of face wassignificant
(F(3,45)=36.05, p<.001 2=.71): post-hoc comparisons showed
thatthehappy/happyfacewasevaluatedhappierthanallofthe
other stimuli and that the sad/sad face was evaluated sadder
thanalloftheotherstimuli(p<.001forallcomparisons;HA/HA:
M=3.96±.1; SA/SA:M=2.38±.1; HA/SA:M=3.02±.09;SA/HA:
M=3.02±.08).Themaineffectofemotionaltone(F(1,15)=49.23,
p<.001,2=.77)showedthathappysyllableswereevaluatedas
happierthansadsyllables(HA:M=3.72±.07;SA:M=2.47±.06).
Themaineffectofpresentationearwasnotsignificant(F(1,15)=.55,
p=.47;left:M=3.09±.07;right:M=3.11±.07).
Asregardsthesignificantinteractionbetweentargetmodality
and face (F(3,45)=20.23, p<.001, 2=.57), post-hoc
compar-isons showed that when visual judgments were required, the
happy/happyfacereceivedahigherevaluationandthesad/sadface
receivedalowerevaluationcomparedtoalloftheotherconditions
(p<.001forallcomparisons),whereaswhenauditoryjudgments
wererequired,thehappy/happyfacewasassociatedtohigher
eval-uations than thesad/sadface (p<.001) and thesad/happy face
(p=.044).
Theinteractionbetweentargetmodalityandpresentationear
wassignificant(F(1,15)=6.34,p=.024,2=.30)andpost-hoc
com-parisonsshowedthatvisualjudgmentswerehigherthanauditory
judgments when syllables were presented both to the left ear
(visualmodality:M=3.17±.10;acousticmodality:M=3.00±.11;
p=.04),andtotherightear(visualmodality:M=3.15±.11;
acous-tic modality:M=3.06±.11;p=.002). Moreover,in theacoustic
modality,syllablespresentedtotherightearreceivedhigher
eval-uations than thosepresented tothe leftear (p=.036), whereas
evaluationsdidnotdifferaccordingtoauditorypresentationside
whenthetargetmodalitywasvisual.
Thethree-wayinteractionamongface,emotionaltoneand
pre-sentationear(F(3,45)=3.41, p=.025, 2=.18)wassignificant.To
betterclarifythisinteraction,twodifferentANOVAs were
com-puted,bysplittingthedataaccordingtopresentationearandusing
faceandemotionaltoneaswithin-subjectfactors.Asregardsthe
right-earpresentationcondition,asignificantinteractionbetween
faceandemotional tone(F(3,45)=4.60, p=.007,2=.23)showed
thatwhenthesadsyllablewaspresented,thesad/happyfacewas
judgedashappierthanthehappy/sadface(p=.047).Theinteraction
wasnotsignificantfortheleft-earpresentationcondition.
Inordertoevaluatetheeffectsofbimodalpresentations,mean
scoresobtainedinboththevisualandtheauditoryunimodal
Fig.4. 95%Confidenceintervalsofthecontrolgroup(lines)andpatientsmeans (A.P.:cross;D.D.V.:triangle)inthefirstsession:4dichoticstimuli.
sessions,usingaseriesofBonferronicorrectedt-tests.In
partic-ular,scoresrecordedinthebimodalsessioninwhichparticipants
wereaskedtojudgetheemotionalvalenceofthevisualstimuli
(session 3) were compared with scores recorded in the visual
session (session 2). The presentation of a happy tone to the
right ear (WN/HA) led to higher evaluations of the sad/happy
chimeric face (t(15)=−3.724, p=.032). In the same way, scores
recordedinthebimodalsessioninwhichparticipantswereasked
to judge the emotional valence of acoustic stimuli (session 4)
werecomparedwiththescoresrecordedintheauditorysession
(session1).Thepresentationof thesad/sadface ledtoa lower
evaluationof allacousticstimuli (HA/WN:t(15)=5.021,p=.002;
WN/HA:t(15)=3.764,p=.030;SA/WN:t(15)=3.68,p=.036;WN/SA:
t(15)=5.51,p<.001),whereasthepresentationofthehappy/happy
faceledtohigherevaluationsofthesadsyllablepresentedtoboth
theleft(SA/WN:t(15)=−4.69,p=.005)andtherightear(WN/SA:
t(15)=−5.397,p=.001).
Inordertocomparetheperformanceofthepatientsandthe
controlgroupineachsession,patients’meanswerecomparedwith
the95%confidenceintervalsofthecontrolgroupscores.
In session 1 (dichotic syllables), A.P.’smean wasabove the
cut-offofthe95%confidenceintervalsintheHA/WNcondition,
whereasD.D.V.’smeanswereaboveofthe95%confidenceintervals
inallconditions(seeFig.4).
Insession2(chimericfaces),A.P.’smeanswereabovethe
cut-offofthe95%confidenceintervalsinHA/HAandHA/SAconditions,
andtheywerebelowinSA/SAandSA/HAconditions.Inthissession,
D.D.V.’smeanswereabovethecut-offofthe95%confidence
inter-valsinHA/HAandSA/HAconditions,andtheywerebelowinSA/SA
andHA/SAconditions(seeFig.5).
Insession3(bimodalpresentationwithvisualevaluation),A.P.’s
meanswereabovethecut-offofthe95%confidenceintervalsin
allconditionsinwhichHA/HA andSA/SAfaceswerepresented,
regardlessof dichoticstimulation, and when HA/SAand SA/HA
faceswerepresentedtogetherwithSA/WNandWN/SAdichotic
stimuli. In this session, D.D.V.’s means were above the cut-off
ofthe95% confidenceintervals whentheHA/HA facewas
pre-sentedtogetherwithWN/HA,SA/WNandWN/SAstimuliandwhen
theSA/HAfacewaspresentedtogetherwithHA/WN,SA/WNand
WN/SAstimuli;theywerebelowthecut-offofthe95%confidence
intervalswhentheHA/HAfacewaspresentedtogetherwiththe
HA/WNdichoticstimulusandinalltheconditionsinwhichSA/SA
andHA/SAfaceswerepresented,regardlessofdichoticstimulation
(seeFig.6).
Fig.5.95%Confidenceintervalsofthecontrolgroup(lines)andpatientsmeans (A.P.:cross;D.D.V.:triangle)inthesecondsession:4chimericfaces.
Insession4(bimodalpresentationwithauditoryevaluation),
A.P.’smeanswereabovethecut-offofthe95%confidence
inter-valswhentheHA/WNdichoticstimuluswaspresentedtogether
withSA/SAandHA/SAfaces,whentheWN/SAdichoticstimulus
waspresentedtogetherwithSA/SAand SA/HAfaces,and in all
conditionsinwhichWN/HAandSA/WNstimuliwerepresented
togetherwithanychimericface.In this session,D.D.V.’s means
wereabovethecut-offofthe95%confidenceintervalswhenthe
HA/WN stimuluswaspresented togetherwiththe HA/HAface,
whentheWN/HAstimuluswaspresentedtogetherwithHA/HA
andSA/HAfaces,whentheSA/WNdichoticstimuluswaspresented
togetherwiththeSA/HAface,andwhentheWN/SAstimuluswas
presentedtogetherwithHA/HA,HA/SAandSA/HAfaces;theywere
below whentheHA/WNstimuluswaspresented togetherwith
bothSA/SAandHA/SAfaces,whentheWN/HAstimuluswas
pre-sentedtogetherwiththeHA/SAface,andwhentheSA/WNstimulus
waspresentedtogetherwiththeSA/SAface(seeFig.7).
4. Discussion
Theperformancesofthetwopatientswereverydifferentfrom
oneanother.Ingeneral,thetotalsplit-brainpatient(D.D.V.)showed
arightwardbias;moreover,hisevaluationsinbimodal
presenta-tionsweremainlybasedonvisualratherthanacousticstimuli,even
whenhewasexplicitlyrequiredtojudgetheauditoryinput.Onthe
otherhand,theresultsofthepartiallycallosotomizedpatient(A.P.)
largelyresembledthoseofthecontrolgroupandhedidnotshow
difficultiesinpayingattentiontoeithervisualoracousticstimuli,
whenrequired.Differentlyfromthecontrolgroup,however,A.P.
showedatrendforaleftwardbiasintheemotionalevaluationof
bothacousticandvisualstimulipresentedinisolation.This
pat-ternofresultsconfirmsaright-hemisphericsuperiorityinemotion
detection,alreadyknowninthevisualdomain(see[24],forsimilar
lateralizationresultsinthesamepatient),andaddsnewevidence
intheauditorydomain.
AsregardstheresultsofD.D.V.,intheauditorysessiontheyseem
tobeinlinewiththevalencehypothesis,giventhatheattributed
ahigherscoretohappysyllablespresentedtotherightthantothe
leftear,andatthesametimeheattributedalowerscoretosad
syllablespresentedtotheleftthantotherightear[5–7].However,
thispatternwasnotevidentintheotherconditions.In
particu-lar,withvisualpresentation,D.D.V.seemedtotallyincapableat
detectingemotionsexpressedbythelefthemiface:hisjudgments
werebasedsolelyontheexpressionoftherighthemiface.Atfirst
Fig.6.95%Confidenceintervalsofthecontrolgroup(lines)andpatientsmeans(A.P.:cross;D.D.V.:triangle)inthethirdsession,inwhichchimericfacesanddichoticstimuli werepresentedtogetheranditwasaskedtoevaluatethevisualstimuli:16bimodalconditions.
dominance,thatcouldbeascribedeithertoahemispheric
asymme-tryinemotionalfaceprocessingortoaleft-hemisphericdominance
duetotheuseoftherighthandforresponding.However,another
explanationcouldbefoundintheliteratureonsplit-brainpatients.
Infact,severalstudiescarriedoutwithsplit-brainpatientsshowed
–onlyfollowingtotalresection–asortofattentionaldeficitofthe
righthemisphere[58,59,24].Inthepresentstudy,theuseofthesole
righthandinprovidingresponses,makesitdifficulttoconcludeif
D.D.V.showedaspatialneglectorless,however,testingthesame
split-brainpatient(D.D.V.),Hausmannandcolleagues[58]founda
rightwardbiaswhenD.D.V.carriedoutalinebisectiontaskusing
eithertheleftortherighthand;Corballisandcolleagues[59,60]
describedthesamekindofbiasinD.D.V.leadingtheauthorsto
concludethatD.D.V.showsasortofspatialhemineglect.A
cru-cialpointisthattheresultsofD.D.V.areincontrastwiththoseof
otherpatientswithpartial(eitheranteriororposterior)
calloso-tomy,whodonotshowhemineglect.Corballisandcolleagues[59]
proposedthatthespatialneglectofD.D.V.,causedbythecomplete
Fig.7.95%Confidenceintervalsofthecontrolgroup(lines)andpatientsmeans(A.P.:cross;D.D.V.:triangle)inthefourthsession,inwhichchimericfacesanddichotic stimuliwerepresentedtogetheranditwasaskedtoevaluatetheauditorystimuli:16bimodalconditions.
absenceofcallosalfibers,isduetoafailuretoshiftattentionfrom
therighthemifieldtothelefthemifield,andtheysuggestedthe
possibilitytotestthesamepatientwithouttheuseofthecentral
fixationcross,inordertoavoidtheneedtoshiftattentionfromthe
centertothesideofthescreen.Inthepresentstudythisproblem
wasinsomewaysidestepped,bymeansofshowingchimericfaces
inthecenterofthescreen.Nevertheless,thepresentresultsseem
toconfirmthepresenceofhemineglectinpatientD.D.V.Moreover,
weaddednewdata,testingthepatientwithacousticstimuli:from
thepresentresultswecanconcludethatthespatialneglectofD.D.V.
isnotapplicabletotheauditorydomain,differentlyfrompatients
whosufferfromhemineglectafteraunilateralcerebrallesion[61],
butitseemstobeconfinedtothevisualdomain.Intheauditory
domain,aconsonant-voweldichoticparadigmcarriedoutwith4
split-brainpatients[62]showedthattheREAwasequivalentfor
therightandthelefthandresponding.Overall,theexplanationof
ourresultsasduetotheuseoftherighthandwouldnotbevalid
intheauditorysession,inwhichD.D.V.didnotshowtherightward
bias,evenifheusedtherighthandonly.
Duringthebimodalpresentation,acousticstimuliwere
substan-tiallyignoredbyD.D.V.,whobasedhisresponsesontheright
hemi-face:giventhatD.D.V.couldsuccessfullycarryoutthedichotictask,
hedisregardedacousticstimulionlyinthecaseinwhichtheywere
presentedtogetherwithvisualstimuli.Inotherwords,thedichotic
presentationalonehighlightedthevalidityofthevalence
hypothe-sis,withopposite-valenceemotionsbeingprocessedbetterbythe
twohemispheres,whereasthesimultaneousaudio–visual
presen-tationshowedasupremacyofthevisualstimuliovershadowing
theacousticstimuli(visualcapture).Thiswasofcoursetruewhen
D.D.V.wasrequiredtoevaluatevisualstimuli,but,importantly,also
whenrequiredtoevaluateacousticstimuli.
Thissetofresultsshowsamultifacetedpicture.Whereas
Cor-ballisandcolleagues[59]concludedthattheattentionaldeficitof
D.D.V.isattributabletoadeficitin shiftingattention,anability
involvingthe ventralattentional system,and that it could
dis-appearifD.D.V.canbasehisresponsesonthedorsalattentional
system,ourresultsseemtosupportanotherperspective:themain
deficitofthepatientdoesnotseemtoresideinattentional
shif-ting,butin hisattentionalbiastowardtherightvisualfield. In
fact,chimericfaceswerepresentedcentrally,thusnotrequiring
gazeorientingorattentionalshift,butnonethelessD.D.V.showed
arightwardbias.Moreover,amongtheperceptualmodalitieswe
tested,thevisualmodality,butnottheauditorymodality,revealed
aspatialhemineglect.Inconclusion,theresultsofthetotal
split-brainpatientprovidedsupportforthevalencehypothesisinthe
auditorydomain,showedahemineglectinthevisualdomain,and
revealedastrongvisualcapture,thusconfirmingthehemineglect,
inaudio–visualprocessing.
Theresultsoftheanteriorsplit-brainpatient,A.P.,were
differ-entfromthoseofD.D.V.inallsessions.Indichoticpresentations,
A.P.’sevaluationsseemtosupporttherighthemisphere
hypothe-sis:happyandsadsyllableswereevaluatedashappierandsadder,
respectively,whenpresentedintheleftear(righthemisphere)than
intherightear(lefthemisphere),supportingtheright-hemispheric
dominanceinemotionalprocessingofbothpositiveandnegative
expressions.Theresultsinthevisualmodalityalsoprovidesupport
fortherighthemispherehypothesis:thehappy/sadchimericface
wasevaluatedashappierbyA.P.thanbythecontrolgroup,andthe
sad/happychimericfacewasevaluatedassadderbyA.P.thanby
thecontrolgroup.Thispatternshowsthatinthepartialsplit-brain
patient,theright-hemispheric dominanceforboth positive and
negativeemotionsmightbeevenmoreevidentthaninthecontrol
group.Inaudio–visualpresentations,A.P.didnotshow
difficul-tiesinevaluatingeithervisualoracousticstimuli.However,inthe
bimodalsessioninwhichvisualevaluationswererequired,A.P.’s
resultsdidnotsupporttherighthemispherehypothesis,butthey
seemedtobeinlinewiththevalencehypothesis:theevaluationsof
thechimericfaceswerelowerwhenthesadsyllablewaspresented
totheleftear(righthemisphere)andhigherwhenthehappy
sylla-blewaspresentedtotherightear(lefthemisphere).Inthebimodal
conditioninwhichauditoryevaluationswererequired,theresults
ofA.P. showedthat whentheemotionalsyllables,regardlessof
theirvalence,werepresentedtotheleftear(righthemisphere),
theyledtoagreaterweightofthe(hemi)facialexpressionshown
totheleft(thus,totherighthemisphere),withlowerscoresforthe
chimericsad/happythanhappy/sadface;whentheemotional
syl-lableswerepresentedtotherightear(lefthemisphere),theyledto
agreaterweightofthe(hemi)facialexpressionshowntotheright
(thus,tothelefthemisphere),withlowerscoresforthechimeric
happy/sadthansad/happyface.Theseresultssuggestthatwhen
auditoryevaluationswererequired,thehemispherecontralateral
totheearpresentedwiththesyllablewasmoreactivated,which
ledtoagreatersalienceofthecontralateralhemiface.Toconclude,
theresultsoftheanteriorcallosotomizedpatientprovidesupport
fortherighthemispherehypothesisonlywithunimodal
stimula-tion.ItisimportanttonotethatthedifferencebetweenD.D.V.and
A.P.isthepresenceofthespleniuminA.P.,whohoweverlacksall
theotherportionsoftheCC.Bymeansofdiffusion-weightedtensor
magneticresonanceimagingandprobabilistictractography,Beer
etal.[63]haverecentlyshownthepresenceofdirectconnections
betweenvisualandauditorycorticeswithineachhemisphereand
thatwhitematterprojectionsfrombothvisualandauditoryareas
crosshemispheresintheposteriorportionoftheCC,withauditory
andvisualinformationpassingthroughthesuperiorandinferior
splenium,respectively.Thisanatomicalevidencelendssupportto
theresultthattheperformanceofA.P.wasverysimilartothatof
thecontrolgroup,becauseinbothcasesaudio–visualinformation
maybeexchangedbetweenhemispherestroughthefibersofthe
splenium.However,somedifferenceexistsbetweenA.P.andthe
controlgroup,suggestingthatothercallosalregionsareimplicated
inthistask.
Thecontrolgroupdidnotshowhemisphericasymmetriesin
unimodalpresentations. Asreportedabove, weexpecteda
pat-ternof lateralizationin favor ofeither thevalence or theright
hemispherehypothesis.Theabsenceofevidenceinanydirection
in visualand auditory sessionscould beattributed tothelong
presentationtimeused.Aswespecifiedabove,wechoselonger
pre-sentationtimescomparedtotheclassicalrangeoftachistoscopic
presentation(around150ms)toensurestimulusprocessingbythe
patients.However,asignofhemisphericlateralizationisevident
intheinteractionbetweentargetmodalityandpresentationear:in
fact,whenanauditoryjudgmentwasrequired,syllablespresented
totherightearreceivedhigherevaluationsthansyllablespresented
totheleftear.Thisresultcouldbeinterpretedinthedirectionof
thevalencehypothesis:whentherighthemisphereisactivated
by the presentationof a syllable, auditoryjudgments decrease
possibly because of the right-hemispheric sensitivity for
nega-tiveemotions.Similarly,whenauditoryevaluationswererequired,
thehappy/happyfacewasjudgedashappierthanthesad/happy
face,butnotthanthehappy/sadface.Again,thisresultcould
sug-gestthattherighthemisphereismoresensitivethantheleftin
detectingthesadhemiface.Ontheotherhand,directcomparisons
betweenunimodalandbimodalsessionspointinthesame
direc-tion:thesad/happychimericfacewasjudgedashappierwhenit
waspresentedtogetherwiththehappysyllabletotherightear
(lefthemisphere),whereasnodifferenceswererecordedwhenthe
samesyllablewaspresentedtotheleftear(righthemisphere):in
accordancewiththevalencehypothesis,thehigheractivationof
thelefthemisphere,moreinvolvedinpositiveemotionprocessing,
couldhaveledtojudgethefaceashappier.Theabsenceofother
evidenceinsupportofthevalencehypothesisinhealthy
explainthepatients’moreasymmetricperformances.In fact,in
theseconditions A.P.based his ratingsontheleftward stimuli.
Probably,inthecaseofunimodalstimulation,afterthebilateral
exchange of perceptual analysis, which can occur through the
intact splenium of the patient, the outputof each hemisphere
couldbeprojectedtowardthefrontalareas(byintrahemispheric
fibers),where theabsenceofanterior interhemispheric
connec-tionspreventsthebalancebetweenhemispheres,leadingtothe
right-hemisphericsuperiorityinemotionalprocessing.This
possi-bilityisinaccordancewiththe‘modifiedvalencehypothesis’(MVH,
[64,65]):theclassicalhemisphericsuperiorityinaspecificvalence
emotionalanalysis(valencehypothesis)woulddependonthe
pre-frontalspecializationinpositiveandnegativeemotionalcontent
byeachhemisphere,whiletheposteriorregionwouldshowthe
right-hemisphericsuperiorityinallemotionalprocessing.Despite
theMVHdidnotreceivegreatconsideration,recentstudies
sup-portitsvalidity.Forexample,Stefanicsetal. [66]haverecently
foundsupportfortheMVH,recordingERPsduringtheprocessingof
unattendedfacialemotions,withearlycomponentssupportingthe
righthemispherehypothesis,andlatelateralizedcomponents
sup-portingthevalencehypothesis.ConsideringthatA.P.’sfrontallobes
weredisconnected,theMVHcouldexplainhisperformance.
How-ever,intheunimodalacousticcondition,theperformanceofD.D.V.
providedsupportforthevalencehypothesis.Fromthispatternof
results,wecansupposethatinthecaseoftotalinterhemispheric
disconnection,eachhemisphereshowsitssuperiorityinaspecific
emotionaldomain.Alsothispatternofresultcouldbeviewedin
accordancewiththeMVH:thecompletecallosalsectioninD.D.V.
doesnotallowanexchangebetweenthehemispheres,thusthe
relativedominanceofeachhemispherecouldemerge.Fromthis
perspective,thefrontalareasofD.D.V.receivea‘specialized’
out-putfromtheposteriorareasinwhichavalence-specificprocessing
wascarriedout.Thereafter,asregardsthetwomainhypotheseson
lateralizedemotionprocessing,wecanspeculatethattheyareboth
true,withtherighthemispherehypothesissupportedinthecase
ofpartialhemisphericconnections(A.P.)andthevalence
hypoth-esissupportedinthecaseofhemisphericindependence(D.D.V.).
Moreover,inbimodalsessions,theincreasedweightofcognitive
load–duetothesimultaneousstimulationsofbothhemispheres–
reducestheright-hemisphericsuperiority,supportinginanycase
thevalencehypothesis, exceptfor theperformanceofthe
split-brainpatient,whichweattributetoanattentionalbiasratherto
anasymmetricemotionalprocessing.Finally,thereistoconsider
thatthemeanageofthecontrolgroupislowerthanthatofthe
patients.However,itiswellknownthatastrongdifferenceexistin
callosalmorphometryconcerningbothsexandage[67].Theeffect
ofsexwascontrolledinthisstudy,testingallmaleparticipants.As
regardstheeffectofage,callosalmorphometricchangesareshown
especiallybeforereachingadulthood[68],whenthecallosalfibers
arenotcompletelymyelinated,butanumberofstudieslackedin
tiveemotion processing arefound in (i)bimodal presentations
inthecontrolgroup(connectedhemispheres),(ii)bimodal
pre-sentationsinA.P.(anteriorlydisconnectedhemispheres),and(iii)
acousticpresentationinD.D.V.(totallydisconnectedhemisphere).
Asregardsbimodalpresentations,wefoundastrongvisualcapture
inthecaseoftotallydisconnectedhemispheres.Importantly,when
visualstimuli werepresented (alone or together withacoustic
stimuli), only the total split-brain patient showed a
predomi-nantroleofthelefthemisphere,probablyduetoanattentional
bias,ratherthananasymmetricemotionalprocessing,whichwas
notpresentinauditoryprocessing.Thedebateonthevalidityof
theright hemisphere/valencehypothesis remains controversial,
butthepresentstudyaddsevidenceinsupportofboth theories
dependingonthetask,andsuggeststhatthevalencehypothesis
ismoreabletoaccountforresultsasthecognitiveloadbecomes
heavier.
Acknowledgments
WethankProfessorGabrielePolonaraforprovidinguswiththe
MRIimagesofthepatients.WealsothankverymuchD.D.V.and
A.P.fortheirwillingnesstocollaborateinthisstudy.
References
[1]DemareeHA,EverhartDE,YoungstromEA,HarrisonDW.Brainlateralizationof emotionalprocessing:historicalrootsandafutureincorporatingdominance. BehavCognNeurosciRev2005;4(1):3–20.
[2]LevyJ,HellerW,BanichMT,BurtonLA.Asymmetryofperceptioninfreeviewing ofchimericfaces.BrainCogn1983;2:404–19.
[3]VanLanckerD.Ragstoriches:ourincreasingappreciationofcognitiveand communicativeabilitiesofthehumanrightcerebralhemisphere.BrainLang 1997;57:1–11.
[4]BorodJC,CiceroBA,OblerLK,WelkowitzJ,ErhanHM,SantschiC,Grunwald IS,AgostiRM,WhalenJR.Righthemisphereemotionalperception:evidence acrossmultiplechannels.Neuropsychology1998;12:446–58.
[5]DavidsonRJ,MednickD,MossE,SaronC,SchafferCE.Ratingsofemotionin facesareinfluencedbythevisualfieldtowhichstimuliarepresented.Brain Cogn1987;6:403–11.
[6]AhernGL,SchwartzGE.Differentiallateralizationforpositivevsnegative emo-tion.Neuropsychologia1979;17:693–8.
[7]Baijal S, Srinivasan N. Emotional and hemispheric asymmetries in shifts of attention: an ERP study. Cogn Emot 2011;25:280–94, http://dx.doi.org/10.1080/02699931.2010.492719.
[8]Brancucci A, Babiloni C, Babiloni F, Galderisi S, Mucci S, Tecchio F, Zappasodi F, Pizzella V, Romani GL, Rossini PM. Inhibition of auditory cortical responses to ipsilateral stimuli during dichotic listening: evi-dence from magnetoencephalography. Eur J Neurosci 2004;19:2329–36, http://dx.doi.org/10.1111/j.1460-9568.2004.03302.x.
[9]Stefanatos GA, Joe WQ, Aguirre GK, Detre JA, Wetmore G. Activation of human auditory cortex during speech perception: effects of monau-ral,binaural,anddichoticpresentation.Neuropshycologia2008;46:301–15, http://dx.doi.org/10.1016/j.neuropsychologia.2007.07.008.
[10]KimuraD.Functionalasymmetryofthebrainindichoticlistening.Cortex 1967;3:163–8.
[11]D’Anselmo A, Reiterer S, Zuccarini F, Tommasi L, Brancucci A. Hemi-spheric asymmetries in bilinguals:tongue similarityaffectslateralization
ofsecondlanguage.Neuropsychologia2013;51:1187–94,http://dx.doi.org/ 10.1016/j.neuropsychologia.2013.03.016.
[12]Marzoli D, Tommasi L. Side biases in humans (Homo sapiens): three ecological studies on hemispheric asymmetries. Naturwissenschaften 2009;96(9):1099–106.
[13]WesterhausenR,HugdahlK.Thecorpuscallosumindichoticlisteningstudies ofhemisphericasymmetry:areviewofclinicalandexperimentalevidence. NeurosciBiobehavRev2008;32:1044–54.
[14]Gadea M, Espert R, Salvador A, Martí-Bonmatí L. The sad, the angry, and the asymmetrical brain: dichoticlisteningstudies ofnegative affect and depression. Brain Cogn 2011;76:294–9, http://dx.doi.org/10.1016/ j.bandc.2011.03.003.
[15]ErhanH,BorodJC,TenkeCE,BruderGE.Identificationofemotioninadichotic listeningtask:event-relatedbrainpotentialandbehavioralfindings.BrainCogn 1998;37:286–307.
[16]HerreroJV,HillixWA.Hemisphericperformanceindetectingprosody:a com-petitivedichoticlisteningtask.PerceptMotSkills1990;71(2):479–86. [17]SchepmanA,RodwayP,GeddesP.Valence-specificlateralityeffectsinvocal
emotion: Interactions with stimulus type, blockingand sex.Brain Cogn 2012;79:129–37,http://dx.doi.org/10.1016/j.bandc.2012.03.001.
[18]MeyersM,SmithBD.Hemisphericasymmetryandemotion:effectsof nonver-balaffectivestimuli.BiolPsychol1986;22(1):11–22.
[19]Wildgruber D, PihanH, AckermannH,Erb M,Grodd W.Dynamicbrain activationduringprocessingofemotionalintonation:influenceofacoustic parameters,emotionalvalence,andsex.NeuroImage2002;15(4):856–69. [20]VoyerD,VoyerSD,TramonteL.Free-viewinglateralitytasks:amultilevel
meta-analysis. Neuropsychology 2012;26(5):551–67, http://dx.doi.org/10.1037/ a0028631.
[21]Drebing CE, Federman EJ, Edington P, Terzian MA. Affect identification bias demonstrated with individual chimeric faces. Percept Mot Skills 1997;85(3):1099–104.
[22]KillgoreWDS,Yurgelun-ToddDA.Theright-hemisphereandvalence hypothe-ses:couldtheybothberight(andsometimesleft)?SocCognAffectNeurosci 2007;2:240–50,http://dx.doi.org/10.1093/scan/nsm020.
[23]Alves NT,Aznar-CasanovaJA, Fukusima SS.Patternsof brainasymmetry in the perceptionof positive and negative facial expressions. Laterality: AsymmetriesBody,BrainCogn2009;14(3):256–72,http://dx.doi.org/10.1080/ 13576500802362927.
[24]PreteG,D’AscenzoS,LaengB,FabriM,FoschiN,TommasiL.Consciousand unconsciousprocessingoffacialexpressions:evidencefromtwosplit-brain patients. J Neuropsychol 2013, http://dx.doi.org/10.1111/jnp.12034[Epub aheadofprint]PMID:24325712.
[25]JansariA,RodwayP,GoncalvesS.Identifyingfacialemotions:valence spe-cificeffectsandanexplorationoftheeffectsofviewergender.BrainCogn 2011;76:415–23,http://dx.doi.org/10.1016/j.bandc.2011.03.009.
[26]KumarD,SrinivasanN.Emotionperceptionismediatedbyspatialfrequency content.Emotion2011;11:1144–51,http://dx.doi.org/10.1037/a0025453. [27]Prete G, Laeng B, Tommasi T. Lateralized hybrid faces: evidence of a
valence-specific bias in the processing of implicit emotions. Laterality: AsymmetriesBody,BrainCogn2014;19(4):439–54,http://dx.doi.org/10.1080/ 1357650X.2013.862255.
[28]de Gelder B, Bertelson P. Multisensory integration, percep-tion and ecological validity. Trends Cogn Sci 2003;7(10):460–7, http://dx.doi.org/10.1016/j.tics.2003.08.014.
[29]CollignonO,GirardS,GosselinF,RoyS,Saint-AmourD,LassondeM,LeporeF. Audio-visualintegrationofemotionexpression.BrainRes2008;1242:126–35, http://dx.doi.org/10.1016/j.brainres.2008.04.023.
[30]PetriniK,McAleerP,PollickF.Audiovisualintegrationofemotionalsignalsfrom musicimprovisationdoesnotdependontemporalcorrespondence.BrainRes 2010;1323:139–48,http://dx.doi.org/10.1016/j.brainres.2010.02.012. [31]deGelderB,VroomenJ.Theperceptionofemotionsbyearandbyeye.Cogn
Emot2000;14(3):289–311.
[32]KreifeltsB,EthoferT,GroddW,ErbM,WildgruberD.Audiovisualintegrationof emotionalsignalsinvoiceandface:anevent-relatedfMRIstudy.NeuroImage 2007;37:1445–56,http://dx.doi.org/10.1016/j.neuroimage.2007.06.020. [33]CampanellaS,BelinP.Integratingfaceandvoiceinpersonperception.Trends
CognSci2007;11(12):535–43,http://dx.doi.org/10.1016/j.tics.2007.10.001. [34]PourtoisG,deGelderB,VroomenJ,RossionB,CrommelinckM.Thetime-course
ofintermodalbindingbetweenseeingandhearingaffectiveinformation. Neu-roReport2000;11:1329–33.
[35]PourtoisG,deGelderB,BolA,CrommelinckM.Perceptionoffacial expres-sionsandvoicesandoftheircombinationinthehumanbrain.Cortex2005;41: 49–59.
[36]Ethofer T, Pourtois G, Wildgruber D. Investigating audiovisual integra-tion ofemotionalsignals inthe humanbrain. ProgBrain Res2006;156: 345–61.
[37]JeongJW,DiwadkarVA,ChuganiCD,SinsoongsudP,MuzikO,BehenME, ChuganiHT,ChuganiDC.Congruenceofhappyandsademotioninmusicand facesmodifiescorticalaudiovisualactivation.NeuroImage2011;54:2973–82, http://dx.doi.org/10.1016/j.neuroimage.2010.11.017.
[38]EthoferT,BretscherJ,WiethoffS,BischJ,SchlipfS,WildgruberD,KreifeltsB. Functionalresponsesandstructuralconnectionsofcorticalareasforprocessing
facesandvoicesinthesuperiortemporalsulcus.NeuroImage2013;76:45–56, http://dx.doi.org/10.1016/j.neuroimage.2013.02.064.
[39]deGelderB,BöckerKBE,TuomainenJ,HensenM,VroomenJ.Thecombined perceptionofemotionfromvoiceandface:earlyinteractionrevealedbyhuman electricbrainresponses.NeurosciLett1999;260:133–6.
[40]HayakawaY,MimuraM,MurakamiH,KawamuraM.Emotionrecognitionfrom stimuliindifferentsensorymodalitiesinpost-encephaliticpatients. Neuropsy-chiatrDisTreat2010;6:99–105.
[41]Gazzaniga MS. Cerebral specialization and interhemispheric communi-cation. Does the corpus callosum enable the human condition? Brain 2000;123:1293–326.
[42]GazzanigaMS.Forty-fiveyearsofsplit-brainresearchandstillgoingstrong. Nature2005;6:653–9.
[43]SperryRW.Lateralspecializationinthesurgicallyseparatedhemispheres.In: SchmittF,WordenF,editors.Neurosciencesthirdstudyprogram.Cambridge, MA:MITPress;1974.p.1–12.
[44]GazzanigaMS,LeDouxJE.Theintegratedmind.NewYork,NY:PlenumPress; 1978.
[45]LevyJ,TrevarthenC,SperryRW.Perceptionofbilateralchimericfigures fol-lowinghemisphericdeconnexion.Brain1972;95:61–78.
[46]SpringerSP,GazzanigaMS.Dichotictestingofpartialandcompletesplitbrain subjects.Neuropsychologia1974;13:341–6.
[47]Musiek FE, Weihing J. Perspectives on dichotic listening and the cor-pus callosum. Brain Cogn 2011;76:225–32, http://dx.doi.org/10.1016/ j.bandc.2011.03.011.
[48]StoneVE,NisensonL, EliassenJC,GazzanigaMS. Lefthemisphere repre-sentations ofemotional facialexpressions. Neuropsychologia 1996;34(1): 23–9.
[49]LàdavasE,CimattiD,PesceMD,TuozziG.Emotionalevaluationwithand with-outconsciousstimulusidentification:evidencefromasplit-brainpatient.Cogn Emot1993;7(1):95–114.
[50]GainottiG.Unconsciousprocessingofemotionsandtherighthemisphere. Neuropsychologia2012;50(2):205–18.
[51]OldfieldRC.Theassessmentandanalysisofhandedness:TheEdinburgh Inven-tory.Neuropsychologia1971;9:97–114.
[52]Fabri M, Del Pesce M, Paggi A, Polonara G, Bartolini M, Salvolini U, Manzoni T.Contributionof posterior corpuscallosum to the interhemi-spheric transfer of tactile information. Cogn Brain Res 2005;24:73–80, http://dx.doi.org/10.1016/j.cogbrainres.2004.12.003.
[53]LundqvistD,FlyktA,ÖhmanA.TheKarolinskadirectedemotionalfaces(KDEF). Stockholm:DepartmentofNeurosciencesKarolinskaHospital;1998. [54]RutherfordMD,ChatthaHM,KryskoKM.Theuseofaftereffectsinthestudyof
relationshipsamongemotioncategories.JExpPsycholHumPerceptPerform 2008;34(1):27–40,http://dx.doi.org/10.1037/0096-1523.34.1.27.
[55]AngensteinN,BrechmannA.Leftauditorycortex isinvolvedinpairwise comparisonsofthedirectionoffrequencymodulatedtones.FrontNeurosci 2013;7:1–8,http://dx.doi.org/10.3389/fnins.2013.00115.
[56]Coolican J, Eskes GA, McMullen PA, Lecky E. Perceptual biases in processing facial identity and emotion. Brain Cogn 2008;66:176–87, http://dx.doi.org/10.1016/j.bandc.2007.07.001.
[57]ButlerSH,HarveyM.Effectsofagingandexposuredurationonperceptual biasesinchimericfaceprocessing.Cortex2008;44:665–72.
[58]HausmannM,CorballisMC,FabriM.Linebisectioninthesplitbrain. Neuropsy-chology2003;17(4):602–9,http://dx.doi.org/10.1037/0894-4105.17.4.602. [59]CorballisMC,CorballisPM,FabriM,PaggiA,ManzoniT.Nowyouseeit,now
youdon’t:variablehemineglectinacommissurotomizedman.CognBrainRes 2005;25:521–30,http://dx.doi.org/10.1016/j.cogbrainres.2005.08.002. [60]Corballis MC, Corballis PM, Fabri M. Redundancy gain in simple
reac-tion time following partial and complete callosotomy. Neuropsychologia 2004;42(1):71–81,http://dx.doi.org/10.1016/S0028-3932(03)00152-0. [61]PavaniF,HusainM,LádavasE,DriverJ.Auditorydeficitsinvisuospatialneglect
patients.Cortex2004;40(2):347–65.
[62]ClarkeJM,LufkinRB,ZaidelE.Corpuscallosummorphometryanddichotic listeningperformace:individualdifferencesinfunctionalinterhemispheric inhibition?Neuropsychologia1993;31(6):547–57.
[63]BeerAL, Plank T,Greenlee MW. Diffusiontensor imaging shows white mattertracts betweenhumanauditory andvisual cortex.Exp Brain Res 2011;213:299–308,http://dx.doi.org/10.1007/s00221-011-2715-y. [64]DavidsonRJ.Affect,cognition,andhemisphericspecialization.In:IzardCE,
KaganJ,ZajoncR,editors.Emotion,cognition,andbehaviour.NewYork,NY: CambridgeUniversityPress;1984.
[65]BorodJC.Cerebralmechanismunderlyingfacial,prosodicandlexicalemotional expression:areviewofneuropsychologicalstudiesandmethodologicalissues. Neuropsychology1993;7:445–63.
[66]StefanicsG,CsuklyG, KomlósiS,CzoborP,CziglerI. Processingof unat-tendedfacial emotions: a visualmismatchnegativity study.NeuroImage 2012;59(3):3042–9.
[67]DriesenNR,RazN.Theinfluenceofsex,age,andhandednessoncorpuscallosum morphology:ameta-analysis.Psychobiology1995;23(3):240–7.
[68]WitelsonSF,KigarDL.Anatomicaldevelopmentofthecorpuscallosumin humans:areviewwithreferencetosexandcognition.In:MolfeseDL, edi-tor.Brainlateralizationinchildren:developmentalimplications.NewYork: GuilfordPress;1988.p.35–57.