README
Dataset 1
Presentation
This dataset was collected between 2012 and 2013 in Rennes (France) during two conditions (visual naming and spelling tasks).
The dataset consists of naming and spelling the names of visually presented objects. The data was collected in the Rennes University Hospital. This experiment was approved by an independent ethics committee and authorized by the French institutional review board (IRB): "Comite de Protection des Personnes dans la Recherche Biomedicale Ouest V" (CCPPRB-Ouest V).
This study was registered under the name "conneXion" and the agreement number: 2012- A01227-36.
Participants
Twenty-three right-handed healthy volunteers of whom 12 females, with an age range between
19 and 40 years (mean age 28 year),and 11 males with an age range between 19 and 33 years (mean age 23 years) participated in this study. (See participants.json and participants.tsv for more details)
Experiment
* The experiment begins with the verification of inclusion/exclusion criteria.
* The participants read the information notice and the consent form.
* Then they sign two questionnaires.
* One subject -->Two conditions (naming and spelling)--> two runs for each condition.
* Each run contains 74 stimuli.
* The spelling task always follow the naming task and its instruction was not given before the naming task was completed to avoid any reminiscence of words orthographic structures
* Each run contains balanced numbers of animals and objects as well as long and short words.
* Pictures are presented on a screen using a computer and the experimental paradigm is presented using E-prime Psychology Software Tools.
* The responses produced by the participants were collected via a Logitech microphone and analyzed to detect onsets of speech using Praat v5.3.13(University of Amsterdam, 1012VT Amsterdam, The Netherlands).
EEG acquisition
* HD-EEG system (EGI, Electrical Geodesic Inc., 256 electrodes)
* Sampling frequency: 1000Hz
* Impedances were kept below 5k
Contact
* If you have any questions or comments, please contact:
* Ahmad Mheich: mheich.ahmad@gmail.com