[IMAGE] A multiscale imaging and modelling dataset of the human inner ear
We develop a large number of software tools and hosting infrastructures to support the research developed at the Department. We will be detailing in this section the different tools available. You can take a look for the moment at the offer available within the UPF Knowledge Portal, the innovations created in the context of EU projects in the Innovation Radar and the software sections of some of our research groups:
Artificial Intelligence |
Nonlinear Time Series Analysis |
Web Research |
Music Technology |
Interactive Technologies |
Barcelona MedTech |
Natural Language Processing |
Nonlinear Time Series Analysis |
UbicaLab |
Wireless Networking |
Educational Technologies |
A multiscale imaging and modelling dataset of the human inner ear
Understanding the human inner ear anatomy and its internal structures is paramount to advance hearing implant technology. While the emergence of imaging devices allowed researchers to improve understanding of intracochlear structures, the difficulties to collect appropriate data has resulted in studies conducted with few samples. To assist the cochlear research community, a large collection of human temporal bone images is being made available. This data descriptor, therefore, describes a rich set of image volumes acquired using cone beam computed tomography and micro-CT modalities, accompanied by manual delineations of the cochlea and sub-compartments, a statistical shape model encoding its anatomical variability, and data for electrode insertion and electrical simulations. This data makes an important asset for future studies in need of high-resolution data and related statistical data objects of the cochlea used to leverage scientific hypotheses. It is of relevance to anatomists, audiologists, computer scientists in the different domains of image analysis, computer simulations, imaging formation, and for biomedical engineers designing new strategies for cochlear implantations, electrode design, and others.
See the related publication at Nature Scientific Data https://www.nature.com/articles/sdata2017132