Classification of breast cancer from digital mammography using deep learning
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.loc.gov/MARC21/slim http://www.loc.gov/standards/marcxml/schema/MARC21slim.xsd">
<record>
<leader>00000cab a2200000 4500</leader>
<controlfield tag="001">MAP20200035671</controlfield>
<controlfield tag="003">MAP</controlfield>
<controlfield tag="005">20210429141008.0</controlfield>
<controlfield tag="008">201110e20200601esp|||p |0|||b|eng d</controlfield>
<datafield tag="040" ind1=" " ind2=" ">
<subfield code="a">MAP</subfield>
<subfield code="b">spa</subfield>
<subfield code="d">MAP</subfield>
</datafield>
<datafield tag="084" ind1=" " ind2=" ">
<subfield code="a">931</subfield>
</datafield>
<datafield tag="100" ind1=" " ind2=" ">
<subfield code="0">MAPA20200022022</subfield>
<subfield code="a">López-Cabrera, José Daniel</subfield>
</datafield>
<datafield tag="245" ind1="1" ind2="0">
<subfield code="a">Classification of breast cancer from digital mammography using deep learning </subfield>
<subfield code="c">José Daniel López-Cabrera, Luis Alberto López Rodríguez, Marlén Pérez-Díaz</subfield>
</datafield>
<datafield tag="520" ind1=" " ind2=" ">
<subfield code="a">Breast cancer is the most frequent in females. Mammography has proven to be the most effective method for the early detection of this type of cancer. Mammographic images are sometimes difficult to understand, due to the nature of the anomalies, the low contrast image and the composition of the mammary tissues, as well as various technological factors such as spatial resolution of the image or noise. Computer-aided diagnostic systems have been developed to increase the accuracy of mammographic examinations and be used by physicians as a second opinion in obtaining the final diagnosis, and thus reduce human errors. Convolutional neural networks are a current trend in computer vision tasks, due to the great performance they have achieved. The present investigation was based on this type of networks to classify into three classes, normal, benign and malignant tumour. Due to the fact that the miniMIAS database used has a low number of images, the transfer learning technique was applied to the Inception v3 pre-trained network. Two convolutional neural network architectures were implemented, obtaining in the architecture with three classes, 86.05% accuracy. On the other hand, in the architecture with two neural networks in series, an accuracy of 88.2% was reached.</subfield>
</datafield>
<datafield tag="650" ind1=" " ind2="4">
<subfield code="0">MAPA20080540500</subfield>
<subfield code="a">Cáncer</subfield>
</datafield>
<datafield tag="650" ind1=" " ind2="4">
<subfield code="0">MAPA20080557850</subfield>
<subfield code="a">Diagnóstico</subfield>
</datafield>
<datafield tag="650" ind1=" " ind2="4">
<subfield code="0">MAPA20080562236</subfield>
<subfield code="a">Enfermedades</subfield>
</datafield>
<datafield tag="650" ind1=" " ind2="4">
<subfield code="0">MAPA20080624842</subfield>
<subfield code="a">Redes neuronales artificiales</subfield>
</datafield>
<datafield tag="650" ind1=" " ind2="4">
<subfield code="0">MAPA20080560331</subfield>
<subfield code="a">Radiografía</subfield>
</datafield>
<datafield tag="700" ind1="1" ind2=" ">
<subfield code="0">MAPA20200022121</subfield>
<subfield code="a">López Rodríguez, Luis Alberto</subfield>
</datafield>
<datafield tag="700" ind1="1" ind2=" ">
<subfield code="0">MAPA20200022138</subfield>
<subfield code="a">Pérez-Díaz, Marlén</subfield>
</datafield>
<datafield tag="773" ind1="0" ind2=" ">
<subfield code="w">MAP20200034445</subfield>
<subfield code="t">Revista Iberoamericana de Inteligencia Artificial</subfield>
<subfield code="d">IBERAMIA, Sociedad Iberoamericana de Inteligencia Artificial , 2018-</subfield>
<subfield code="x">1988-3064</subfield>
<subfield code="g">01/06/2020 Volumen 23 Número 65 - junio 2020 , p. 56-66</subfield>
</datafield>
<datafield tag="856" ind1=" " ind2=" ">
<subfield code="q">application/pdf</subfield>
<subfield code="w">1108610</subfield>
<subfield code="y">Recurso electrónico / Electronic resource</subfield>
</datafield>
</record>
</collection>