Similarity-Based Rough Sets with Annotation Using Deep Learning - Inria - Institut national de recherche en sciences et technologies du numérique
Conference Papers Year : 2021

Similarity-Based Rough Sets with Annotation Using Deep Learning

Dávid Nagy
  • Function : Author
  • PersonId : 1153172
Tamás Mihálydeák
  • Function : Author
  • PersonId : 1153173
Tamás Kádek
  • Function : Author
  • PersonId : 1153174

Abstract

In the authors’ previous research the possible usage of correlation clustering in rough set theory was investigated. Correlation clustering is based on a tolerance relation that represents the similarity among objects. Its result is a partition which can be treated as the system of base sets. However, singleton clusters represent very little information about the similarity. If the singleton clusters are discarded, then the approximation space received from the partition is partial. In this way, the approximation space focuses on the similarity (represented by a tolerance relation) itself and it is different from the covering type approximation space relying on the tolerance relation. In this paper, the authors examine how the partiality can be decreased by inserting the members of some singletons into base sets and how this annotation affects the approximations. This process can be performed by the user of system. However, in the case of a huge number of objects, the annotation can take a tremendous amount of time. This paper shows an alternative solution to the issue using neural networks.
Fichier principal
Vignette du fichier
512271_1_En_8_Chapter.pdf (220.59 Ko) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-03741710 , version 1 (01-08-2022)

Licence

Identifiers

Cite

Dávid Nagy, Tamás Mihálydeák, Tamás Kádek. Similarity-Based Rough Sets with Annotation Using Deep Learning. 4th International Conference on Intelligence Science (ICIS), Feb 2021, Durgapur, India. pp.93-102, ⟨10.1007/978-3-030-74826-5_8⟩. ⟨hal-03741710⟩
29 View
27 Download

Altmetric

Share

More