Abstract: The Self-Organizing Map (SOM) is an excellent tool in experimental phase of data mining. It projects input space on models of a low-dimensional regular grid that can be effectively utilized to visualize and explore properties of the data. When the number of SOM units is large, to simplify quantitative analysis of the map and the data, similar units need to be grouped, i.e., clustered. Self-organizing maps are known for its clustering, visualization and classification capabilities. SOM is a popular unsupervised artificial neural network algorithm to produce a low dimensional, discredited representation of the input space of the training samples, called a feature map. Such a map retains standard features of the input data. The Expectation Maximization (EM) algorithm was proposed to calculate the Maximum Likelihood Estimator (MLE) in the occurrence of missing observations. An EM algorithm that yields topology preserving maps of data based on probabilistic mixture models. Compared to other mixture model approaches to self-organizing maps, the function our algorithm maximizes has a clear interpretation; it sums data log-likelihood and a penalty term that enforces self-organization. Finally allows just handling of missing data and learning of mixtures of self-organizing maps. The proposed SOM and EM algorithm is implemented with MATLAB.
Keywords: Self-Organizing Map (SOM), Neural Networks, Feature Map, Clustering, Unsupervised Learning, Mixture Models, EM Algorithm
| DOI: 10.17148/IJIREEICE.2018.6104