This page has only limited features, please log in for full access.

Dr. David Renato Domínguez Carreta
University Autonoma of Madrid

Basic Info


Research Keywords & Expertise

0 Neural Networks
0 Sustainability
0 topology optimization
0 sociophysics
0 stochastic dynamics

Fingerprints

Neural Networks
Sustainability

Honors and Awards

The user has no records in this section


Career Timeline

The user has no records in this section.


Short Biography

The user biography is not available.
Following
Followers
Co Authors
The list of users this user is following is empty.
Following: 0 users

Feed

Journal article
Published: 02 March 2021 in Neurocomputing
Reads 0
Downloads 0

The present study analyzes the retrieval capacity of an Ensemble of diluted Attractor Neural Networks for real patterns (i.e., non-random ones), as it is the case of human fingerprints. We explore the optimal number of Attractor Neural Networks in the ensemble to achieve a maximum fingerprint storage capacity. The retrieval performance of the ensemble is measured in terms of the network connectivity structure, by comparing 1D ring to 2D cross grid topologies for the random shortcuts ratio. Given the nature of the network ensemble and the different characteristics of patterns, an optimization can be carried out considering how the pattern subsets are assigned to the ensemble modules. The ensemble specialization splitting into several modules of attractor networks is explored with respect to the activities of patterns and also in terms of correlations of the subsets of patterns assigned to each module in the ensemble network.

ACS Style

Mario González; Ángel Sánchez; David Dominguez; Francisco B. Rodríguez. Ensemble of diluted attractor networks with optimized topology for fingerprint retrieval. Neurocomputing 2021, 442, 269 -280.

AMA Style

Mario González, Ángel Sánchez, David Dominguez, Francisco B. Rodríguez. Ensemble of diluted attractor networks with optimized topology for fingerprint retrieval. Neurocomputing. 2021; 442 ():269-280.

Chicago/Turabian Style

Mario González; Ángel Sánchez; David Dominguez; Francisco B. Rodríguez. 2021. "Ensemble of diluted attractor networks with optimized topology for fingerprint retrieval." Neurocomputing 442, no. : 269-280.

Journal article
Published: 26 June 2020 in Heliyon
Reads 0
Downloads 0

The present study analyzes the offshoring network constructed from the information contained in the Panama Papers, characterizing worldwide regions and countries as well as their intra- and inter-relationships. The Panama Papers 2016 divulgence is the largest leak of offshoring and tax avoidance documentation. The document leak, with a volume content of approximately 2.6 terabytes, involves more than two hundred thousand enterprises in more than two hundred countries. From this information, the offshore connections of individuals and companies are constructed and aggregated using their countries of origin. The top offshore financial regions and countries of the network are identified, and their intra- and inter-relationship are mapped and described. We are able to identify the top countries in the offshoring network and characterize their connectivity structure, discovering the more prominent actors in the worldwide offshoring scenario and their range of influence.

ACS Style

David Dominguez; Odette Pantoja; Pablo Pico; Miguel Mateos; María Del Mar Alonso-Almeida; Mario González. Panama Papers' offshoring network behavior. Heliyon 2020, 6, e04293 .

AMA Style

David Dominguez, Odette Pantoja, Pablo Pico, Miguel Mateos, María Del Mar Alonso-Almeida, Mario González. Panama Papers' offshoring network behavior. Heliyon. 2020; 6 (6):e04293.

Chicago/Turabian Style

David Dominguez; Odette Pantoja; Pablo Pico; Miguel Mateos; María Del Mar Alonso-Almeida; Mario González. 2020. "Panama Papers' offshoring network behavior." Heliyon 6, no. 6: e04293.

Conference paper
Published: 18 December 2018 in Privacy Enhancing Technologies
Reads 0
Downloads 0

This study evaluates the performance provided by a Blume-Emery-Griffiths neural network (BEGNN) for two datasets of corruption indicators, namely the Corruption Perceptions Index and the Global Corruption Barometer. Bi-lineal and bi-quadratic terms are added to the Hamiltonian of the model, as well as for the order parameters to measure the network retrieval efficiency. The network is tested for different noise levels of the patterns’ initial state during the retrieval phase in order to measure the robustness of the network and its basin of attraction. The network connectivity is diluted periodically and its performance is tested for different levels of dilution. The network is analyzed in terms of the pattern load, mixing the real corruption patterns with random patterns in order to assess the change from retrieval to non-retrieval phases.

ACS Style

Mario González; David Dominguez; Guillermo Jerez; Odette Pantoja. Periodically Diluted BEGNN Model of Corruption Perception. Privacy Enhancing Technologies 2018, 289 -298.

AMA Style

Mario González, David Dominguez, Guillermo Jerez, Odette Pantoja. Periodically Diluted BEGNN Model of Corruption Perception. Privacy Enhancing Technologies. 2018; ():289-298.

Chicago/Turabian Style

Mario González; David Dominguez; Guillermo Jerez; Odette Pantoja. 2018. "Periodically Diluted BEGNN Model of Corruption Perception." Privacy Enhancing Technologies , no. : 289-298.

Conference paper
Published: 18 December 2018 in Privacy Enhancing Technologies
Reads 0
Downloads 0

This work models the Corporate Sustainability General Reporting Initiative (GRI) using a ternary attractor network. A dataset of 15 years evolution of the GRI reports for a world-wide set of companies was compiled from a recent work and adapted to match the pattern coding for a ternary attractor network. We compare the performance of the network with a classical binary attractor network. Two types of criteria were used for encoding the ternary network, i.e., a simple and weighted threshold, and the performance retrieval was better for the latter, highlighting the importance of the real patterns’ transformation to the three-state coding. The network exceeds the retrieval performance of the binary network for the chosen correlated patterns (GRI). Finally, the ternary network was proved to be robust to retrieve the GRI patterns with initial noise.

ACS Style

Mario González; David Dominguez; Odette Pantoja; Carlos Guerrero; Francisco De Borja Rodriguez. Modeling Sustainability Reporting with Ternary Attractor Neural Networks. Privacy Enhancing Technologies 2018, 259 -267.

AMA Style

Mario González, David Dominguez, Odette Pantoja, Carlos Guerrero, Francisco De Borja Rodriguez. Modeling Sustainability Reporting with Ternary Attractor Neural Networks. Privacy Enhancing Technologies. 2018; ():259-267.

Chicago/Turabian Style

Mario González; David Dominguez; Odette Pantoja; Carlos Guerrero; Francisco De Borja Rodriguez. 2018. "Modeling Sustainability Reporting with Ternary Attractor Neural Networks." Privacy Enhancing Technologies , no. : 259-267.

Book chapter
Published: 18 May 2017 in Advances in Computational Intelligence
Reads 0
Downloads 0
ACS Style

Mario González; David Dominguez; Ángel Sánchez; Francisco B. Rodríguez. Capacity and Retrieval of a Modular Set of Diluted Attractor Networks with Respect to the Global Number of Neurons. Advances in Computational Intelligence 2017, 497 -506.

AMA Style

Mario González, David Dominguez, Ángel Sánchez, Francisco B. Rodríguez. Capacity and Retrieval of a Modular Set of Diluted Attractor Networks with Respect to the Global Number of Neurons. Advances in Computational Intelligence. 2017; ():497-506.

Chicago/Turabian Style

Mario González; David Dominguez; Ángel Sánchez; Francisco B. Rodríguez. 2017. "Capacity and Retrieval of a Modular Set of Diluted Attractor Networks with Respect to the Global Number of Neurons." Advances in Computational Intelligence , no. : 497-506.

Journal article
Published: 01 April 2017 in Expert Systems with Applications
Reads 0
Downloads 0

Ensemble of diluted Attractor Neural Networks for pattern retrieval.Increase of network storage capacity by divide-and-conquer approach of subnetworks.Ensemble system triples maximal capacity of the single network with same wiring cost.Engineering application to limited memory systems: embedded systems or smartphones. This work presents an ensemble of Attractor Neural Networks (ANN) modules, that increases the patterns' storage, at similar computational cost when compared with a single-module ANN system. We build the ensemble of ANN components, and divide the uniform random patterns' set into disjoint subsets during the learning stage, such that each subset is assigned to a different component. In this way, a larger overall number of patterns can be stored by the ANN ensemble, where each of its modules has a moderate pattern load, being able to retrieve its corresponding assigned subset with the desired quality. Allowing some noise in the retrieval, we are able to recall a larger number of patterns while discriminating between pattern subsets assigned to each component in the ensemble. We showed that the ANN ensemble system with N = 10 4 units is able to approximately triple the maximal capacity of the single ANN, with similar wiring costs. We tested the modularized ANN ensemble for different levels of component dilution, by keeping constant the wiring costs. This approach could be implemented, for instance, with parallel computing in order to tackle computational costly real-world problems.

ACS Style

Mario González; David Dominguez; Ángel Sánchez; Francisco De Borja Rodriguez. Increase attractor capacity using an ensembled neural network. Expert Systems with Applications 2017, 71, 206 -215.

AMA Style

Mario González, David Dominguez, Ángel Sánchez, Francisco De Borja Rodriguez. Increase attractor capacity using an ensembled neural network. Expert Systems with Applications. 2017; 71 ():206-215.

Chicago/Turabian Style

Mario González; David Dominguez; Ángel Sánchez; Francisco De Borja Rodriguez. 2017. "Increase attractor capacity using an ensembled neural network." Expert Systems with Applications 71, no. : 206-215.

Journal article
Published: 01 September 2016 in Physica A: Statistical Mechanics and its Applications
Reads 0
Downloads 0

The ability of a metric attractor neural networks (MANN) to learn structured patterns is analyzed. In particular we consider collections of fingerprints, which present some local features, rather than being modeled by random patterns. The network retrieval proved to be robust to varying the pattern activity, the threshold strategy, the topological arrangement of the connections, and for several types of noisy configuration. We found that the lower the fingerprint patterns activity is, the higher the load ratio and retrieval quality are. A simplified theoretical framework, for the unbiased case, is developed as a function of five parameters: the load ratio, the finiteness connectivity, the density degree of the network, randomness ratio, and the spatial pattern correlation. Linked to the latter appears a new neural dynamics variable: the spatial neural correlation. The theory agrees quite well with the experimental results.

ACS Style

Felipe Doria; Rubem Erichsen; Mario González; Francisco B. Rodríguez; Ángel Sánchez; David Dominguez. Structured patterns retrieval using a metric attractor network: Application to fingerprint recognition. Physica A: Statistical Mechanics and its Applications 2016, 457, 424 -436.

AMA Style

Felipe Doria, Rubem Erichsen, Mario González, Francisco B. Rodríguez, Ángel Sánchez, David Dominguez. Structured patterns retrieval using a metric attractor network: Application to fingerprint recognition. Physica A: Statistical Mechanics and its Applications. 2016; 457 ():424-436.

Chicago/Turabian Style

Felipe Doria; Rubem Erichsen; Mario González; Francisco B. Rodríguez; Ángel Sánchez; David Dominguez. 2016. "Structured patterns retrieval using a metric attractor network: Application to fingerprint recognition." Physica A: Statistical Mechanics and its Applications 457, no. : 424-436.

Journal article
Published: 01 November 2015 in Neurocomputing
Reads 0
Downloads 0

This work experimentally explores the metric Attractor Neural Network for modeling Corporate Sustainability Reporting patterns of a set of global companies. A small-world topology configuration is used for the metric network, and compared with a configuration obtained from the Mutual Information (MI) between companies, in terms of the usual dilution and shortcut ratios. The resulting MI topology configuration is depicted for mesoscopic blocks distributed by continents and economic sectors. The reporting sequence is learned as static patterns, as well as, a temporal sequence from year 1999 to 2013. The retrieval of the sequence showed a saturation point around 2010 where the reporting pattern stalled. We showed that the MI topology configuration obtained for continents, reinforces previous research about the role of Europe as a driver about sustainability and its influence worldwide. Also, the MI configuration outlines recent (post-crises) behavior, of the involved economic sectors.

ACS Style

Mario Gonzalez; María Del Mar Alonso-Almeida; Cassio Avila; David Dominguez. Modeling sustainability report scoring sequences using an attractor network. Neurocomputing 2015, 168, 1181 -1187.

AMA Style

Mario Gonzalez, María Del Mar Alonso-Almeida, Cassio Avila, David Dominguez. Modeling sustainability report scoring sequences using an attractor network. Neurocomputing. 2015; 168 ():1181-1187.

Chicago/Turabian Style

Mario Gonzalez; María Del Mar Alonso-Almeida; Cassio Avila; David Dominguez. 2015. "Modeling sustainability report scoring sequences using an attractor network." Neurocomputing 168, no. : 1181-1187.

Journal article
Published: 01 February 2012 in Physica A: Statistical Mechanics and its Applications
Reads 0
Downloads 0
ACS Style

David Dominguez; Mario Gonzalez; Francisco De Borja Rodriguez; Eduardo Serrano; R. Erichsen; W.K. Theumann. Structured information in sparse-code metric neural networks. Physica A: Statistical Mechanics and its Applications 2012, 391, 799 -808.

AMA Style

David Dominguez, Mario Gonzalez, Francisco De Borja Rodriguez, Eduardo Serrano, R. Erichsen, W.K. Theumann. Structured information in sparse-code metric neural networks. Physica A: Statistical Mechanics and its Applications. 2012; 391 (3):799-808.

Chicago/Turabian Style

David Dominguez; Mario Gonzalez; Francisco De Borja Rodriguez; Eduardo Serrano; R. Erichsen; W.K. Theumann. 2012. "Structured information in sparse-code metric neural networks." Physica A: Statistical Mechanics and its Applications 391, no. 3: 799-808.

Journal article
Published: 31 July 2011 in Neurocomputing
Reads 0
Downloads 0

The goal of this work is to learn and retrieve a sequence of highly correlated patterns using a Hopfield-type of attractor neural network (ANN) with a small-world connectivity distribution. For this model, we propose a weight learning heuristic which combines the pseudo-inverse approach with a row-shifting schema. The influence of the ratio of random connectivity on retrieval quality and learning time has been studied. Our approach has been successfully tested on a complex pattern, as it is the case of traffic video sequences, for different combinations of the involved parameters. Moreover, it has demonstrated to be robust with respect to highly variable frame activity.

ACS Style

Mario Gonzalez; David Dominguez; Ángel Sánchez. Learning sequences of sparse correlated patterns using small-world attractor neural networks: An application to traffic videos. Neurocomputing 2011, 74, 2361 -2367.

AMA Style

Mario Gonzalez, David Dominguez, Ángel Sánchez. Learning sequences of sparse correlated patterns using small-world attractor neural networks: An application to traffic videos. Neurocomputing. 2011; 74 (14-15):2361-2367.

Chicago/Turabian Style

Mario Gonzalez; David Dominguez; Ángel Sánchez. 2011. "Learning sequences of sparse correlated patterns using small-world attractor neural networks: An application to traffic videos." Neurocomputing 74, no. 14-15: 2361-2367.

Journal article
Published: 31 October 2009 in Neurocomputing
Reads 0
Downloads 0

The model of attractor neural network on the small-world topology (local and random connectivity) is investigated. The synaptic weights are random, driving the network towards a disordered state for the neural activity. An ordered macroscopic neural state is induced by a bias in the network weight connections, and the network evolution when initialized in blocks of positive/negative activity is studied. The retrieval of the block-like structure is investigated. An application to the Hebbian learning of a pattern, carrying local information, is presented. The block and the global attractor compete according to the initial conditions and the change of stability from one to the other depends on the long-range character of the network connectivity, as shown with a flow-diagram analysis. Moreover, a larger number of blocks emerges with the network dilution.

ACS Style

Mario González; David Dominguez; Francisco B. Rodríguez; David Renato Domínguez Carreta. Block attractor in spatially organized neural networks. Neurocomputing 2009, 72, 3795 -3801.

AMA Style

Mario González, David Dominguez, Francisco B. Rodríguez, David Renato Domínguez Carreta. Block attractor in spatially organized neural networks. Neurocomputing. 2009; 72 (16-18):3795-3801.

Chicago/Turabian Style

Mario González; David Dominguez; Francisco B. Rodríguez; David Renato Domínguez Carreta. 2009. "Block attractor in spatially organized neural networks." Neurocomputing 72, no. 16-18: 3795-3801.

Article
Published: 10 February 2009 in Physical Review E
Reads 0
Downloads 0

The retrieval abilities of spatially uniform attractor networks can be measured by the global overlap between patterns and neural states. However, we found that nonuniform networks, for instance, small-world networks, can retrieve fragments of patterns (blocks) without performing global retrieval. We propose a way to measure the local retrieval using a parameter that is related to the fluctuation of the block overlaps. Simulation of neural dynamics shows a competition between local and global retrieval. The phase diagram shows a transition from local retrieval to global retrieval when the storage ratio increases and the topology becomes more random. A theoretical approach confirms the simulation results and predicts that the stability of blocks can be improved by dilution.

ACS Style

David Dominguez; Mario González; Eduardo Serrano; Francisco B. Rodríguez. Structured information in small-world neural networks. Physical Review E 2009, 79, 021909 .

AMA Style

David Dominguez, Mario González, Eduardo Serrano, Francisco B. Rodríguez. Structured information in small-world neural networks. Physical Review E. 2009; 79 (2):021909.

Chicago/Turabian Style

David Dominguez; Mario González; Eduardo Serrano; Francisco B. Rodríguez. 2009. "Structured information in small-world neural networks." Physical Review E 79, no. 2: 021909.

Journal article
Published: 01 April 2007 in Neural Computation
Reads 0
Downloads 0

A wide range of networks, including those with small-world topology, can be modeled by the connectivity ratio and randomness of the links. Both learning and attractor abilities of a neural network can be measured by the mutual information (MI) as a function of the load and the overlap between patterns and retrieval states. In this letter, we use MI to search for the optimal topology with regard to the storage and attractor properties of the network in an Amari-Hopfield model. We find that while an optimal storage implies an extremely diluted topology, a large basin of attraction leads to moderate levels of connectivity. This optimal topology is related to the clustering and path length of the network. We also build a diagram for the dynamical phases with random or local initial overlap and show that very diluted networks lose their attractor ability.

ACS Style

D. Dominguez; K. Koroutchev; Eduardo Serrano; Francisco De Borja Rodriguez. Information and Topology in Attractor Neural Networks. Neural Computation 2007, 19, 956 -973.

AMA Style

D. Dominguez, K. Koroutchev, Eduardo Serrano, Francisco De Borja Rodriguez. Information and Topology in Attractor Neural Networks. Neural Computation. 2007; 19 (4):956-973.

Chicago/Turabian Style

D. Dominguez; K. Koroutchev; Eduardo Serrano; Francisco De Borja Rodriguez. 2007. "Information and Topology in Attractor Neural Networks." Neural Computation 19, no. 4: 956-973.

Conference paper
Published: 01 January 2007 in AIP Conference Proceedings
Reads 0
Downloads 0

The retrieval abilities of spatially uniform attractor networks can be measured by the average overlap between patterns and neural states. Metric networks (with local connections), like small‐world graphs, modelled by the parameters: connectivity γ and randomness ω, however, display a richer distribution of memory attractors. We found that metric networks can carry information structured in blocks without any global overlap. There is a competition between global and blocks attractors. We propose a way to measure the block information, related to the fluctuations of the overlap over the blocks. The phase‐diagram with the transition from local to global information, shows that the stability of blocks grows with dilution, but decreases with the storage rate and disappears for random topologies.

ACS Style

David Renato Domínguez Carreta. Block information and topology in memory networks. AIP Conference Proceedings 2007, 887, 107 -114.

AMA Style

David Renato Domínguez Carreta. Block information and topology in memory networks. AIP Conference Proceedings. 2007; 887 ():107-114.

Chicago/Turabian Style

David Renato Domínguez Carreta. 2007. "Block information and topology in memory networks." AIP Conference Proceedings 887, no. : 107-114.

Preprint
Published: 29 July 2005
Reads 0
Downloads 0

The retrieval abilities of spatially uniform attractor networks can be measured by the average overlap between patterns and neural states. We found that metric networks, with local connections, however, can carry information structured in blocks without any global overlap. and blocks attractors. We propose a way to measure the block information, related to the fluctuation of the overlap. The phase-diagram with the transition from local to global information, shows that the stability of blocks grows with dilution, but decreases with the storage rate and disappears for random topologies.

ACS Style

David Dominguez; Kostadin Koroutchev; Eduardo Serrano; Francisco B. Rodriguez. Structured Information in Metric Neural Networks. 2005, 1 .

AMA Style

David Dominguez, Kostadin Koroutchev, Eduardo Serrano, Francisco B. Rodriguez. Structured Information in Metric Neural Networks. . 2005; ():1.

Chicago/Turabian Style

David Dominguez; Kostadin Koroutchev; Eduardo Serrano; Francisco B. Rodriguez. 2005. "Structured Information in Metric Neural Networks." , no. : 1.

Preprint
Published: 20 June 2005
Reads 0
Downloads 0

A neural network works as an associative memory device if it has large storage capacity and the quality of the retrieval is good enough. The learning and attractor abilities of the network both can be measured by the mutual information (MI), between patterns and retrieval states. This paper deals with a search for an optimal topology, of a Hebb network, in the sense of the maximal MI. We use small-world topology. The connectivity $\gamma$ ranges from an extremely diluted to the fully connected network; the randomness $\omega$ ranges from purely local to completely random neighbors. It is found that, while stability implies an optimal $MI(\gamma,\omega)$ at $\gamma_{opt}(\omega)\to 0$, for the dynamics, the optimal topology holds at certain $\gamma_{opt}>0$ whenever $0\leq\omega<0.3$.

ACS Style

David Dominguez; Kostadin Koroutchev; Eduardo Serrano; Francisco B. Rodriguez. Dynamical Neural Network: Information and Topology. 2005, 1 .

AMA Style

David Dominguez, Kostadin Koroutchev, Eduardo Serrano, Francisco B. Rodriguez. Dynamical Neural Network: Information and Topology. . 2005; ():1.

Chicago/Turabian Style

David Dominguez; Kostadin Koroutchev; Eduardo Serrano; Francisco B. Rodriguez. 2005. "Dynamical Neural Network: Information and Topology." , no. : 1.

Journal article
Published: 01 November 2000 in Physica A: Statistical Mechanics and its Applications
Reads 0
Downloads 0
ACS Style

D Bollé; David Renato Domínguez Carreta. Mutual information and self-control of a fully-connected low-activity neural network. Physica A: Statistical Mechanics and its Applications 2000, 286, 401 -416.

AMA Style

D Bollé, David Renato Domínguez Carreta. Mutual information and self-control of a fully-connected low-activity neural network. Physica A: Statistical Mechanics and its Applications. 2000; 286 (3-4):401-416.

Chicago/Turabian Style

D Bollé; David Renato Domínguez Carreta. 2000. "Mutual information and self-control of a fully-connected low-activity neural network." Physica A: Statistical Mechanics and its Applications 286, no. 3-4: 401-416.

Article
Published: 01 April 1989 in Physical Review B
Reads 0
Downloads 0

The q-state Potts model in a random field with a discrete distribution of statistically independent fields ordered along any of the q states is studied in mean-field theory. Detailed phase diagrams are obtained in a two-component order-parameter theory for q=3 and a one-component theory for general q. Lines of critical and tricritical points are found in the first case and lines of critical points in the second one, in the presence of a sufficiently large, constant uniform field.

ACS Style

José F. Fontanari; W. K. Theumann; David R. C. Dominguez. Potts model in a random field. Physical Review B 1989, 39, 7132 -7139.

AMA Style

José F. Fontanari, W. K. Theumann, David R. C. Dominguez. Potts model in a random field. Physical Review B. 1989; 39 (10):7132-7139.

Chicago/Turabian Style

José F. Fontanari; W. K. Theumann; David R. C. Dominguez. 1989. "Potts model in a random field." Physical Review B 39, no. 10: 7132-7139.