Neural Networks - AI III

English

Restricted Boltzmann Machines for feature extraction: A stopping criterion using Hamming Distance

Restricted Boltzmann Machines (RBMs) are stochastic neural networks which are capable of learning a probability distribution over its set of inputs. This characteristic allows them to be useful in many different and complex tasks, the most popular of which are dimensionality reduction, feature learning, classification and collaborative filtering. Nowadays, the RBMs have gained much interest as they are studied in many different versions and scientific fields, using multiple types of data.

APPLICATION OF DEEP LEARNING AND CHAOS THEORY FOR LOAD FOREACTING IN GREECE

The decision making and operation of the power grid are directly related to the electrical load and consequently, its accurate prediction is of major importance. However, electric load, due to the non-linear and stochastic behavior of consumers, is considered a complex signal. Despite the research that has been implemented in this area, accurate forecasting models are still needed. In this article, a novel technique that combines deep learning and chaos theory is proposed for short-term electric load forecasting in Greece.

Agentpy - Agent-based modeling in Python

Numerous modeling and simulation tools have been developed to support the development of agent-based models (ABMs) [1]. Recent applications often require high complexity, including large numbers of agents and simulation steps, multiple environments, parameter sampling, Monte Carlo simulations, and data analysis. Existing simulation frameworks that support such complexity are arguably not as approachable and easy to use as traditional tools like NetLogo.

Teaching Recurrent Neural Networks to Modify Chaotic Memories by Example

The ability to store and manipulate information is a hallmark of computational systems. Recent efforts have made progress in modeling the representation and recall of information in neural systems. However, precisely how neural systems learn to modify these representations remains far from understood. Here we drive a recurrent neural network (RNN) with examples of translated, linearly transformed, or pre-bifurcated time series from a chaotic Lorenz system, alongside an additional control signal c that changes value for each example.

Partners

Twitter

Facebook

Contact

For information please contact :
ccs2020conf@gmail.com