User profiles for L. Ambrogioni
Luca AmbrogioniDonder's institute of Cognition Verified email at donders.ru.nl Cited by 812 |
Generative adversarial networks for reconstructing natural images from brain activity
… From conv1 we also collected l f , m on negative feature activations, using the threshold 1.0
… the terms with a weight:(4) loss = l Ω = λ p x l p x + λ f , m l f , m where we chose λ p x = 100.0 …
… the terms with a weight:(4) loss = l Ω = λ p x l p x + λ f , m l f , m where we chose λ p x = 100.0 …
Theta oscillations locked to intended actions rhythmically modulate perception
10.7554/eLife.25618.001 Ongoing brain oscillations are known to influence perception, and
to be reset by exogenous stimulations. Voluntary action is also accompanied by prominent …
to be reset by exogenous stimulations. Voluntary action is also accompanied by prominent …
Spontaneous symmetry breaking in generative diffusion models
G Raya, L Ambrogioni - Advances in Neural Information …, 2024 - proceedings.neurips.cc
Generative diffusion models have recently emerged as a leading approach for generating
high-dimensional data. In this paper, we show that the dynamics of these models exhibit a …
high-dimensional data. In this paper, we show that the dynamics of these models exhibit a …
Neural dynamics of perceptual inference and its reversal during imagery
After the presentation of a visual stimulus, neural processing cascades from low-level
sensory areas to increasingly abstract representations in higher-level areas. It is often …
sensory areas to increasingly abstract representations in higher-level areas. It is often …
[HTML][HTML] Hyperrealistic neural decoding for reconstructing faces from fMRI activations via the GAN latent space
Neural decoding can be conceptualized as the problem of mapping brain responses back
to sensory stimuli via a feature space. We introduce (i) a novel experimental paradigm that …
to sensory stimuli via a feature space. We introduce (i) a novel experimental paradigm that …
Wasserstein variational inference
This paper introduces Wasserstein variational inference, a new form of approximate Bayesian
inference based on optimal transport theory. Wasserstein variational inference uses a new …
inference based on optimal transport theory. Wasserstein variational inference uses a new …
The kernel mixture network: A nonparametric method for conditional density estimation of continuous random variables
This paper introduces the kernel mixture network, a new method for nonparametric estimation
of conditional probability densities using neural networks. We model arbitrarily complex …
of conditional probability densities using neural networks. We model arbitrarily complex …
Gait-prop: A biologically plausible learning rule derived from backpropagation of error
…, MA van Gerven, L Ambrogioni - Advances in Neural …, 2020 - proceedings.neurips.cc
… Given an input-target pair (y0,tL) we can define a quadratic loss function, l as … − 1 is equal
to the number of units in the subsequent layer indexed l. We can therefore describe the l − 1-th …
to the number of units in the subsequent layer indexed l. We can therefore describe the l − 1-th …
[HTML][HTML] End-to-end neural system identification with neural information flow
… We further regularize the input using an ℓ 1 loss on all components (pixel values) of I.
The ℓ 1 leads to the suppression of noise in the image, which otherwise easily occurs in this …
The ℓ 1 leads to the suppression of noise in the image, which otherwise easily occurs in this …
Structurally-informed Bayesian functional connectivity analysis
… The employed l 1 regularizer encourages sparse precision matrices as determined by
the regularization parameter λ. This maximization problem can be solved using established …
the regularization parameter λ. This maximization problem can be solved using established …