User profiles for L. Ambrogioni

Luca Ambrogioni

Donder's institute of Cognition
Verified email at donders.ru.nl
Cited by 812

Generative adversarial networks for reconstructing natural images from brain activity

K Seeliger, U Güçlü, L Ambrogioni, Y Güçlütürk… - NeuroImage, 2018 - Elsevier
… From conv1 we also collected l f , m on negative feature activations, using the threshold 1.0
… the terms with a weight:(4) loss = l Ω = λ p x l p x + λ f , m l f , m where we chose λ p x = 100.0 …

Theta oscillations locked to intended actions rhythmically modulate perception

A Tomassini, L Ambrogioni, WP Medendorp, E Maris - Elife, 2017 - elifesciences.org
10.7554/eLife.25618.001 Ongoing brain oscillations are known to influence perception, and
to be reset by exogenous stimulations. Voluntary action is also accompanied by prominent …

Spontaneous symmetry breaking in generative diffusion models

G Raya, L Ambrogioni - Advances in Neural Information …, 2024 - proceedings.neurips.cc
Generative diffusion models have recently emerged as a leading approach for generating
high-dimensional data. In this paper, we show that the dynamics of these models exhibit a …

Neural dynamics of perceptual inference and its reversal during imagery

N Dijkstra, L Ambrogioni, D Vidaurre, M van Gerven - elife, 2020 - elifesciences.org
After the presentation of a visual stimulus, neural processing cascades from low-level
sensory areas to increasingly abstract representations in higher-level areas. It is often …

[HTML][HTML] Hyperrealistic neural decoding for reconstructing faces from fMRI activations via the GAN latent space

T Dado, Y Güçlütürk, L Ambrogioni, G Ras, S Bosch… - Scientific reports, 2022 - nature.com
Neural decoding can be conceptualized as the problem of mapping brain responses back
to sensory stimuli via a feature space. We introduce (i) a novel experimental paradigm that …

Wasserstein variational inference

L Ambrogioni, U Güçlü, Y Güçlütürk… - Advances in …, 2018 - proceedings.neurips.cc
This paper introduces Wasserstein variational inference, a new form of approximate Bayesian
inference based on optimal transport theory. Wasserstein variational inference uses a new …

The kernel mixture network: A nonparametric method for conditional density estimation of continuous random variables

L Ambrogioni, U Güçlü, MAJ van Gerven… - arXiv preprint arXiv …, 2017 - arxiv.org
This paper introduces the kernel mixture network, a new method for nonparametric estimation
of conditional probability densities using neural networks. We model arbitrarily complex …

Gait-prop: A biologically plausible learning rule derived from backpropagation of error

…, MA van Gerven, L Ambrogioni - Advances in Neural …, 2020 - proceedings.neurips.cc
… Given an input-target pair (y0,tL) we can define a quadratic loss function, l as … − 1 is equal
to the number of units in the subsequent layer indexed l. We can therefore describe the l − 1-th …

[HTML][HTML] End-to-end neural system identification with neural information flow

K Seeliger, L Ambrogioni, Y Güçlütürk… - PLOS Computational …, 2021 - journals.plos.org
… We further regularize the input using an 1 loss on all components (pixel values) of I.
The 1 leads to the suppression of noise in the image, which otherwise easily occurs in this …

Structurally-informed Bayesian functional connectivity analysis

M Hinne, L Ambrogioni, RJ Janssen, T Heskes… - NeuroImage, 2014 - Elsevier
… The employed l 1 regularizer encourages sparse precision matrices as determined by
the regularization parameter λ. This maximization problem can be solved using established …