Perception is not solely determined by the light that hits our eyes. Instead, what we perceive is strongly influenced by our prior knowledge of the world. When incoming sensory signals match our expectations, they become easier to process. On the other hand, when sensory signals are noisy, expectations can bias our perception. I will present work from fMRI and MEG studies, using multivariate decoding techniques, that demonstrates how the processing of sensory signals is strongly influenced by prior expectations already at the earliest cortical stages. This work suggests that perception is fundamentally an inferential process, combining bottom-up sensory signals and top-down expectations. I will highlight two recent endeavours to reveal the neural mechanisms underlying this process, one targeted at the laminar level of V1, and the other at identifying the top-down sources of expectation signals.