How Our Brains Construct Our Perception of Reality

How Our Brains Construct Our Perception of Reality

The Architect Within: How Our Brains Construct Perceptual Reality

Our intuitive understanding of the world often paints a picture of passive observation: our senses act as clear windows, faithfully transmitting an objective reality directly to our consciousness. However, decades of research in neuroscience and psychology reveal a far more intricate and fascinating truth. The reality we experience is not a direct reflection of the external world, but rather a sophisticated, dynamic construction—an elaborate “controlled hallucination” meticulously crafted by our brains.1 This article delves into the remarkable processes by which our brains act as tireless architects, gathering raw sensory materials and transforming them into the rich, coherent, and deeply personal tapestry of our perceived reality. Far from being passive receivers, our brains are active interpreters and predictors, constantly working to make sense of an ambiguous and ever-changing environment.2 Understanding this constructive process challenges our most fundamental assumptions about experience and has profound implications for fields ranging from philosophy to mental health and artificial intelligence.

The very definition of perception, as the set of processes that “perform computations on sensory data to construct and transform representations of the external environment, acquire information from, and make predictions about, the external world, and guide action,” underscores the brain’s active role.4 If perception were merely a passive recording, the language used would involve terms like “reflecting” or “registering.” Instead, the emphasis on construction, transformation, and prediction highlights that the brain is not simply receiving a pre-formed reality. This active construction is essential because the sensory information reaching us is often incomplete, noisy, or ambiguous.2 The brain, therefore, must engage in complex computations to interpret these signals, fill in missing information, and generate a stable and meaningful experience. This necessity for active interpretation is evident even at fundamental physical levels. For instance, light travels significantly faster than sound.6 If the brain merely registered these signals as they arrived, our perception of events involving both light and sound, such as a distant lightning strike followed by thunder, would be disjointed. To create a “cohesive reality,” the brain must extrapolate and adjust for these temporal discrepancies, effectively predicting and aligning these sensory inputs.6 This indicates that even before complex cognitive interpretations, the brain is actively timing, integrating, and constructing a unified experience, a process sometimes referred to as temporal binding. This foundational understanding—that our brains create our reality—sets the stage for exploring the intricate mechanisms behind this daily marvel.

The Sensory Gateway: Gathering Raw Materials

The brain’s construction of reality begins with the collection of raw data from the environment through our specialized sensory organs. These organs act as gateways, capturing various forms of physical energy and chemical information. We are most familiar with the traditional five senses: sight (vision), hearing (audition), taste (gustation), smell (olfaction), and touch (somatosensation). However, our sensory repertoire extends further to include the vestibular sense (balance and spatial orientation, detected by the inner ear), proprioception (the sense of our body’s position and movement in space), and interoception (the perception of internal bodily states like hunger, thirst, and pain).7 Each sensory system is exquisitely designed to detect specific types of stimuli from our internal and external worlds.7

At the heart of each sensory organ are sensory receptors, specialized cells or structures that respond to particular forms of energy. These include:

  • Photoreceptors (rods and cones in the eyes) that respond to light.7 Rods are sensitive to dim light, while cones are responsible for color vision and fine detail.9
  • Mechanoreceptors (in the skin, ears, muscles) that respond to mechanical pressure, vibration, or distortion.7
  • Chemoreceptors (in the nose and on the tongue) that respond to chemical stimuli, giving rise to smells and tastes.7 The olfactory system, for example, possesses around 350 subtypes of receptors, each responsive to a limited array of odors.9
  • Thermoreceptors (in the skin) that detect changes in temperature.7
  • Nociceptors (throughout the body) that signal pain in response to potentially damaging stimuli.8

The diversity and specialization of these receptors are crucial. They deconstruct the complex external world into manageable informational components. The brain doesn’t receive a holistic “picture” of the environment; rather, it receives discrete streams of data about specific features. This initial filtering and categorization is a fundamental step in information processing, setting the stage for the brain’s subsequent task of integrating these disparate streams into a coherent whole.

Once a sensory receptor is stimulated by its adequate stimulus (e.g., light striking the retina), a critical process called transduction occurs. This is the conversion of physical energy into electrical and chemical signals—nerve impulses or action potentials—that the nervous system can understand and process.8 For instance, stimulation of a receptor cell can trigger the release of a protein, initiating a biochemical cascade that generates electrical charges in a neuron, causing it to fire.9 This transformation is fundamental: the physical world, in its original form, never directly enters the brain. Instead, the brain works with an encoded representation, a neural language of electrical pulses and chemical messengers.

These neural signals then embark on a journey along sensory neurons, forming pathways to the central nervous system.7 For most senses, this journey includes a crucial stop at the thalamus, an egg-shaped structure located deep within the brain, often described as a relay station or a central switching hub.9 The thalamus receives sensory information from the eyes (vision), ears (audition), tongue (taste), and skin (touch), and directs these signals to the appropriate areas of the cerebral cortex for further processing.9 This anatomical convergence suggests that the thalamus is more than a simple relay; it likely plays a role in initially filtering, gating, or modulating sensory information based on factors like arousal and attentional state before it reaches higher cortical areas or conscious awareness.11

The sense of smell (olfaction) follows a unique path, largely bypassing the thalamus. Olfactory signals travel from receptors in the nose directly to the olfactory bulb and then to more ancient parts of the brain, including regions of the limbic system, which is heavily involved in emotion and memory.9 This direct connection helps explain the powerful and often immediate way smells can evoke vivid memories and strong emotional responses, as they access these systems with less intermediate filtering than other senses.

Table 1: Overview of Human Sensory Systems

Sensory ModalityPrimary StimulusReceptor Cells/OrgansKey Brain Regions for Initial Processing (Thalamic Nuclei & Primary Cortex)
VisionElectromagnetic radiation (light)Photoreceptors (rods and cones) in the retina (eyes)Lateral Geniculate Nucleus (LGN) of Thalamus → Primary Visual Cortex (V1) in Occipital Lobe 9
AuditionSound waves (vibrations in air/medium)Hair cells (mechanoreceptors) in the cochlea (inner ear)Medial Geniculate Nucleus (MGN) of Thalamus → Primary Auditory Cortex (A1) in Temporal Lobe 9
SomatosensationPressure, vibration, texture, temperature, painMechanoreceptors, thermoreceptors, nociceptors in skin, muscles, jointsVentral Posterior Nucleus (VPN) of Thalamus (e.g., VPL for body, VPM for face) → Primary Somatosensory Cortex (S1) in Parietal Lobe 11
Olfaction (Smell)Airborne chemical moleculesOlfactory receptor neurons in the olfactory epithelium (nasal cavity)Olfactory Bulb → Primary Olfactory Cortex (e.g., piriform cortex, entorhinal cortex, amygdala) in Temporal Lobe (bypasses thalamus for direct cortical projection) 10
Gustation (Taste)Dissolved chemical moleculesTaste receptor cells in taste buds (tongue, palate)Ventral Posteromedial Nucleus (VPM) of Thalamus → Primary Gustatory Cortex (Insular Lobe, Parietal Operculum) 11
Vestibular SenseHead movement, gravity, accelerationHair cells (mechanoreceptors) in semicircular canals, utricle, and saccule (inner ear)Vestibular Nuclei (Brainstem) → (various pathways including to) Ventral Posterior Nucleus (VPN) of Thalamus → (multiple cortical areas including parietal-insular vestibular cortex) 8
ProprioceptionBody/limb position and movementMechanoreceptors in muscles (muscle spindles), tendons (Golgi tendon organs), and joints(Spinal cord pathways) → Ventral Posterior Lateral Nucleus (VPL) of Thalamus → Primary Somatosensory Cortex (S1) in Parietal Lobe (integrated with touch) 8
InteroceptionInternal physiological states (e.g., visceral sensations, pain, temperature, itch, hunger, thirst)Various interoceptors in internal organs and tissues(Pathways via spinal cord and brainstem) → (various thalamic nuclei including VPM/VPL, MDN) → Insular Cortex, Anterior Cingulate Cortex (ACC), Somatosensory Cortex 8

The Brain’s Workshop: Interpreting Sensory Blueprints

Once sensory signals, encoded as neural impulses, have journeyed from the periphery and, for most senses, passed through the thalamic checkpoint, they arrive at the cerebral cortex—the brain’s outer layer responsible for higher-level processing. Here, in what can be likened to a sophisticated workshop, these “sensory blueprints” are meticulously interpreted and transformed into meaningful perceptions.

This interpretation begins in primary sensory cortices, which are distinct regions of the cortex specialized for the initial processing of information from a single sensory modality.12

  • The Primary Visual Cortex (V1), located in the occipital lobe, is the first cortical station for visual information. It begins to analyze basic features like edges, orientations, colors, and motion.9
  • The Primary Auditory Cortex (A1), situated in the temporal lobe, processes sounds, deconstructing them into components like pitch, loudness, and temporal patterns.9
  • The Primary Somatosensory Cortex (S1), found in the parietal lobe just behind the central sulcus, receives information about touch, temperature, pain, pressure, and body position.15 A fascinating feature of S1 is its somatotopic organization, often visualized as the “sensory homunculus,” where different body parts are represented in proportion to their sensory acuity rather than their physical size.15 Adjacent to S1 is the secondary somatosensory cortex (S2), which is involved in further processing, such as identifying the shape and texture of an object by touch and integrating this with spatial and tactile memory.16
  • The Primary Gustatory Cortex, located in the insular lobe and parts of the parietal lobe, is responsible for processing taste sensations.11
  • The Primary Olfactory Cortex, found in the temporal lobe near emotion and memory centers like the amygdala and parahippocampal gyrus, processes smells.15

The existence of these specialized primary sensory cortices suggests that the brain first deconstructs complex sensory experiences into their fundamental components. For example, the visual system initially breaks down a scene into lines, edges, and colors, while the auditory system analyzes sounds into different frequencies and intensities. This modular design allows for highly specialized and efficient processing of different types of information. However, this initial deconstruction necessitates subsequent stages of integration to form a holistic and unified percept.

Information processing within these sensory pathways is often hierarchical. Basic features detected in primary areas are relayed to association cortices for more complex analysis.11 For instance, in visual processing, information flows from V1 to areas that specialize in recognizing shapes, objects, and eventually, complex scenes like faces. The processing of faces, for example, involves an initial analysis of local features in areas like the occipital face area (OFA), followed by more holistic processing in regions such as the fusiform face area (FFA).21

Crucially, the brain does not process sensory information in isolated streams. Instead, it engages in multisensory integration, combining inputs from different modalities to create a richer, more robust, and often more accurate perception of the environment.12 This integration is not merely additive; it can be synergistic, leading to perceptual outcomes that are different from, or superior to, what would be achieved by any single sense acting alone. For example, the ventriloquism effect, where the visual information of a puppeteer’s moving mouth can “capture” the perceived location of their voice, illustrates how visual cues can dominate auditory localization.12 Similarly, effective hand-eye coordination relies on the tight integration of visual and somatosensory (proprioceptive and tactile) information.12 This ability to use redundant and complementary information from multiple senses helps the brain resolve ambiguity and construct a coherent and actionable model of the world.

From these primary and association cortices, sensory information engages even wider and more diverse brain networks. For instance, signals from the olfactory cortex have direct connections to the limbic system, which can imbue a smell with intense emotional significance.9 Similarly, circuits that store memories are activated to give meaning to what we see and hear, transforming raw sensations into recognized objects, familiar voices, or meaningful events.9 Thus, the output of the primary sensory cortices—the basic features—serves as the input for these higher-order areas, establishing a causal chain from simple sensory processing to complex, meaningful, and emotionally colored perception. The brain’s workshop, therefore, is not just passively assembling blueprints; it is actively interpreting them, adding layers of meaning derived from an individual’s unique history and internal state, reinforcing the deeply personal and subjective nature of perception.

Building from the Ground Up & The Top Down: A Two-Way Construction

The brain’s interpretation of sensory blueprints relies on two fundamental modes of processing that operate in constant dialogue: bottom-up processing and top-down processing. Understanding their interplay is crucial to appreciating how perception is actively constructed.

Bottom-up processing, also known as data-driven processing, refers to perception that is built directly from incoming sensory information.22 It begins with the stimulation of sensory receptors and involves the transmission of this raw data to the brain, where it is analyzed in a hierarchical fashion, starting with basic features and gradually assembling them into a more complex percept.24 In this mode, perception is primarily determined by the physical characteristics of the stimulus itself, without significant influence from prior knowledge, expectations, or context.22 An example would be encountering an entirely unfamiliar object for the first time; the brain would attempt to make sense of its shape, color, and texture based purely on the visual input it receives.26 Similarly, if one were to look at an image composed of random dots, like the classic ambiguous image of a Dalmatian dog, the initial experience of seeing just spots, before any recognition of the dog occurs, is a result of bottom-up processing alone.27

In contrast, top-down processing, or conceptually-driven processing, involves the influence of an individual’s existing knowledge, beliefs, past experiences, expectations, emotions, and the current context on the interpretation of sensory data.9 Higher-level cognitive processes actively shape how incoming sensory information is perceived, allowing the brain to make sense of ambiguous, incomplete, or noisy signals by “filling in the gaps” based on what is already known or anticipated.28 As succinctly put, “we see and hear what we think we will”.9 Everyday examples abound: our ability to effortlessly read misspelled words in a familiar sentence, or to recognize a friend’s face in a crowded and visually complex environment, relies heavily on top-down processing.30 A powerful illustration is the “B” or “13” illusion, where an identical ambiguous shape is perceived as the letter “B” when surrounded by other letters, but as the number “13” when surrounded by numbers.23 The context provided by the surrounding characters creates a strong expectation that guides the interpretation of the ambiguous shape.

Perception in the real world is rarely a purely bottom-up or purely top-down affair. Instead, it is typically a dynamic interplay between these two streams of influence.23 Sensory data arriving from the environment (bottom-up) provides the foundational evidence, while our cognitive framework (top-down) shapes how this evidence is interpreted and organized into a meaningful experience. Hierarchical processing in the brain reflects this duality, with ascending (bottom-up) neural pathways transmitting sensory information from the periphery towards higher cortical areas, and descending (top-down) pathways carrying modulatory signals from higher cognitive centers back to earlier sensory processing stages.11 These top-down operations are particularly crucial when the sensory input is incomplete or ambiguous, allowing the brain to make educated guesses and construct a coherent percept.21

The very necessity of top-down processing implies that raw sensory data, processed in a purely bottom-up fashion, is often insufficient on its own to create a meaningful or actionable perception of reality. The environment is complex, and sensory input can be noisy or fragmented.2 If our brains relied solely on bottom-up analysis, our perceptual world would likely be a confusing jumble of disconnected sensations, and we would struggle to interpret novel variations of familiar objects or navigate rapidly changing situations. Top-down mechanisms, by leveraging past learning and contextual cues, enable rapid, efficient, and generally adaptive interpretations.30

This interaction can be conceptualized as a continuous “dialogue” or “negotiation” within the brain. Bottom-up signals provide the “evidence” from the current sensory environment, while top-down signals offer “hypotheses,” “predictions,” or “contextual frameworks” based on prior knowledge. The brain constantly works to reconcile these two streams of information, a process fundamental to resolving ambiguity and forming a stable perception. This interplay allows for both consistency in our perceptions (due to the influence of established knowledge) and flexibility (the ability to update our interpretations based on new sensory information).

Furthermore, top-down influences can causally alter how bottom-up information is processed, even at the very early stages of sensory pathways. It’s not just that our expectations influence how we interpret sensations after they have been fully processed; our mindset, attention, and emotional state can actually modify the firing of neurons in primary sensory cortices.9 This suggests that top-down signals can “prime” or “tune” sensory areas, making them more or less receptive to certain types of incoming bottom-up information. This means the construction of our perceptual reality is even more deeply intertwined with our internal state from the earliest moments of sensory processing. This understanding is crucial for appreciating phenomena like perceptual sets (how expectations shape what we perceive) 28, the powerful influence of context 23, and the fundamental reasons why different individuals can interpret the exact same sensory event in vastly different ways. It also lays the groundwork for understanding more sophisticated models of perception, such as predictive processing.

Table 2: Bottom-Up vs. Top-Down Processing in Perception

FeatureBottom-Up ProcessingTop-Down Processing
Starting PointSensory receptors; raw sensory data 22Higher-level cognitive processes (knowledge, expectations, context) 22
Direction of InfluenceFrom sensory input upwards to higher brain centers 11From higher brain centers downwards to influence sensory interpretation 11
Primary DriverCharacteristics of the physical stimulus 24Internal mental states (prior knowledge, beliefs, goals, context) 9
Role of Prior Knowledge/ContextMinimal or no influence 24Significant influence; used to interpret and organize sensory data 23
Processing SpeedCan be slower as it analyzes all incoming detailsOften faster and more efficient, especially in familiar situations, as it uses shortcuts and makes inferences 30
ExamplePerceiving an unfamiliar object based solely on its visual features 26; seeing random spots before recognizing a pattern 27Reading misspelled words easily 30; interpreting the “B” or “13” illusion based on surrounding characters 23

The Predictive Engine: How Our Brains Anticipate Reality

Emerging theories in cognitive neuroscience are revolutionizing our understanding of perception by casting the brain not merely as an interpreter of incoming data, but as a sophisticated prediction machine. This perspective suggests that the brain is constantly, and largely unconsciously, generating hypotheses about the causes of sensory signals it expects to receive, based on its vast store of prior experiences and internal models of the world.1 As stated by Lisa Feldman Barrett, “what your brain is doing, when it’s making a prediction, it’s creating a category of instances from the past…in order to predict what’s going to happen next”.2 This predictive capacity is crucial; if the brain were a purely stimulus-response organ, passively waiting for sensory information to be fully processed before acting, our interactions with the world would be far too slow and inefficient.32

The Predictive Coding framework offers a specific neurocomputational theory of how this predictive processing might be implemented.29 It posits a hierarchical organization in the brain where higher levels continuously send top-down predictions about expected sensory activity to lower levels. These lower levels then compare the incoming sensory input with these predictions. Any mismatch between the prediction and the actual input generates a prediction error signal. Crucially, according to this model, it is primarily these error signals—the unexplained or surprising components of the sensory information—that are fed forward (bottom-up) to higher levels of the processing hierarchy.29 The brain’s overarching goal is to minimize this prediction error across all levels. Perception, from this viewpoint, is essentially the brain’s “best guess” or the internal hypothesis that best explains the current sensory input by minimizing overall prediction error.29 This framework even proposes distinct populations of neurons: “expectation units” that transmit the top-down predictions and “error units” that signal the bottom-up prediction errors.29

Closely related to predictive coding is the Bayesian Brain hypothesis. This influential theory conceptualizes the brain as a statistical inference engine that operates according to the principles of Bayesian probability.34 In this view, the brain continuously updates its beliefs about the world (its internal models or “priors”) in light of new sensory evidence (the “likelihood” of that evidence occurring given a particular state of the world). The combination of these priors and the likelihood results in a “posterior belief,” which forms the basis of our perceptual experience.33

A critical component of these predictive models is the concept of precision-weighting. The brain doesn’t just make predictions and calculate errors; it also estimates the reliability or “precision” of both its prior beliefs and the incoming sensory signals.29 More weight is given to information deemed more reliable. For instance, if sensory input is noisy or ambiguous (low precision), the brain might rely more heavily on its strong prior expectations. Conversely, if the sensory input is clear and unambiguous (high precision), it will have a greater influence in updating the brain’s internal models. Attention is thought to play a key role in modulating this precision, effectively increasing the “gain” on prediction errors arising from attended stimuli or features, making them more influential in shaping perception.29

This predictive framework elegantly explains a range of perceptual and neural phenomena. For example, repetition suppression, the observed decrease in neural response to repeated, and therefore predictable, stimuli, can be understood as a consequence of reduced prediction error.29 Similarly, expectation suppression occurs when stimuli that are cued or expected elicit weaker neural responses. Many perceptual illusions can also be interpreted as the brain’s predictive mechanisms attempting to reconcile ambiguous input with its internal models.29

The predictive processing perspective fundamentally inverts the classical view of perception. Traditionally, bottom-up signals were seen as carrying the primary content of our sensory world, which was then interpreted by higher brain areas. In predictive coding, however, it is the top-down predictions that embody the brain’s current hypothesis about the state of the world. The bottom-up signals primarily convey the difference between this hypothesis and reality—the prediction error.29 Thus, what ascends the sensory hierarchy is largely news of surprise or deviation from expectation. Perception, then, is the process of finding the internal model that best “explains away” this error.

This continuous cycle of prediction generation, comparison with sensory input, calculation of prediction error, and subsequent model updating forms a dynamic causal loop. This loop is the process of perception. It is constantly striving for a state of minimized prediction error, meaning our perception is not a static snapshot but an ever-evolving construction, subtly refined by each new wave of sensory information. If our perception of reality is the brain’s most successful working hypothesis about the causes of its sensory inputs, this has profound implications. It helps us understand how deeply held beliefs (which form part of our brain’s generative models) can actively shape our experience, and provides a framework for investigating conditions like psychosis, which may involve disturbances in this predictive machinery, such as the generation of aberrant prediction errors or the mis-weighting of priors and sensory evidence.34 Indeed, predictive processing models are increasingly being used to understand phenomena like symptom perception in medicine, including the powerful effects of placebos, where expectation plays a critical role.35

The Subjective Tapestry: Why Our Realities Are Personal

The brain’s construction of reality is not a uniform process yielding identical outputs for everyone. Instead, each individual’s perception is a unique, subjective tapestry woven from a multitude of interacting factors. This inherent subjectivity explains why two people can witness the same event or encounter the same stimulus yet experience it in profoundly different ways.

Prior knowledge, memories, beliefs, and expectations form a significant part of the loom upon which this tapestry is woven. Our brains are not blank slates; they carry a lifetime of experiences that shape how we interpret new sensory data.2 Prior knowledge creates expectations that can influence perception even at very early neural stages, sometimes within 80-95 milliseconds after a stimulus appears.37 Memories are not passive recordings but are actively reconstructed and edited by the brain, used to fill in missing information and provide context for current sensations.1 An example of this is contraction bias, where our memory of the average magnitude of past stimuli biases our judgment of a current stimulus.38 If repeatedly shown balls of varying sizes, our memory of an “average sized ball” will lead us to underestimate the size of a larger-than-average ball and overestimate a smaller one.

Emotions and mood are also deeply intertwined with our cognitive processes and significantly color our perception of the world.9 Emotion is not merely a reaction to what we perceive; it actively “determines how we perceive our world”.40 Emotional arousal can enhance the vividness of perceived scenes and strengthen memory encoding.9 For instance, anxious individuals might be able to detect threatening odors at lower concentrations than non-anxious individuals, reflecting heightened activity in primary olfactory centers.9 Similarly, our current mood can bias interpretation; listening to sad music might lead one to perceive the homophone “mourning” when “morning” is spoken.31

Attention acts as a crucial filter and modulator in this constructive process.42 Given the brain’s limited processing capacity, attention allows us to selectively prioritize certain sensory information while downplaying or ignoring other inputs. This selection can be voluntary (top-down attention, e.g., searching for a specific face in a crowd) or involuntary (bottom-up attention, e.g., a sudden loud noise capturing our focus).43 Beyond simply filtering, attention can alter the perceived attributes of stimuli, such as their apparent contrast or length 42, and even influence our subjective experience of time.44

Cognitive biases are systematic patterns of deviation from rational judgment that can subtly distort our perception and decision-making.45 These biases often operate automatically and unconsciously. Examples include:

  • Confirmation bias: The tendency to seek out, interpret, favor, and recall information that confirms or supports one’s pre-existing beliefs or hypotheses.45
  • Anchoring bias: Over-reliance on the first piece of information offered (the “anchor”) when making decisions.45
  • Halo effect: Where an overall impression of a person influences thoughts and feelings about their specific traits.45 These biases demonstrate how our inherent thinking patterns actively shape what we notice, how we interpret it, and ultimately, the reality we construct.

Beyond these cognitive and emotional factors, individual differences rooted in our biology and life experiences contribute significantly to perceptual variability:

  • Culture: Our cultural background shapes learned heuristics and the statistical regularities of our sensory environments, influencing everything from the perception of gender roles 47 to susceptibility to visual illusions like the Müller-Lyer illusion (explained by the “carpentered world” hypothesis, which suggests that exposure to environments rich in rectangular structures tunes our visual system in specific ways).48
  • Genetics and Age: Genetic factors contribute to individual differences in personality traits and self-perception, which can, in turn, influence how sensory information is processed and interpreted.49 Age also brings changes in perception, partly due to physiological changes in sensory organs and the brain, and partly due to the accumulation of unique experiences.50
  • Sensory Processing Differences: Individuals vary greatly in how they modulate and discriminate sensory input. Some may exhibit sensory over-responsivity (SOR), feeling overwhelmed by typical levels of stimulation, while others may show sensory under-responsivity (SUR), requiring more intense input to register a sensation. Conditions like dyspraxia involve difficulties in planning and executing motor actions based on sensory information.51 These differences lead to vastly different subjective experiences of the same physical stimuli.

The subjective tapestry of our reality is thus woven from an intricate interplay of relatively stable factors, such as long-term memories, deeply ingrained cultural norms, and genetic predispositions, alongside more transient states like our current mood, attentional focus, and recent experiences. This means that while our perception of the world has a degree of consistency, it is also highly dynamic, context-dependent, and continuously modulated. Many of these shaping influences operate pre-consciously or very early in the perceptual pipeline.9 We are often unaware of why we perceive things the way we do because the construction largely happens “behind the scenes.” This results in our subjective reality feeling direct, immediate, and “true,” even though it is the product of extensive filtering and interpretation, making it challenging to recognize our own biases or the subtle ways our internal states shape our experience of the external world.

The profound subjectivity of perception has significant implications for interpersonal understanding. If two individuals genuinely experience different perceptual realities of the same event due to their unique internal landscapes and histories, then disagreements may stem not just from differing opinions, but from fundamentally different experienced “facts”.32 This underscores the critical importance of empathy, active listening, and a willingness to consider that another’s perspective, however different from our own, may be a valid reflection of their constructed reality.

When Construction Bends: Insights from Illusions and Altered States

Perceptual illusions and various neurological phenomena serve as powerful windows into the brain’s active, constructive, and sometimes fallible, role in creating our reality. These are not mere “errors” of an imperfect system; rather, they are often logical outcomes that arise when the brain applies its normal processing rules and heuristics to ambiguous, incomplete, or specifically crafted stimuli.53 By studying these instances where perception diverges from physical reality, we gain invaluable insights into the “hidden workings” of the brain’s constructive mechanisms.

Visual illusions offer compelling demonstrations:

  • The Kanizsa Triangle is formed by three Pac-Man-like shapes oriented inwards, creating the perception of a bright, white triangle whose contours are not physically present.55 The brain “fills in” these missing edges, showcasing its tendency to organize visual information into coherent, familiar forms based on top-down expectations and predictions.29
  • The Müller-Lyer Illusion involves two lines of equal length; one with outward-pointing arrowheads appears longer than one with inward-pointing arrowheads.54 This is often attributed to the brain misapplying 3D depth cues (learned from environments with corners and perspectives, the “carpentered world” hypothesis) to a 2D image, leading to a misjudgment of size.48
  • The Ponzo Illusion occurs when two identical objects or lines placed between converging lines (like railway tracks receding into the distance) appear to be different sizes. The line perceived as “further away” due to the depth cues is judged as larger.57 Functional MRI studies have linked this illusion to shifts in the position of population receptive fields in early visual cortical areas (V1-V3).57
  • The Ebbinghaus Illusion demonstrates how the perceived size of a central circle is influenced by the size of surrounding circles: it appears smaller when surrounded by larger circles and larger when surrounded by smaller ones, highlighting the impact of relative context on size perception.56
  • Ambiguous figures like the Coffer Illusion (which can appear as sunken panels or protruding circles) or Gianni Sarcone’s Mask of Love (perceived as a single mask or two faces kissing) illustrate the brain “flipping” between equally plausible interpretations as it strives to identify objects and resolve ambiguity.53

Auditory illusions similarly reveal the constructive nature of hearing:

  • The McGurk Effect is a striking multisensory illusion where visual information (seeing lip movements for one sound, e.g., /ga/) changes the perception of an incongruent auditory speech sound (e.g., hearing /ba/), often resulting in the perception of a third sound (e.g., /da/).60 This powerfully demonstrates the brain’s integration of auditory and visual information in speech perception, sometimes leading vision to override or fuse with audition to create a novel percept.62
  • Binaural Beats are perceived when two pure tones of slightly different frequencies are presented separately to each ear. The listener perceives a third “beat” at the difference frequency, a sound not physically present in the stimulus but created by the brain’s processing.63
  • The Illusory Continuity of Tones occurs when a tone that is briefly interrupted by a louder sound (like a burst of noise) is perceived as continuous, as if it played through the noise. The brain “fills in” the missing auditory information based on contextual expectations of continuity.60

Tactile illusions show that our sense of touch is also a construction:

  • The Cutaneous Rabbit Illusion (or Somatosensory Saltation) involves rapid sequential taps delivered to two distinct skin locations (e.g., the wrist and then the elbow). This can create the compelling illusion of a series of taps hopping along the skin at intervening locations that were never actually stimulated.64 This demonstrates spatial and temporal mislocalization, with the brain “filling in” sensations. Remarkably, this illusion can activate the primary somatosensory cortex (S1) at a somatotopic location corresponding to the illusory touch 66, and the “rabbit” can even be perceived to hop out of the body onto an object held between the stimulated points.67
  • The Aristotle Illusion occurs when a single small object (like a pea, or even one’s own nose) is touched with two crossed fingers, often leading to the sensation of touching two separate objects.64
  • Funneling is experienced when simultaneous vibrations at two or more skin locations are perceived as a single sensation localized to a point somewhere between the actual stimulation sites.64

Beyond transient illusions, phenomena like phantom limbs offer profound insights. Following the amputation of a limb, the vast majority of individuals continue to experience vivid sensations as if the limb were still present, often including pain (phantom limb pain, or PLP).68 This phenomenon is strongly linked to cortical reorganization in the somatosensory and motor cortex. Brain areas that previously received input from and controlled the amputated limb do not remain silent; instead, they are often “invaded” or recruited by neighboring cortical regions representing other body parts (e.g., the area for the face might expand into the former hand area).68 Consequently, touching the face might evoke sensations in the phantom hand. This demonstrates profound neuroplasticity and reveals that the brain maintains a persistent representation of the body (a body schema or image) that can endure even after drastic physical changes. Peripheral nerve changes, such as the formation of neuromas and nerve hyperexcitability at the stump site, as well as central sensitization in the spinal cord, also contribute to phantom limb sensations and pain.68

The study of these illusions and neurological conditions powerfully underscores that our perception is not a direct readout of the physical world or our physical body. Instead, it is a dynamically constructed neural representation. Many illusions demonstrate the brain’s drive to create a single, coherent model of reality by integrating information across senses, even if it means altering the input from one sense to align with another, or “filling in” missing data based on prior experience and inherent processing rules. The fact that these are predictable outcomes of a normally functioning brain, rather than system failures, makes them invaluable tools for dissecting the constructive processes of perception. This has far-reaching implications, influencing our understanding of the reliability of perception in everyday life (e.g., eyewitness testimony), the subjective nature of experiences like pain, and the remarkable flexibility of our brain’s body representation.

Table 3: Illuminating Illusions: What They Reveal About Perceptual Construction

Illusion NameSensory Modality(ies) InvolvedBrief Description of the IllusionKey Insight into Brain’s Perceptual Construction
Kanizsa TriangleVisionPerception of illusory contours forming a shape (e.g., a triangle) that is not physically present.55Brain’s tendency for closure, “filling-in” missing information based on top-down expectations to create coherent forms.29
Müller-Lyer IllusionVisionLines of equal length appear different due to the orientation of arrowheads at their ends.54Misapplication of 3D depth cues (learned from “carpentered” environments) to 2D figures, affecting size perception.48
Ponzo IllusionVisionTwo identical lines appear different in size when placed between converging lines, suggesting depth.57Brain interprets linear perspective cues for depth, scaling perceived size based on apparent distance. Linked to receptive field shifts in visual cortex.57
Ebbinghaus IllusionVisionPerceived size of a central circle is influenced by the size of surrounding circles.56Demonstrates the impact of relative context on size perception; the brain judges size comparatively rather than absolutely.
McGurk EffectAudition, VisionVisual information of lip movements alters the perception of an auditory speech sound.61Powerful multisensory integration where vision can override or fuse with auditory input to create a novel, unified percept.
Cutaneous Rabbit IllusionTouch (Somatosensation)Rapid sequential taps at two skin locations create illusory taps at intervening locations.66Brain “fills in” missing tactile information based on spatiotemporal patterns, demonstrating predictive interpolation and dynamic remapping in somatosensory cortex.66
Phantom Limb SensationSomatosensation, ProprioceptionVivid sensation that an amputated limb is still present, often including pain.68Profound cortical reorganization (neuroplasticity) in somatosensory and motor areas; the brain’s internal body map persists and can generate sensations without peripheral input.

The Malleable Mind: Neuroplasticity and Evolving Perceptions

The brain’s construction of reality is not a static, predetermined process fixed at birth. Instead, it is profoundly shaped and reshaped throughout life by neuroplasticity—the brain’s remarkable ability to change its own structure and function in response to experience, learning, and injury.70 This inherent malleability means that our perceptual abilities, and indeed the very way we experience the world, can evolve. Key mechanisms underlying neuroplasticity include the formation of new neurons (neurogenesis, though more limited in the adult brain), the programmed death of cells (apoptosis, crucial during development), and activity-dependent synaptic plasticity—the strengthening (long-term potentiation, LTP) or weakening (long-term depression, LTD) of connections between neurons based on their activity patterns. These processes often involve physical changes in dendritic spines, the tiny protrusions on neurons where synapses are formed.70

Perceptual learning is a direct consequence of neuroplasticity, referring to the improvement in our ability to perceive and discriminate sensory information through practice and experience.72 Examples are ubiquitous: musicians develop a refined ability to distinguish subtle differences between musical notes 74, radiologists become adept at detecting faint anomalies in medical images, and individuals learning a new language gradually become better at perceiving and producing unfamiliar speech sounds. Even everyday tasks like learning to navigate a new city involve perceptual learning as we become more attuned to relevant landmarks and spatial relationships.75 Recent research indicates that efficiently enhancing visual perception can occur with minimal stimuli exposure through brief memory reactivations. This type of learning engages distinct neural mechanisms compared to standard repetition-based learning, often involving higher-order brain regions like the intraparietal sulcus (IPS) that are associated with control and attentional resources.72 This suggests multiple pathways through which plasticity can refine perception.

Neuroplasticity can be broadly categorized into two types based on the nature of the experiences that drive it 71:

  1. Experience-expectant plasticity: This refers to brain development that “expects” certain universal environmental inputs during specific sensitive or critical periods. For example, the visual system anticipates exposure to patterned light and contrast borders shortly after birth to develop normally.9 If such expected input is absent during this window (e.g., if an infant’s eye is covered for an extended period), visual perception may be permanently impaired.9
  2. Experience-dependent plasticity: This involves neural changes that are unique to an individual’s specific life experiences, learning opportunities, and environment.71 This type of plasticity is not strictly tied to critical periods and allows for lifelong learning and adaptation. Language acquisition provides a classic example of the shift from experience-expectant mechanisms (young infants can discriminate phonetic sounds from all languages) to experience-dependent mechanisms (with exposure, they become specialists in their native language(s) while losing the ability to easily distinguish non-native sounds).71

The effects of sensory deprivation dramatically illustrate the brain’s plastic capabilities. When one sensory modality is lost (e.g., through blindness or deafness), brain areas typically devoted to processing that sense can be recruited by the remaining, intact senses. This phenomenon is known as cross-modal plasticity.76 For instance, in individuals who are blind, the visual cortex (occipital lobe) can become responsive to auditory or tactile stimuli.9 This reorganization is not merely a passive takeover; it can be functionally relevant. Sometimes, it leads to compensatory enhancements in the remaining senses, such as blind individuals demonstrating superior tactile discrimination thresholds or enhanced sound localization abilities.9 However, such compensation is not universal, and the extent of benefit can vary widely.77 Moreover, cross-modal plasticity can sometimes have maladaptive consequences. For example, if sensory function is later restored (e.g., through a cochlear implant for deafness or, hypothetically, a visual prosthesis for blindness), the reorganized cortex might not efficiently process the newly restored input, potentially hindering rehabilitation.76 In age-related hearing loss, the auditory cortex may begin processing visual and tactile information, which, while adaptive in one sense, can reduce its efficiency for processing sounds even when hearing aids are used, leading to increased cognitive load.78

Conversely, exposure to an enriched environment (EE)—one that provides ample sensory, cognitive, social, and motor stimulation—can promote positive neuroplastic changes. EE has been shown to enhance neural development, support brain repair after injury, and improve learning and memory capacity by inducing morphological, cellular, and molecular adaptations in the brain.79 The specific effects of EE can vary depending on the developmental stage at which it is experienced and the particular components of the enrichment (e.g., physical exercise, social interaction, novel objects).79

Neuroplasticity thus reveals that the brain’s construction of reality is a dynamic and continuously evolving process. Our perceptual world is not set in stone but is constantly being updated and refined based on our interactions with the environment. The distinction between experience-expectant and experience-dependent plasticity highlights a fundamental developmental strategy: the brain is pre-wired to utilize certain common environmental inputs for basic perceptual scaffolding, yet it remains highly adaptable to individual-specific experiences, enabling lifelong learning and perceptual refinement. The causal chain is clear: changes in sensory input or experience (or lack thereof) can directly induce structural and functional modifications in the brain, which, in turn, causally alter our perceptual capabilities and subjective experience. While this offers immense hope for learning, adaptation, and rehabilitation, it also underscores that prolonged exposure to impoverished or biased environments could negatively shape our perceptions. Plasticity is an adaptive mechanism, but its outcomes are context-dependent and not invariably beneficial.

The Bigger Picture: Implications of a Constructed Reality

Understanding that our perception of reality is an active construction by the brain, rather than a passive reflection of an objective external world, carries profound implications that ripple across philosophy, mental health, technology, and ethics.

Philosophical Considerations:

The notion of a constructed reality forces a re-examination of the relationship between subjective experience and objective reality. If what we perceive is a brain-generated model, how much direct access do we truly have to the “thing-in-itself” (the Ding an sich, as Immanuel Kant termed it)—the world as it exists independently of our perception?.80 Some neuroscientists and philosophers suggest that, for us, “Reality in capitals… does not exist” in an accessible form; instead, we subjectively invent or construct the reality that is meaningful and useful for our survival.81 As Lisa Feldman Barrett has noted, reality for us is what we can sense and what we can make sense of with the signals in our brain.2 This doesn’t necessarily deny the existence of an external world but emphasizes that our experience of it is always mediated and shaped by our neural architecture.

While the constructive nature of perception might seem to lend a sliver of plausibility to solipsism—the extreme philosophical idea that only one’s own mind is sure to exist 82—this position is generally considered unproductive by most scientists and philosophers. The brain-in-a-vat thought experiment, where a disembodied brain is fed artificial sensory inputs to create a simulated reality, plays on these themes but primarily serves to explore the limits of skepticism.82

More significantly, the construction of perception is intimately linked to the nature of consciousness and subjective experience, often referred to as qualia (the qualitative, “what it’s like” character of experience).17 Some theories propose that consciousness itself might be a sophisticated construct, perhaps arising from the brain’s social perceptual machinery used to model not only others’ minds but also our own attentional state.85 The concept of a “neural subjective frame,” rooted in the brain’s representation of the body’s internal state (interoception), suggests a foundational biological mechanism for first-person perspective, an anchoring point from which subjective experience unfolds.17

Impact on Understanding Mental Health and Developing Treatments:

Distortions in perception are a hallmark of many mental health conditions. For example, individuals with body dysmorphic disorder have a distorted perception of their own appearance; anxiety disorders can lead to a hyper-perception of threat; and conditions like schizophrenia can involve hallucinations (perceiving things that aren’t there) and delusions (firmly held false beliefs often based on misinterpretations of reality).36

Recognizing perception as a construct is pivotal for treatments like Cognitive Behavioral Therapy (CBT). CBT works by helping individuals identify, challenge, and modify maladaptive thought patterns and interpretations that contribute to their distress.32 By changing these top-down influences, individuals can alter their perception of situations and, consequently, their emotional and behavioral responses. The placebo effect, where an inert treatment can lead to genuine perceived improvement, can also be understood within this framework. Expectations (a powerful top-down process) can significantly influence perceived outcomes, including pain relief.32 Predictive processing models are particularly helpful in accommodating phenomena like symptom perception in the absence of clear pathophysiology, and the efficacy of placebos.35 Furthermore, substances like hallucinogens, which profoundly alter perception, are being investigated for their therapeutic potential in treating conditions like depression and anxiety, possibly by disrupting rigid, maladaptive perceptual patterns and promoting neuroplasticity.86

Relevance for Artificial Intelligence (AI) Development:

The principles of constructive and predictive perception in humans offer valuable insights for the development of more sophisticated AI. If human intelligence and adaptability stem, in part, from the ability to build internal models of the world, make predictions, learn from errors, and integrate context, then AI systems designed to interact intelligently with complex environments might benefit from incorporating similar mechanisms.87 Current challenges in AI, such as imbuing machines with robust common sense, nuanced context-awareness, and the ability to handle ambiguity, are all areas where human constructive perception excels. Moreover, understanding how humans perceive and interact with AI is crucial for its development and societal adoption. As AI becomes more integrated into daily life, human-AI interaction can shift perceptions of AI from a mere tool to something akin to a companion, reflecting a social construction of technology itself.88

Ethical Considerations of Subjective Realities:

The deeply subjective and constructed nature of our realities raises significant ethical questions. If individuals can genuinely experience the same event differently, how do we navigate disagreements, establish shared truths for societal functioning, or assign responsibility?.89 This underscores the critical importance of empathy, communication, and a willingness to acknowledge the potential validity of others’ subjective experiences, even when they starkly contrast with our own. The inherent risk of cognitive biases 45 shaping our constructed realities, potentially leading to prejudice, stereotyping, or unfair judgments, also demands constant vigilance and critical self-reflection. For instance, societal assumptions based on gender, race, or age can unconsciously filter into our perceptions and subsequent actions.91

The legal system, for example, often relies heavily on eyewitness testimony, which implicitly assumes a reasonably objective and accurate recall of events. However, the understanding that perception and memory are reconstructive processes, influenced by stress, emotion, expectation, and bias, challenges this assumption.1 This necessitates a more nuanced and cautious approach to evaluating evidence that depends on human perception.

Ultimately, recognizing that perception is constructed—and can be influenced by a myriad of internal and external factors—prompts a re-evaluation of “truth” and “objectivity” across many domains. It fosters intellectual humility, encouraging a shift from asserting the sole correctness of one’s own viewpoint to curiously exploring why perceptions might differ. In scientific inquiry itself, this understanding reinforces the need for rigorous methodologies, such as double-blinding and peer review, to mitigate individual cognitive biases and strive for a more intersubjectively agreed-upon understanding of phenomena.

Conclusion: Embracing Our Intricately Woven Worlds

The journey from raw sensory input to the rich, meaningful, and subjectively real world we each inhabit is one of the most extraordinary and complex processes known to science. Our exploration has revealed that the human brain is far from a passive recipient of external information; it is an active, tireless architect, constantly constructing our individual perceptual realities.1 This construction is not an arbitrary illusion but a highly adaptive process, meticulously refined by evolution to enable us to navigate, interact, and ultimately survive within a complex and ever-changing environment.4 The brain, in essence, creates the reality it is “interested in for the survival of the organism”.81

This intricate weaving of perception involves a symphony of neural mechanisms: the specialized sensory organs gathering diverse forms of energy and chemical information; the crucial act of transduction converting these inputs into a neural code; the hierarchical processing within dedicated cortical areas that analyze features and integrate information across modalities; and the constant, dynamic interplay between data-driven bottom-up signals and conceptually-driven top-down influences. At the forefront of our current understanding is the notion of the brain as a predictive engine, tirelessly generating hypotheses about the world, comparing these with incoming sensory evidence, and updating its internal models to minimize prediction error—a process that is our perception.

The deeply subjective nature of this constructed reality, shaped by our unique memories, beliefs, emotions, attention, cultural backgrounds, and even our genetic predispositions, means that each of us experiences a personalized version of the world. Yet, despite this inherent subjectivity, humans are remarkably adept at co-constructing “social reality” through shared language, agreements, and cultural norms, allowing for collective understanding and cooperation.2

Recognizing that our reality is a brain-based construction can be profoundly empowering. It opens the door to understanding that by mindfully altering our thoughts, managing our expectations, directing our attention, and engaging in new experiences, we can leverage the brain’s inherent neuroplasticity to influence our own perceived reality and, consequently, our well-being.32 This understanding demystifies aspects of mental health, informs therapeutic interventions, and offers valuable perspectives for the development of more sophisticated artificial intelligence.

The human brain’s capacity to transform simple physical stimuli into a vibrant, coherent, and emotionally resonant perceptual world is a testament to the elegance and complexity of biological computation. While many mysteries remain, the ongoing scientific quest to unravel the mechanisms of perception continues to illuminate the profound ways in which we are all architects of our own experience, each living within an intricately woven world of our brain’s own making. This awareness invites not only scientific curiosity but also a deeper appreciation for the subjective experiences of others and a more critical, yet wondrous, reflection upon our own.

Works cited

  1. Understanding How the Brain Constructs Our Perception of Reality – Roxana Murariu, accessed May 9, 2025, https://www.roxanamurariu.com/understanding-how-the-brain-constructs-our-perception-of-reality/
  2. How your brain creates reality, explained by a neuroscientist – Big Think, accessed May 9, 2025, https://bigthink.com/the-well/what-is-reality/
  3. REALITY, PERCEPTION, AND SIMULATION: A PLAUSIBLE THEORY – Johns Hopkins University Applied Physics Laboratory, accessed May 9, 2025, https://secwww.jhuapl.edu/techdigest/content/techdigest/pdf/V15-N02/15-02-Powell.pdf
  4. www.nimh.nih.gov, accessed May 9, 2025, https://www.nimh.nih.gov/research/research-funded-by-nimh/rdoc/constructs/perception#:~:text=Perception%20refers%20to%20the%20process,external%20world%2C%20and%20guide%20action.
  5. Perception – National Institute of Mental Health (NIMH), accessed May 9, 2025, https://www.nimh.nih.gov/research/research-funded-by-nimh/rdoc/constructs/perception
  6. oxsci.org, accessed May 9, 2025, https://oxsci.org/how-should-we-define-reality-and-does-it-exist/#:~:text=Reality%20can%20be%20described%20as,perceptions%20with%20our%20physical%20experiences.
  7. Sensory Perception – Advanced | CK-12 Foundation, accessed May 9, 2025, https://flexbooks.ck12.org/cbook/ck-12-advanced-biology/section/17.15/primary/lesson/sensory-perception-advanced-bio-adv/
  8. Sense – Wikipedia, accessed May 9, 2025, https://en.wikipedia.org/wiki/Sense
  9. The Senses — A Primer (Part I) – BrainFacts, accessed May 9, 2025, https://www.brainfacts.org/thinking-sensing-and-behaving/vision/2013/the-senses-a-primer-part-i
  10. Sensory Perception | Process of Sensory Perception – Vedantu, accessed May 9, 2025, https://www.vedantu.com/biology/sensory-perception
  11. Sensory pathways | Perception Class Notes – Fiveable, accessed May 9, 2025, https://library.fiveable.me/perception/unit-1/sensory-pathways/study-guide/c5bEkGFzVNyBabkH
  12. Sensory processing – Wikipedia, accessed May 9, 2025, https://en.wikipedia.org/wiki/Sensory_processing
  13. my.clevelandclinic.org, accessed May 9, 2025, https://my.clevelandclinic.org/health/body/22652-thalamus#:~:text=Your%20thalamus%20is%20an%20egg,your%20body%20to%20your%20brain.
  14. Thalamus: What It Is, Function & Disorders – Cleveland Clinic, accessed May 9, 2025, https://my.clevelandclinic.org/health/body/22652-thalamus
  15. How does the human brain work? – Speechneurolab, accessed May 9, 2025, https://speechneurolab.ca/en/comment-fonctionne-le-cerveau-humain/
  16. Neuroanatomy, Somatosensory Cortex – StatPearls – NCBI Bookshelf, accessed May 9, 2025, https://www.ncbi.nlm.nih.gov/books/NBK555915/
  17. The neural subjective frame: from bodily signals to perceptual consciousness – PMC – PubMed Central, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC3965163/
  18. Sensory cortex – Wikipedia, accessed May 9, 2025, https://en.wikipedia.org/wiki/Sensory_cortex
  19. Neuroanatomy, Cerebral Cortex – StatPearls – NCBI Bookshelf, accessed May 9, 2025, https://www.ncbi.nlm.nih.gov/books/NBK537247/
  20. www.ncbi.nlm.nih.gov, accessed May 9, 2025, https://www.ncbi.nlm.nih.gov/books/NBK555915/#:~:text=The%20primary%20somatosensory%20cortex%20is,internal%20capsule%20and%20corona%20radiata.
  21. The bottom-up and top-down processing of faces in the human occipitotemporal cortex – PubMed Central, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC7000216/
  22. pressbooks.online.ucf.edu, accessed May 9, 2025, https://pressbooks.online.ucf.edu/lumenpsychology/chapter/reading-what-is-perception/#:~:text=Perception%20refers%20to%20the%20way,are%20built%20from%20sensory%20input.
  23. 10.4: Top-Down vs. Bottom-Up (Conceptually-driven vs. Data-driven processing) – Social Sci LibreTexts, accessed May 9, 2025, https://socialsci.libretexts.org/Bookshelves/Psychology/Cognitive_Psychology/Cognitive_Psychology_(Andrade_and_Walker)/10%3A_Perception/10.04%3A_Top-Down_vs._Bottom-Up_(Conceptually-driven_vs._Data-driven_processing)
  24. What is Perception? – General Psychology – UCF Pressbooks, accessed May 9, 2025, https://pressbooks.online.ucf.edu/lumenpsychology/chapter/reading-what-is-perception/
  25. Top-Down Processing and Bottom-Up Processing | What is Top-Down Processing – Structural Learning, accessed May 9, 2025, https://www.structural-learning.com/post/top-down-processing-and-bottom-up-processing
  26. Fundamentals of Cognition | Soma Cognition – Soma Technologies, accessed May 9, 2025, https://blog.soma-npt.ch/what-is-cognition-and-its-components-understanding-the-basics/
  27. www.researchgate.net, accessed May 9, 2025, https://www.researchgate.net/figure/Example-of-top-down-signals-in-visual-perceptionBottom-up-data-driven-processes_fig1_12763753#:~:text=Bottom%2Dup%20(data%2Ddriven,experiencing%20bottom%2Dup%20signals%20alone.
  28. Perception Psychology: Exploring Key Perception Theories – BetterHelp, accessed May 9, 2025, https://www.betterhelp.com/advice/psychologists/perception-psychology-definition-and-how-we-see-things/
  29. Evaluating the neurophysiological evidence for predictive …, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC7187369/
  30. Top Down vs Bottom Up Processing – Strategy Capstone, accessed May 9, 2025, https://strategycapstone.org/top-down-vs-bottom-up-processing/
  31. Video: Factors Affecting Perception – JoVE, accessed May 9, 2025, https://app.jove.com/science-education/17777/factors-affecting-perception
  32. Perception, a constructed reality – ScIU – IU Blogs – Indiana University, accessed May 9, 2025, https://blogs.iu.edu/sciu/2024/03/11/perception-a-constructed-reality/
  33. Symptom Perception From a Predictive Processing Perspective, accessed May 9, 2025, https://cpe.psychopen.eu/index.php/cpe/article/view/2553/2553.html
  34. The Bayesian Brain – Wellcome Centre for Human Neuroimaging …, accessed May 9, 2025, https://www.fil.ion.ucl.ac.uk/bayesian-brain/
  35. Symptom perception, placebo effects, and the Bayesian brain – PMC – PubMed Central, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC6319577/
  36. Reality – psychology-lexicon.com, accessed May 9, 2025, https://www.psychology-lexicon.com/cms/glossary/51-glossary-r/22889-reality.html
  37. Early effects of previous experience on conscious perception – PMC – PubMed Central, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC6084554/
  38. Contraction bias: How previous experiences influence our perception – Scientifica UK, accessed May 9, 2025, https://www.scientifica.uk.com/neurowire/contraction-bias-how-previous-experiences-influence-our-perception
  39. 3.5 The Impact of Emotions on Cognition – Open Educational Resources Collective, accessed May 9, 2025, https://oercollective.caul.edu.au/neuroscience-psychology-conflict/chapter/3-5-the-impact-of-emotions-on-cognition/
  40. The impact of emotion on perception, attention, memory, and decision-making, accessed May 9, 2025, https://smw.ch/index.php/smw/article/download/1687/2255
  41. The Influences of Emotion on Learning and Memory – Frontiers, accessed May 9, 2025, https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2017.01454/full
  42. How Does Attention Alter Length Perception? A Prism Adaptation Study – Frontiers, accessed May 9, 2025, https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2020.02091/full
  43. The Neural Correlates of Consciousness and Attention: Two Sister Processes of the Brain, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC6842945/
  44. Attention modulates subjective time perception across eye movements – PubMed, accessed May 9, 2025, https://pubmed.ncbi.nlm.nih.gov/39778360/
  45. Understanding Cognitive Biases: An In-depth Look at 20 Common Mental Traps – Achology, accessed May 9, 2025, https://achology.com/psychology/20-common-cognitive-biases/
  46. What Is Cognitive Bias? 7 Examples & Resources (Incl. Codex) – Positive Psychology, accessed May 9, 2025, https://positivepsychology.com/cognitive-biases/
  47. www.tutorchase.com, accessed May 9, 2025, https://www.tutorchase.com/answers/a-level/psychology/how-does-cultural-background-influence-perceptions-of-sexuality#:~:text=Moreover%2C%20cultural%20background%20can%20influence,be%20more%20passive%20and%20submissive.
  48. Culture and Perception, part II: The Muller-Lyer illusion …, accessed May 9, 2025, https://www.cognitionandculture.net/blogs/simons-blog/culture-and-perception-part-ii-the-muller-lyer-illusion/index.html
  49. Genes, Environments, Personality, and Successful Aging: Toward a Comprehensive Developmental Model in Later Life – PMC, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC3326243/
  50. Genetic and Familial Influences on Self-Perception in Early Childhood and Self-Esteem in Adulthood: A Cross-Sectional Analysis | Twin Research and Human Genetics | Cambridge Core, accessed May 9, 2025, https://www.cambridge.org/core/journals/twin-research-and-human-genetics/article/genetic-and-familial-influences-on-selfperception-in-early-childhood-and-selfesteem-in-adulthood-a-crosssectional-analysis/3230365794EB63A721660B4AE679E37E
  51. Patterns or Subtypes of Differences in Sensory Integration & Processing – STAR Institute, accessed May 9, 2025, https://sensoryhealth.org/basic/patterns-or-subtypes-of-differences-sensory-integration-processing
  52. Interindividual differences in perception – Wikipedia, accessed May 9, 2025, https://en.wikipedia.org/wiki/Interindividual_differences_in_perception
  53. Three visual illusions that reveal the hidden workings of the brain – The University of Sydney, accessed May 9, 2025, https://www.sydney.edu.au/news-opinion/news/2017/07/31/three-visual-illusions-that-reveal-the-hidden-workings-of-the-br.html
  54. The Constructivist Theory of Perception – Algor Cards, accessed May 9, 2025, https://cards.algoreducation.com/en/content/WWHkcBVL/constructivist-perception-theory
  55. www.pageon.ai, accessed May 9, 2025, https://www.pageon.ai/blog/best-optical-illusions#:~:text=The%20Kanizsa%20Triangle,-The%20Kanizsa%20Triangle&text=Your%20brain%20perceives%20the%20triangle,visual%20information%20into%20coherent%20forms.
  56. Optical Illusions and Data Viz – Do Mo(o)re with Data, accessed May 9, 2025, https://domoorewithdata.com/2023/05/02/optical-illusions-and-data-viz/
  57. Position shifts of fMRI-based population receptive fields in human visual cortex induced by Ponzo illusion – PubMed, accessed May 9, 2025, https://pubmed.ncbi.nlm.nih.gov/26314755/
  58. Optical Illusions – Vivid Vision, accessed May 9, 2025, https://www.seevividly.com/info/Physiology_of_Vision/Optical_Illusions
  59. Ebbinghaus Illusion – (Intro to Brain and Behavior) – Fiveable, accessed May 9, 2025, https://library.fiveable.me/key-terms/introduction-brain-behavior/ebbinghaus-illusion
  60. Auditory illusion – Wikipedia, accessed May 9, 2025, https://en.wikipedia.org/wiki/Auditory_illusion
  61. McGurk Effect: How the Brain Plays Tricks – UC Irvine – YouTube, accessed May 9, 2025, https://www.youtube.com/watch?v=501_cVo9LdY
  62. What is the McGurk effect? – PMC, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC4091305/
  63. www.344audio.com, accessed May 9, 2025, https://www.344audio.com/post/auditory-illusions#:~:text=This%20perceived%20beat%20is%20not,is%20called%20a%20binaural%20beat.
  64. A Brief Taxonomy of Tactile Illusions and Demonstrations That Can Be Done In a Hardware Store, accessed May 9, 2025, https://www.cim.mcgill.ca/~haptic/pub/VH-BRB-07.pdf
  65. Brain Process for Perception of the “Out of the Body” Tactile Illusion for Virtual Object Interaction – PubMed Central, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC4431253/
  66. The Cutaneous Rabbit Illusion Affects Human Primary Sensory Cortex Somatotopically | PLOS Biology, accessed May 9, 2025, https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.0040069
  67. The “Cutaneous Rabbit” Hopping out of the Body | Journal of Neuroscience, accessed May 9, 2025, https://www.jneurosci.org/content/30/5/1856
  68. Phantom Limb Pain – StatPearls – NCBI Bookshelf, accessed May 9, 2025, https://www.ncbi.nlm.nih.gov/books/NBK448188/
  69. The perception of phantom limbs. The D. O. Hebb lecture – PubMed, accessed May 9, 2025, https://pubmed.ncbi.nlm.nih.gov/9762952/
  70. pmc.ncbi.nlm.nih.gov, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC6871182/#:~:text=According%20to%20the%20theories%20of,and%20activity%E2%80%90dependent%20synaptic%20plasticity.
  71. Neural plasticity of development and learning – PMC – PubMed Central, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC6871182/
  72. Distinct Neural Plasticity Enhancing Visual Perception – PubMed, accessed May 9, 2025, https://pubmed.ncbi.nlm.nih.gov/39103221/
  73. Distinct Neural Plasticity Enhancing Visual Perception – Journal of Neuroscience, accessed May 9, 2025, https://www.jneurosci.org/content/44/36/e0301242024
  74. Perceptual learning | EBSCO Research Starters, accessed May 9, 2025, https://www.ebsco.com/research-starters/education/perceptual-learning
  75. What are some everyday examples of neuroplasticity, that is, of how the brain is changed and shaped by experience? – Quora, accessed May 9, 2025, https://www.quora.com/What-are-some-everyday-examples-of-neuroplasticity-that-is-of-how-the-brain-is-changed-and-shaped-by-experience
  76. Neural Reorganization Following Sensory Loss: The Opportunity Of Change – PMC, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC3898172/
  77. Why does the cortex reorganize after sensory loss? – PMC – PubMed Central, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC7382297/
  78. Brain’s Reorganization After Hearing Loss – Mitelos, accessed May 9, 2025, https://www.mitelos.com/brains-reorganization-after-hearing-loss/
  79. The role of enriched environment in neural development and repair – PMC – PubMed Central, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC9350910/
  80. Objectivity | Internet Encyclopedia of Philosophy, accessed May 9, 2025, https://iep.utm.edu/objectiv/
  81. [Does the brain creates reality?] – PubMed, accessed May 9, 2025, https://pubmed.ncbi.nlm.nih.gov/16524238/
  82. Solipsism – Wikipedia, accessed May 9, 2025, https://en.wikipedia.org/wiki/Solipsism
  83. Solipsism and Its Implications – Torah and Science, accessed May 9, 2025, https://quantumtorah.com/solipsism-and-its-implications/
  84. A First Principles Approach to Subjective Experience – PMC – PubMed Central, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC8888408/
  85. Human consciousness and its relationship to social neuroscience: A novel hypothesis, accessed May 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC3223025/
  86. Perception | BetterHelp, accessed May 9, 2025, https://www.betterhelp.com/mental-health/disorders-conditions/perception/
  87. Perceptions of Artificial Intelligence (AI) in the Construction Industry Among Undergraduate Construction Management Students: Case Study—A Study of Future Leaders – MDPI, accessed May 9, 2025, https://www.mdpi.com/2075-5309/15/7/1095
  88. Human and Artificial Intelligence Interaction from the Perspective of Social Construction of Technology – ResearchGate, accessed May 9, 2025, https://www.researchgate.net/publication/389606310_Human_and_Artificial_Intelligence_Interaction_from_the_Perspective_of_Social_Construction_of_Technology
  89. Chapter 3–The Role of Objective and Subjective Thinking – Ethical Decision-Making – University System of New Hampshire Pressbooks, accessed May 9, 2025, https://dev.pressbooks.usnh.edu/ld821ethicaldecisionmaking/chapter/chapter-3-the-role-of-objective-and-subjective-thinking/
  90. Thinking Critically about the Subjective-Objective Distinction – Sandy LaFave’s Web Page, accessed May 9, 2025, https://lafavephilosophy.x10host.com/subjective_objective.html
  91. Perception and Perspective – The Subjective Writer – Academic Writing Skills, accessed May 9, 2025, https://uq.pressbooks.pub/academicwritingskills/chapter/perception-and-perspective-the-subjective-writer/
Categories: