10/2025 —
02/2026
UX,
UI,
immersive
technologies,
research
with
Jan Schlieben
formfinder
graphics
all
exploration
research
prototype











10/2025 —
02/2026
UX,
UI,
immersive
technologies,
research
with
Jan Schlieben
formfinder
text
What happens to social interactions in times when individuals create their own perceptions of reality?
When digital technologies are embedded in the physical world? Where these phygital worlds are individual, visible to some and invisible to others?
This project constructs a world that vividly imagines a future in which immersive digital technologies shape everyday perception, in order to formulate rules and regulations for such technologies in relation to different places and their distinct connotations within urban space.
Through the research-process, core tensions mirroring three intertwined structures of conflict: person to person, person to reality, and person to space started crystallizing out. In the third, interestingly, the familiar relationship—where space is shaped by us and thus remains the object—begins to shift. Space, understood as a distributed canvas across different locations, starts to shape our content and experience in return, subtly repositioning itself as the subject. In this role, it shapes the object (the user) through its own form.
Thus, different locations have different impacts on perception.
A speculative video prototype tells the story of four individuals with different types and intensities of engagement with the technology. Moving away from binary storytelling, it presents multiple points of view that highlight relativity, exploring how conflicts emerge across different locations.
Potential regulatory implications, especially concerning personal freedoms:
Article 19
Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.
Article 19.1
Every individual has the inalienable right to be perceived in their natural state, free from external alterations or distortions imposed by sensory adaptation layers or similar technologies.
graphics
text
The project imagines an alternative file system beyond folders and icons. Files appear as geometric objects whose form and materiality carry meaning about versioning, relations, and collaboration. Organisation emerges through interaction, specifically filtering on the search level. The system becomes a spatial illustration of what files are today: dynamic, networked, processual, fractal.
Visual impulses, partially based on data visualisations exploring file structures that laid the ground for designing with time as the main compositional and organisational factor, alongside a second concept that approached files as semantic entities capable of forming meaning-based clusters carrying a strong visual impulse led to translating the information within a file into 3D geometry.
This translation operates on two levels: intra-level geometry generated from data through rules, and inter-level material and colour communicating relational information.
This concept opened up a new position of visual representation of a file on the scale of minimization or abstraction and informational value that stands between the icon as a strongly abstract entity and the preview as a direct, non-translated representation.
When bringing a third dimension into the matrix, which is the user distance, an expression summarizing informational value and abstraction the geometry positions itself again somewhere in between, possibly nearer the preview, holding up the position for even further deconstruction of its visual parts and contents holding the potential for recognition and file-specific memorability.
The system is search-based, relying on time, relationships, and type rather than names. Filters enable detailed, customizable search, allowing users to move between files through relational chains in time. Files can also be grouped within collaborative spaces and tagged in more complex ways using nodes for automation supporting tagging, and collaboration-based classifications.