Paul Nylund

My goal is to evolve productivity and connection through the advancement of technology. Whether it's hardware, software, development, design, or entrepreneurship —they're all means to create useful solutions.

Scroll me  →

I am originally from Los Angeles, California and was raised speaking Norwegian. My interest in dissolving the boundaries between design and engineering led me to study in the Netherlands, Denmark, and Norway. I've also spent some time working in New York, Malmö, and Melbourne.

My life's trajectory tends to confuse a lot of people, so I made a map.

I am also incredibly passionate about cooking and playing music.

Feel free to download my CV, send me an email, or follow me here:


filter works by:





Branding / Strategy


Responsive navigation — graduation project

Fall 2018 — Los Angeles, CA & Melbourne, AUS


Lace is a technical concept for augmenting navigation queues in public spaces in response to the dynamic and turbulent movements of crowds and objects through those spaces. The navigation queues were rendered in a mobile augmented reality view.

Download the report

The speed and immediacy of the image recognition technology we employed in our solution garnered some interest from retail stores, as well. As a result, we organized meetings with Norwegian Rain, Ting, and Kiwi —Norway's largest grocery store chain— to evaluate Lace's potential as a "webshop-style analytics for brick-and-mortar". However, after studying potential market competitors, whom were already much further along, we utlimately put the project on ice.

My partner on this project, Michael, and I began our journey researching public signage interpretation and navigation in increasingly globalist societies. In narrowing down our scope, we settled on using augmented reality to realize our solution.

We spoke to architects about the flow of pedestrians through spaces and how the spaces themselves affected that flow. We spoke to Michael Knopoff, of Montalba Architects, for example, about how his firm utilized circular flow in the new Tom Bradley International Terminal at LAX.

We also had the privelage of meeting with Curtis Barrett, Principal Technical Program Manager at Google and former engineering lead of Google Glass, at Google's headquarters in Mountain View, California.

Following a barrage of feedback from Curtis, we managed to narrow down our approach from attempting to reliably predict the movements of each individual user (a nearly impossible task given our timespan), to designing a dynamic digital signage system to one that relied on simple cameras and machine learning to understand the movements of all actors through a space in real time.

Then we set off to Melbourne, a cool city where people speak English. This would make it easier to conduct user studies. And our tutor, Danielle Wilde, was going to be there for a conference at some point. Nice.

Michael and I delegated our development tasks, considering the estimated workload and short timespan. I developed the pathfinding algorithm and rendering of the path in AR that would guide the user to their destination. Michael worked on the image recognition algorithm and corresponding server that would identify the positions people in a space from a video feed.

In order to develop the solution in augmented reality, I decided I wanted to learn Unity development with C# in three months, never having touched Unity or C# before. Probably a silly idea in retrospect.

Throughout the process, we conducted several user studies, from online surveys, to recruiting pedestrians in Melbourne's Carlton Gardens, to one-on-one interview sessions. The goal of these studies was to identify which visual elements users reacted to, as well as how they were able to relate visual elements to their own position within a space.

Our prototype ended up working, mostly. It earned us a score of 12, the highest possible score at a Danish university!

University of Southern Denmark