The
Varsovian
The Varsovian Film Festival (or TVFF), is a (fictional) independent film festival showcasing unique stories about Warsaw and its inhabitants. Its new website—with a built-in award nomination platform—aims to streamline the film evaluation process to better address the needs of the festival's judges, filmmakers, and budget.
Dates
Oct. 2021
Tools
Adobe XD, Figma
Client
Student project at Google

The Problem
The festival's current evaluation process is paper-based and lengthy, making it time-consuming, expensive to administrate, and not particularly secure. Careful research and prioritization is required to streamline this process in a way that doesn't dilute the integrity of the award committee's decisions. Additionally, the visual design of the new website needs to reflect Poland's rich cinematic tradition.
The Solution
Simplify the film evaluation process so that judges can evaluate the merit of each festival submission with ease, fairness, and efficiency. Redesign the festival’s brand identity to better reflect Polish film history and visual culture.
Note: This project was originally designed as a top-down responsive website, but for the sake of brevity and clarity, this case-study will only focus on the desktop/laptop version.
Pain Points
1
Alphabetical organization.
Many evaluation platforms organize film titles alphabetically. This can make it difficult for judges to find the films they want to evaluate if they don't remember the titles (after all, they often see dozens if not hundreds of films during the festival season). Due to a cognitive bias known as the serial position effect, it also tends to mean that film titles falling in the middle of the alphabet are evaluated less thoroughly.
2
No way to track progress.
Few evaluation platforms offer a simple way to keep track of which films a judge has or hasn’t already rated. This forces judges to document and organize this information in programs like Excel—a solution which is neither efficient nor convenient.
3
Confusing and redundant systems.
Competing platforms often rate films out of 10 stars in at least 10 categories. Many of these categories are not easily distinguishable from one another (for example, one might be asked to rate a film's "originality" and also its "creativity.") This not only causes confusion for judges, but can skew the rating algorithm by weighing similar criteria too heavily against others.

Persona: Paweł
Paweł is a 28-year-old PhD candidate from Warsaw specializing in Polish film history. He and his boyfriend live in the Mokotów neighborhood of Warsaw with their cat, Orson. Between film screenings, attending classes, and bartending gigs, Paweł is stretched pretty thin during the festival’s run in March. Sometimes he's too tired or doesn't have enough time to evaluate films as thoroughly as he wants to. He needs a more efficient, equitable way to evaluate festival submissions so he can be as fair a judge as possible.
Age : 28 years old
Education: MFA in Screenwriting
Hometown: Warsaw, Poland
Family: Boyfriend, cat
Occupation: PhD candidate
“The more complicated the evaluation process gets, the more subjective it becomes."
Goals
1. Rate as many films as he can
2. Be fair and thorough
3. Keep track of his progress
Frustrations
1. The process is time consuming
2. He sometimes forgets a film's title and then has trouble finding it again
Site Features
After conducting interviews, mapping Paweł's journey, creating an affinity diagram, completing a competitive audit, and building a preliminary sitemap (none of which I've documented here), I knew that the final product needed to include features like...
An image-based schedule
Films are organized within a screening schedule instead of alphabetically, and the film posters are displayed front-and-center. This makes it easier for both judges and festival guests to navigate the site; users often remember what day or time they saw a film or what the poster looked like more easily than they remember the film's title or director.

Evaluation history
Any judge can view their evaluation history after logging in. This includes a list of films they've already evaluated and a list of films they haven't. Films can be sorted by the number of completed evaluations, program schedule, and title. The default sorting is completed evaluations (least to most) so that judges are encouraged to prioritized under-evaluated films.

Streamlined criteria
The platform uses only three main criteria to evaluate films: cinematography, script, and acting. Interviews revealed that other criteria (originality, production value, mise en scène, and more) are usually spoken about within the context of those three categories and don't necessarily merit ratings of their own. If there is notable criterion that doesn't seem to fit (breathtaking sound design, for instance), judges can still write about it in the "notes" section at the end.


Secure log in
Paper-based evaluations always bring up security concerns. To evaluate a film on the website, judges must input a 3-digit ID number and password specific to them. These IDs can be changed as often as the festival deems necessary and also act as digital tags, ensuring that no one can submit multiple evaluations of the same film.
The Process
Now that we've seen some of the final screens and key features, let's break down the process I took to get there. Transforming my research-based insights from theory to reality required steps like: building a sitemap, creating paper and digital wireframes, making high and low-fidelity prototypes, and conducting usability tests:

Paper Wireframes
I experimented with a number of ways to display one's evaluation history. The solution I ultimately chose in the final iteration was the simplest of these (2), but not the one I originally went forward with in my low-fidelity prototype (5). That's one of the beautiful things about user testing; we often learn that our first and simplest instinct is more correct than we thought!
Usability Study Round 1
After repeating this exploratory process with other screens, I made digital wireframes in Adobe XD. I then added interactions until I had a low-fidelity prototype, which I used to conduct the first usability study. The following are pain points I discovered specific to the evaluation history page, along with the actions I took to address them in my high-fidelity iteration (completed in Figma):
Parameters:
Study type:
Moderated
Duration:
10-15 minutes each
Participants:
5 people, ages 24 to 65
Location:
United States, in person
Pain Points / Actions:
1
Users were confused by the presence of the "evaluate" tab in the navigation bar before log in, so I changed it to only appear after a judge has logged in.
2
Users didn't intuitively know to look at their profile page for information on which films they had or hadn't evaluated, so I moved the evaluation history to the "evaluate" tab.
3
Users didn't like having to read a key to know which films they had or had not evaluated, so instead of using colored outlines, I separated these films into two separate, titled lists.
4
Users wanted to know how many completed evaluations films had or where they fell in the program schedule, so I added a sorting feature.


Usability Study Round 2
As I already mentioned, this website was originally designed as a top-down responsive website, meaning that I designed from largest screen to smallest. Yet this is not the only reason I decided to focus on the desktop/laptop iteration in this case study. My second usability study—which had users test the site on a laptop, tablet, and mobile phone—proved that users also naturally gravitated towards using the platform on a larger screen. Why? There were two main reasons:
1
To watch trailers. Because laptop/desktop screens are larger, they're better for watching trailers, which users noted could help jog their memory of a film before evaluating it.
2
To type better. Even in a simulated situation like a usability study, users feel a sense of responsibility when evaluating films. Many feel that completing evaluations on a computer lets them be more careful and thoughtful, especially because it's easier for them to type and edit their writing.
Final Prototype

Brand Design Inspiration
It was important to the festival coordinators that the TVFF website visually reflect Poland's rich cinematic tradition. Vintage Polish film posters are widely acclaimed within graphic design circles, so I decided to take inspiration from these—along with a few Polish posters in other categories—when selecting the project’s typography and color palette. Below are a few samples:

What I Learned
Responsive designs are challenging but worthwhile. Completing this project allowed me to better understand how and why users use certain devices over others, and that sometimes the reasons are unexpected! For example, I intuitively hypothesized that users would prefer using the site on a larger screen, but had never considered that it might be because they find it useful to watch film trailers in a separate tab. Nor had I anticipated how stressful users find it to type notes, and that due to insecurities over spelling and grammar, they far prefer typing on a laptop or desktop instead of a tablet or phone. In short, this project taught me the importance of not only considering the users but also the device you're designing for.
Next Steps
1
3rd usability study.
Conduct a third usability study to determine whether user pain points have been effectively addressed or whether there are any new areas of need that have not been accounted for yet.
2
Add a built-in trailer feature.
Users wanted to watch film trailers to jog their memories before completing evaluations. While it's not strictly necessary, this feature could certainly be explored in future iterations.
3
Subtly expand criteria.
While the limited criteria in the evaluation flow is important for maintaining efficiency and fairness, future iterations could use data from the notes section to add criteria that is lacking. For example, if over time it becomes clear that users are frequently mentioning the sound or set design in their notes, these categories can be added to the flow.
4
Crowd-based voting.
If budget allows, future iterations could explore how to securely allow non-judges to use the evaluation platform for limited purposes, like casting ballots for a "crowd favorite" award.
Thank you!
Thank you for your time reviewing my work on the The Varsovian Film Festival website! If you’re interested in connecting, please reach out via the links below or through my contact page.