Australian Game Developer Awards, 2019 - 2023
A retrospective look at how empathy, ethics and Excel were used to transform a small conference after-party into a sold-out cocktail gala.
The Systems Behind the
Australian Game Developer Awards
How empathy, ethics and Excel were used to transform a small conference after-party into a sold-out cocktail gala.
Executive Summary
Areas of Focus
Systems in Focus:
The Australian Game Developer Awards 2019 – 2023
Primary Audience:
Australian Game Developers (Applicants, Judges & Ceremony Attendees)
Key Challenges:
A prior history of ambiguous and opaque processes with inconsistent messaging that created distrust and brought into question the integrity of the judging system, legitimacy of the awards, and resulted in an overall negative perception of the ceremony.
Desired Outcome:
Design an ethical and trustworthy judging system that highlights innovation in the Australian Games industry, and allows the entire industry to have access to impactful career opportunities.
Introduction
Award ceremonies are prevalent in all creative industries, and regardless of the profession, they have a vital role to play. They validate hard work, provide recognition, and offer a platform to showcase excellence. In the games industry, competition is particularly fierce and with development cycles often spanning years, visibility for a game is crucial. When it comes time for developers to release their titles, having their hard work recognised by winning an award can be extremely meaningful and impactful on both a personal and professional level.
The Australian Game Developer Awards (AGDAs) is a prolific annual awards show that has been celebrating Australian-made video games for over 20 years. In that time, the awards have changed name, management, venue, and categories many times over. The Australian games industry has also undergone many evolutions in that time, which has made for an ever-changing landscape within the sector.
When I joined the AGDAs design team in 2019, the awards were in a completely different place to where they are now. The AGDAs main objective back then was to act as some sort of ‘after party’ to mark the conclusion of Games Connect Asia Pacific (GCAP), the Australian games industry conference. The judging process that was responsible for determining the winning games each year was ambiguous and inconsistent, and this ambiguity and inconsistency can be seen across most aspects of the AGDAs in these previous years, most notably in the unexplained removal of individual award categories altogether in 2017 and 2018.
Though my contribution to the 2019 AGDAs was smaller compared to the years that followed, the decisions made in 2019 laid the foundations for the AGDAs we know today. In 2020 when the video game association that represented developers (GDAA), and the association that represented publishers and distributors (IGEA) merged, I was given the unique opportunity to take on the role of AGDAs Director.
As the first 2 years of this role occurred in 2020 and 2021, most of my time was spent redesigning the ceremony so it could be effectively delivered as an entirely online experience due to COVID-19 lockdowns preventing the possibility of an in-person ceremony. Although these were uncertain times, the pivot to the online ceremonies allowed the AGDAs to rapidly grow its reach and engagement, averaging approximately 1,600 simultaneous online attendees during the livestream. This also allowed the AGDAs to be accessible to non-GCAP ticket holders for the first time.
In 2022 and 2023, the AGDAs made a triumphant return to an in-person ceremony to great avail. Due to the significant growth over the online years, a decision was made to allow attendees to purchase tickets to the AGDAs separately for the first time, which resulted in double the number of attendees since the last in-person ceremony in 2019. By 2023, the AGDAs was attended by more people than GCAP itself.
This transformational growth was made possible through the development and implementation of a new management system, event production schedule, and by building a new judging process from the ground up. These new frameworks were iterated on and improved upon year on year in order to deliver three key design objectives:
Maintain Integrity;
Showcase Innovation; and
Represent the Whole Industry.
These design objectives were built by:
Conducting research on the most common systems currently being used to deliver games award ceremonies across the globe;
Analysing how the needs of the Australian game development industry differ from that of the global industry; and
Understanding what the perceived issues with the AGDAs were within the various circles of the Australian game development community.
The Common Judging Systems
The Small Jury System
Looking at examples from across the globe, there are two primary models that are most commonly used to determine award winners in a typical games award ceremony.
The most common system is where a hand-picked group of individuals (or judges) are organised into a jury, and these judges are tasked with reaching a consensus on which game submissions should be selected, and which should be eliminated. Juries are typically made up of no more than 12 people who are expected to have the necessary knowledge between them to assess all applications suitability for each category. Members of the jury express their judging opinions by advocating for the games they believe have the most merit, and debating against the games they feel are not suitable.
The process concludes once all differing opinions have been settled, and the jury have reached a final consensus on the winners.
The Pros & Cons of a Jury System are as follows:
PROs of a Jury System:
Considered Deliberation: When a jury is selected with care and consideration, the mix of alternative viewpoints, background and perspectives can unlock a deeper level of understanding of the unique qualities and shortcomings of each game.
Utilitarian Curation: Generally, juries tend to take a more utilitarian approach to assigning winners. Some juries will (knowingly or unknowingly) restrict the ability for a game to win in multiple categories, in order to distribute the available awards more evenly amongst the development teams/games.
CONs of a Jury System:
Groupthink: As ultimately the debate doesn’t conclude until a unanimous winner is decided, a consensus can sometimes be reached not based on critical reasoning and mutual agreement, but rather a desire to achieve an outcome without conflict or repercussions. This can come in the form of:
Persistence Over Persuasion: In a system that requires group consensus, oftentimes the least open-minded, most stubborn and loudest voice in the room can get their way by simply doubling down on their opinion until the rest of the jury ends up too tired to continue debating.
Fear of Judgement & Repercussions: Jury members may be afraid that their opinions could be held against them or used to block them from future opportunities, and as such will not voice their honest opinion in the debate.
Integrity Degradation: The limited size of the jury creates a number of issues that worsen as the number of game submissions increase. As submissions grow, the expectations placed on the judges becomes too high. This can result in a declined quality of judging as judges dedicate little to no time to the games that they are less interested in playing, in order to spend more time on the games that align more closely with their personal interests and disciplines. This can result in a number of submissions only having one judge assess them, or in some cases, game submissions not being played, and ultimately being “skipped” for assessment entirely.
Availability over Ability: The heavy time-commitment required to facilitate the Jury System, in order to both assess the games and align schedules for long/multiple jury meetings inevitably creates a jury pool that favours jurors that have less time commitments during the judging period, over jurors that have the expertise most suitable to judge the submissions, but limited availability to attend deliberations.
The Large Judging Pool System
Another common model is to utilise a large pool of judges to evaluate the games. The idea is simple: more perspectives should result in a more balanced assessment of quality.
Typically, judges individually score the games submissions. These games are either assigned specifically to the judges by the manager of the judging process, or the judges are allowed to ‘opt-in’ to the specific games they wish to judge. Noting that on some occasions, a combination of these two approaches are used.
The individual scores from this pool of judges are collated to form an averaged score, typically using either a weighted average or median. This score is then used to determine which games move forward. For many games award shows, this type of judging serves simply as the initial round, which establishes a shortlist to be used in the next phase of judging.
This next phase of judging will most commonly be delivered in one of the following two formats:
Grand Jury Debate: A smaller panel debates the merits of all shortlisted games in each category, and deliberate to reach an overall consensus on all winners.
Category-Specific Committees: Each category has its own specialised judging committee responsible for evaluating and selecting a winner within their designated genre or discipline.
The Pros & Cons of a Large Judging Pool System are as follows:
PROs of a Large Judging Pool System:
Distinctive Gaps: When averages or medians are calculated from a large dataset it reduces the likelihood of games receiving identically averaged scores, making shortlisting more straightforward. A smaller panel might struggle with close calls and have to institute a tie breaker policy, but a broad voting base naturally separates top contenders.
Outlier Resistance: In a system with a smaller pool of judges, a single judge scoring a game unusually high or low due to personal taste or preconceived expectations can be the difference between a high-quality game making the shortlist and going on to take the win, or that game not receiving a single nomination. A larger judging pool that utilises a median or weighted average system can mitigate outliers and ensure that extreme swings won’t disproportionately impact the overall results, and thus should create shortlists that are reflective of industry-wide opinion.
CONs of a Large Judging Pool System:
Monitoring Issues: A large pool of judges introduces challenges in oversight. With so many individuals involved, it becomes difficult to verify that judging is delivered on time and to a consistent level of quality.
Motivated by the Keys: In many large judging pool systems, judges often select which games they want to review, rather than being assigned a fixed set of games. This can lead to ethically questionable behaviour, such as “donkey voting” where judges submit rushed or arbitrary scores simply to gain access to free game keys. If left unchecked, this behaviour can compromise the integrity of the results.
Balance: Ensuring each game gets a fair number of judges assessing it is another challenge in a large judging pool system. Without strict oversight, popular titles will receive more attention, creating situations where lesser-known games are not being reviewed as thoroughly. This can lead to an uneven playing field where well-marketed/more well known games have an advantage over hidden gems.
The Australian Audience
Unlike the Independent Games Festival (IGFs) and the British Academy of Film and Television Arts Awards (BAFTAs), The AGDAs requires that all submissions are primarily developed within Australia. This bespoke ruling means that each year the award ceremony is a huge opportunity for Australian game developers to have their work recognised in front of their peers.
This bespoke rule also created a series of problems due to the Australian games industry being a relatively small and tight-knit community. Each year’s AGDAs results proved to be a divisive topic amongst industry. Fortunately, this criticism was merely an opportunity to better understand how the system had previously failed its audience, regardless of whether or not that failure was perceived or actual.
Perceived Issues
A list of common criticisms the AGDAs received, expressed both online via social media, and discussed in-person (both publicly and privately) was compiled and boiled down to the following problem statements:
Melbourne Bias: “Winners are selected based on the State they are from; that’s why Melbourne always wins everything”
Lack of Integrity: “My game wasn’t nominated because no-one even played it”
Indie Darling Bias: “Only the artsy PC games can win an AGDA; excellent design work done in my sector of industry (mobile, live service etc.) never gets any recognition”
Insider Bias: “Only those that are connected with the organisers of the event will win, so there’s no point even submitting”
Meaningless: “No one cares about the AGDAs winners. Its not like winning one ever led to any opportunities”
It was clear that several years of minimal clarity around the judging process had created distrust, brought into question the integrity of the judging system, legitimacy of the awards, and resulted in an overall negative perception of the ceremony.
The system going forward would need a strong framework behind it that would build back trust and engagement by beaconing the processes clearly and creating opportunities for a diverse range of games from across Australia to have their strengths showcased.
Three design objectives were created to serve as the key pillars that would address these issues and create an event that the entire industry could get behind.
Key Design Objectives
Maintain Integrity
Build faith in the integrity of the system by proudly displaying a fair and transparent judging process that doesn’t live behind closed doors.
Showcase Innovation
Showcase titles that exemplify excellence in innovation, originality, and beacon the unique strengths of the Australian games industry to consumers and investors globally.
Represent the Whole Industry
Ensure that proper due diligence is conducted and inclusive practices are utilised so that the whole Australian games industry is represented.
Equipped with the knowledge around the strengths and weaknesses of other games award ceremonies and judging processes, as well as a thorough understanding of the needs of the Australian games industry, 10 core design frameworks were developed.
Since assuming the role of AGDAs Director, every aspect of the AGDAs was redesigned from the ground up, and were incrementally improved upon, year on year. It was important to ensure that these core frameworks supported developers, and allowed the AGDAs to grow in both quality and size.
The 10 core design frameworks are as follows:
Core Design Frameworks
1. Respect all Games Submissions
In Australia, the differences in team size (from solo developers, to teams of 100+), budget, and game length vary wildly from project to project. As such, it was important to establish an even playing field where smaller teams with smaller games, or lower budgets weren’t excluded from an opportunity to have their work recognised in a category that they excelled in.
A mission statement was developed: Games must be assessed not just on their general product appeal or by their popularity, but instead by evaluating the specific aspects of the games design (audio, art, gameplay etc.) to find examples of engaging and innovative work.
The first component of delivering this mission statement was to make sure that each team who submitted their game to the AGDAs was given a fair chance to have their work recognised. This was delivered via a publicly stated judging guarantee: Every submission will be judged.
Note: This is something that may sound obvious, but this is surprisingly not a common rule. As for the most part, most global award shows will have a cleverly worded eligibility criteria that allows for games to be culled from this initial round before they’ve even had a chance to be assessed.
This rule, that every submission will be judged, was displayed publicly on the website and further specified that every eligible submission will be assessed by at least 5 judges.
In reality though, the system of ensuring each game was assessed was far more intricate than this. A better description of how it worked was that each game would have their game evaluated by at least 5 judges, all stemming from different backgrounds and skillsets, in the first round of judging.
Each game was assessed by a mix of designers, journalists, artists, academics, and engineers in the first round, each of whom would assess the game against the same criteria, but would all have their own unique frame of reference to ensure that excellence was spotted from every angle.
If a game was successful enough to secure multiple nominations during the judging process, their game will have had on average over 16 different judges play it.
2. Evaluate with Experts
When there are a large number of high-quality games in the running, it becomes a lot harder to effectively distinguish which game shows the highest level of excellence in an individual category and ultimately deserve to claim first place. In order to ensure that the passionate creatives working in speciality fields didn’t have their innovative work go unnoticed, a multi-layered judging system was introduced.
This two-phase system ensures that games are first filtered through a broad industry perspective before being examined with deeper scrutiny by a more specialised group.
AGDAs Judging Process Overview:
A step-by-step guide to how the judging process works
The first round requires judges (referred to as Generalist Judges) to assess games in their entirety. Generalist Judges take a broad overview of the various qualities in their assigned games to identify excellence in its many different forms. Generalist Judges assess their games against the judging criteria, which is utilised as a guide for what constitutes excellence in each award category.
The scores from the generalist round of judging are collated, and a shortlist is formed from the highest performing games in each respective award category.
In the second round, each award category’s shortlist is provided to a separate set of judges (referred to as Specialist Judges), who have expertise in, and best understand the intricacies of the skillset of the discipline of their designated award category. Specialist Judges utilise the judging criteria to critique the shortlist, compare the strengths and weaknesses of each game, and rank them in order from first to third. The ranking orders from the specialist round of judging are collated to ultimately decide which game should be crowned the winner of each respective category.
As a way to both ensure that specialist judges were highly skilled and unlikely to have a potential conflict of interest with the local development teams they were assessing, international judges were more commonly used in the specialist judging round.
To mitigate the risk of the system experiencing any potential blockers, a number of safeguards were designed:
Tie breaker event: In the unlikely scenario of a tie occurring in the final round of judging, the following procedures would be followed to ensure a singular winner was crowned:
Primary Tie Breaker | Generalist Verification: The game that performed the best in the previous round (the generalist judging round) would be considered the winner.
If this was also a tie:
Secondary Tie Breaker | Weighted Innovation: The game that the specialists deemed to be the strongest in the innovation core consideration pillar (as per the judging criteria) would be considered the winner.
If this was ALSO a tie, then:
Final Tie Breaker | Specialists Adjudication: An additional specialist would be brought on board to act as an adjudicator.
In reality though, ties were extremely rare in the specialist round. Specialists were often aligned on the decision of which game should win, despite the fact that blind judging was in place (see core design framework 6. “Keep the Judging Impartial”).
Judge Reassignments: In the unlikely scenario of an external factor preventing a judge from completing their assessment of one or more of their assigned games, games would be reassigned to other judges to ensure that the minimum number of judges required to assess each game was met.
This was a necessary safeguard in the initial years where Covid was far more likely to suddenly affect a judge’s health, or limit their ability to access the equipment needed to judge their assigned games. In the later years, it was a less common occurrence.
3. Put an Open Call Out for Judges
An important aspect of building audience trust in the judging process was being open and transparent about how the judging was conducted. One of the best ways to do this was to allow a broad range of industry folk a chance to participate in the process and become judges themselves.
For the very first time, there was a public open call where anyone could apply to be an AGDAs judge. This procedure not only allowed for the size of the judging committee to grow to an average of 80 judges per year, but it also acted as a show of good faith in the process. This helped build the reputation of the AGDAs and establish a public sense of integrity by avoiding a “members only” approach to judging in favour of a more inclusive one that more accurately represented the industry. It also meant that the process had the variety of judges needed in order to provide an effective and fair assessment of the diverse range of games being made each year in Australia.
Note: It’s important to note that the games being submitted weren’t all PC premium products with their steam keys ready to go that only needed a mouse, keyboard and a working PC to play. Far from it! The submissions came in all shapes and sizes, so our judges needed to not only be able to offer their unique perspectives on the game, but to also have access to the tools needed in order to effectively play the products being developed across the many sectors of the Australian games industry. From free to premium, console to PC, the Oculus Rift 2 to the Apple Arcade, this open call allowed us to get the right people for the job, and a lot of them!
4. Craft Relevant Award Categories
In 2017 and 2018 the AGDAs removed all named game categories (other than game of the year) and opted to instead award 8 games an undefined AGDA. This confused both the audience and the winners, but also presented a unique opportunity for the categories to be redesigned to reflect the strengths of the Australian games industry as it currently stood.
Over the years following this strange category naming event, the following game award categories were added:
Mixed Reality (AR/VR) was added as a way to allow the many studios working on PSVR and Oculus projects an opportunity to showcase their work and to make sure advancements made in this newer technology were always being showcased.
Impact (Serious Games) was added to allow games focused on changing behaviours and educating audiences an opportunity to have their work treated with the same respect as entertainment focused products.
Mobile* was added to ensure that the large workforce working in the mobile games space were able to have an opportunity to have their work respected in a time where those working on PC and Console would unfortunately often look down on Mobile game developers.
Emerging was added to ensure that games made by smaller and less experienced teams (often with lower budgets) were able to have their debut titles showcased. This award would also have the benefit of providing those newer teams with a unique opportunity to be recognised by, and connect with industry at an event that over half of the Australian games industry attended.
Live Service (Ongoing) was added so that developers that worked on products that didn’t follow the traditional “develop and release cycles” were provided their own unique pathway to showcase the hard work they were delivering in the form of content updates, DLCs and quality of life changes.
Narrative was added to allow writers and narrative designers an opportunity to have their contributions highlighted, in particular for the titles made by passionate teams pouring their heart and soul into meaningful stories and compelling characters.
In addition to these new categories, Audio was split into two awards, recognising Music and Sound Design individually. The split into these two categories allowed for a sound designers’ high quality work (that is sometimes ironically less noticeable the better quality it is) was recognised separately from the evocative score and catchy sound tracks created by composers. Since the split of this category, no single game has won both the Music and the Sound Design award proving the distinction between the disciplines.
Finally, the Gameplay, Art & Accessibility awards were all re-established.
* Note: later I discovered “Best Mobile Title” was actually a category a decade earlier, though very different in nature, as the iPhone hadn’t launched yet, so touch screens and mobile apps weren’t yet normalised.
5. Publicise the Judging Criteria
In order to alleviate concerns regarding ambiguous and opaque judging processes, and allow for games applicants and judges to be aligned in their understanding of what excellence meant for each award category, a judging criteria was designed and publicly displayed via the newly created AGDAs website.
This included a mission statement for each category which was designed as an easy way for the entire audience to understand the purpose of the category, which was also displayed alongside the category on the main page of the website.
The criteria were broken up into 3 core consideration pillars that allowed the judges to assess the games through 3 lenses:
Game Feel (Intuitive, Immersive & Engaging)
Game Polish (High Quality, Coherent & Consistent)
Innovative Approach (Unique, Surprising, Ground-breaking OR Subversive)
Each of these core considerations were contextualised for their relevant category and written in the form of prompt questions. This was done to avoid a prescriptive approach to evaluating where only games following tried and tested designs would score highly. The goal instead was to allow for novel or non-traditional designs that broke the mould and offered those playing them a memorable experience to remain competitive throughout the judging process.
6. Keep the Judging Impartial
Cognitive bias in its many forms presents a huge problem to all judging systems globally and the AGDAs was no different. This was a particularly hard problem to solve as biases are, by their very nature, not visible to the person experiencing them. In order to address this the system needed to mitigate as many biases in the process as possible to ensure that the results were fair.
The new process was inspired by the somewhat famous story of the symphony orchestras, who were able to make their hiring practices more impartial by implementing a blind audition system. Musicians who were trying out for the orchestra that year would perform their auditions behind a screen, obscuring their identity, gender, and all other visual factors which allowed the selection committee to mitigate their unconscious bias and judge the applicants on talent alone.
For the new AGDAs judging process, a number of metaphorical “screens” were introduced so that judges were able to judge the games on excellence alone, and allow them to not be swayed or influenced by factors that could potentially skew their results. In order to keep the judging impartial, and make it as fair as possible for the games in the running, a blind judging system was established to address the following:
Group Think: By avoiding utilising debate as part of the process, it eliminated the ability for the most persistent (sometimes most stubborn), and loudest voice in the room to control who won. This was especially useful as often judges that weren’t as comfortable in public group settings were excellent judges who appreciated being given a format where they could more easily deliver their verdict.
Fear of Repercussion & Judgment: The confidentiality that the blind system created meant judges were able to be more candid about their critical analysis of the games they were assessing.
By implementing a blind judging system to address the above, it allowed for:
Confidential Evaluation: Blind judging ensured that one judge was not aware of another judge’s opinion of the same game. This would ensure that each judge's assessment was independent, and would prevent any scores from changing to align with that of another judge’s evaluation.
These changes in the design of the judging process allowed for a fairer, and less biased result, which ultimately addressed all Key Design Objectives. One of the most notable outcomes of this approach was that the results started to more accurately “Represent the Whole Industry”.
In the years since this blind judging approach was implemented, there has been a wider diversity of developers and games being acknowledged through both nominations and wins. There is a greater representation from all states and territories, as well as the smaller and lesser known games receiving the recognition they deserve.
As an additional measure to ensure that the judging was fair, and that bias was addressed in all forms, there was also a clear procedure to identify any conflicts of interest. Before judging commenced, judges were provided with the full list of all eligible games that required judging. They were instructed to mark any games they had potential conflicts of interest with, and these games were not assigned to the respective judges, ensuring a fair assessment of each game. By establishing the open call for judges and increasing the number of judges available this meant that assigning games in both rounds of judging while simultaneously avoiding conflicts of interest was achievable.
7. Cultivate a Positive Judging Experience
To ensure that games were given the respect they deserved, it was important that judges in turn were treated with respect and also provided with adequate support.
This was done by:
Reasonable Time Allocations: Checking the length of the games and the amount of time a judge had, and aligning those two measures to ensure a reasonable amount of time was able to be dedicated to each game. It also ensured that judges wouldn’t be required to allocate an unreasonable amount of time to complete all their judging by the assigned deadline.
Clear Run-down: Judges were clearly communicated with regarding the entire judging process, and all the relevant deadlines and dates upon signing up to be a judge. Ongoing communication was maintained throughout the process, and a dedicated AGDAs inbox was established to ensure there was always a point of contact for judges should they need it.
Tech Checks: Completing and booting all of the games assigned to judges was separated into an earlier stage of the process. This mitigated a huge number of problems as approximately 80% or more of any tech issues came from the initial installation stage (keys > install > boot).
Separating this process allowed the judges to run a “tech check” on the games before things took off, which meant that when it came time to sit down and judge the games, everything was ready to go.Reiterative UX Design: Judging forms had continuous user-experience focused improvements made every year based on feedback provided from judges through a series of feedback loops and annual review systems.
Thank Them: All judges had their hard work acknowledged once results were locked in. This was done via a thank you reel at the end of the ceremony. In addition, they were provided a complimentary AGDAs ticket as a small reward that allowed them to enjoy the fruits of their labour and celebrate alongside other members of industry.
8. Provide Consistent Outreach
A lack of faith and understanding in the AGDAs process for years meant that many developers often:
Weren’t aware that they needed to submit their games to be in the running;
Weren’t aware when submissions were open or would be closing soon;
Had no faith that their game would be considered of value to the judges; or
Felt the system was rigged, and it wouldn’t be worth their time submitting.
This meant that to ensure there were no glaring omissions of titles in the running that year, and to allow the system to have the best chance of outputting winners from several different states and territories, a proactive and diligent approach was needed in order to keep on top of what the titles in every state and territory were, and when they were coming out.
I’d love to tell you that this was an intricately designed system with a map of Australia, timers that countdown until games release via a fancy app that scrapes steam data… but it wasn’t.
This was me with a list of games on a spreadsheet, persistently finding the social media handles and emails of developers that I likely had never met, so that I could message them to ask when their game was going to be out, and politely but firmly ask if they’d like to submit to the AGDAs.
I was able to keep track of 100s of games and their release dates by:
Accessing great resources like the SIFTER list;
Posting and lurking in all the various state-based game developer discord communities; and
Through having “boots on the ground” developers in every state and territory to tell me who I should keep an eye on.
This may not have been one of my most intricate approaches to designing a solution, but it’s important that it’s included as a key framework, because sometimes persistence really is the piece you need to make things happen because my goodness, did this work well!
There are multiple teams whose games made it into the running, who will have received annual messages from me every year asking “is your game coming out this year?” with what I’d imagine is the same level of annoyance as a child in the back seat of a car asking “are we there yet?”.
This combined with a dedicated AGDAs email, posted on every major AGDAs announcement, meant the AGDAs was far more approachable and ultimately more trustworthy than it had previously been to the average developer.
9. Make the Moment Meaningful
Winning an award should be a moment of prestige that feels impactful for both the finalists and the winners, and so it was important that there was value provided to them long after the ceremony ended. To achieve this, the way that the AGDAs celebrates its winners and finalists was dramatically improved by implementing the following features that elevated and celebrated the finalists and nominees:
Laurels for Nominees & Winners: Winners and nominees were provided with official laurels allowing them to highlight their achievements across multiple platforms and distinguish themselves in a competitive marketplace. These were provided in different aspect ratios so they could be proudly displayed on Steam store pages, websites, press kits, or even email signatures.
The Finalists Reel: A video was produced for the ceremony consisting of clips from finalists to be featured prior to the winners of each category being announced. These reels featured judges’ commentary to allow for deeper insight into what made the game stand out. This allowed the audience to be in sync with the judging process and better understand what it was that made this game excel in the category they were nominated for, reinforcing the credibility of the system and allowing the audience to better understand the value of the games they might not have known anything about.
Live Stream: The ceremony was streamed in its entirety allowing for the event to reach beyond the limits of a physical location on a single night of the year. This digital expansion meant that developers could celebrate the AGDAs with their teams, even if they weren’t physically present, and it provided additional opportunities for audience engagement. By making the ceremony more inclusive and accessible. The AGDAs transformed from a singular, exclusive paywalled event in Melbourne, into a nationwide moment of recognition that everyone could celebrate.
Note: Additionally, the live stream meant that a full video recording of the annual AGDAs ceremony was now available in its entirety for the first time for anyone who wishes to access it.
10. Don’t Rest on your Laurels
To ensure that the AGDAs judging process continued to meet its desired design objectives to the level of effectiveness that was needed, an iterative approach was taken to ensure opportunities for improvements weren’t missed.
This came in the form of three design techniques that added an experiential learning perspective and a human-centric approach to the design:
User Journeys: From the first question on the application form to the final group photo on the red carpet, each touchpoint that applicants, judges and attendees interacted with was mapped out regularly in order to identify and understand where the pain points could be reduced and where design changes could be made to improve the user experience.
Feedback Loops: Data was collected via multiple feedback loops in the form of various anonymous feedback surveys, retrospective reviews, and one-on-one discussions to ensure that multiple perspectives were heard and incorporated into future improvements.
Reflection: When issues arose, they weren’t dismissed or ignored. Instead, each problem was logged, documented and reviewed to find ways to solve the issues in question, improve system effectiveness, and identify opportunities for future growth. This ensured that even the smallest inefficiencies or concerns had a clear path toward resolution in the iterations made in the following years.
Conclusion
After all is said and done, it’s important to acknowledge that in reality, the very notion of ranking and evaluating creative works will always be an inherently subjective process.
This is why the goal of the AGDAs was never about making the perfect system, but rather it was about getting better year on year at delivering an ever-growing system that:
Improves incrementally with each iteration;
Makes being a finalist mean something;
Guarantees that no one gets left behind;
Honours those that make it all possible;
Maintains its integrity diligently;
Doesn’t operate exclusively behind closed doors;
Represents the needs of the industry;
Includes industry in the conversation;
Ensures excellence is spotted from every angle; and
Respects games as the art form that they are.
The Australian games industry is filled with extremely talented individuals who deserve the appropriate recognition for their hard work on both a national, and global stage. The AGDAs is just one way to achieve this, but if you’re going to elevate and celebrate Australian talent, what a fun way to do it.