A request for feedback was circulated by email with a Google Form used to collect the responses. Thank you to the 70 people who took the time to respond to the survey. Your feedback is much appreciated.
Overall most respondents were very satisfied (5) or satisfied (4) with all aspects of the event. A number of useful suggestions were made to improve future events.
Additional comments in the responses were analysed and have been summarised in the attached PDF file or click "Read More"
Download 25ARC Post-event Survey Summary
2025 Australian Rogaining Championships: Murrumbidgee Wayfaring
Post-event Survey Summary
15-16 March 2025
Cooinbil Hut, Long Plain, Kosciuszko National Park
An email invitation was made to competitors post-event to provide feedback on the event in general and on 13 specific aspects of the event. For each question, people were asked to provide a ranking between 1 (very dissatisfied) and 5 (very satisfied).
A free-field text box was also provided to allow for comments related to the specific aspects. Further free-field text questions were:
- What did you think was done well for this event?
- What could have been improved?
- Any overall feedback for the event?
This summary analyses responses to all aspects and provides some of the more pertinent comments. The comments are particularly valuable in determining improvements or aspects of the event organization that should be retained.
176 teams comprising 392 participants took part. Feedback was received from 70 people. This is 40% of people, although it may be possible some feedback represents the views of the team as well as the individual respondent.
Overall satisfaction
96% of respondents ranked the event as a 4 or 5 (satisfied or very satisfied). This has been taken to indicate overall the event achieved the goal of a well-run event that met competitors expectations for an Australasian Rogaining Championship.
It is noted though, that the score could also reflect a reluctance by competitors to provide negative feedback, especially as there were no 2 or 1 scores (dissatisfied or very dissatisfied).
Table 1: Ranking of satisfaction for each aspect. 1 being very dissatisfied, 5 very satisfied.
Specific aspects
Event website and pre-event information
The event website was a dedicated site separate from the ACT Rogaining Association’s main site. It was built on a Joomla platform and used the QRA cognito forms as the entry system.
88% of people were satisfied or very satisfied with the event website and pre-event information.
10% were neutral and one person commented “the website was kinda confusing with what felt like too many pages and a homepage with only a giant photo”.
Event entry process
The event entry process had a page of entry information including dates, entry fees, refund policy, for sale items. There were links to the entry, team changes and a team list included. The entry system used the QRA cognito forms that many Australian rogainers will be familiar with through their own state associations.
97% of people were satisfied or very satisfied with the event entry process. There was no-one dissatisfied.
Final instructions
Final instructions were posted on the event website on a dedicated page. This is how the ACT Rogaining Association does final instructions for all events and these instructions used a templated form.
93% were satisfied or very satisfied with the final instructions. 7% were neutral or dissatisfied. There were a few useful comments on the final instructions:
- A number of people commented that the final instructions were not emailed to competitors. It is noted that this is a standard practice for some state associations. All competitors did receive an automatic message on entry that said “Thanks for your entry. Check back on the ACTRA website in the week leading up to the event for final instructions.” It would be useful for future ACTRA-hosted Australasian Rogaining Championships (or other states that don’t use this process) to email competitors to advise them that the final instructions are available with a link to the web page in lieu of the emailed instructions.
- A copy of the final instructions should be made available as a PDF. This was an oversight and should have been done.
- One person suggested a link on the home page for the event to the final instructions – a useful suggestion.
- A number of people arrived late on Friday evening at the Hash House site. A map was posted on a sign at the entry to provide people with information about where the camping areas could be found. One person suggested this could have been included in the final instructions. This would be useful for Hash House where it is not obvious how areas are laid out when you arrive in the dark.
Bus transportation
A bus was provided to the event from Canberra. All seats on the 48-seater bus were sold. Only 11 of the 70 survey responders were on the bus so the feedback here is limited (23% of the bus people). The bus had no problems and ran to time.
All of the people were neutral, satisfied or very satisfied with the bus transport. There was one comment from a New Zealand team who were grateful for the organized bus transport. One respondent requested a bus from Sydney.
Camping and Parking
Camping and parking for 420 people requires a large area. The Cooinbil Hut campground is big but has a number of constraints with no camping areas, horse yards and fences. Organisers attempted to maximise the best areas for camping by having some camp-by-your-car sites and another area of walk-in camping. The number of cars required parking on areas that were un-mown.
94% of people were satisfied or very satisfied with camping and parking. There were three comments:
- One person was accidentally directed to park in the unmown area where there was a large rock (fortunately no damage).
- Ample parking and able to set up tent close to car.
- Nice camping area
Toilets
There were four long-drop permanent toilets on site and a further 6 portaloos were brought to the site. This is slightly higher than the usual 1:50 people ratio that ACT Roganining Association uses.
83% of people were satisfied or very satisfied with the toilets supplied. There were 17% neutral. This lower level of satisfaction compared to the other aspects of the event reflected the running out of toilet paper and of water in the portaloos. There were a lot of comments related to this.
Without excuses (we should have brought more paper), the portaloos were supplied with less than expected paper and did not have full water reservoirs on delivery. Organisers put at least 40 litres more water in every portaloo, but this was a time-consuming and hard-work job (carrying water from a creek about 200 m away) and was insufficient.
There were no comments about the cleanliness or otherwise of the portaloos or long-drops. The toilets were cleaned at least once during the event by organisers.
Two key learnings:
- Two more portaloos should have been ordered (this would have made a ratio 1:35 people).
- A lot more toilet paper should have been brought.
Registration
Registration at the event was done prior to map give-out, with time available on Friday evening and also on Saturday morning. Map tokens were supplied to exchange for maps and course setters notes. Gear checks were done at the time of registration. See below for additional commentary on compulsory gear.
We were not well organized with the process for bagging GPS devices and should have:
- Recorded who took bags and wrote team numbers on bags
- Informed competitors that they would need to show us the bagged devices to enter the start corral.
This would have made for less confusion and a quicker process at the start.
94% of people were satisfied or very satisfied with the registration process. 3 teams were neutral and one dissatisfied. One neutral team registered at a time when we were unable to do the gear checks so needed to return later, which is a sub-optimal experience. The person who scored the registration as dissatisfied did not provide a comment as to why.
Two comments were made about a quick registration (and finish) process.
One comment suggested “Some people were asking about briefing punches. If not needed, perhaps mention that in the final instructions and briefing notes (if not already).” Mentioning this in the final instructions or at registration would be an easy improvement to make.
There were two comments suggesting there should have been a check in and check out punch at registration for teams returning overnight. This was considered but not implemented because it is difficult to ensure compliance and the only value in it is if the team does not return at the end of the event and a search is commenced. It is noted that rogaining rule 23 requires this.
Competition map
The competition map was custom-made with Lidar data for contours and a combination of three vegetation sources, including SPOT5. The vegetation used was judged as the best compromise.
96% of people were satisfied or very satisfied with the competition map. One person was neutral and two dissatisfied.
Unsurprisingly, there were quite a few comments on the map. 11 people made the general comment that the map was done well.
There was one comment from a dissatisfied person “Got tripped up by inset at night - should have been highlighted in briefing. Partly our fault … but wasn’t clear that inset was a paste in (as in dark contours matched surprisingly well)”. Perhaps a thicker border may have alleviated this problem, although it was not mentioned by anyone else.
There were several comments about the vegetation, with one of these making an alternative suggested data source. Vegetation mapping is improving rapidly with different methods for generating data being developed. It is recommended that options are considered by mappers for future events and trialed (during setting, all setters had several map options to compare against each other while in the field).
- I know you did a bit of work on the vegetation density in the in the watercourses but the vegetation in general was a bit hit and miss. The latest point cloud data from ELVIS is dated 2018 and I thought that it may have been utilised to provide a better representation of the overall vegetation. Other than that the map was great.
- … just how precise the map was - in particular the dense (dark green) vegetation.
- Map was awesome with great detail particularly the lidar data for vegetation.
- Very accurate map and vegetation marking.
- The only challenge with the map was that there seemed no way to determine what was really scungy bush with lots of deadfall as opposed to just heavy forest that was better going. Making such a distinction may have been just too hard.
The following comment is similar to a couple relating to checkpoint locations. “Despite being LIDAR based we felt there were some subtleties in the terrain (very minor gullies) that did not show up clearly on the map and caused us great confusion in at least one location.” The area does have a lot of small or subtle features that do not appear well in the contours, even at 5 m intervals. The setters were conscious of this challenge but, in some places, it was difficult to avoid having competitors encounter areas of subtle features.
5 people commented on the dead wood on the ground or the thickness of the vegetation and the physical challenge this posed. They were looking to see more information about this in the notes or on the map. The course setters notes did include a mention of fallen timber and vegetation, however, this feedback suggests there was insufficient emphasis placed on this text.
Checkpoint locations
91 controls/checkpoints were placed on the course – the most in any Australasian Rogaining Championships.
88% of people were satisfied or very satisfied with the checkpoint locations, 10% neutral and one dissatisfied.
Three responses suggested they had an expectation that higher point controls should be more difficult (navigation or physical). Perhaps this can be the subject of an information email from the ARA for state associations to remind members that this is not the intention of how scoring should be set on a course.
- Some checkpoints were inconsistent e.g. some on very vague features (not necessarily in the flat area), some on very prominent features. Scores also didn’t reflect difficulty.
- The scoring of the controls could be tweaked. Some very different low pointers and similarly some very easy high pointers.
- The only negative that I recall is that some low scoring controls were on marginal minor features whilst all the high scoring controls that we visited were on major features.
There were also three complaints that some checkpoints were in scrub or thick bush. Several potential CP locations were removed from the course after setters felt the area was thick enough to be unfairly slow to move through. However, it was not possible to avoid everything (or know where it was).
- Some CP locations really were buried in scrub and could easily have been move to nearby better locations.
- The checkpoint we failed to find was due to giving up because of the difficulty of pushing through and seeing through the bush (at night time). … I think fine to have challenging navigation but not be limited by the difficulty of the bush.
- I felt that the location of CPs 56, 106 and 96 had the potential to be unfair as they were all on vague features and sometimes in thick vegetation.
A response from a long-time rogainer, made an observation “The use of CP sites that were adjacent to, rather than in, watercourses was also good and very suitable for the area plus they were well described. This approach was reasonably common in rogaines 30 years ago but had drifted out of the sport.” This approach was taken where the actual watercourse was not very nice to get into or did not have a suitable tree but there was a good adjacent site that could be found from all directions.
Three people thought the water drops should also be a checkpoint. This is what ACTRA usually does, however, the lack of “surplus” numbers between 20 and 109 meant that we did not consider putting points on water drops.
Navlight scoring and results
There were no problems at the event with the Navlight software. The finish system worked efficiently where tags were able to be read quickly and teams provided with a printout to check their results.
See above for the comment about informing competitors that the navlight tags have been briefed.
96% of people were satisfied or highly satisfied with the navlight scoring and results. The other 4% were neutral. There were no comments on the navlight, which is a good indication that the system was functioning well.
Catering
Catering was undertaken by ACTRA volunteers, with a food truck servicing spit roasted meat and salads at the finish. Soup, curries, cheese toasties, sausages, breakfasts were served continuously. This was the first event of this scale ACTRA has catered for with our own volunteers and there are a number of things we took on from our observations.
91% of people were satisfied or highly satisfied with the catering. 7% were neutral and 1 person was dissatisfied.
Competitors expectations for catering are anticipated to be highly variable depending on what they are used to in their own state. This was reflected in the counter-comments about build-your-own pizzas!
- The spit-roast food was also a winner with some people and others found it greasy and salty.
There was one comment about defrosting food in the sun not meeting food safety standards. This has been taken seriously and looked into.
The overnight food was appreciated by one person “The food options for those stopping by in the small, dark hours were excellent.”, although another found food to be scarce around 5 am.
Presentations
Presentations gave out certificates for second and third places and certificate and a trophy for first. Subsequent to the event, winners also received glasses (we forgot to take them to the event).
We did not plan the presentations well enough and should have made some sort of podium for all the place getters in each category to stand on. This would have reduced some of the confusion stated in the feedback.
90% of people were satisfied or highly satisfied with the presentations, with 8% neutral and 2% dissatisfied. One comment suggested including the state/country with the score and names of team members.
Some state associations read all team results from lowest to highest score, and there were comments requesting this. However, this would have taken too long for this size event with over 170 teams – estimated at least 30 minutes just to read names and scores. This form of presentation is also incompatible with category placings presentations.
Post-event results
Preliminary results were posted late on 16th March and presented using the rogaine-results.com website the following day. Other results features were added over the following days, such as the GPS tracks and Strava Fly-by. GPS tracks that could be discovered for around 50 teams on Strava were uploaded by the organisers with others added by competitors, with 65 of 176 teams having tracks within 1 month after the event.
96% of people were satisfied or highly satisfied with the post event results with 4% neutral.
Comments were:
- Further ideas for results: an overlayed heat map of checkpoint pairs and a replay where you can watch all the little team dots wander around the map.
- The interactive rogaine-results website was used to show tracks. One person commented “I love when the rogaine-results website is used for results. … The gps overlay on the map is a great resource for reflecting and learning from mistakes.”
- The results website was really great and so interesting to analyse.
These three comments indicate there is value in adding the analysis tools that have been built to the results page. A heat map of checkpoint visits is one that was not included this time but might be of interest.
Other feedback
Water
The course was constrained in where water drops could be placed as we had no access to fire trails beyond locked gates. Five drops were put out. Competitors were also given 5 litres equivalent of water purification tablets to use with natural water on the course.
Water drops were generally located so that they could be visited between any two of three or more adjacent controls, rather than only between one pair of controls. Two of the drops (W1 and W5) were on the northern boundary of the course, requiring a diversion to get water.
We had initially thought we might be able to put a map at registration giving an indication of where natural water might be found, but at the time of hanging the flags water was found in many locations but quite restricted to where the vetters went. As such, we thought it would bias people’s courses to try to say exactly where vetters found water.
If a course is similarly constrained in the future, either a map with known locations – even with the biases noted above – or more guidance in the course setters notes where you may find water should be included.
Key comments were:
- Water points should have point value (i.e. be a control too). Some, particularly W1, were a little bit out of a straight line between controls on the course. As mentioned above, we did not have extra “points” to allocate. An option that should have been used was to make them worth 10 points, as there were “spare” 10’s.
- Checkpoint descriptions should have been included for the water drops. This was an unfortunate oversight that should not have happened.
- “It was a matter of luck, in the end, whether we would find water in a particular creek or whether we would need to divert for water (and matters of luck should be minimised).”
Physical nature of course
Eight people commented that they found the terrain physically challenging. Some people were specific mentioning fallen timber and others thick bush. A comment from another person acknowledges “We did hear feedback from others that the course was brutal and they came back early but 24hrs is always going to be hard.”
In contrast, other respondents were complimentary on the country and vegetation, “I thought the vegetation was easier going on the whole than any of the NSW or ACT 24 hour events that I have done in the last 5 years”.
It is not possible to set a 24 hour rogaine in perfect country that people are not going to find difficult. For example, sites in South Australia that do not have thick vegetation or fallen timber have people complaining that their feet are sore from hard rocks.
The setters deliberately did not use part of the area that had the thickest bush but did not think the rest of the course was unfairly thick.
European wasps
The Hash House site and places on the course had European wasps. Competitors were warned, but unfortunately a few teams did step on nests and were stung. More problematic were nests and many wasps around the Hash House site. The setters did not find any on their last visit two weeks before the event, despite looking for them. Some nests were eradicated with insecticide on the Friday evening but this had limited effect on overall numbers. The wasps were particularly bad for the Sunday lunch catering. There is probably limited options for organisers with this other than putting a request to land managers if the problem is identified before the event.
Compulsory Gear
The organisers decided to add additional compulsory equipment to the basic whistle that rogaining rules require. This was advertised in the final instructions and included Space blanket or bag (per person), basic first aid kit that includes 2 compression/snake bite bandages (per team), head torch (per person), beanie/buff and gloves (per person), thermal top and bottom (per person), and waterproof jacket (per person). At the event with the warm weather forecast competitors were advise at registration that gloves and the thermal bottom were not required.
There is a school of thought, and this was reflected in one comment, that competitors need to be responsible for their own actions, including gear that they take. The comment was: “I have a serious problem with any requirement for compulsory gear beyond a whistle. I have participated in over 200 rogaines ... In all of that time I have NEVER worn a waterproof coat. Thus, if I am required to carry one during an event it is simply a weight penalty. Also, why should I be required to carry a light at all times?”. Other rogainers expressed similar views verbally at the event. In hindsight, the head torch may have been better placed on the “recommended” list of equipment.
However, the organisers are aware that competitors will expect to be rescued if they end up in difficulties during an event and with a course that was substantially inaccessible without calling in external resources (Ranger, emergency services) the active decision was taken to include these items. This does need to be a decision taken on an event-by-event basis taking into account the location and possible weather.
The Australian rogaining rules are silent on compulsory gear beyond rule 15 “Competitors shall carry a whistle at all times whilst on the course. In an emergency a competitor shall give a series of short blasts on their whistle”. The Australian Rogaining Association Council may wish to consider whether the rules require modification to allow for additional compulsory gear for safety.
Summary
Overall, most teams appeared to be very satisfied with the event. Most aspects of the organization went smoothly. There are a few key points highlighted above that should be included next time. It does need to be noted that these need to be incorporated into the organization that did take place (i.e. not become a specific focus).