08.10.2020 Author: IPMA Awards

IPMA virtual mega project award assessment: what a challenge!


This year was my first experience with the IPMA Global Excellence Award. I was already several times assessor and judge for the PMO of the world award. But those assessments are completely different. You watch on your own a one-hour presentation of a PMO. You assess that PMO at six criteria: PMO’s journey, client service, best practices, innovation, community and value generation and summarize strong points and areas for improvement. On top of this you make a comparison with another PMO on the same six criteria. In the final round you will do this for the four finalists. A great experience and you can learn a lot.

The IPMA Global Project Excellence Award is different. This is about a search for excellent projects in three categories: Small-/ Medium- Sized (budget < € 50 million), Large (< € 200 million), and Mega projects (budget > € 200 million).

To become an assessor, you have to follow a three-day training to understand and be able to use the Project Excellence Model (see also the text block at the next page).

So, I followed this training and certification in March 2020 (just before the lockdown due to COVID-19) in Vilnius, Lithuania and I was looking forward to the first assignment, hopefully in Asia. Several months later, the original site visit period was rescheduled, I received a request from the IPMA Awards PMO if I was available as an assessor for the IPMA Project Excellence Award. Yes, but as you can understand I was a little bit disappointed too, no travelling, no real-life experience of the culture, the atmosphere, it will be completely online!

(Virtual) site visit preparation

At the end of June, I received a mail from the IPMA Awards PMO confirming that I was selected as an assessor for the Mega-Sized project category, together with several documents e.g. the Global PE Awards Guidelines for the virtual assessment process and the application report of the local applicant. And I received the name of the Team Lead Assessor (TLA). The rest of the team was not known yet (for me).

After a couple of days, I received a mail from our TLA, introducing himself and to make arrangements for our Team Virtual Meeting. We had to analyse the application report and we had to perform an individual assessment and record the results (strengths, areas for improvement, questions, score) in a standard assessment spreadsheet.

It took me around a day to read the application report and to fill in the individual assessment. To make sure I was on the right track, I started with one criterion and asked for feedback from our TLA. We discussed, via Skype, some attention points and that definitely helped to perform the individual assessment.

The virtual meeting was the first moment to meet the complete team. To get to know each other we used a speed dating form, and everyone introduced him/herself.

It was, as expected, a very diverse team, 6 people in total, 2 women, 4 men, 6 nationalities, different backgrounds (at least one must have industry knowledge of the application and one must be a native speaker in the language of the applicant).


Country Time zone
TLA Germany CEST
team member 1 Iran CEST + 02:30
team member 1 Poland CEST
team member 1 England CEST – 01:00
team member 1 Nepal (based in USA) CEST – 06:00
me Netherlands CEST


As you can understand it was quite a challenge to find the right moment to have online meetings or calls because the total time difference was 8.5 hours.

After discussing our findings and individual scoring (big individual differences were discussed and sometimes adjusted) we prepared the first judges report. We had a look at the virtual work guideline and agreed on a next Skype call.

In total we had three Skype calls in the evening to discuss, among other things, the agenda for the virtual site visit (in August 2020), we did two tests with the applicant to test the conference software. We started with Skyroom but we had lots of connection problems, Zoom wasn’t allowed in the applicant’s country and with Skype the applicant had problems too. We ended up with Google Meet managed by the applicant and we used Skype for our own team meetings. Besides the Skype meetings we used mail and WhatsApp to exchange pictures, videos from the applicant and we received from our local team member lots of pictures and videos to get a flavour of the beautiful local culture and also some information on the political and economic situation.

Based on our assessment we selected for each category the top 5 questions to answer during the site visit (everyone got 3 criteria to prepare the questions).

A specific topic was our dress code behind your own PC camera. In line with the IPMA guidelines and the local habits we agree that men were wearing a jacket and tie and the ladies had to use a scarf to cover their hair (to be honest for me it was shorts, shirt, tie and jacket).

Virtual site visit

The Sunday evening before, we had our virtual site visit kick-off. The agenda showed several moments where we split our team in two smaller groups to make it possible to get all our category questions answered. Because several of the documents were in the local language, we decided to split the document review in two groups too (English and local language). For the first three days we started at 09:00 AM local applicant time till 18:00 PM and at 18:30 PM our own team meeting to wrap up the day and adjust the program if needed.

The virtual site visit started with an introduction from their side. To feel the atmosphere and make the mental switch to the virtual site visit the opening was in the local language and directly translated into English. They showed a video of the project site and a presentation of their project team structure. Our TLA gave an overview of the IPMA PEM and we introduced ourselves. After the lunchbreak we got an explanation of the site (project result) and they gave us 3600 life view of the project site and we finished the day to discuss part B Processes and Resources by using our prepared questions.

During our evening team meeting we reviewed the day. We didn’t manage to complete both subcriteria in B, so we had to find some room for that in the agenda of the next day. On top of that we agreed that most of the questions were answered by one or two people. We would like to hear more from others as well. To make this happen, we agreed to change the agenda by making it possible to interview in smaller groups and explicitly mention in which group the needed different roles from their side had to be present. On day 3 we would need three independent meeting links.

On day 2 we started with a first document review session. This is also quite an experience. In line with IPMA’s guidelines we were not allowed to receive the document by ourself (in my case, as part of the group to look at the documents written in the local language, it would not be a problem because I couldn’t read it anyway). Reviewing documents meant that we ask for a document, they used their document system and opened the document (shared their screen) and they scrolled through the document and our local team member did the translation for us. This is a challenging exercise and you are completely dependent of the local team assessor for the on the spot translation.

During our evening meeting we filled in the complexity matrix to be used by the judges. For everyone this was new except our TLA who used this in previous assessments too. As a PM method freak, I will briefly explain this tool. I think there are many debates about complexity. Many people call something complex but in reality, it is complicated. For me, and I expressed that during our meeting, I call something complex when you don’t know the relationship between cause and effect. Only in hindsight, e.g. during a retrospective you can find out the relationship. In these complex situations, in line with Snowden’s Cynefin model, (see figure) you have to start with experiments before you can make decisions.

In the complexity matrix used by IPMA both the applicant as the assessor team have to give their view.

We had to give our score and justification on the following six aspects:
• Complexity related to project objectives
• Complexity related to stakeholders
• Complexity related to cultural and social diversity & geographical environment
• Complexity related to the project delivery process or its outcome (products & services)
• Complexity related to the project team and contract management
• Complexity related to uncertainty (risks & opportunities) and volatility of the project.

As a guideline we received several examples for each of the six aspects. The score could be low, some, medium, high, or extreme complexity. I would say this is very arbitrary to find the right score. Personally, I would prefer the following simple/obvious, complicated, complex, or chaotic.

During day three of our virtual site visit we had lots of interviews (in pairs) with local and international contractors and stakeholders to discuss satisfaction. The last interview was with the complete assessor team and the applicant’s team to discuss project results and impact on the environment. During the close session a lot of compliments on both sides, a virtual team picture and clapping in the rooms and behind our screens.

During our subsequent meeting on Skype we discussed the next steps. We agreed that everyone would fill in their own finding sheet with approximately three strengths and three areas for improvement for each category (single sentences to be used as bullet points in the 2nd Judges Report) to be delivered ultimately tomorrow morning.

On day 4 of the virtual site visit we started at 11:00 AM CEST enthusiastically to discuss the combined finding sheet and more specific we started with category – Realization of results as defined in project objectives. It took us almost two hours to discuss this first category.
I asked for a process intervention. Because it was for me the first time to be on this journey. What do we have to deliver today?

We have to clean-up (rephrasing, removing duplications) the finding sheet and the results will be copied into the 2nd Judges Report. Next we have to do a final initial scoring and discuss the big differences and finally we have to complete the 2nd Judges Report. Luckily, we already finished the complexity matrix. The feedback report doesn’t need to be created today. Based on our finding sheet we only need to translate the findings into the right wording. So, to make it more practical, I stated if we continue in the way we are now working we have 18 criteria to discuss. This means 18 times 2 hours is 36 hours. That will be a long day! So, we decided to timebox on 15 minutes per criteria, meaning still 4,5 hours without breaks for the clean-up. At the end we had two short breaks and finalized the clean-up around 8:40 PM (average of 30 minutes per category).

We still had the 2nd Judge Report and our final scoring to complete. In the Judge Report we had to answer the additional judge’s questions and we had to give our five most important project strengths and five most important improvement potentials. To complete this report, we had to give our recommendation to the judges for this project and our main arguments to support our recommendation.

Filling in our final score after consuming so much information about the project was a piece of cake. Yes, around 10:45 PM we managed to finish the job, we congratulated each other and wished everyone a good night. Our TLA would integrate our cleaned-up finding sheet into the 2nd judge report and we all agreed to translate the findings into the correct wording by the end of the next week and to have a retrospective in a couple of weeks.

Four weeks later, we had our online retrospective. We followed the following phases of the assessment:
• phase 1: individual assessment
• Phase 2: virtual team meeting and preparation of the site visit
• Phase 3: online site visit, second virtual team meeting and feedback report
See the appendix for our lessons learnt and, in our opinion, this is something every assessor team has to perform and share among others.

The online site visit was challenging and definitely not an agony. It was my first assessment but as a result of the training, the Project Excellence Model wasn’t new for me anymore. I could use it to do the individual assessment, I could caried out the interviews like the other assessor team members and participated in building the judges reports. It was intensive, very interesting and the days behind the screen were long, but it worked out well. We had fun, definitely during our evening sessions and during breaks when we used our own private Skype environment. Of course, we all missed the local atmosphere, the, sometimes very important, (informal) talks with local project team members around the coffee machine or looking in someone’s eyes when interviewing. I believe that being at the applicant’s premise the quality of our assessment would be higher, but I think that the virtual site visit delivered enough information of good quality to be used by the judges to make their decision (and all applicants are in the same situation).

I found this assessment an awesome experience. Being part of a completely new composed team (different gender, skills, nationalities) to perform the assessment was wonderful. The possibility to look into a multi-billion project running in a completely different culture was amazing and enriched my world view. I am looking forward to participating in a next one, even if it’s still a virtual assessment.

Appendix: Lessons Learned

As structure we adapt to the “phases of the assessment”

Phase 1: individual assessment

Went well:
• enough time to do the assessment
• there was room for Q&A to avoid uncertainties with the TLA
• numbering of the application text helps with orientation

Area for improvement
• additional instruction: write sentences with one finding / question to one topic – not several topics per sentence
• list of referenced documents was very dense and packed – give a short explanation about each document
• fill in the assessment if it is considered S or AFI and PDCA and line numbers for each finding in the individual comments

Phase 2: virtual team meeting and preparation of the site visit

Went well:
• TLA demonstrated flexibility to integrate assessor in the team
• helpful to have several one hour skype – sessions for the assessor team
• helpful to test the technical environment with the applicant
• select the 5 most important questions for each criterion

Area for improvement
• introduction of the complexity matrix during the assessor training
• parallelize more sessions in the initial agenda for the site visit / at least for day 2 and 3
• check with the Awards PMO, which kind of additional documents can be sent to the team

Phase 3: online site visit, second virtual team meeting and feedback report

Went well:
• helpful to use skype and WhatsApp for internal team communication / use different devices for communication with the applicant
• use of the list of findings was helpful to discuss the team results and as basis for the feedback report

Area for improvement
• address non-managers as interview partners, to get information from different levels
• check if agenda changes are communicated to all interview partners of the applicants
• confirm all timelines and team agreements in writing
• set an agenda for day 4, to make time management possible
• do a cross check between the complexity matrix and the 5 major strengths and areas for improvement

Final conclusions:
• good teamwork in special conditions
• awesome experience
• team operated in a flexible manner
• interesting insights into a different culture and complicated and complex project

Privacy settings

Decide which cookies you want to allow. You can change these settings at any time. However, this can result in some functions no longer being available. For information on deleting the cookies, please consult your browser’s help function.

This page will be:

This page will not be:

This page will be:

This page will not be:

This page will be:

This page will not be:

  • Remember your login details
Cookies to make this site work properly, we sometimes place small data files called cookies on your device. Most big websites do this too.
Read more