I have to admit that this is my favourite post of the year to write. I love looking through the UFE report released approximately every May and reading the Board of Evaluators (BOE) comments. You’ll see a lot of these as well when you’re taking up your simulations and using the UFE Report as your answer sheet. The BOE comments can often be somewhat harsh which often gives me a chuckle.
I clearly need to get a life. Until then, here’s the 2013 edition.
A reminder, that this should be read in conjunction with the 2012 edition since rumour has it that the UFE is on a two year cycle. I avoid going to specific so as to not give away cases that you’re going to write soon.
Another reminder, I have no inside knowledge of what’s going into this year’s UFE. I just believe that the UFE reports are the best guidance we have as to what to watch for.
Overall, this years UFE report seems to be a pretty positive one compared to some previous years, indicating that candidates are performing well and steady. This is, of course, good news. Now, on to the useful stuff.
Examining the 2012 Report
In one of the most strongly worded warnings I’ve seen, the BOE cautions this years writers on Page 7:
The Board would like to caution candidates about a dangerous exam-writing strategy the Board believes is occurring. It appears as though some candidates are assuming there is a specific number of issues that the Board is looking for on each indicator. As a result, candidates appear to be repeatedly choosing to address only a few issues on each indicator even when numerous issues are outlined in the simulation. By choosing to limit the breadth of their discussions to a particular number of issues, candidates are risking not being able to demonstrate their competence in a specific area. Candidates should attempt to discuss all the issues they deem significant in the simulations, and not limit their analysis because they think they have reached the minimum number of issues the Board is looking for.
In 2010, when I wrote the UFE the marking guides that are now very prevalent were new. In years past, candidates had to estimate which and how many issues were required from the wording in the UFE reports which was of course more challenging but also built up your ranking/decision making skills on the UFE. It was part of the process. I think that these marking guides are a factor in this warning, candidates are playing the numbers game. Don’t do that this year. I strongly suspect that the BOE will come down on this heavily this year. Always rank and write according to the issue and don’t (necessarily) stop just because you’ve written the top three when a fourth might be equally valid.
On page 9 and 10, the BOE discusses Assurance at length. The indicator seems to be that the UFE will continue to place you in unusual Assurance roles. The focus seems to be at putting you in unusual assurance roles and asking you to ‘make the leap’ and think about how your role can add value to the client. These seem to be more undirected.
In 2012, candidates also had difficulty in their Audit Planning, mainly the issue was where in the audit they were locating themselves. Remember to consider whether you are deciding to accept an engagement or whether you have already accepted an engagement. The conversation can be totally different depending on that factor.
On Page 10, the BOE identifies some of the PMR weaknesses they saw:
Candidates who were unable to demonstrate their competence in this area typically did not provide indepth discussions of the issues. Generally, they did not support their discussions with relevant case facts and Handbook guidance, but instead jumped to conclusions. Some candidates also performed poorly as a result of not addressing a sufficient number of issues, as mentioned above in the discussion of overall deficiencies in candidates’ performance.
Case facts, friends. Stating criteria alone does not add value and therefore does not score points. This was also mentioned as an overall weakness in 2012.
Next up: not-for-profit organizations were a weakness in 2012. On page 10:
Candidates did not seem to be familiar with the standards applicable to not-for-profit organizations. They did not address some of the accounting issues central to not-for-profit organizations, such as how to recognize donated goods and services, as frequently as the Board had hoped.
Don’t forget that there is a large not-for-profit and public sector out there and that both of these areas have been tested before. Although most of the standards are similar to private enterprise, they’ll probably test the differences so don’t be afraid of diving into the handbook to look them up.
Regarding Tax, performance seems to be pretty stable. Candidates are focusing more time and effort on the simpler stuff they know vs. the complex issues as brought up by the BOE on Page 11:
In general, candidates seemed to struggle with the more complex tax concepts tested, and provided extensive discussions on the basic concepts they were comfortable with (such as interest deductibility or RRSP withdrawals). The Board reminds candidates that ranking of issues remains an important skill, and as a result, they should not avoid more complex issues if they are significant to the simulation.
This is pretty standard. There will probably continue to be one or two simple indicators and another one that’s complex. It might serve you well to tackle the complex stuff if you can find the right section of the Tax Act to refer to quickly.
In MDM the BOE had suggestions to candidates on Page 12:
Before deciding what type of analysis to perform, candidates need to reflect upon what their analysis needs to accomplish quantitatively and what would be useful to the client.
Along with this, the BOE encourages candidates to take the analysis all the way and offer recommendations where warranted.
And finally, the often feared PQ indicator was tested more in 2012 than previously which signals that the BOE wishes to continue testing the more undirected stuff heavily. Candidates seem to be recognizing the issues that are more commonly looked for such as fraud but continue to miss the “big picture integration” that the BOE wants every year.
The BOE leaves us with an interesting comment at the very end, on Page 14.
The Board continues to emphasize the importance of being able to identify and appropriately address underlying issues on the UFE. These analytical skills are critical for a chartered accountant, and will continue to play an important part not only in the Level 1 assessment on the UFE, but also in the assessment of competence at Levels 2 and 3.
I find it interesting because it makes sense that the PQ affects Level 1 (overall score) but it also seems that they are indicating it might make a difference in Level 2 or 3 which could mean that regular indicators might become more undirected similar to what’s been mentioned above in Assurance. Keep your eye out for this.