Knowledge Stewardship: an approach to taxpayer data analysis in large accounting firms: work process, structure of the workpapers, and software

1990; Todd Boyle Tax Manager, Ernst & Young, Tokyo

Analysis of taxpayer data is the single biggest task for tax professionals. Clearly, expensive professional-level people are a necessary condition for achieving acceptable accuracy. However, they are not a *sufficient* condition for accuracy or more importantly, economic objectives of clients and the firm.

Other "necessary conditions":

A. Developing more capacity in professional staff

The firm could accelerate the process of skills development, which now takes years, by making some efforts toward a more systematic, procedural approach.

We need to really think more deeply about what people actually do in analyzing taxpayer data. In my opinion, there are a large but finite set of rules which staff must learn; we are providing training now, but we are not providing help in how to approach and untangle freeform taxpayer data. We are providing the building blocks but not the plan. We are delivering the building materials, but no blueprint for *how* to analyze taxpayer data.

One of the results of this is the excessive focus by staff on the numbers they can *see* without adequate common-sense analysis of the whole taxpayer data package and what might be missing, what may be getting over-reported, etc.

CPAs always think the work they do is different from other clerical processes, which have been systematized in various ways. Reports in the fields of artificial intelligence or expert systems development indicate this is a common problem that developers often encounter when trying to somehow understand the decision processes of an expert in order to write software. For example, expert systems exist for medical diagnosis, which can generate a list of possible diseases based on various symptoms... but the judgement of a true expert runs far ahead of the capabilities of these systems, partly because experts simply *cannot explain* how they approach analytical problems.

In my opinion, taxpayer data analysis skills can and should be taught and communicated by managers, in training sessions.

B. Efficiency in transmitting knowledge into the machine

The firm must provide a decent analytic framework and software for inputting taxpayer data. The present software (computax, fast tax, etc..) does not provide an excellent interface for inputting taxpayer data.

What you see on the screen is a poor abstraction of the pure, conceptual quantities in many of the special areas such as compensation income, alt min tax, etc.

It's deficient in allowing the user to interact with the machine to find the right input cell, see the results, and see the layout of the printed results.

The challenge of representing data which is of more than 2 dimensions has been worked out successfully by many other pieces of software with view-flipping, such as Lotus Improv or Excel 97 pivot tables, and OLAP products such as RedBrick, Pilot, etc. Lotus version 3 made a noble effort but everyone agrees their hardcoded 3-dimensional limit is obsolete.

If you want to be able to navigate the data in a symetrical, coherent way, it must be represented to the user in a more standard, generalized way. Vendors like Computax/FastTax reply to this whole challenge is a hard-coded custom interface and custom data structures each year.

Perhaps each quantity of taxable data may be viewed as a single value, at the intersection of a series of coordinates. The following dimensions may be required in this storage model:
The following additional dimensions might be required by some users: However, presenting all these dimensions on screen is *not* a problem with view flipping software. View flipping is a really great metaphor which users immediately pick up. We're using outdated software and paying a price for it.

C. Transmission of the information from machines to reviewers

Our stewardship of taxpayer data is poor. After requiring people to spend hours of time analyzing the data and inputting it, much of the knowledge is wasted and must be recreated by the reviewer.

  • The knowledge acquired last year, regarding taxpayer data, is also almost entirely lost. The software is so wholly inadequate that the only information we can see is perhaps a few descriptions that may have existed last year. You have to go to last years' paper files, and embark on another analysis task.

  • Information transmission from preparer to reviewer is almost completely accomplished by notes in the workpapers, and the taxpayer data lead sheet, a 2 dimensional paper object which results in all the other information being ignored or at best, recorded in "custom" comments which are little better than the raw taxpayer data and are often obsoleted by corrections.

  • The waste of knowledge acquired by other cities' tax practice within the firm, especially foreign tax practices overseas, is nearly 100%.

  • The waste of knowledge which exists in other executives' files for the same employer is also nearly 100%

  • Needless to say, there is a vast potential for errorchecking and condition checking within the current tax prep software which is only partially realized.

    The firm has made an economic and strategic decision to outsource the software development process for individual tax returns. I believe the firm should build its own 1040 software and take a very long term approach towards moving the firm culture towards a more cross- disciplined tax professional who applies computers effectively to professional problems and with the skills to modify the software as well as enter data.

    In any case, the firm should communicate effectively to tax software vendor about problems in the software. While conceding it's cost prohibitive to encourage staff to incur unlimited time in pursuing of bugs and writing of memos, I believe we should move somewhat further in that direction. Good software is a marriage between users and programmers. Controlling the software decisions of professional staff may be unavoidable, but if their opinion of the software is low and there's no way to fix it, the firm incurs negative effects on motivation and retention.

    The result of not having a systematic infrastructure to capture knowledge about the taxpayer data is that the firm has excessive reliance on professional staffs' familiarity with the client fact patterns. Thus you pay excessively to acquire knowledge in new staff, pay excessively each year to retain them, and dread the loss when they leave.

    In conclusion we need a more rapid, systematic way of dissecting the raw taxpayer data, with control totals and other techniques. Then, we need better software, designed for the exact purpose of interactive analysis of taxpayer data especially in complex areas. Finally, there should be much more standardized, effective means of communication the conclusions of the preparer to subsequent reviewers.

    Each of these 3 areas--- procedures/workpapers/software-- should be considered altogether, not as separate questions.