|
WHAT 1'O DO
<br />
<br />The sin!~le most important step is to acknowl-
<br />edge the Deed ID evaluate, make a clear com-
<br />mitment to do so, and then mandate that
<br />commitment in the code. Planners often talk
<br />about murfitoring and evaluation, but a plan
<br />to do so [s often IDs[ in the shuffle of code
<br />revision and the morass of codesmithin§
<br />paper,~w][k, not to mention the hoverin§ anxi-
<br />ety ~]f trying to finish a revision project·
<br />Further, the focus on administering the new
<br />code immediately after adoption-and after
<br />the long f~ndurance run of drafting-often
<br />obliterates any thoughts of evaluation. ]'he
<br />mentality is, "Let's just get it done." No one
<br />really wants to shop the day after Christmas,
<br />but those who do reap the benefits when the
<br />next holiday season rolls around. When plan-
<br />nih3 for a comprehensive code revision, think
<br />about a code evaluation plan with the same
<br />energy.
<br />
<br /> Mendel:ins an evaluation report with a
<br />deadline (~r at fixed inten/als following adop-
<br />tion is necessary. It ensures that the evalua-
<br />tion will not be ignored or forgotten. It also
<br />provides for a concrete expression of the com-
<br />mitment to monitor and evaluate. It can also
<br />be helpful in §aining consensus on some of
<br />the more contentious issues associated with
<br />the new re§ulations. It assures both the public.
<br />and decision makers that there is a clear,
<br />a§reed-,Hmm time after implementation when
<br />there will he more discussion--and proof of
<br />wna[ is working and ,Nhat is not.
<br /> Al[hou§h ,Me mandated a review after 24.
<br />months, r)ur code did not specify review meth-
<br />ods, nor did it ~ive direction as to how the
<br />repmt ,A,n~; [o be approached. A plan for con-
<br />rJucting file monitoring arid evaluation of a
<br />code will .,u~gest ,nays that data can be goner-
<br />
<br />ated during the regular course of administra-
<br />tion. If the evaluation plan is given the same .
<br />attention as drafting the regulations, it is easy
<br />enou§h to specify up front what the parame-
<br />ters of the report should be. I suggest that
<br />requests for proposals for code revision assis-
<br />tance include specifications for a continuing
<br />evaluation plan following adoption.
<br /> Of course, such proposals may not
<br />always be popular. New codes can often be
<br />controversial or contentious. Politically, com-
<br />munity leaders may be averse to putting the
<br />cards on the table at a specified time, particu-
<br />larly if failure must be acknowledged, in addi-
<br />tion, some code provisions may inherently be
<br />difficult to measure in terms of success or fail-
<br />ure, and the effect of external forces or devel-
<br />opments on code performance needs to be
<br />assessed and accounted for or discounted,
<br />depending on the circumstances. Every effort
<br />should be made during the adoption process
<br />to define what is to be measured and how.
<br />Different methods may work for certain regula-
<br />tions than for others. The devil lies in the
<br />details.
<br />
<br />MORETHAN NUMBERS
<br />Effective evaluation and monitoring of a code
<br />consists of more than producing' reports in a
<br />specified format and at certain intervals. It is
<br />not just a numbers game. It should go beyond
<br />numerical data and percentages and attempt
<br />to took carefully at individual cases and experi-
<br />ences. While there is a place for quantitative
<br />analysis, particularly for certain code provi-
<br />sions, it is the qualitative analysis that lends
<br />itself best to an understanding of how the
<br />code is affectio§ the community and effect[ns
<br />the plan. Numbers and percentages alone can-
<br />not account for motivations, social situations,
<br />
<br />market sensitivities, or the human condition. A
<br />qualitative evaluation should pay close atten-
<br />tion to the results that the code was intended
<br />to achieve. It should consider experience and
<br />try to reduce the gap between theory and prac-
<br />tice. Planners can obtain data from experience,
<br />personal contact, discussions, interviews, and
<br />detailed document analysts. They should.
<br />emphasize the particulars, meanings, and
<br />descriptions. To this extent, evaluating a code
<br />is more art than science, more craft than calcu-
<br />lation. Explanation replaces measurement and
<br />understanding replaces statistics. The process
<br />should be participatory and engage all players
<br />and stakeholders.
<br /> The evaluation should chal[en§e the the-
<br />eries, policies, and objectives upon which the
<br />code is based and consider the experience of
<br />actual practice. It may lead to a reconsidera-
<br />tion of the theory or an adiustment to the
<br />tools, or both. The idea is to understand how '
<br />things are working and whether the code is
<br />meeting its expectations. Did we do what we
<br />said we would do? Could it have been
<br />approached differently? 8ut even this is not
<br />enough. A useful recommendation must be
<br />part of any evaluation.
<br /> The recommendation may be directed
<br />toward things that are going well or that are
<br />not; it may suggest curative action or state
<br />that the code feature is "right on" at that
<br />point. It may simpty suggest a wait-and-see
<br />approach and concede that more time for
<br />analysis arid understan~ling is needed before
<br />a recommendation can be made. In our case,
<br />24 months proved to be too soon to analyze .
<br />fully all aspects of the code. It was, however,
<br />enough time to gain a level of comfort that
<br />the new approaches were conceptually the
<br />ria'ht ones.
<br />
<br /> ZONING PRACTICE 7.05
<br /> AMERICAN PLANNING ASSOCIATION
<br /> ZZl
<br />~'~::7%:: , , ,: :.~.:..:
<br />
<br />
<br />
|