Task: Review the Architecture
var defaultQueryStr = '?proc={002674F9-6511-4D15-8623-B761D8C48986}&path={002674F9-6511-4D15-8623-B761D8C48986},{F2160C54-F666-4736-9982-FC7F58F15FAD},_eOA0MEocEdqrjq4i3fchvA';
var backPath = './../../';
var imgPath = './../../images/';
var nodeInfo=[{view: "view:_e_O28N7KEdm8G6yT7-Wdqw", path: ["_e_O28N7KEdm8G6yT7-Wdqw", "_vCtak0JHEdq4z9xc-r201w", "_vChNQkJHEdq4z9xc-r201w", "_vChNREJHEdq4z9xc-r201w", "_UNdM0CFtEdqrX8YVzvtlIg", "_eOA0MEocEdqrjq4i3fchvA"]}, {view: "view:_FCx1oN7CEdmsEI4YDGX2ag", path: ["_FCx1oN7CEdmsEI4YDGX2ag", "_PEpmMCVuEdqSZ9OimJ-AzA", "_-kFhcCVuEdqSZ9OimJ-AzA", "_pV4NgSFsEdqrX8YVzvtlIg", "_UNdM0CFtEdqrX8YVzvtlIg", "_eOA0MEocEdqrjq4i3fchvA"]}, {view: "view:_FCx1oN7CEdmsEI4YDGX2ag", path: ["_FCx1oN7CEdmsEI4YDGX2ag", "_e_O28N7KEdm8G6yT7-Wdqw", "_vCtak0JHEdq4z9xc-r201w", "_vChNQkJHEdq4z9xc-r201w", "_vChNREJHEdq4z9xc-r201w", "_UNdM0CFtEdqrX8YVzvtlIg", "_eOA0MEocEdqrjq4i3fchvA"]}];
contentPage.preload(imgPath, backPath, nodeInfo, defaultQueryStr, true, true, false);
Task: Review the Architecture
This task defines when and how to conduct the review of an Architecture and how to address review findings.
Purpose
To uncover any unknown or perceived risks in the schedule or budget.
To detect any architectural design flaws. Architectural flaws are known to be the hardest to fix, the most damaging
in the long run.
To detect a potential mismatch between the requirements and the architecture: over-design, unrealistic
requirements, or missing requirements. In particular the assessment may examine some aspects often neglected in the
areas of operation, administration and maintenance. How is the system installed? Updated? How do we transition the
current databases?
To evaluate one or more specific architectural qualities: performance, reliability, modifiability, security,
safety.
To identify reuse opportunities
Relationships
RolesMain:
Technical Reviewer
Additional:
Assisting:
InputsMandatory:
Risk List
Software Architecture Document
Optional:
Supplementary Specifications
External:
None
Outputs
Review Record
Steps
General Recommendations
Purpose
General recommendations for each review.
Seen from 20,000 feet there is not much that distinguishes a software architecture assessment from any other assessment
or review.
However, one important characteristic of the software architecture is the lack of specific measurements for many
architectural quality attributes-only a few architectural qualities can be objectively measured. Performance is an
example where measurement is possible. Other qualities are more qualitative or subjective: conceptual integrity for
example. Moreover, it is often hard to decide what a metric means in absence of other data or reference for comparison.
If a reference system is available and understood by the target audience, it is often convenient to express some of the
results of the review relative to this reference system. This may happen in a context where the system under
design can be compared to an earlier design.
When in the life-cycle this assessment takes place also affects its purpose or usefulness.
At the end of the inception phase in an initial development cycle, there is usually little of a concrete
architecture in place. But a review may uncover some unrealistic objectives, missing pieces, missed opportunity for
reusing existing products, etc.
The most natural place for a software architecture assessment is at the end of the elaboration phase. This phase is
primarily focused on exploring the requirements in details, and baselining an architecture. An architecture review
is mandated by the RUP at this milestone. This is the case where a broad range of architectural qualities are
examined.
More focused assessments may take place during the construction phase to examine specific quality attributes, such
as performance or safety, and at the end of the construction phase for any lingering issues that may make the
product unfit to be put in the hands of its users.
Damage-control assessments may take place late in the construction or even transition phases, when things have gone
really wrong: construction does not complete, or an unacceptable level of problems arise in the installed base
during the transition.
Finally as assessment may take place at the end of the transition phase, in particular to inventory reusable assets
for an eventual new product or evolution cycle.
The "peer" reviewer has the same staffing profile as that of the Role: Software Architect, although with a more narrow focus on the technical issues. Leadership, maturity, pragmatism, and
result-orientation are important to lesser degrees, but still important-a reviewer may uncover architectural defects
that are likely to be unpopular if they threaten the schedule of the project. Still, it's better to raise critical
issues early, when they can be resolved, rather than blindly following a schedule that leads the project team down the
wrong path. The architecture reviewer needs to balance risks against costs, remaining sensitive to the broader issues
of project success. The architecture reviewer also needs to be a persuasive communicator who can raise and discuss
sensitive issues.
Recommended Review Meetings
Purpose
To define the scope and the goals of the review.
To define the approaches used for each specific scope/goal combination.
Diverse approaches can be used to do the review:
representation driven
information driven
scenario driven
Representation-driven review
Obtain (or build) a representation of the architecture, then ask questions and reason based on this representation.
There is a wide range of situations here, from the organization that are very architecture-literate and will provide
some intelligible description to start with, to organizations where you need to identify who is the software architect
(even hidden under some other name), and need to extract the information from that person, to the place where software
architecture is a totally unknown concept. This process is then called "mining the architecture," and in practice looks
literally like that: digging it out the software or its documentation with a pickax, looking at source code,
interfaces, configuration data, etc.
One model that can be used to organize the representation is in the format of the architectural views presented in the
Software Architecture Document: the logical view organizes the main classes (the object model), the process view
describes the main threads of control and how they communicate, the development view shows the various subsystems and
their dependencies, the physical view describes the mapping of elements of the other views onto one or several physical
configuration. Organize issues alongside the various views.
Information-driven review
Establish the list of information-data, measurements-that is needed for the reasoning, get the information, and compare
this information to either the requirements or some accepted reference standard. This applies well for investigating
certain quality attributes, such as performance, or robustness.
Scenario-driven review
This is the systematic "what if" approach. Transform the general questions being asked into a set of scenarios the
system should go through and ask questions based on the scenarios. Example of such scenarios are:
The system runs on platforms X and Y. (The real quality attribute probed is portability.)
The system does this (additional) function F. (The real quality attribute is extensibility.)
The system processes 200 requests per hour. (The real quality attribute is scalability.)
The system is being installed on this kind of site by the user. (The real quality attribute is completeness or
usability.)
The advantage of such an approach is that it puts the task in a very concrete perspective, understandable by all
parties. It also allows to probe into omissions or flaws into the requirements, especially when the requirements are
informal or unwritten or very general and terse. The disadvantage is that it does not grab the architecture itself as
the object being reviewed, but takes the system as a black box into which we are only sending some probes.
In practice, things are not so clearly separated, and we end up doing a bit of all three approaches.
Identifying issues
Uncovering potential issues is mostly done by human judgment based upon knowledge and experience. Certain failure
patterns are repeated from project to project, from organization to organization. Certain heuristics can be used to
uncover problem areas. Check-lists can be useful (some very generic ones are proposed later), as well as results from
previous reviews, if any.
Capture potential issues as they appear, describing them in a neutral tone-no finger pointing, no
"catastrophism'. You may use little cardboard cards as do AT&T reviewers, or as we do with CRC cards, to help
prioritizing, organizing, eliminating.
Later, sort the candidate issues by decreasing scope or impact, and if there are many, tackle first the ones that are
directly related to the question at hand, leaving the "other suggestions" for later if time permits. Then assert the
reality of the problem: very often one can perceive a problem, but it may not be. We just have not spoken to the right
person, looked at the right piece of information. Sort again. Ensure multiple data points to verify the reality of a
problem. (Inexperienced assessors tend to be too single-threaded.)
When the problem has been confirmed, rapidly examine what could eliminate the problem, without necessarily trying to do
on-the-fly redesign of the system. Write down potential simplifications, reuse and alternatives (for example, buy vs.
build).
Allocate Defect Resolution Responsibilities
Purpose
To take action on the defects identified.
After the review, allocate responsibility for each defect identified. "Responsibility" in this case may not be to fix
the defect, but to coordinate additional investigation of alternatives, or to coordinate the resolution of the defect
if it is far-reaching or broad in scope.
Properties
Multiple Occurrences
Event Driven
Ongoing
Optional
Planned
Repeatable
More Information
Guidelines
Software Architecture Document
Tool Mentors
Comparing and Merging Rational Rose Models Using Model Integrator
© Copyright IBM Corp. 1987, 2006. All Rights Reserved.
contentPage.onload();
contentPage.processPage.fixDescriptorLinks();
Wyszukiwarka
Podobne podstrony:
review the architecture?7975CCrefine the architecture?F2AA31The Architecture of?sirereview the?sign4CB80A9refine the architecture?F2AA31refine the architecture?F2AA31refine the architecture?F2AA31refine the architecture?0FA195refine the architecture?0FA195refine the architecture?0FA195refine the architecture?F2AA31review the?sign?94A7ACrefine the architecture?0FA1952000 10 Mandrake 7 1 the Latest Mandrake Linux Distribution ReviewedReview on the Assessment of Safety and RisksBrzuch architekta The Belly Of An ArchitectBook Review Social Economy and the Price System by Murray N Rothbardwięcej podobnych podstron