Analysis tooling before annotations?

34 views
Skip to first unread message

Elias Ross

unread,
Dec 14, 2007, 7:11:04 PM12/14/07
to JSR-305: Annotations for Software Defect Detection

I'm new to this group and though I've read several old threads, sorry
if this discussion has been done before. I hear that one of the
biggest concerns is developers using various annotations incorrectly.
That is, they will become false information and create problems. It's
also a concern they may be too difficult to understand and they
wouldn't be used.

The solution may be to create tools that do "perfect" analysis and
insert the annotations inside the code automatically. Such a tool
could be useful for working with an entire project or per class. If
per class, then if a developer was hoping for "qualifying" a class
intended as @Immutable, for instance, the tool could explain why the
code is not @Immutable. Or, if such a class was nearly so, for example
if an object was immutable once constructed, it would "award"
@Immutable with some sort of variance.

The point of running analysis and annotating first is to create
documentation. The more (accurate) documentation the better. Creating
an automatic processing tool will also help with older code bases
without having to look at all classes and remember the original
intent. And within the code maintenance lifecycle, if these
annotations are present earlier, more regressions might be caught
sooner. Once the tool runs and annotations are introduced, it's
unlikely that later changes would break a class's @Immutable
condition.

If somebody working on the JSR-305 could provide a simple tool,
perhaps running within Eclipse or NetBeans, that could check the
"qualities" of people's existing code, it would lead to wide
adaptation of them.

Perhaps there's already such a tool?
Reply all
Reply to author
Forward
0 new messages