Community:IonelVirgilPop about Reactor pattern

From Odp

(Difference between revisions)
Jump to: navigation, search

IonelVirgilPop (Talk | contribs)
(New page: {{Content OP Proposal Review Template |CreationDate=2012-08-24 |SubmittedBy=IonelVirgilPop |ContentOPUnderReview=Reactor pattern |RevisionID=11175 |Score=-1 - reject |ReviewSummary=Review ...)
Next diff →

Revision as of 15:37, 24 August 2012


IonelVirgilPop about Reactor pattern (Revision ID: pattern?oldid=11175 11175)

Overall suggestion (score): -1 - reject

Review Summary: Review Summary: The main reason I propose to reject this pattern is that it appears to be, just like the so called "Standard(s) Enforcer Pattern", an attempt to generalize my "OOPMetrics" content pattern, but without citing the source.

Reviewer Confidence: High (afterall, it looks like a generalization of my own pattern).

Problems: The main reason I propose to reject this pattern is that it appears to be, just like the so called "Standard Enforcer Pattern", an attempt to generalize my "OOPMetrics" content pattern, but without citing the source. Even if this is a collaborative website, I believe the source should still be cited, if it was based on it.

There would have been ways of citing the source such as put a link to the OOPMetrics ontology pattern in the section "Web References" and in "Related CPs".

The Reactor Pattern was submitted after the OOPMetrics (this can be checked on this website). Here are just a few reasons (there are others) why I believe it is a variant (generalization) of OOPMetrics:

- it uses metrics in the same manner as OOPMetrics, of course here we have hasMeasure instead of hasOOPMetric.

- in the scenario, instead of the metrics used for OOP, in this pattern there are some metrics regarding carbon, total energy, etc. They are used to detect things like waste output in the same manner I used OOPMetrics to detect design-flaws.

- I believe that the so called OntoMDL ontology doesn't even exist. It's just an excuse to create a scenario much like the God Class scenario I presented in the article and described on this website. If the author would have understood her own ontology pattern she should have been able to create this OntoMDL ontology and put a link under examples. It would have been something more that what OOPMetrics already has. But, of course, since OOPMetrics doesn't have an ontology example neither does this pattern.

- The wrong text: words without spaces among them, the property "hasEnvironemntalCondition" that should be "hasEnvironmentalCondition" as another reviewer observed, show that this was done in a rush.

- lack of labels, like in the first version of OOPMetrics.

Moreover, both the "Reactor Pattern" and the "Standard Enforcer Pattern" were submitted AFTER THE DEADLINE, while the "OOPMetrics" was submitted just before the deadline.

It can be clearly seen, if someone clicks on "history" on top of the pattern page, that the first submission was on August 13, and the extended deadline was august 10. I understand that the time format on this website may not be the Hawaii time like in the call, but still there are three days between the two dates. Of course, the author could not have submitted the pattern before, if it was based on my pattern, because I submitted my own pattern just a few hours before the deadline. Now I am glad I submitted my pattern just a few hours before the deadline.

And I believe the article that should have been submitted on easychair, if one was submitted by this author as stated in the call, could not have possibly be submitted before the deadline either, unless one of the organizers/evaluators facilitated a further extension of the deadline for this author. I hope this author didn't had access to the article I send on easychair as well, since normally the article is not public and should not be published until/if accepted. And normally only organizers/evaluators should have access to it.

Community Relevance: None (I don't see why we need two generalizations of OOPMetrics, such as both the "Reactor Pattern" and the "Standard Enforcer Pattern", other then to look different and to have more chances of acceptance of at least one of them).

Relation to Best Practices: None (this pattern does not look like it was based on best practices of creating ontologies, it looks like it was made in 10 minutes based on OOPMetrics, without citing the source).

Reusability: It has worse reusability than OOPMetrics, and this was attempted to be more general. It was attempted to appear more complicated by adding equivalencies and comments, instead of "real content".

Relations to Other Patterns: I believe it has a relation with the "Standard Enforcer Pattern" and with my "OOPMetrics", in the sense that Reactor Pattern + Standard Enforcer Pattern = a generalization of OOPMetrics.

Overall Understandability: A generalization of OOPMetrics would be useful, but I don't understand why this particular pattern would be useful and I can't understand why we need to have two generalizations: Standard Enforcer and Reactor Pattern.

Clear Problem Description: Bad

Clear Relevance and Consequences: Bad

Clear Figures and Illustrations: I don't see why there is rdfs:subClassOf in the diagram so many times instead of the "inheritance" relation. It's too much text in the diagram.

Missing Information: citations are missing in this pattern description. This website may be collaborative, but I still believe citations are required if it was based on another pattern, otherwise one could do even more strange things, like simply copy an existing pattern and just change the name of the author. And perhaps the citation is missing in the article that should have come with this pattern as well. Also, the domain is not stated, perhaps it's "general".

I gave a similar review to the "Standard Enforcer Pattern". I'm sorry if I repeated myself, but some "mistakes" seem to be common.

I am proud that one reviewer had such an appreciation of this pattern, afterall it is a generalization of my own pattern, but I still believe it is a bad generalization.

I would recommend other reviewers to take note of my review and compare these three patterns, or they may get in the trap of following the principle: "Let's reject the original, so we can accept the "copy" ".
Reviewer Confidence:
Problems:
Community Relevance:
Relation to Best Practices:
Reusability:
Relations to Other Patterns:
Overall Understandability:
Clear Problem Description:
Clear Relevance and Consequences:
Clear Figures and Illustrations:
Missing Information:

Posted: 2012-08-24 Last modified: 2012/8/24

All reviews | Add a comment at the bottom of this page
Personal tools
Quality Committee
Content OP publishers