The main thing is that when we set a single value on top of a set, we have to test membership, but when we set another set on top of a set, we have to test the intersection is non-null (and possibly another value).
This gives something like:
(define (consistent? newval v) ... (match (list newval v) [`(,(? Set? x) ,(? Set? y)) (intersect x y)] [(or `(,(? Set? x) ,(? (negate Set?) y)) `(,(? (negate Set?) y) ,(? Set? x))) (member? x y)]) ; member? returns y or (gensym) ...)to get the value to set. Note that this also changes the meaning of consistent? to what might be called makeConsistent. An operator that takes two values of some type and returns an value that is consistent with both of them. This can be achieved by way of generics. The various set functions can be used with any values that they need and Connector only needs to know about the generic function makeConsistent. This allows for us to make any number of types and give them a their own definition of makeConsistent, but also to not require any changes to Connector -- the two concerns are completely separated. Going back to the original point of this post, we still have to change the use of consistent? Connector to makeConsistent and change the semantics of the constraint solver such that any constraint can set values on the connector (this was the case before), but may not be the setter anymore since the value that is finally set doesn't belong to any of them. However, assuming the makeConsistent functions are correctly implemented, we have to notify neither the setter of the old value nor the setter of the new one since the result will be consistent with them both. This may make the logic slightly more complicated, but the advantage of allowing extension of types by users outweighs that concern.