Categories
Math

Subsets of Algebras, Subalgebras, and Preserving Concepts

Let F be a field and S \subseteq F be closed under addition and multiplication; under these operations, say S is a field. Is it then necessarily true that the other field concepts on S (identities and inverse) agree with those on F? What about for other algebras?

We suspect that the agreement in fact must be true. We now attempt to prove this.

In any field, identities and inverses are unique; thus, we can denote the identities on S by 0_{S} and 1_{S}, and those on F by 0_{F} and 1_{F}. We can do a similar notation for inverses.

We know that if 1_{F} \in S then we must have 1_{S} = 1_{F} (by the standard argument showing that identities are unique), and similarly for 0_{F}. So let’s try to show that 1_{F} \in S.

For any a \in S, we have that a1_{S} = a. But since a \in F, we must have a1_{F} = a. Thus, a1_{S} = a1_{F}. This is an equation in F, as we don’t know that 1_{F} is in S. But in F we know that we can cancel a if we choose a \neq 0. In this case, we must have 1_{S} = 1_{F}, as desired. This is possible if S \neq \left\{ 0_{F} \right\}. But since S is a field, it must contain at least two elements, so we know this is true. Thus, the multiplicative identity does in fact agree.

The argument is even simpler for 0_{S} = 0_{F}, since we don’t need to choose a nonzero for cancellation there to work. Thus, the additive identity does in fact agree.

Now, we know in F that it isn’t possible for an element to have multiple inverses, additive or multiplicative. And since the identities are the same between F and S, this implies that the inverses must be too. Thus, we have that S is automatically closed under the identities and inverses of F, as desired.

What are some algebraic structures for which this doesn’t hold?

Well, we’ve seen it for rings in my ring theory class: for example, if R is a ring and e is a non-identity idempotent (there are many R for which nontrivial such e exist), then e is an identity for eRe. So it’s possible for eRe \subseteq R to be closed under addition and multiplication (which in fact eRe always is), a ring on its own (which in fact it is), but for the identity to be different.

Can we come up with a simpler example? (I’m currently writing an abstract algebra textbook, and it’d be nice to have a simpler example there.) Can we do a “simple” example for groups?

Say S \subseteq G is closed under multiplication and S on its own is a group (with the same multiplication operation.) Can we find such S and G such that S has an identity and/or inverse different from G?

Well, say it had a different inverse for some a \in S, so for some b \in S we had ab = ba = e_{S}. Since S is a group, b must be unique. We also have an inverse for G, ac = ca = e_{G}, where c is unique. If e_{G} = e_{S}, then ab = ac, and cancellation in the overall group G implies b = c. Thus, we must have e_{G} \neq e_{S}. Conversely, if the identities don’t agree, then the inverses don’t agree: otherwise we’d have ab = ac but ab = e_{S} and ac = e_{G}. Thus, for such an example to exist, the identities can’t agree, and the inverses can never agree for any element.

For the identities to not agree, we know that S must not contain e_{G}. In fact, e_{G} \notin S is equivalent to our desired condition, as e_{G} can’t agree with the identity of S if it is not in S. So let’s try to find an S that is closed under multiplication and a group, but not containing e_{G}.

Actually, we can use the same field reasoning to show that such an S can’t exist. Assuming S is nonempty (as a basic condition for our discussion), let a \in S. We have ae_{S} = a and ae_{G} = a. Thus, ae_{S} = ae_{G}, as an equation in G, and in G we can cancel a to yield e_{S} = e_{G}.

So any subset of a group that is closed just under multiplication and a group on its own must be a subgroup, with the identity and inverses agreeing.

Since rings are just special groups, why do we have this result for groups? It’s because of cancellation, which comes from the inverses. But rings don’t have multiplicative inverses. So any subset of a ring that is a ring would have to have the same additive identity 0, but it need not have the same multiplicative identity (for multiplication, rings are just monoids.) So in particular, the ring example shows that a subset of a monoid that is a monoid on its own need not have the same identity.

For my abstract algebra textbook, how do I cover this? I’m currently starting with fields (somewhat nonstandardly), so do I continue with this or should I try to re-architect things to have better examples upfront? Actually, I think it’s not too worth to re-architect, both because of effort on my part and the familiarity of fields (much of real number algebra transfers over “cleanly” — this is the reason I started with fields.) Though readers may not see concrete examples of the value of the distinction in the beginning, they should have enough mathematical rigor to see why the distinction is necessary and what is being proven here.

It seems that cancellation is necessary to have subsets that are algebras be subalgebras. Is this always true? Can we show that (for one operation first, for simplicity) the subsets of an algebra A that are algebras are in fact subalgebras if and only if A is a group (or a special kind of group like an abelian group)?

Let’s formalize this. Let \mathcal{A} be a class of algebras. For any A \in \mathcal{A}, all subsets of A that are members of \mathcal{A} are in fact subalgebras of A. Must \mathcal{A} be a subclass of the class of groups?

Actually, “subalgebra” generally just means that the operations are closed. The idea that we wanted the identities and inverses to agree too when they were not part of the signature is our informally motivated desire, not something that is included in this formalization. Really, what this is about is translating between signatures. If we have a signature L \subseteq L^{'}, then what we’re investigating is: for what axiom sets A in the language of L do we have that all models of A have all submodels for L as submodels for L^{'} too?

This is somewhat connected to something I was investigating with my logical generation theory. There, I was looking at the difference between say groups with the equational logic definition and groups with the first-order logic definition. The conclusion that we got there, which we should remember here, is that in general, definitional equivalence is ultimately an informal concept, and modulo the “measure of equivalence,” is equivalent to existence of an isomorphism between the sets/classes of structures that the definitions define. Or maybe since isomorphism has more formalizable meanings in contexts involving presence of operations, relations, and so on, really this just amounts to a bijection with additional conditions — what these conditions are is in general not formally defined (informal), but they can be formalized depending on the context.

But in the case of a signature being contained in another, we can provide a more formal capturing of our goal: if L \subseteq L^{'} are languages, and A,A^{'} are axiom sets in the respective languages, then we say that A and A^{'} are equivalent if for every model of A, we can use the same operations for the symbols in L and define unique operations in the rest of the symbols of L^{'} (the symbols of L^{'} - L) such that it becomes a model of A^{'}, and if every model of A^{'} is accounted for uniquely in this manner. To be clear, this doesn’t depend on any kind of “definability”: the condition is that the extra operations (or relations in general) in L^{'} - L must exist and be unique.

Is it then actually true that groups in \left\{ \times \right\} and groups in \left\{ \times ,e,x^{- 1} \right\} are equivalent, under this definition? (In this case, clearly L = \left\{ \times \right\} and L^{'} = \left\{ \times ,e,x^{- 1} \right\}.) Clearly, we know that operations for L^{'} - L exist, based on the standard construction of them; are they necessarily unique? Is it possible for there to be two different group structures on the same underlying set with the same multiplication? Say we have \left\{ e_{1},i_{1}:G \rightarrow G \right\} and \left\{ e_{2},i_{2}:G \rightarrow G \right\} as the two structures. We have e_{1}a = ae_{1} = a for all a, as well as e_{2}a = ae_{2} = a for all a. Thus, e_{1}a = e_{2}a for all a. Both structures satisfy associativity; multiply on the right by either of the inverses, say i_{1}(a), to get \left( e_{1}a \right)i_{1}(a) = \left( e_{2}a \right)i_{1}(a), which by associativity yields e_{1}\left( ai_{1}(a) \right) = e_{2}\left( ai_{1}(a) \right). From the first group structure, we have ai_{1}(a) = e_{1}, so e_{1}e_{1} = e_{2}e_{1}. From the first group structure, the LHS is e_{1} and the RHS is e_{2} (as far as the first group structure is concerned, e_{2} is just some element in the group, which e_{1} preserves after multiplication.) Thus, e_{1} = e_{2}.

But inverses are unique, so since the identities are the same, the inverse operations must be the same too. This implies that the group structures actually coincide, so we have uniqueness of operations in L^{'} - L.

OK, what about the rest of the conditions for our definition of definitional equivalence? We need every model of A^{'} to be accounted for uniquely by this construction. Well, given a model of A^{'}, certainly we can produce a model of A that maps to it (just dropping the identity and inverse operations and knowing that those are the values that satisfy the existence conditions in A.) No two models of A are going to map to A^{'} in this way: if two models A_{1},A_{2} of A mapped to a model of A^{'}, then the underlying sets of A_{1},A^{'} and A_{2},A^{'} must be the same, so A_{1},A_{2} must have the same underlying set, and similarly they must have the same multiplication, which then just shows that the models A_{1},A_{2} are the same (since models of A are only determined by underlying set and multiplication.)

By the way, this discussion is actually closer to what I was doing earlier with my “constructed compatibility theory” (as I called it then), until I decided to change the formalization of that since it didn’t match what I intuitively wanted. Or really, this is a special case of that theory.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.