Boolean Algebra Hard Examples

0 views
Skip to first unread message

Smacka Shock

unread,
Aug 3, 2024, 5:17:33 PM8/3/24
to fortofurtka

I am recently studying computer science and I was introduced into boolean algebra. It seems that boolean algebra is used to simplify logic gates in hardware in order to make the circuit design minimal and thus cheaper. Is there any similar way that you can use it to reduce the number of code lines in your software in higher level languages like C++, C# or any other language?

Note it is not a tool for just simplifying logic gates in hardware as well. However, it can sometimes be used for such cases (as well as for the opposite, or for completely different purposes).

For example, if your program contains an overly complicated boolean expression or sequence of conditionals, boolean algebra might help you to simplify the expression and the surrounding code. But that does not necessarily lead to less lines of code. In fact, sometimes complex one-line boolean code snippets get more maintainable when you split them up into several lines of code, and boolean algebra can help you to do this correctly.

So IMHO your question is like "can I use a pocket calculator to find the shortest route when traveling from A to B"? Sure you can, when you take a map with distance information for individual roads and use the calculator to add them up, and then pick the route with the smallest sum. But you could also use the calculator for finding longer routes, or for calculating completely different things.

This is a list of checks on the object person, and will exit the function as soon as one of those is true. The meaning of that is pretty clear, and anyone that reads it, even a non-programmer, can understand what checks we're doing.

So no, you do not reduce the number of code lines using boolean algebra. You use it whenever there is a binary choice to be made. If your problem is complex, you are going to have a lot of choices in your software solution.

Well, OK. The reason electronics engineers want to reduce the number of logic gates in a circuit is probably a matter of construction cost and operational performance of their final product. They are building the very thing that will do the work in the end.

When you are programming in any language other than assembly, the computer in the end will not execute your code, but a transformed version of it. This transformation from the programmer friendly language to the computer friendly language may be called compilation. Boolean logic is indeed used by the compiler to achieve operational efficiency.

You may code in assembly and do the compiler's job yourself, but on a day-to-day basis, it becomes less and less worth the hassle. Modern compilers know more and more of the tricks, and better than you.

On the contrary, if you ask whether there exist a similar practice in higher-level languages, yes, it's called refactoring, which means applying well-defined transformations to code chat change the structure and not the behaviour. In the 90s, Martin Fowler published a book that is a catalog of probably the most recurring basic refactorings in object languages.

However in case of High-level languages it may result in more problems. HLL languages are designed for readability and ease of understanding. You can simplify your several lines of code into a single one, but it will become unreadable and complex, so much more time is required to grasp the idea of the code.Most of compilers perform such kind of optimizations by default.

All in all, I suggest not to think a lot about reducing lines of code, but consider behavior of the program in runtime, memory usage and architectual design. Generally when you are applying Boolean Algebra to your circuits you are acting as a compiler for most of existing languages.

Usually, breaking a complex line into multiple, simpler lines will increase readability without significantly reducing performance. Only if benchmarking shows that there is a unacceptable performance issue should you refactor to increase performance at the expense of readability.

To add to the existing discussion, including its idea that fewer lines is good if it also enhances readability, I'll just mention one thing you can do with Boolean algebra in a language where all values all truthy or falsy, such as Python. Here's a FizzBuzz solution in Python, written in a style that's achievable in just about any high-level language:

In this case the benefit came because the same function needn't be mentioned in two lines of which only one is called, and we therefore seek an expression that equals this argument in every scenario, and we can get it using this not-all-that-little-known fact about how Python handles Boolean operators. I'd hate to have to write

You'll find several Python solutions to FizzBuzz here, using this trick but differing in what further Pythonic tricks they use to shorten the code even further. (They even show you can do the whole thing in one line, once you learn what happens when you multiply strings by integers.) These might be less conducive to readability, although of course they're worth learning how to do.

Boolean algebra was introduced by George Boole in his first book The Mathematical Analysis of Logic (1847),[1] and set forth more fully in his An Investigation of the Laws of Thought (1854).[2]According to Huntington, the term "Boolean algebra" was first suggested by Henry M. Sheffer in 1913,[3] although Charles Sanders Peirce gave the title "A Boolian [sic] Algebra with One Constant" to the first chapter of his "The Simplest Mathematics" in 1880.[4] Boolean algebra has been fundamental in the development of digital electronics, and is provided for in all modern programming languages. It is also used in set theory and statistics.[5]

A precursor of Boolean algebra was Gottfried Wilhelm Leibniz's algebra of concepts. The usage of binary in relation to the I Ching was central to Leibniz's characteristica universalis. It eventually created the foundations of algebra of concepts.[6] Leibniz's algebra of concepts is deductively equivalent to the Boolean algebra of sets.[7]

Boole's algebra predated the modern developments in abstract algebra and mathematical logic; it is however seen as connected to the origins of both fields.[8] In an abstract setting, Boolean algebra was perfected in the late 19th century by Jevons, Schrder, Huntington and others, until it reached the modern conception of an (abstract) mathematical structure.[8] For example, the empirical observation that one can manipulate expressions in the algebra of sets, by translating them into expressions in Boole's algebra, is explained in modern terms by saying that the algebra of sets is a Boolean algebra (note the indefinite article). In fact, M. H. Stone proved in 1936 that every Boolean algebra is isomorphic to a field of sets.

In the 1930s, while studying switching circuits, Claude Shannon observed that one could also apply the rules of Boole's algebra in this setting,[9] and he introduced switching algebra as a way to analyze and design circuits by algebraic means in terms of logic gates. Shannon already had at his disposal the abstract mathematical apparatus, thus he cast his switching algebra as the two-element Boolean algebra. In modern circuit engineering settings, there is little need to consider other Boolean algebras, thus "switching algebra" and "Boolean algebra" are often used interchangeably.[10][11][12]

Efficient implementation of Boolean functions is a fundamental problem in the design of combinational logic circuits. Modern electronic design automation tools for very-large-scale integration (VLSI) circuits often rely on an efficient representation of Boolean functions known as (reduced ordered) binary decision diagrams (BDD) for logic synthesis and formal verification.[13]

Logic sentences that can be expressed in classical propositional calculus have an equivalent expression in Boolean algebra. Thus, Boolean logic is sometimes used to denote propositional calculus performed in this way.[14][15][16] Boolean algebra is not sufficient to capture logic formulas using quantifiers, like those from first order logic.

Although the development of mathematical logic did not follow Boole's program, the connection between his algebra and logic was later put on firm ground in the setting of algebraic logic, which also studies the algebraic systems of many other logics.[8] The problem of determining whether the variables of a given Boolean (propositional) formula can be assigned in such a way as to make the formula evaluate to true is called the Boolean satisfiability problem (SAT), and is of importance to theoretical computer science, being the first problem shown to be NP-complete. The closely related model of computation known as a Boolean circuit relates time complexity (of an algorithm) to circuit complexity.

Boolean algebra also deals with functions which have their values in the set 0,1. A sequence of bits is a commonly used example of such a function. Another common example is the totality of subsets of a set E: to a subset F of E, one can define the indicator function that takes the value 1 on F, and 0 outside F. The most general example is the set elements of a Boolean algebra, with all of the foregoing being instances thereof.

One might consider that only negation and one of the two other operations are basic because of the following identities that allow one to define conjunction in terms of negation and the disjunction, and vice versa (De Morgan's laws):[22]

All of the laws treated thus far have been for conjunction and disjunction. These operations have the property that changing either argument either leaves the output unchanged, or the output changes in the same way as the input. Equivalently, changing any variable from 0 to 1 never results in the output changing from 1 to 0. Operations with this property are said to be monotone. Thus the axioms thus far have all been for monotonic Boolean logic. Nonmonotonicity enters via complement as follows.[5]

c80f0f1006
Reply all
Reply to author
Forward
0 new messages