Don’t confuse the “and”/ “or” operators with the “&&”/“||” operators.
In the early years of computing, there was just one “logical and” operator, as it mapped to a machine-language operation that performed a “bitwise and”. Basically, 00000001 AND 00000001 results in 00000001, and since that’s not zero, it’s true.
Later, we used the symbol “&” for “and” and “|” for “or”. And this was fine, for a while.
As people created more sophisticated programming languages, there was a desire to separate “logical and” from “bitwise and”. The “bitwise and” compares the respective bits of two binary strings and results in a new binary string. The “logical and” returns true or false, depending on whether or not one of the expressions being compared is true. This resulted in using “&” for “bitwise and” and “&&” for “logical and”.
There was some confusion about where “logical and” should appear in the order of operations. So in some languages, a compromise was reached, with two “logical and” operators, “&&” and the word “and”. The difference between them is where they occur in the order of operations.
The English versions, “and” and “or”, have lower operator precedence than the assignment operator.
x = y and z means (x = y) and z
x = y && z means x = (y && z)
In the first one, you only do z if the assignment by y to x completes successfully (returns true).
In the second, the value “true” is assigned to x if both y and z are true; otherwise, “false” is assigned to x.
As you can see, these are two totally different actions. Be aware of their use, and if ever in doubt, add parentheses.