what does the single and (&) sign mean in Ruby? - ruby

I saw the & sign recently in Ruby and after searching for a while I can't find an explanation online.
As in:
62 & 15 #=> 14

This Fixnum#& method works as a bitwise and (&) operator in Ruby.
Bitwise AND operator &
The & (bitwise AND) operator compares each bit of its first operand to the corresponding bit of the second operand. If both bits are 1's, the corresponding bit of the result is set to 1. Otherwise, it sets the corresponding result bit to 0.
Both operands must have an integral or enumeration type. The usual arithmetic conversions on each operand are performed. The result has the same type as the converted operands.
Because the bitwise AND operator has both associative and commutative properties, the compiler can rearrange the operands in an expression that contains more than one bitwise AND operator.
The following example shows the values of a, b, and the result of a & b represented as 16-bit binary numbers:
bit pattern of a 0000000001011100
bit pattern of b 0000000000101110
bit pattern of a & b 0000000000001100
Note: The bitwise AND (&) should not be confused with the logical AND. (&&) operator. For example,
1 & 4 evaluates to 0
while
1 && 4 evaluates to true

& is a bitwise and.
See also:
Ruby operators at Tutorialspoint

Related

Can the shunting-yard algorithm be used to parse expressions containing a mix of logical, comparison and arithmetic operators?

According to Wikipedia the shunting-yard algorithm is used to parse mathematical expressions. But is there any reason it could not be used with a mix of logical and arithmetic expressions, as well as comparisons?
As an example, could one use it to parse this:
a+b<17 && a+b>3 || a==b
As far as I can see, you could just define logical operators to have the lowest precedence, followed by comparison operators, and then have the usual arithmetic operators with increasing precedence. Or am I missing something?

Conceptual issue occurred while converting infix notation to postfix notation

How can I realize/understand what is the preferential order/precedence of operators while converting infix notation to postfix notation :
" * ", " / ", " + ", " - ", " ^ ", " ) ", " ( ".
I understand that one can just look at an algorithm for this and solve this but I don't want to do that. What should be my thought process?
Operator precedence is a convention and property of a formal infix language to indicate which operations should evaluate first. A higher precedence operation means that it should operate before lower precedence operations.
Grouping parentheses are not part of the operator precedence. (Note that other kinds of parentheses such as function call parentheses may be however I am assuming you do not refer to those here) They are used to explicitly indicate the order of operations. Parentheses are only useful to indicate an order of operations in infix notation. The purpose of the operator precedence convention in a given language is to avoid using parentheses in most case. So, for example, if I want to multiply 4 by 5 and then add 7 to the result, I can write:
4*5+7
This is valid under normal arithmetic operator precedence rules because multiplication ('*') has higher precedence than addition ('+'). But if I want to add 3 and 4 and then multiply this result by 8, I need to write:
(3+4)*8
In this case I want the order of operations to be different than the normal order of "higher precedence operations first." In other words, parenthesis are only necessary when we are using infix notation and want operations to execute in an order other than precedence-order.
In standard arithmetic, exponentiation ("^") has highest precedence. Next is multiplication and division (equal precedence) and finally addition and subtraction. Therefore, an infix expression written using these operators without parenthesis will evaluate all exponentiation first, then all multiplications and divisions (in left to right order) and finally all additions and subtractions, again in left to right order.
If you want to infer the operator precedence of an unknown language, you would need to look at the places where parentheses are and are not used. Since it is valid to use parentheses everywhere even when unnecessary, this is only a heuristic. For the examples above, I can write:
((4*5)+7)
And this gives no hint about operator precedence. It is because every binary operator in this case has parentheses, and therefore at least one of the two sets is redundant assuming the precedence of addition and multiplication are not the same.
Similarly, looking at the next example:
(3+4)*8
since parentheses were used around the addition but not the multiplication, we can infer that probably in this language addition has lower precedence than multiplication. Otherwise, the parenthesis would be redundant. So look for the pattern where parentheses are and are not used to try to figure out operator precedence in unknown languages. It is more common to assume a certain precedence level based on the formal specification of the language under consideration. Most formal languages have an operator precedence chart for the infix form to avoid this ambiguity.
We never need parentheses in prefix or postfix languages because the order of terms and operators already makes the order of evaluation explicit. Therefore this issue is really an infix-language-specific problem.
If parentheses are balanced properly you can always find a parenthesis-free subexpression, which reduces the problem to that case.
Now just ask yourself, according to precedence rules, which operation in such an expression should be performed first?

Why does a unary operator have associativity?

In a expression like "10 - 3 - 2", it's easy to understand why - and + operators are left associative. To match mathematical convention and have 5 instead of 9 as the result. As I understood it, associativity means the order when some operators have the same precedence level.
But what relevance does this have with unary operators? I don't get why an unary operator has associativity.
Note: The question is about general programming, but if you have to answer in a language dependent manner C is preferred.
**Arithmetic Operators** (Left-to-Right)
+ Additive operator (a + b)
- Subtraction operator (a - b)
* Multiplication operator (a * b)
/ Division operator (a / b)
% Remainder operator (a % b)
**Unary operators** (Right-to-Left)
+ Unary plus operator; indicates positive value (numbers are positive without this, however)
- Unary minus operator; negates an expression
++ Increment operator; increments a value by 1
-- Decrement operator; decrements a value by 1
! Logical complement operator; inverts the value of a boolean
But when we consider unary for example:
a = +1
a= -1
a++
a-- etc
What you mentioned here with 10 - 3 - 2 will not be taken into unary operation.
So the operation will be Left-to-Right. Therefore:
10 - 3 equals 7 then
7 - 2 equals 5
Not as given below (Arithmetic operators always Left-to-Right not Right-to-Left)
3 - 2 = 1 then
10 - 1 = 9 This is absolutely wrong.
For further details, please check the below references:
Precedence and Associativity
Assignment, Arithmetic, and Unary Operators (I'm not much in touch with the C language. But operators are common.)
Unary operators don't have associativity, really. At least, not as individual operators. As you say, it's impossible for a unary operator to compete with another instance of itself for the same operand; that can only happen with a binary operator.
On the other hand, the grammatical definition of associativity doesn't just apply between two identical operators. It applies between two operators of the same class. For example, + and - associate mutually, since they both have the same precedence.
Postfix operators including, for example, array subscription and function application) normally associate more tightly than any other operator, including prefix operators. We could annotate that by putting postfix operators in a higher precedence level, or we could annotate it by putting prefix and postfix operators in the same precedence level, and then saying that that precedence level associates to the right.
Both of those descriptions would explain why *f() means "dereference the result of calling f" —*(f())— and not "call the function referenced by what f points at" —(*f)().
Personally, I find it more intuitive to put postfix operators in a different precedence level than prefix operators, and prefix operators in their own precedence levels as well. But the original precedence declaration syntax for yacc did not provide a declaration for precedence levels without associativity. (%nonassoc is different; it prevents chaining, which actually modifies the grammar.) So you had to declare unary operators as %left or %right. Of those two choices, %right makes more sense, because postfix operators (usually) associate more strongly than prefix operators. So that's the usual style. But it's really not very clear.
Bison does allow you to declare a precedence level without associativity, using the %precedence declaration. That's handy for unary operators. But many other yacc-based parser generators do not provide this feature, so you'll still often see unary operators declared with %right.

Shunting yard algorithm for immediate evaluation

Generally, programs which evaluate an infix mathematical expression use some variation of the Shunting Yard Algorithm to first translate the expression into Reverse Polish Notation, and then evaluate the Reverse Polish Notation to get a single final value.
My question is, is there some well-known algorithm that bypasses the INFIX -> RPN step, and just evaluates the initial infix expression in place, using some kind of recursive descent parsing?
Presumably, it might be useful when writing compilers or parsers to translate INFIX -> RPN. The RPN is sort of a "compiled form" of the expression (an AST), which can be more easily evaluated by a computer using a simple output stack. But, if you're just simply writing code that immediately translates an infix expression into a numeric output value, you might not have any use for caching the intermediate RPN form.
So, is there any well-known algorithm for parsing infix expressions WITHOUT first converting to RPN? Or is converting to RPN generally more efficient than any other approach?
You can easily modify the shunting-yard algorithm to immediately evaluate the expression as you go rather than building up an RPN representation. Specifically, whenever you would normally pop off an operator and two operands from the stacks, rather than appending those tokens to the output, instead just evaluate the expression and push the result back onto the operand stack. Finally, at the very end, evaluate the expression implied by the operator and operand stacks by popping off two operands, popping an operator, computing the result, and pushing it back onto the stack.
For example, to evaluate 3 * 4 + 2, you'd do the following:
Process 3:
Operators
Operands 3
Process *:
Operators *
Operands 3
Process 4:
Operators *
Operands 3 4
Process +:
Operators +
Operands 12
Process 2:
Operators +
Operands 12 2
End of input:
Operators
Operands 14
Hope this helps!

Manually evaluating logical expressions

I'm not sure if this belongs here, but I was told to evaluate
(00110101 ^ (10010101 v 10100000))
How is my answer suppose to look like?
I was wondering how I would do this?
I'm thinking of treating each of those values as a variable like (a ^ (b v c))
then make a truth table? Is that what I'm suppose to do?
Do the stuff in the parentheses first:
10010101
| 10100000
-----------
101.....
Then AND the result with the first number:
00110101
& 101.....
-----------
001.....
For bitwise logic simply evaluate each expression bit-by-bit. Take the first bit of expressions b and c and OR them, then AND them with the first bit of a. Then move to the second bit, etc. You should end up with a bit string of equal length as the starting strings.

Resources