Hey guys,

During an assignment for my Digital Electronics class, I started trying to simplify a few expressions...well...let me write it out for you.

Given inputs: A, B, C, D
Where, b = (not)B

Here is the starting equation:
AB + AbCD + CD

Now, basic rules state that I can eliminate the AbCD. Which makes sense, because:
If CD is 0, then AbCD is 0
If CD is 1, then AbCD doesn't matter

So the initial equation simplifies to:
AB + CD

Then I started thinking...

Again, back to the original equation:
AB + AbCD + CD

If B is 0, AB is eliminated, and AbCD relies solely on ACD
if B is 1, AB relies solely on A, and AbCD is eliminated

Therefore, does that not simplify to:
A + ACD + CD
Which further simplifies to:
A + CD (Same thought process as the first outcome)

However, these two conclusions imply that:
AB + CD = A + CD

Which just makes my mind want to implode. Did I do an error in my thought process? Does B ACtually not matter? Are they actually equal?

This assignment's already been submitted, and I stuck with the first simplification (including the B). Can anyone tell me where I went wrong in that second simplification though?

I would just like to know if I'm rationalizing everything incorrectly, or else this may come back to bite me on an exam (and the prof isn't what you could really call easily accessible). I posted it here because I figure most college/university programmers have gone through an introductory Digital Electronics course.

If anybody can shed some light, it would be much appreciated.

Thanks.