Is division From Zero bad programming?

In an honors C++ class, a classmate of mine had a program, in which he divided some nonzero number from 0 (he gave 0/8). This is integer division by the way. The teacher (who shall not be named) took of massive points, claiming that we had known that 0 and division is bad. The teacher then showed us in the book that division by zero is an error. The student claimed that division by zero and from zero are different. They had some what of an argument, but it was ultimately unresolved, except for the teacher taking massive points of the student’s grade for not catching the error.

The question is, who is right? Is division from zero in fact a standard thing to avoid in programming? Or was that just made up?

Which companies disallow divisions from zero in their practices? Which allow it? Which expect a programmer to have no problem with division from zero? Google, Microsoft, Apple, Amazon, etc…? I would like a long list, to help solve the disagreement, so it is clear who is right. You don’t necessarily need to cite your source (although if you can it helps); you probably know which companies allow division from zero and which don’t.

Any other division from zero resources would be helpful as well.

Note: The question is whether division from zero is something that should in general be avoided in programming.

(Note: I am extremely sure I know who is right, but I wanted this question to be neutral.)


Source: c++

Leave a Reply