As you observed, the eigenvalues of a matrix are the roots of its characteristic polynomial. This fact is useful in theory (and for getting a good grade in your linear algebra class :-) ), but, in real life, it would be very rare to calculate eigenvalues this way.
There are very good numerical methods for calculating eigenvalues and eigenvectors. For example, look in LAPACK, or EISPACK, or the Numerical Recipes books. The software was written by world-class experts, and in many cases it's quite old, so it has been very well tested. None of these methods use the characteristic polynomial; they typically work by iteratively transforming the matrix in some way (Householder transformations, or Jacobi transformations, for example).
Actually, the ironic thing is that the relationship between polynomial roots and eigenvalues is often exploited in the opposite direction. If you want to find the roots of a polynomial, one approach is to construct its companion matrix, and then find its eigenvalues. This approach is used in the root finder in the Chebfun system, for example -- it routinely finds roots of polynomials whose degrees are in the hundreds.
In some sense, finding eigenvalues is easier than finding polynomial roots -- certainly more high-quality numerical methods software is available to help out. And, for any realistic eigenvalue problem, numerical methods are unavoidable. Even for 3x3 matrices, where you can get the roots of the characteristic polynomial by a formula, the numerical methods will often give you more accurate answers (though they might be a bit slower).