A game rendered at 480p and output through component analog cables is fed into the TV at 480p, and gets up-scaled by the TV itself.
With the Wii U, the game is rendered at 480p, and then the Wii U itself scales the rendered image up to 1080p. Once the image has been upscaled it is then output over HDMI to the TV at 1080p.
The difference is WHERE the upscaling is taking place. Over standard Wii component output, it is taking place on the TV itself. On the Wii U it is taking place on the Wii U itself, and no upscaling is happening on the TV.
There are some very high-quality TVs where there may be little to no difference, as those TVs might be great at handling upscaling themselves. But most modern fixed-resolution displays are terrible at upscaling. (and even down-scaling, for that matter) If you feed them any signal other than their native resolution, you are going to get a drop in quality, simply because they aren't very good at handling the scaling themselves. This drop in quality usually takes the shape of compression artifacts in the image. Basically, it makes the final rendered image look like a JPEG that has been scaled up.
With the Wii U, all the upscaling happens on the system itself. This allows it to avoid having any scaling handled by the TV. It provides a more consistent quality across multiple TVs, as well as a generally cleaner final image. The image is still derived from a 480p max original, but the scaling is of a higher quality, which helps keep the final image clean and free of artifacts.