Ian Griffiths wrote an interesting article. However, I have to take issue with a bit of it:
"A clear example of where general purpose CPUs have become fast enough to render special purpose chips obsolete is in mobile phones. (Or 'cellphones', as I believe they're usually called in the USA.) A few years ago, all digital mobile phones had two processors in them: a general purpose one to handle the signalling protocol and user interface, and a specialised digital signal processor (DSP). [...] Having these two processors was a problem. The extra processor made phones bigger, shortened battery life, and decreased reliability as a result of the increased component count. But it was necessary because only a specialised DSP was fast enough to perform the necessary processing, while a general purpose CPU was required to handle the signalling protocol and user interface. But a few years ago, the performance of low-power embedded CPUs (and in particular the ARM CPU) got to the point where one general purpose CPU could do all of the work."
My understanding is that the DSP is in fact still used - but it's no longer a separate chip. Instead it's been absorbed as an 'IP core' into the same chip which houses the display controller, keyboard controller, serial interface controllers, memory controller and the CPU core. It's still loosely labelled the 'CPU' but to reflect the changed role it's now sometimes called an 'application processor'.
See, the way chips are designed has changed quite a bit. When I was at University, and still studying Electronics, we were taught how to use a language known as VHDL. Originally this language was used for simulating circuits before producing them, but a changeover was in progress for instead synthesizing circuits: getting a compiler to generate code for programming a programmable logic chip, and eventually actual masks for producing silicon.
This has lead to a market in, essentially, source code for circuits: the ability to buy in IP cores for a dedicated chip. By combining IP cores a single chip to run a device can be produced - if you're producing millions of them, you can save a heck of a lot. A major reason ARM have been so successful is that they've licensed their processor cores, in addition to manufacturing their own CPUs - basically the core connected directly to the pins. Almost every ARM-compatible application processor out there - Texas Instruments, Samsung, whoever - is basically a licensed ARM core - an actual ARM design - decorated with whatever other cores the ODM (Original Design Manufacturer) decided to throw in.
Take my new Dell Axim X30. It features an Intel PXA270 processor. This single chip has the latest XScale core (Intel's design implementing the ARM5 instruction set - Intel actually design their own cores, rather than using ARM's design), a memory controller capable of controlling SDRAM, Static RAM and Flash ROM, an LCD controller with 256KB of on-board RAM, a USB host controller, a camera capture interface, Smart Digital media I/O controller, a SIM card interface, keypad controller, 3 standard UARTs (serial devices), audio controller, CompactFlash controller, and USB client implementation. Some of these parts are probably licensed cores (although Intel do have the resources to design their own).
[Edit] I need to review these things before posting them. I forgot to add this bit:
Recent ARM cores do have Java bytecode interpreters apparently in hardware - ARM call this Jazelle. However, it must be remembered that Java was originally designed to be interpreted. CIL was not - it was designed to be JIT compiled. Eventually even Sun had to realise that JIT-compiling Java produced far better-performing programs than interpretation ever could.